Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
HYBRID-IMAGE DISPLAY DEVICE
Document Type and Number:
WIPO Patent Application WO/2016/189413
Kind Code:
A1
Abstract:
According to an implementation of the present specification there is provided an apparatus for creating a hybrid image. The apparatus comprises: a first light source configured to emit a first light in a first direction; a reflector disposed to intercept the first light and configured to at least partially reflect the first light in an output direction to produce a reflected first light. The apparatus also comprises a second light source configured to emit a second light, the reflector configured to at least partially transmit the second light in the output direction to produce a transmitted second light. The apparatus also comprises an input terminal configured to receive an input used to control one or more of the first light source and the second light source. The hybrid image comprises a combination of the reflected first light and the transmitted second light.

Inventors:
FOSTER NEIL ANDREW (CA)
MARDON GREGORY SCOTT (CA)
Application Number:
PCT/IB2016/052802
Publication Date:
December 01, 2016
Filing Date:
May 13, 2016
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
FOSTER NEIL ANDREW (CA)
MARDON GREGORY SCOTT (CA)
International Classes:
G02B27/18; A47F3/00; G02B27/10; G02B30/00; G06F3/048; G06F3/0488; G06Q20/00; G06Q30/02
Domestic Patent References:
WO2009138741A22009-11-19
Foreign References:
EP0559889A11993-09-15
US20130169551A12013-07-04
US20070058144A12007-03-15
US20030046166A12003-03-06
Attorney, Agent or Firm:
PERRY + CURRIER (Suite 50, Toronto Ontario M4T 1X3, CA)
Download PDF:
Claims:
We claim:

1. An apparatus for creating a hybrid image, the apparatus comprising: a first light source configured to emit a first light in a first direction; a reflector disposed to intercept the first light, the reflector configured to at least partially reflect the first light in an output direction to produce a reflected first light; a second light source configured to emit a second light, the reflector configured to at least partially transmit the second light in the output direction to produce a transmitted second light; and an input terminal configured to receive an input used to control one or more of the first light source and the second light source; wherein the hybrid image comprises a combination of the reflected first light and the transmitted second light.

2. The apparatus of claim 1 , wherein the input terminal comprises a control screen configured to display a user interface; and the input comprises a touch input received through the user interface.

3. The apparatus of claim 2, wherein the second light source comprises a first portion of a display disposed on a side of the reflector opposite the output direction, the first portion covered by the reflector; and the control screen comprises a second portion of the display, the second portion extending beyond a perimeter of the reflector thereby allowing the second portion to receive the touch input unobstructed by the reflector.

4. The apparatus of any one of claims 2 to 3, wherein the user interface comprises one or more dynamic input zones configured to receive the touch input, receiving the touch input triggering changes in an appearance of at least one of the input zones.

5. The apparatus of any one of claims 1 to 4, wherein the reflector is disposed at about 45° to the first direction.

6. The apparatus of any one of claims 1 to 5, further comprising one or more of a motion sensor and a proximity sensor configured to sense motion and proximity respectively in the vicinity of the apparatus.

7. The apparatus of any one of claims 1 to 6, wherein the second light source comprises a display oriented about parallel to the reflector.

8. The apparatus of any one of claims 1 to 7, further comprising: a first surface disposed at a first angle to the reflector, the first surface disposed on a side of the reflector opposite the output direction; and a first surface light source configured to illuminate at least a portion of the first surface.

9. The apparatus of claim 8, further comprising: a second surface disposed at a second angle to the reflector, the second surface disposed on the side of the reflector opposite the output direction, the second angle different from the first angle; and a second surface light source configured to illuminate at least a portion of the second surface.

10. The apparatus of claim 9, wherein at least one portion of one or more of the first surface and the second surface is at least partially translucent to human-visible light, the at least one portion configured to be backlit by its corresponding one of the first surface light source and the second surface light source.

11. The apparatus of any one of claims 9 to 10, wherein one or more of the first surface and the second surface comprises one or more of a fabric and an acrylic material.

12. The apparatus of any one of claims 9 to 11 , wherein one or more of the first surface light source and the second surface light source is controlled by the input.

13. The apparatus of any one of claims 9 to 12, wherein the hybrid image comprises the combination of the reflected first light and the transmitted second light combined with one or more of a first surface light emanating from the first surface and a second surface light emanating from the second surface.

14. The apparatus of any one of claims 1 to 13, further comprising a sound emitter controlled by the input.

15. The apparatus of any one of claims 1 to 14, further comprising an output terminal configured to connect to auxiliary sources of one or more of light, sound, smell, physical movement, and materials, the auxiliary sources external to the apparatus, the output terminal configured to allow the input to control one or more of the auxiliary sources.

16. The apparatus of any one of claims 1 to 15, further comprising one or more of a payment information reader and an identification information reader configured to read payment information and identification information respectively and control one or more of the first light source and the second light source based on one or more of the payment information and the identification information.

17. A method of creating a hybrid image, the method comprising: receiving an input at an input terminal; controlling one or more of a first light source emitting a first light and a second light source emitting a second light based on the input; producing a reflected first light by at least partially reflecting the first light from a reflector; producing a transmitted second light by at least partially transmitting the second light through the reflector; and creating the hybrid image comprising a combination of the reflected first light and the transmitted second light.

18. The method of claim 17, wherein the input terminal comprises a control screen; and the receiving the input comprises displaying a user interface on the control screen; and receiving a touch input at the control screen.

19. The method of claim 18, wherein the second light source comprises a first portion of a display and the control screen comprises a second portion of the display.

20. The method of any one of claims 18 to 19, further comprising: displaying one or more dynamic input zones on the control screen, the dynamic input zones configured to receive the touch input; and changing an appearance of at least one of the input zones in response to the touch input.

21. The method of any one of claims 17 to 20, further comprising: sensing one or more of motion and proximity in the vicinity of one or more of the first light source, the second light source, and the reflector; and controlling one or more of the first light source and the second light source based on one or more of the sensed motion and proximity.

22. The method of any one of claims 17 to 21 , further comprising: one or more of: illuminating a first surface using a first surface light source, the first surface disposed at a first angle to the reflector; and illuminating a second surface using a second surface light source, the second surface disposed at a second angle to the reflector, the second angle different from the first angle; and wherein the hybrid image comprises the combination of the reflected first light and the transmitted second light combined with one or more of: a first surface light emanating from the first surface and transmitted through the reflector, and a second surface light emanating from the second surface and transmitted through the reflector.

23. The method of claim 22, further comprising: controlling one or more of the first surface light source and the second surface light source based on the input.

24. The method of any one of claims 22 to 23, further comprising producing, using a sound emitter, one or more sounds based on the input.

25. The method of claim 24, further comprising synchronizing with each other two or more of the first light source, the second light source, the first surface light source, the second surface light source, and the sound emitter to produce a hybrid presentation.

26. The method of any one of claims 17 to 25, wherein one or more of the first light source and the second light source comprises a display having a plurality of pixels; and the controlling comprises selecting, based on the input, one or more of a given set of one or more images from a library of sets of images for being displayed by one or more of the first light source and the second light source; and a given number of and given coordinates of the pixels of one or more of the first light source and the second light source for displaying the given set of one or more images.

27. The method of any one of claims 17 to 26, further comprising controlling, based on the input, one or more auxiliary sources of one or more of light, sound, smell, physical movement, and materials.

Description:
HYBRID-IMAGE DISPLAY DEVICE

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] This application claims priority from United States Provisional Patent Application No. 62/167,113 filed on May 27, 2015, which is incorporated herein by reference in its entirety.

FIELD

[0002] The present specification relates to interactive hybrid-image display devices and methods, and more particularly to retail marketing interactive hybrid-image display devices and methods.

BACKGROUND

[0003] Generally retail customers want to visually examine and interact with a product before making the decision to purchase that product. However, conventional retail presentation of products has several limitations that restrict the ability of a customer to examine and interact with a product. First, the limited shelf space in a store presents a physical limit to the number of products that can be conveniently presented to the customer. Second, when a visual representation of a product is presented on a paper flyer or a conventional in-store screen, these visual representations often lack sufficient detail and do not allow the customer to interact with and examine the product through its visual representation. Third, product packaging, especially packaging intended to be tamper-proof or to protect the integrity of products, can present a further obstacle to potential customers' ability to examine and interact with the product.

SUMMARY

[0004] According to an implementation of the present specification there is provided an apparatus for creating a hybrid image. The apparatus comprises: a first light source configured to emit a first light in a first direction; a reflector disposed to intercept the first light, the reflector configured to at least partially reflect the first light in an output direction to produce a reflected first light; a second light source configured to emit a second light, the reflector configured to at least partially transmit the second light in the output direction to produce a transmitted second light; and an input terminal configured to receive an input used to control one or more of the first light source and the second light source. The hybrid image comprises a combination of the reflected first light and the transmitted second light.

[0005] The input terminal can comprise a control screen configured to display a user interface; and the input can comprise a touch input received through the user interface.

[0006] The second light source can comprise a first portion of a display disposed on a side of the reflector opposite the output direction, the first portion covered by the reflector; and the control screen can comprise a second portion of the display, the second portion extending beyond a perimeter of the reflector thereby allowing the second portion to receive the touch input unobstructed by the reflector.

[0007] The user interface can comprise one or more dynamic input zones configured to receive the touch input, receiving the touch input triggering changes in an appearance of at least one of the input zones.

[0008] The reflector can be disposed at about 45° to the first direction.

[0009] The apparatus can further comprise one or more of a motion sensor and a proximity sensor configured to sense motion and proximity respectively in the vicinity of the apparatus.

[0010] The second light source can comprise a display oriented about parallel to the reflector.

[0011] The apparatus can further comprise: a first surface disposed at a first angle to the reflector, the first surface disposed on a side of the reflector opposite the output direction; and a first surface light source configured to illuminate at least a portion of the first surface.

[0012] The apparatus can further comprise: a second surface disposed at a second angle to the reflector, the second surface disposed on the side of the reflector opposite the output direction, the second angle different from the first angle; and a second surface light source configured to illuminate at least a portion of the second surface. [0013] At least one portion of one or more of the first surface and the second surface can be at least partially translucent to human-visible light, the at least one portion configured to be backlit by its corresponding one of the first surface light source and the second surface light source.

[0014] One or more of the first surface and the second surface can comprise one or more of a fabric and an acrylic material.

[0015] One or more of the first surface light source and the second surface light source can be controlled by the input.

[0016] The hybrid image can comprise the combination of the reflected first light and the transmitted second light combined with one or more of a first surface light emanating from the first surface and a second surface light emanating from the second surface.

[0017] The apparatus can further comprise a sound emitter controlled by the input.

[0018] The apparatus can further comprise, an output terminal configured to connect to auxiliary sources of one or more of light, sound, smell, physical movement, and materials, the auxiliary sources external to the apparatus, the output terminal configured to allow the input to control one or more of the auxiliary sources.

[0019] The apparatus can further comprise one or more of a payment information reader and an identification information reader configured to read payment information and identification information respectively and control one or more of the first light source and the second light source based on one or more of the payment information and the identification information.

[0020] According to another implementation of the present specification there is provided a method of creating a hybrid image, the method comprising: receiving an input at an input terminal; controlling one or more of a first light source emitting a first light and a second light source emitting a second light based on the input; producing a reflected first light by at least partially reflecting the first light from a reflector; producing a transmitted second light by at least partially transmitting the second light through the reflector; and creating the hybrid image comprising a combination of the reflected first light and the transmitted second light.

[0021] The input terminal can comprise a control screen; and the receiving the input can comprise displaying a user interface on the control screen; and receiving a touch input at the control screen. [0022] The second light source can comprise a first portion of a display and the control screen can comprise a second portion of the display.

[0023] The method can further comprise: displaying one or more dynamic input zones on the control screen, the dynamic input zones configured to receive the touch input; and changing an appearance of at least one of the input zones in response to the touch input.

[0024] The method can further comprise: sensing one or more of motion and proximity in the vicinity of one or more of the first light source, the second light source, and the reflector; and controlling one or more of the first light source and the second light source based on one or more of the sensed motion and proximity.

[0025] The method can further comprise one or more of: illuminating a first surface using a first surface light source, the first surface disposed at a first angle to the reflector; and illuminating a second surface using a second surface light source, the second surface disposed at a second angle to the reflector, the second angle different from the first angle. The hybrid image can comprise the combination of the reflected first light and the transmitted second light combined with one or more of: a first surface light emanating from the first surface and transmitted through the reflector, and a second surface light emanating from the second surface and transmitted through the reflector.

[0026] The method can further comprise controlling one or more of the first surface light source and the second surface light source based on the input.

[0027] The method can further comprise producing, using a sound emitter, one or more sounds based on the input.

[0028] The method can further comprise synchronizing with each other two or more of the first light source, the second light source, the first surface light source, the second surface light source, and the sound emitter to produce a hybrid presentation.

[0029] One or more of the first light source and the second light source can comprise a display having a plurality of pixels; and the controlling can comprise selecting, based on the input, one or more of a given set of one or more images from a library of sets of images for being displayed by one or more of the first light source and the second light source; and a given number of and given coordinates of the pixels of one or more of the first light source and the second light source for displaying the given set of one or more images. [0030] The method can further comprise controlling, based on the input, one or more auxiliary sources of one or more of light, sound, smell, physical movement, and materials.

BRIEF DESCRIPTION OF THE DRAWINGS

[0031] Implementations of the present specification will now be described, by way of example only, with reference to the attached Figures, wherein:

[0032] Fig. 1 depicts a top perspective view of an exemplary hybrid-image display device, according to non-limiting implementations.

[0033] Fig. 2 depicts a schematic, side elevation, cross-section of a portion of the device of Fig. 1.

[0034] Fig. 3 depicts a side elevation view of the device of Fig. 1 , with a selection of internal components shown in dashed lines.

[0035] Fig. 4 depicts a front elevation view of the device of Fig. 1 , with a selection of internal components shown in dashed lines.

[0036] Fig. 5 depicts a rear elevation view of the device of Fig. 1 , with a rear outer panel of the device removed to reveal some internal components. Fig. 5 also shows a selection of otherwise obscured other internal components, shown in dashed lines.

[0037] Fig. 6 depicts a top perspective view of another exemplary hybrid-image display device, according to non-limiting implementations.

[0038] Fig. 7 depicts a side view of the device of Fig. 6.

[0039] Fig. 8 depicts a front view of the device of Fig. 6.

[0040] Fig. 9 depicts a bottom rear perspective view of the device of Fig. 6.

[0041] Fig. 10 depicts a top perspective view of another exemplary hybrid-image display device, according to non-limiting implementations.

[0042] Fig. 11 depicts a flow chart showing steps of a method of creating a hybrid image, according to non-limiting implementations.

DETAILED DESCRIPTION

[0043] The present specification discloses a display device capable of forming an interactive, hybrid image which can include, but is not limited to, an interactive hybrid image of an object or a product. This display device can allow a large number of products to be presented to potential customers without the need for additional retail shelf space for each additional product. This hybrid image can produce in a viewer the perception that the hybrid image has depth and/or is three-dimensional (3D). Moreover, since the 3D image is interactive, the potential customer can manipulate the image and examine the visual details of the product from multiple perspectives unhindered by any product packaging. The present specification also discloses a method of forming such interactive 3D hybrid images.

[0044] Fig. 1 shows an exemplary hybrid-image display device 100 comprising a housing 105 which supports a first display 110, a reflector 115, and a second display 120. First display 110 and/or second display 120 can comprise LCD, LED, OLED, plasma, or any other suitable type of display. Reflector 115 is disposed at an acute angle to first display 110. Second display 120 is disposed on the side of reflector 115 opposite the side closest to first display 110. Reflector 115 partially overlaps second display 120 such that a first portion 125 of display 120 is covered by reflector 115. A second portion 130 of display 120 extends beyond a perimeter of reflector 115, and as such is not covered or obstructed by reflector 5. As will be described in greater detail below, reflector 115 reflects at least a portion of a light emitted by first display 110 and transmits at least a portion of the light emitted by first portion 125 of second display 120. The combination and/or layering of the reflected light and the transmitted light forms a hybrid image which can appear to the viewer as having depth and/or being 3D. Second portion 30 of display 120 can be used as a control screen to display a user interface and to receive touch input from a user/viewer. Hereafter, "user" and "viewer" are used interchangeably. The touch input can be used to control first display 110 and/or second display 120, thereby controlling the hybrid image and allowing the user to manipulate the hybrid image. This, in turn, can allow a user to interact with the hybrid image.

[0045] Device 100 can further comprise optional sensors to sense its external environment, including the presence of a user. For example, device 100 can comprise proximity sensors 135 to detect whether a user is nearby. In addition and/or instead, device 100 can comprise motion sensors 140 to detect movement in its proximity. One or a combination of these sensors can be used by device 00 to detect a user and to produce a response. For example, the response can comprise turning on or waking up device 100 to capture the attention of a nearby user for potential retail marketing purposes. [0046] Device 100 can also comprise an optional magnetic card reader 145 that can be used to read information from cards including, but not limited to, identification, loyalty, rewards, gift, and/or payment cards. In some implementations, card reader 1 5 can also write information back onto these cards. This information can also be used to control one or more of the first display 110 and the second display 120, in order to control the hybrid image.

[0047] Device 100 can also comprise one or more optional openings 150 in one or more of its side panels. Openings 150 can allow for air circulation into and out of housing 105 to cool the electronic components inside housing 105. In addition, openings 150 can allow sound generated by any speakers inside housing 105 to more easily exit housing 105.

[0048] Fig. 2 shows in cross-section a schematic drawing of a portion of device 100, and depicts the light rays that can form the hybrid image. First display 110 emits light X in the first direction 205. Light X propagates towards reflector 115, which intercepts light X and at least partially reflects light X to form reflected light X' propagating in an output direction 210 and towards viewer 215. First portion 125 of second display 120 emits light Y, which is at least partially transmitted by reflector 115 to form transmitted light Y' also propagating in the output direction 210 towards viewer 215. The combination and/or layering of reflected light X' and transmitted light Y' forms the hybrid image which can be perceived by viewer 215 as having depth or a 3D quality.

[0049] It is contemplated that reflected light X' can be substantially the same as light X, except for its direction. It is also contemplated that transmitted light Y' can be substantially the same as light Y.

[0050] Second portion 130 of display 120 can be used to display a user interface for receiving input from viewer 215. In addition second portion 130 of display 20 can be used to act as an input terminal for receiving a touch input from viewer 215, which touch input can be used to control one or more of first display 110 and second display 120, thereby controlling and manipulating the hybrid image. The touch input can include, but is not limited to, single or multi finger touch, gestures, swipes, force-sensitive touch, and/or single or multi object touch such as touch by a stylus or an electronic pen or controller. The receiving of touch input can allow viewer 215 to interact with the hybrid image; for example to zoom, rotate, translate, transform, actuate, or activate the object represented in the hybrid image. This in turn can allow viewer 215 to view and/or examine the object represented in the hybrid image from different directions and perspectives, and in different states.

[0051] Although Fig. 2 shows one output direction 210, viewing angles of device 100 are not limited to output direction 210. It is contemplated that output directions can comprise a range of directions away from reflector 115 and outwardly of device 100 in which a potential viewer can perceive at least some of the light from first display 110 as reflected from reflector 115 and at least some of the light from first portion 125 of second display 120 transmitted through reflector 115.

[0052] Optionally, other backlight Z can also be transmitted through reflector 115 to form transmitted back light Z' propagating in the output direction 210 towards viewer 215. Transmitted back light Z' can combine with reflected light X' and transmitted light Y' to contribute to the formation of the hybrid image. It is contemplated that in some implementations transmitted light Z' can be substantially the same as light Z.

[0053] Backlight Z can be emitted by any number of suitable sources. For example, optionally device 100 can comprise a base 217 disposed on the side of reflector 115 opposite output direction 210 (i.e. the side of reflector 115 opposite the side closet to first display 110). As shown in Fig. 2, base 217 can be disposed at an acute angle to reflector 115. The angle can be about 45°. Base 217 can define a surface 220 which can be illuminated by an optional corresponding internal surface light source 230. The resulting light emanating from surface 220 can form all or a portion of backlight Z. In addition, optionally, device 100 can comprise a back 222 disposed on the side of reflector 115 opposite output direction 210. As shown in Fig. 2, back 222 can be disposed at an acute angle to reflector 115. The angle can be about 45°. Back 222 can define a surface 225 which can be illuminated by a corresponding internal light source, such as surface light source 230. The resulting light emanating from surface 225 can also form all or a portion of backlight Z. In some implementations, light emitted by surface light source 230 itself can directly form all or a portion of backlight Z.

[0054] In some implementations, at least a portion of one or more of base 217 and back 222 can be at least partially translucent to human-visible light and can be backlit by an internal light source located outside of the space defined by reflector 115, base 217, and back 222. For example, one or more of base 217 and back 222 can comprise fabric, acrylic, or other suitable materials. While the internal surface light source is shown as being located in one corner of the space defined by reflector 115, base 217, and back 222, it is contemplated that surface light source 230 can be located in any suitable location in that space. Surface light source 230, and any other similar surface light sources, can also be controlled by the input received from the user at second portion 130 of second display 120.

[0055] In device 100, reflector 115 is disposed at about 45° to first direction 205 of light X emitted by first display 110. This angle can reduce stray reflections in device 100. In other implementations, reflector 115 can form an angle with first direction 205 of light X that is between about 30° and about 60°. In yet other implementations, reflector 115 can form an angle with first direction 205 of light X that is greater than about 60° or less than about 30°. In some implementations, the angle between reflector 115 and first direction 205 of light X can be adjustable, which can allow for adjusting optimal viewing directions and the hybrid image.

[0056] In device 100, first portion 125 of second display 120 is oriented about parallel to reflector 115 to reduce optical artifacts. However, it is contemplated that first portion 125 can be oriented at any acute angle to reflector 115. In addition, Fig. 2 shows a small gap between first portion 125 and reflector 115. However, it is contemplated that first portion 125 can be adjacent to and/or in contact with reflector 115.

[0057] Reflector 115 can be shaped as a plate, a sheet, or other optically suitable shape. Reflector 115 generally comprises a material that can reflect at least a portion of the light X incident upon it, while transmitting at least a portion of the light Y existing on the side of the reflector 115 opposite the side receiving the light X. Reflector 15 can comprise, but is not limited to, a glass material, a plastic material, a glass composite material, and/or a plastic composite material. Reflector 115 can comprise an at least partially transmissive glass or plastic matrix having embedded within it a set of oriented reflective particles. These reflective particles can comprise metallic flakes. Reflector 115 can comprise Peppers Ghost glass, treated glass, and/or optical hologram film.

[0058] Fig. 3 shows a side elevation view of device 100, with a selection of the internal components shown in dashed lines. Fig. 3 shows housing 105 supporting first display 110, reflector 115, and second display 120. Fig. 3 also shows motion sensor 140, card reader 145, and openings 150 of device 100. In addition, Fig. 3 shows base 217 defining surface 220 and back 222 defining surface 225. [0059] Fig. 4 shows a front elevation view of device 100, with a selection of the internal components shown in dashed lines. Fig. 4 shows housing 105 supporting first display 1 0, reflector 115, and second display 120 having a first portion 125 and second portion 130. First portion 125 is shown in dashed lines as it is covered by reflector 115. Base 217 defining surface 220 is also depicted. Proximity sensors 135 and motion sensors 140 are also shown secured to an upper portion of housing 105 near first display 110. Magnetic card reader 145 is also visible, secured to housing 105 on one side of reflector 115.

[0060] Fig. 5 shows a rear elevation view of device 100, with a rear outer panel removed to reveal a selection of internal components of device 100. A selection of other internal components that would otherwise be obscured from view is shown in dashed lines. Fig. 5 shows housing 105 supporting first display 110. Internal surface light source 230 is secured to a portion of housing 05, such that internal surface light source 230 can illuminate surface 225 defined by back 222 (surface 225 and back 222 not visible in Fig. 5, but shown in Fig. 3). An optional second internal surface light source 505 is also secured to housing 105, and can be configured to illuminate surface 220 defined by base 217. Surface light source 230 can also be referred to as "back light" and surface light source 505 can also be referred to as "stage light". In some implementations, one or more of the back light and the stage light can be controlled by the input received from the user.

[0061] Fig. 5 also shows controllers 510,515 configured to control the images displayed by first display 110 and second display 120. Controllers 510,515 can comprise a memory and processors such as a Central Processing Unit and/or a Graphics Processing Unit. In some implementations, controllers 510,515 can comprise a video decoding unit capable of decoding two videos simultaneously. Controllers 510,515 can receive the input from the user (e.g. received at second portion 130 of second display 120) and determine what image or images are to be displayed on first display 110 and second display 120. Controllers 510,515 can also determine what images are displayed on first portion 125 of second display 120 and what user interface is displayed on second portion 130 of second display 120 (first portion 125 and second portion 130 not labeled in Fig. 5, but labeled in Fig. 4), which can also be determined based on the input received from the user. Input from the user is not limited to input received at second portion 130 of second display 120, and can include, but is not limited to, data received through barcode scanners, motion sensors, RFID, GPIO and USB button controls, IR remote controls, serial devices, keyboards and mice.

[0062] Controllers 510 and 515 can each control a corresponding one of first display 110 and second display 120. In some implementations, device 100 can comprise only one controller to control all the displays. While device 100 is shown as having onboard memory and/or processors in the form of controllers 510,515, it is contemplated that device 100 can have no onboard memory and/or processors, i.e. have no controllers. Alternatively, the memory and/or processors can be integrated into each corresponding display; for example, one or more of first display 110 and second display 120 can comprise a tablet such as an iPad™

[0063] Device 100 can also comprise an optional GPS locator 520 that can be used to track the location of device 100. If device 100 is implemented as part of a mobile marketing unit, GPS locator 520 can be used to tailor or customize based on the location of device 100 the content displayed by one or more of first display 110 and second display 120. An example of a mobile marketing unit can comprise, but is not limited to, a truck or other vehicle transporting marketing materials, messages, and/or devices to various locations.

[0064] Device 100 can also comprise an optional circuit board and/or microcontroller 525 configured to communicate on/off and RGB commands to one or more of first internal surface light source 230 and second internal surface light source 505. Microcontroller 525 can receive commands from one or more of controllers 510,515 and in response send programmed responses to internal surface light sources 230 and 505. In some implementations, these internal light sources can be controlled directly by one or more of controllers 510,515, which can send on/off and other commands directly to internal surface light sources 230 and 505.

[0065] Device 100 can also comprise one or more optional sound emitters 530, which can comprise, but are not limited to, speakers. Sound emitters 530 can be controlled by one or more of controllers 510,515 and can be activated and/or controlled based on the input from the user. Sound emitters 530 can also be controlled by input from sensors incorporated in device 100 and/or based on internal programming of controllers 510,515. In some implementations, controllers 510,515 can be configured to synchronize the sound emitted from sound emitters 530 with images and/or video displayed by one or more of first display 110 and second display 120 to create a hybrid audio-visual presentation. [0066] Housing 105 of device 100 can be made of materials comprising metals or any other sufficiently rigid and strong material such as high-strength plastic, wood, and the like. In some implementations, housing 05 can comprise one or more panels removably secured to a structural frame. Removable panels can allow for easy access to internal components of device 100 to service and/or exchange those components.

[0067] Fig. 6 shows a top perspective view of another implementation of hybrid-image display device 600, which is generally similar to device 100. The differences between device 600 and device 100 are described in greater detail below. Device 600 comprises a housing 605 supporting first display 110, reflector 115, and second display 120. Housing 605 comprises a display guard 610 which surrounds the edges of second portion 130 of second display 120. Display guard 610 can protect second display 120 from impact by users, shopping carts, and other objects and forces external to device 600. It is also contemplated that display guard 610 can surround a subset or all of the exposed edges of second portion 130 of second display 120. Device 600 can also comprise openings 650 to facilitate sound generated inside housing 605 to travel outside housing 605.

[0068] Fig. 7 shows a side view of device 600, showing housing 605 supporting first display 110 and comprising display guard 610 and openings 650.

[0069] Fig. 8 shows a front view of device 600, showing housing 605 supporting first display 110, reflector 115, and second display 120. Fig. 8 also shows display guard 610 surrounding and protecting edges of second portion 130 of second display 120.

[0070] Fig. 9 shows a rear, bottom perspective view of device 600, comprising housing 605 supporting first display 110 and having openings 650. Housing 605 also comprises openings 955 in a bottom panel of housing 605 near edges of second portion 30 of second display 120. Openings 955 can be configured to facilitate sound to emerge from inside housing 605, and also to allow air to circulate into and out of housing 605 to cool the components inside housing 605. In addition, housing 605 can comprise openings 960 in a rear panel of housing 605. Openings 960 can perform a function similar to openings 955.

[0071] Fig. 10 shows a top perspective view of another implementation of the hybrid- image display device 1000. Device 1000 comprises display device 100 secured to a stand 1005. Stand 1005 can be of any shape or dimension so long as stand 1005 can support device 100 at a height at which a user can view the hybrid image formed by device 100 and can give input to device 100 and interact with the hybrid image. In other implementations, instead of device 100, device 600 can be secured to stand 1005. In yet other implementations, stand 1005 can have an adjustable height to adjust the level of device 100 or device 600 for the height of each user.

[0072] While devices 100 and 600 are exemplary implementations of the hybrid-image display device, other variations and implementations are also possible. For example, and without limiting the possible implementations and variations, one or more of the first display and the second display can comprise any other suitable light source. For example, a light projector can be used instead of a display. In general, the first and second displays can comprise any light source capable of emitting light and producing an image.

10073] In addition, while in devices 100 and 600 second portion 130 of second display 120 acts as the input terminal to receive touch input from the user, it is contemplated that any other type of input terminal can be used. For example, the input terminal can be separate from the first portion 125 of second display 120, i.e. second display 120 may not have a second portion 130. The input terminal can comprise a keypad, a mouse, a joystick, a pedal, a controller, a touch screen separate from second display 120, a mobile device, or any other suitable input device configured to receive input from the user.

[0074] In implementations where the input terminal comprises a control screen, the control screen can be configured to display a user interface and to receive a touch input from the user through the control interface. The input interface can be generated and/or controlled by one or more of controllers 510,515. Controllers 510,515 can detect the input from the user, and in response control one or more of the first and second displays, the first and second internal surface light sources (i.e. the back light and the stage light), and the sound emitter. Controllers 510,515 can also change the user interface based on the input from the user. The user interface can comprise one or more dynamic input zones configured to receive the touch input, where receiving the touch input can trigger corresponding changes in an appearance of at least one of the input zones. The dynamic input zones can comprise touch zones, whereby touching anywhere in that zone constitutes a given selection by the user.

[0075] In implementations where the input terminal comprises second portion 130 of second display 120, second portion 130 extends beyond the perimeter of reflector 115 thereby allowing second portion 130 to receive touch input unobstructed by reflector 115. [0076] Moreover, while in devices 100 and 600 first portion 125 and second portion 130 are shown as being two rectangular portions of about comparable size, the respective shapes and relative sizes of the first and second portions can be different. For example, the boundary between the first and second portions can be sloped, curved, jagged, or otherwise irregularly shaped. In other words, the edge of reflector 115 which defines the boundary between first and second portions can be slopped, curved, jagged, or otherwise irregularly shaped. In implementations where the second light source and the input terminal are not portions of the same display, the second light source and the input terminal and be separated and/or spaced from one another.

[0077] In addition, in implementations where the second light source comprises a display and the input terminal comprises a touch screen, it is contemplated that the display and the touch screen may not be coplanar or disposed on parallel planes. For example, the angle and orientation of the second display can be selected to optimize forming the hybrid image, whereas the angle and orientation of the touch screen input terminal can be selected to optimize displaying a user interface to and receiving touch input from the user.

[0078] In device 100 (and also device 600), the projection of reflector 115 onto the plane defined by first display 110 can have dimensions comparable to dimensions of first display 110 and can overlay first display 110; see Fig. 2. This, in turn, can allow for all or substantially all the light emitted by first display 10 in direction 205 to be intercepted by reflector 115, and potentially reflected in the output direction 210 and useable to generate the hybrid image. Moreover, referring again to Fig. 2, a width of first portion 125 can be about commensurate with width of reflector 115, also to allow for all or substantially all of the light emitted by first portion 125 to be incident upon reflector 115, and potentially transmitted in the output direction 210 and useable to form the hybrid image. The "width" can be the dimension in the out-of-page direction, i.e. the direction perpendicular to the direction 205 and to the output direction 210.

[0079] In some implementations, the hybrid-image display device can also comprise an output terminal configured to connect to auxiliary sources of one or more of light, sound, smell, physical movement, and materials. These auxiliary sources can be external to the display device. The output terminal can be configured to allow the input from the user and/or the controllers to control one or more of the auxiliary sources. In this manner, the user can also interact with the environment external to and/or surrounding the display device. In addition, the sounds and images produced by the display device can be synchronized with the effects produced by the auxiliary sources to produce a multi-sensory effect on the user.

[0080] Moreover, while device 100 is depicted as having magnetic card reader 145, in some implementations an RFID and/or chip card reader/writer can be used in addition and/or instead of magnetic card reader 145. In yet other implementations, a bar code reader can be used in addition and/or instead of magnetic card reader 145.

[0081] In some implementations, the hybrid-image display device can have wired or wireless data connectivity ability, for example using a network card in communication with one or more controllers 510,515. Network data connectivity can allow the display device to be remotely programmed, for example by utilizing remote content changeability. In addition, data connectivity can allow the display device to communicate and/or pair with a mobile device of a user in order for the user to control the display device and/or for the display device to display content from the user's mobile device such as a smartphone and smart wearable device.

[0082] In some implementations, controllers 510,515 have stored in their memory a collection of pre-recorded image sets or videos. Based on the input, the controllers can apply their internal programming to determine which content to play, which display(s) to play it on, and over which set of pixels on each display to play that content. This eliminates the need to render images and/or video on the fly and reduces the computing and power needs of device 100. This can also increase speed and enhance responsiveness of device 100. In other implementations, the controllers can render video on the fly based on the input from the user.

[0083] Fig. 11 shows in a flow chart the steps in an exemplary method 1100 for creating a hybrid image, such as the hybrid images that can be formed by the hybrid-image display devices described herein. In step 1105, an input is received at the input terminal. An exemplary input can comprise, but is not limited to, input from a user and/or viewer. In step 1110, based on the input, a first light source emitting a first light is controlled and/or a second light source emitting a second light is controlled. An exemplary first light source can comprise, but is not limited to, first display 110. An exemplary second light source can comprise, but is not limited to, first portion 125 of second display 120. For example, first light can comprise, but is not limited to, light X and second light can comprise, but is not limited to, light Y as shown in Fig. 2.

[0084] In step 1115, a reflected first light is produced by at least partially reflecting the first light from a reflector. For example, the reflected first light can comprise, but is not limited to, light X'. The reflector can comprise, but is not limited to, reflector 115. In step 1120, a transmitted second light is produced by at least partially transmitting the second light through the reflector. For example, the transmitted second light can comprise, but is not limited to, light Y'. In step 1125, the hybrid image is created comprising a combination and/or a layering of the reflected first light and the transmitted second light.

[0085] The input terminal can comprise a control screen and the receiving the input can comprise displaying a user interface on the control screen and receiving a touch input at the control screen.

[0086] In some implementations, the second light source can comprise a first portion of a display and the control screen can comprise a second portion of the same display. For example, as in the case of devices 100 and 600, the second light source can comprise first portion 125 of second display 120 and the control screen can comprise second portion 130 of second display 120. In some implementations, for example devices 100 and 600, first portion 125 and second portion 130 are not separate displays, but merely represent different sets of pixels of the same second display 120 which sets of pixels perform different functions: first portion 125 functions as the second light source whose light is transmitted through reflector 1 5 to contribute to forming of the hybrid image, whereas second portion 130 functions to display a user interface to the user and to receive touch input through the user interface from the user.

[0087] In some implementations, method 1100 further comprises displaying one or more dynamic input zones on the control screen, where the dynamic input zones are configured to receive the touch input. The input zones are visual zones which are dynamic in the sense that once a touch input is received, the appearance of at least one of the input zones can be changed in response to the touch input. In a device like device 100, this change of appearance can be performed by one or more of controllers 510,515.

[0088] In some implementations, method 1100 can further comprise sensing one or more of motion and proximity in the vicinity of one or more of the first light source, the second light source, and the reflector; and controlling one or more of the first light source and the second light source based on one or more of the sensed motion and proximity. For example, in device 100, proximity sensor 135 and motion sensor 140 can be used to detect user proximity or motion near device 100, and then communicate their output to one or more of controllers 510,515, which controllers in turn control first display 110 and second display 120 to change one or more of the hybrid image and the user interface.

[0089] Method 1100 can also comprise one or more of: illuminating a first surface using a first surface light source, the first surface disposed at a first angle to the reflector, and illuminating a second surface using a second surface light source, the second surface disposed at a second angle to the reflector, the second angle different from the first angle. In the non-limiting example of device 100, the first surface can comprise surface 220, the first surface light can comprise internal surface light source 505, the second surface can comprise surface 225, and the second surface light can comprise internal surface light source 230. The hybrid image can comprise the combination and/or layering of the reflected first light and the transmitted second light combined and/or layered with one or more of a first surface light emanating from the first surface and transmitted through the reflector, and a second surface light emanating from the second surface and transmitted through the reflector. In the non- limiting example of device 100, light Z' can represent one or more of a first surface light emanating from the first surface and transmitted through the reflector, and a second surface light emanating from the second surface and transmitted through the reflector.

[0090] In some implementations, these first and second surface light sources can be controlled by the input from the user. For example, in the non-limiting case of device 100, the input can be communicated to one or more of controllers 510,515 which can in turn control one or more of internal surface light sources 230 and 505.

[0091] The method can further comprise using a sound emitter to produce one or more sounds based on the input. In the non-limiting example of device 100, the sound emitter can comprise sound emitter 530.

[0092] In some implementations, the method can further comprise synchronizing with each other two or more of the first light source, the second light source, the first surface light source, the second surface light source, and the sound emitter to produce a hybrid presentation. This hybrid presentation can comprise a hybrid audio-visual presentation. [0093] In some implementations, one or more of the first light source and the second light source comprises a display having a plurality of pixels. Non-limiting examples include LCD, LED, OLED, and plasma displays which comprise addressable pixels. Step 1110 can comprise selecting, based on the input a given set of one or more images from a library of sets of images for being displayed by one or more of the first light source and the second light source. In addition and/or in the alternative, step 1110 can comprise selecting, based on the input, a given number of and given coordinates of the pixels of one or more of the first light source and the second light source for displaying the given set of one or more images.

[0094] For example, based on the input, one or more of controllers 510,515 can select an image or a set of images (e.g. in the form of video) to display on one or more of first display 110 and second display 120. The controllers can also decide which subset of pixels (i.e. number and coordinate of pixels) should be used to display that content. The content can be selected from a library of pre-recorded or pre-rendered images and videos to reduce the computation and power needs of implementing this step and to enhance responsiveness.

[0095] Method 1100 can further comprise controlling, based on the input, one or more auxiliary sources of one or more of light, sound, smell, physical movement, and materials.

[0096] The methods described herein, including method 1100, are not limited to being performed on any of the devices described herein. The steps of these methods are described in reference to components of devices 100 and 600 for demonstrative purposes only. These methods can be performed using any suitable components and apparatuses.

[0097] The devices and method of the present specification can allow a user to interact and engage with a hybrid image having a 3D appearance. The hybrid image can allow the user to visually inspect various aspects of the objects represented in the hybrid image and to receive additional information about those objects that would not be readily available from a 2D image or video. Such interactive, 3D hybrid images can produce in the user a perception of tactile interaction with an object, as the user is able to manipulate a 3D representation of the object using touch input.

[0098] In the retail marketing context, the devices and methods of the present specification can enable merchants to attract and educate the consumer through a new marketing platform that allows for a higher profile within the retail environment and for interaction between the consumer and the (representations of) their products without the need for the physical products being present.

[0099] The above-described implementations are intended to be examples of the present invention and alterations and modifications may be effected thereto, by those of skill in the art, without departing from the scope of the invention which is defined solely by the claims appended hereto.