Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
DEVICE HAVING COMBINATION 2D-3D DISPLAY
Document Type and Number:
WIPO Patent Application WO/2023/039617
Kind Code:
A2
Abstract:
In accordance with an embodiment, a device includes: a frame and an autostereoscopic display coupled to the frame. The autostereoscopic display includes: a display panel including light-emitting diodes in a first far-field region, in a second far-field region, and in a near-field region, the near-field region disposed between the first far-field region and the second far-field region; a first microlens overlapping the first far-field region of the display panel; and a second microlens overlapping the second far-field region of the display panel, the second microlens spaced apart from the first microlens, where the near-field region of the display panel is free of microlenses.

Inventors:
WU PINGFAN (US)
HUA CHIWEI (US)
HONG HEATHER (US)
PENG LIANG (US)
Application Number:
PCT/US2022/082564
Publication Date:
March 16, 2023
Filing Date:
December 29, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
FUTUREWEI TECHNOLOGIES INC (US)
Attorney, Agent or Firm:
CORTIAUS, Stephen (US)
Download PDF:
Claims:
WHAT IS CLAIMED:

1. A device comprising: a frame; and an autostereoscopic display coupled to the frame, the autostereoscopic display comprising: a display panel comprising light-emitting diodes in a first far-field region, in a second far-field region, and in a near-field region, the near-field region disposed between the first far-field region and the second far-field region; a first microlens overlapping the first far-field region of the display panel; and a second microlens overlapping the second far-field region of the display panel, the second microlens spaced apart from the first microlens, wherein the near-field region of the display panel is free of microlenses.

2. The device of Claim 1, wherein a distance between the first microlens and the second microlens is different from a size of the first microlens and a size of the second microlens.

3. The device of any of Claims 1-2, wherein the first microlens and the second microlens are round microlenses in a top-down view.

4. The device of any of Claims 1-2, wherein the first microlens and the second microlens are rectangular microlenses in a top-down view.

5. The device of any of Claims 1-2, wherein the first microlens and the second microlens are hexagonal microlenses in a top-down view.

6. The device of any of Claims 1-5, wherein the first far-field region and the second far-field region are ones of a plurality of far-field regions, the near-field region is one of a plurality of near-field regions, and a total area of the near-field regions is greater than a total area of the far-field regions.

7. The device of any of Claims 1-6 further comprising: a controller configured to change transparency of the first far-field region and the second far-field region.

-34-

8. A device comprising: a frame; a display panel coupled to the frame, the display panel comprising: a transistor layer; and a light-emitting diode layer over the transistor layer; a first microlens over a first portion of the light-emitting diode layer; and a second microlens over a second portion of the light-emitting diode layer, the first microlens and the second microlens each having a microlens size, the first microlens being separated from the second microlens by a separation distance, the separation distance being different from the microlens size.

9. The device of Claim 8, wherein the transistor layer is opaque.

10. The device of Claim 8, wherein the transistor layer is transparent.

11. The device of any of Claims 8-10, wherein the first microlens and the second microlens are part of a microlens array over the display panel.

12. The device of Claim 11, wherein microlenses of the microlens array are disposed a same distance from one another along a horizontal direction and along a vertical direction in a top-down view.

13. The device of any of Claims 8-10, wherein the first microlens is part of a first microlens strip over the display panel, and the second microlens is part of a second microlens strip over the display panel.

14. The device of any of Claims 8-10, wherein the first microlens and the second microlens are part of microlens strips over the display panel, and each of the microlens strips are disposed a same distance from one another.

15. The device of any of Claims 8-10, wherein the first microlens and the second microlens are part of a randomized microlens array over the display panel.

16. The device of any of Claims 8-15, wherein the separation distance is greater than the microlens size.

17. The device of any of Claims 8-15, wherein the separation distance is less than the microlens size.

-35-

18. A device comprising: a thin-film transistor layer comprising transistors; a light-emitting diode layer over the thin-film transistor layer, the lightemitting diode layer comprising light-emitting diodes, the transistors configured to control the light-emitting diodes; a parallax barrier over the light-emitting diode layer, the parallax barrier comprising: a first microlens over the light-emitting diode layer, the first microlens aligned with a first array of the light-emitting diodes; and a second microlens over the light-emitting diode layer, the second microlens aligned with a second array of the light-emitting diodes, the second microlens spaced apart from the first microlens.

19. The device of Claim 18, wherein the thin-film transistor layer is opaque.

20. The device of Claim 18, wherein the thin-film transistor layer is transparent.

Description:
Device Having Combination 2D-3D Display

PRIORITY CLAIM AND CROSS-REFERENCE

[0001] This patent application claims priority to U.S. Provisional Application No. 63/421,849, filed on November 02, 2022, which is hereby incorporated by reference herein as if reproduced in its entirety.

TECHNICAL FIELD

[0002] The present invention relates generally to a device having a three- dimensional (3D) display, and in particular embodiments, to techniques and mechanisms for a device having a combination two-dimensional (2D)-3D display.

BACKGROUND

[0003] The metaverse is part of the next-generation internet that is real-time, interactive, social, and persistent. A key technology for the metaverse is augmented reality that includes displays that provide real imagery overlaid with generated display items. Optical combiner displays may be used for augmented reality once such functionality can be produced. As the interest in augmented reality has grown, a need for improved optical combiner displays that support such functionality has emerged. Other types of displays, such as 2D displays, may also have applications in the metaverse.

SUMMARY OF THE INVENTION

[0004] Technical advantages are generally achieved, by embodiments of this disclosure which describe a device having an optical see-through three- dimensional (3D) display.

[0005] In accordance with an embodiment, a device includes: a frame; and an autostereoscopic display coupled to the frame, the autostereoscopic display including: a display panel including light-emitting diodes in a first far-field region, in a second far-field region, and in a near-field region, the near-field region disposed between the first far-field region and the second far-field region; a first microlens overlapping the first far-field region of the display panel; and a second microlens overlapping the second far-field region of the display panel, the second microlens spaced apart from the first microlens, where the near-field region of the display panel is free of microlenses. In accordance with some embodiments of the device, a distance between the first microlens and the second microlens is different from a size of the first microlens and a size of the second microlens. In accordance with some embodiments of the device, the first microlens and the second microlens are round microlenses in a top-down view. In accordance with some embodiments of the device, the first microlens and the second microlens are rectangular microlenses in a top-down view. In accordance with some embodiments of the device, the first microlens and the second microlens are hexagonal microlenses in a top-down view. In accordance with some embodiments of the device, the first far-field region and the second far-field region are ones of a plurality of far-field regions, the near-field region is one of a plurality of near-field regions, and a total area of the near-field regions is greater than a total area of the far-field regions. In accordance with some embodiments, the device further includes: a controller configured to change transparency of the first far-field region and the second far-field region.

[0006] In accordance with an embodiment, a device includes: a frame; a display panel coupled to the frame, the display panel including: a transistor layer; and a light-emitting diode layer over the transistor layer; a first microlens over a first portion of the light-emitting diode layer; and a second microlens over a second portion of the light-emitting diode layer, the first microlens and the second microlens each having a microlens size, the first microlens being separated from the second microlens by a separation distance, the separation distance being different from the microlens size. In accordance with some embodiments of the device, the transistor layer is opaque. In accordance with some embodiments of the device, the transistor layer is transparent. In accordance with some embodiments of the device, the first microlens and the second microlens are part of a microlens array over the display panel. In accordance with some embodiments of the device, microlenses of the microlens array are disposed a same distance from one another along a horizontal direction and along a vertical direction in a top-down view. In accordance with some embodiments of the device, the first microlens is part of a first microlens strip over the display panel, and the second microlens is part of a second microlens strip over the display panel. In accordance with some embodiments of the device, the first microlens and the second microlens are part of microlens strips over the display panel, and each of the microlens strips are disposed a same distance from one another. In accordance with some embodiments of the device, the first microlens and the second microlens are part of a randomized microlens array over the display panel. In accordance with some embodiments of the device, the separation distance is greater than the microlens size. In accordance with some embodiments of the device, the separation distance is less than the microlens size.

[0007] In accordance with an embodiment, a device includes: a thin-film transistor layer including transistors; a light-emitting diode layer over the thin- film transistor layer, the light-emitting diode layer including light-emitting diodes, the transistors configured to control the light-emitting diodes; a parallax barrier over the light-emitting diode layer, the parallax barrier including: a first microlens over the light-emitting diode layer, the first microlens aligned with a first array of the light-emitting diodes; and a second microlens over the lightemitting diode layer, the second microlens aligned with a second array of the light-emitting diodes, the second microlens spaced apart from the first microlens. In accordance with some embodiments of the device, the thin-film transistor layer is opaque. In accordance with some embodiments of the device, the thin- film transistor layer is transparent.

[0008] Embodiments may achieve advantages. Utilizing the microlenses on the display panel allows the resulting light-emitting units to emit a light field that is suitable for displaying 3D images. A display may be more compact than other types of displays, such as waveguide combiner displays.

BRIEF DESCRIPTION OF THE DRAWINGS

[0009] For a more complete understanding of the present disclosure, and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:

[0010] Figures 1A-1B are views of eyeglasses, in accordance with some embodiments;

[0011] Figure 2 is a block diagram of eyeglasses, in accordance with some embodiments;

[0012] Figure 3 is a flow diagram of a method for displaying augmented reality content on a three-dimensional display, in accordance with some embodiments; [0013] Figure 4 is a schematic diagram of a three-dimensional display during operation, in accordance with some embodiments;

[0014] Figure 5 is a schematic diagram of three-dimensional displays during display of a 3D image, in accordance with some embodiments;

[0015] Figures 6A-6B are views of a three-dimensional display, in accordance with some embodiments;

[0016] Figures 7A-7G are top-down views of three-dimensional displays, in accordance with various embodiments;

[0017] Figures 8-11 are cross-sectional views of intermediate stages in the manufacturing of a three-dimensional display, in accordance with various embodiments;

[0018] Figure 12 is a cross-sectional view of an intermediate stage in the manufacturing of a three-dimensional display, in accordance with various embodiments;

[0019] Figure 13 is a flow chart of a method of manufacturing a three- dimensional display, in accordance with some embodiments;

[0020] Figure 14 illustrates a diagram of an embodiment processing system;

[0021] Figure 15 is a view of an autostereoscopic device, in accordance with some embodiments;

[0022] Figures 16A-16B are schematic diagrams of an autostereoscopic display during display of a 3D image, in accordance with some embodiments;

[0023] Figures 17A-17B are views of a three-dimensional display, in accordance with some other embodiments;

[0024] Figure 18 is a flow chart of a method of manufacturing a three- dimensional display, in accordance with some embodiments;

[0025] Figures 19A-19B are schematic diagrams of an autostereoscopic display during display of a 3D image, in accordance with some other embodiments;

[0026] Figures 20A-20B are views of a three-dimensional display, in accordance with some other embodiments; and

[0027] Figure 21 is a three-dimensional schematic diagram of an autostereoscopic display during display of 2D and 3D images, in accordance with some other embodiments; [0028] Figure 22 is a flow chart of a method of switching operation of a three- dimensional display, in accordance with some embodiments; and

[0029] Figure 23 is a flow chart of a method of manufacturing a three- dimensional display, in accordance with some embodiments. [0030] Corresponding numerals and symbols in the different figures generally refer to corresponding parts unless otherwise indicated. The figures are drawn to clearly illustrate the relevant aspects of the embodiments and are not necessarily drawn to scale.

DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENTS

[0031] The making and using of embodiments of this disclosure are discussed in detail below. It should be appreciated, however, that the concepts disclosed herein can be embodied in a wide variety of specific contexts, and that the specific embodiments discussed herein are merely illustrative and do not serve to limit the scope of the claims. Further, it should be understood that various changes, substitutions, and alterations can be made herein without departing from the spirit and scope of this disclosure as defined by the appended claims.

[0032] According to various embodiments, a device includes a three- dimensional display. In some embodiments, the device is a wearable device. In some embodiments, the device is a non-wearable autostereoscopic device. The three-dimensional display may be used as an optical combiner display to display augmented reality content to the user of the device as well as ordinary imagery that can be viewed. The three-dimensional display is formed using microlenses of a particular size and spacing that allow the display to be optical see-through (OST) while still having a desired display resolution. The microlenses are spaced apart such that there are gaps or spaces among the microlenses, and therefore a user of the device perceives the display as being transparent. Additionally, the three-dimensional display may be a direct-emissive display, which may be more compact than other types of displays.

[0033] Figures 1A-1B are views of eyeglasses too, in accordance with some embodiments. Specifically, Figure 1A is a front view of the eyeglasses too, and Figure 1B is a side view of the eyeglasses too. The eyeglasses too are a wearable device having optical see-through three-dimensional (3D) displays. The eyeglasses 100 include a frame 102, three-dimensional displays 104, sensors 106, and a human interface device 108. As subsequently described in greater detail, the three-dimensional displays 104 are optical see-through displays that may act as optical combiner displays. The three-dimensional displays 104 are lenses for the eyeglasses 100. The human interface device 108 (subsequently described) is for interacting with the eyeglasses 100. Optionally, the eyeglasses 100 include other components, such as speakers 110. Although not separately illustrated in Figures 1A-1B, the eyeglasses 100 also include a controller 112 and a transceiver 114 (see Figure 2). In one embodiment, the controller 112 and the transceiver 114 are disposed with the human interface device 108. Alternatively, they may be disposed within any portion of or attached to the frame 102.

[0034] The frame 102 is an eyeglasses frame that holds the lenses (e.g., the three-dimensional displays 104) in the proper position for the user (e.g., wearer) of the eyeglasses 100. The frame 102 includes a pair of rims that surround, or at least partially surround, and hold the three-dimensional displays 104 in place, a bridge which connects the rims to one another, and earpieces connected to the sides of the rims. The frame 102 may be formed of any acceptable material such as plastic, metal, a combination thereof, or the like.

[0035] The three-dimensional displays 104 are coupled to the frame 102. Specifically, the three-dimensional displays 104 are held by the rims of the frame 102, such that the frame 102 extends at least partially around the three- dimensional displays 104. The three-dimensional displays 104 are direct- emissive, optical see-through, three-dimensional displays. Specifically, the three- dimensional displays 104 are optical see-through light field displays. In this context, a display is optical see-through when the user of the eyeglasses 100 perceives the display as being transparent at a desired viewing distance. The three-dimensional displays 104 are perceived as transparent when a real-world environment is visible through the three-dimensional displays 104 to the user of the eyeglasses 100. The three-dimensional displays 104 may be used to display content such as an augmented reality overlay, a user interface (UI), or the like to a user of the eyeglasses 100.

[0036] Because the real-world environment is visible through the three- dimensional displays 104, the user of the eyeglasses 100 may perceive augmented reality content as being overlaid on the real-world environment visible through the three-dimensional displays 104. Additionally, in this context, a display is direct-emissive when the display directly emits a desired light field without using a backlight for light emission. Specifically, the display emits light from each pixel with a desired intensity and color, such that the image is produced directly on the display. A direct-emissive display may simply be referred to as an emissive display.

[0037] The sensors 106 include one or more of visual sensors, audio sensors, position sensors, environment sensors, and the like. Examples of the sensors 106 include cameras such as wide-angle cameras, infrared cameras, or the like; depth sensors; position sensors such as orientation sensors, magnetometers, or the like; motion sensors such as accelerometers, gravity sensors, gyroscopes, or the like; environment sensors such as light sensors, temperature sensors, humidity sensors, air pressure sensors, or the like; audio sensors such as microphones; medical sensors such as blood-oxygen sensors, brain wave sensors, or the like; satellite navigation sensors such as global positioning system (GPS) sensors; or the like. Some or all of the sensors 106 are integrated or disposed within the frame 102. In some embodiments, the sensors 106 are disposed in the rims and bridge of the frame 102. The sensors 106 may be disposed symmetrically or asymmetrically around the frame 102, and may face towards and/or away from the user of the eyeglasses 100. For example, the sensors 106 may include cameras facing towards the user of the eyeglasses 100, which may be utilized to track the position(s) of the head, face, and/or eyes of the user. Similarly, the sensors 106 may include cameras facing away from the user of the eyeglasses 100, which may be utilized to track hand gestures of the user. Some of the sensors 106 may be networked sensors that are external to the frame 102 and are communicated with through the transceiver 114 (subsequently described for Figure 2).

[0038] The human interface device 108 includes features for interacting with the user interface of the eyeglasses 100. The human interface device 108 may include a touch pad, buttons, or the like. In some embodiments, the human interface device 108 is attached to the frame 102, such as to the earpieces of the frame 102. In some embodiments, the human interface device 108 is a networked human interface device that is external to the frame 102 and is communicated with through the transceiver 114 (subsequently described for Figure 2).

[0039] The speakers 110 are used for outputting information to the user of the eyeglasses 100. For example, the speakers 110 may be used to play a notification or alert. In an alternative embodiment, speakers 110 are not used to play a notification or alert if the transceiver 114 (subsequently described for Figure 2) is paired with remote speakers (e.g., Apple ® Airpod ® type ear speakers using a communication protocol such as but not limited to Bluetooth ®). In this alternative embodiment, the controller 112 sends data to the transceiver 114 for wireless transmission instead of sending audio data or information to the speakers 110. [0040] Figure 2 is a block diagram of the eyeglasses 100, in accordance with some embodiments. As noted above, the eyeglasses 100 also include a controller 112 and a transceiver 114.

[0041] The controller 112 is adapted to control the components of the eyeglasses 100 during operation. Specifically, the controller 112 is adapted to control the components of the eyeglasses 100 by receiving input signals from the input devices of the eyeglasses 100 (e.g., the sensors 106) and transmitting output signals to the output devices of the eyeglasses 100 (e.g., the three-dimensional displays 104 and the speakers 110). The controller 112 may include a control circuit, a processor, an application-specific integrated circuit, a microcontroller, or the like. For example, the controller 112 may include one or more processors and memories, such as non-transitory computer readable storage mediums, that store programming for execution by the processors. One or more modules within the controller 112 may be partially or wholly embodied as software and/or hardware for performing any functionality described herein. As subsequently described in greater detail, the controller 112 is adapted to display augmented reality content on the three-dimensional displays 104.

[0042] The transceiver 114 is used by the controller 112 to communicate with external devices. The transceiver 114 is adapted to transmit and receive signaling over a network. For example, the transceiver 114 may be adapted to communicate with an external device that assists the controller 112 with processing or even to receive information for generating imagery for the three-dimensional displays 104. The transceiver 114 may include a transmitter and receiver for a wireless telecommunications protocol, such as a cellular protocol (e.g., 5G, long-term evolution (LTE), etc.), a wireless local area network (WLAN) protocol (e.g., Wi-Fi, etc.), or any other type of wireless protocol (e.g., Bluetooth, near field communication (NFC), etc.). In such embodiments, the transceiver 114 includes one or more antenna/ radiating elements for transmitting and/or receiving communication signals.

[0043] Figure 3 is a flow diagram of a method 300 for displaying augmented reality content on a three-dimensional display, in accordance with some embodiments. The method 300 is described in conjunction with Figures 1A-2. In this context, augmented reality content includes 3D images that are displayed on a three-dimensional display 104 such that the 3D images are overlaid on the real- world environment visible through the three-dimensional displays 104. The augmented reality content is spatially aligned with the real-world environment visible through the three-dimensional display 104.

[0044] In step 302, signals from the input devices of the eyeglasses 100 (e.g., the sensors 106) are analyzed. The signals may be sensor signals received from the sensors 106. The information from analyzing the signals from the input devices may be spatial alignment information that is used to spatially align augmented reality content on the three-dimensional display 104. For example, the position(s) of the head, face, and/or eyes of the user of the eyeglasses 100 may be determined by analyzing the signals from the sensors 106 (e.g., the cameras facing towards the user of the eyeglasses 100).

[0045] In step 304, augmented reality content is rendered according to the analyzed signals. Rendering the augmented reality content includes calculating the locations to display 3D images on the three-dimensional display 104 so that the 3D images are spatially aligned with the real-world environment. For example, the spatial alignment information may be used to render the augmented reality content in desired locations on the three-dimensional display 104.

[0046] In step 306, the rendered augmented reality content is output to the three-dimensional display 104. The 3D images may be output to the three- dimensional display 104 by transmitting signals to the three-dimensional display 104. Accordingly, the three-dimensional display 104 displays the 3D images. [0047] At least a portion of the method 300 may be performed by the controller 112. In some embodiments, each step of the method 300 is performed by the controller 112. In some embodiments, some steps of the method 300 are performed by the controller 112, and other steps of the method 300 are performed by an external device, such as an external processor (e.g., a server), a cloud computing processor, an edge computing processor, or the like. For example, the rendering of the augmented reality content may take a considerable amount of computing power. In some embodiments, steps 302 and 306 are performed by a front-end processor (e.g., the controller 112) while step 304 is performed by a back-end processor (e.g., an external device). The front-end processor may communicate with the back-end processor via the transceiver 114. [0048] Additionally, step 302 may be performed in only some circumstances. For example, a calibration process may be performed using the input devices of the eyeglasses 100 (e.g., the sensors 106, specifically, the cameras facing towards the user of the eyeglasses 100). The calibration process may produce the spatial alignment information that is used to spatially align the augmented reality content on the three-dimensional display 104. The spatial alignment information may be stored in a memory of the controller 112 and then reused for multiple rendering steps, without repeated recomputing of the spatial alignment information.

[0049] Figure 4 is a schematic diagram of a three-dimensional display 104 during operation, in accordance with some embodiments. The three-dimensional display 104 is a direct-emissive, optical see-through, three-dimensional display. The three-dimensional display 104 is used to display 3D images to a target 400. In this example, the three-dimensional display 104 is an eyeglasses lens, and the target 400 is an eye of the user.

[0050] The three-dimensional display 104 is adapted to display 3D images to the target 400. Specifically, the three-dimensional display 104 is a light field display. The three-dimensional display 104 includes a plurality of 3D lightemitting units 402. As subsequently described in greater detail, a 3D lightemitting unit 402 includes a microlens 604 (subsequently described for Figures 6A-6B) and a light-emitting diode array 614 (subsequently described for Figures 6A-6B), which includes a cluster of light-emitting diodes. Each 3D light-emitting unit 402 emits one or more light fields. In this context, a light field is described by a vector function that describes the amount of light flowing in a plurality of directions through a plurality of points in space. Specifically, a light field includes a plurality of light rays 404 defined by a plenoptic function. Each light ray 404 has a radiance, which is a measurement of the amount of light traveling along the light ray 404. Put another way, each 3D light-emitting unit 402 emits a plurality of light rays 404. In the illustrated example, only some of the light rays 404 radiating towards the target 400 are illustrated. The light rays 404 each travel in a direction that forms an acute angle with a direction that is perpendicular to a major surface of the three-dimensional display 104. As such, an appearance of three-dimensional depth may be created by the light rays 404 that are emitted by the 3D light-emitting units 402 (e.g., by the combination of the light-emitting diode arrays 614 and the microlenses 604). [0051] The 3D light-emitting units 402 are self-emitting, and do not use a backlight for light emission. Accordingly, the three-dimensional display 104 is a direct-emissive display. Utilizing a direct-emissive display instead of a backlit display allows for a reduction in the size of the three-dimensional display 104, and also allows for enhanced brightness and contrast of the three-dimensional display 104. The 3D light-emitting units 402 also do not use a projector or waveguide for light emission. A direct-emissive display may have lower power consumption than a backlit or waveguided display.

[0052] The three-dimensional display 104 is also an optical see-through display. Accordingly, an object 406 may be visible to the target 400 through the three-dimensional display 104. The 3D light-emitting units 402 are spaced apart from one another, such that each 3D light-emitting unit 402 is separated from others of the 3D light-emitting units 402. As subsequently described in greater detail, the ratio of the distance between the 3D light-emitting units 402 to the size of the 3D light-emitting units 402 is a particular ratio that permits light rays 408 from the object 406 to pass through the three-dimensional display 104. As such, the three-dimensional display 104 can display 3D images to the target 400 while also allowing the light rays 408 to pass through the three-dimensional display 104 and be visible to the target 400. Accordingly, augmented reality content displayed with the three-dimensional display 104 may be spatially aligned with the real -wo rid environment (e.g., the object 406) visible through the three- dimensional display 104.

[0053] Figure 5 is a schematic diagram of three-dimensional displays during display of a 3D image, in accordance with some embodiments. When displaying a 3D image with a wearable device (e.g., the eyeglasses 100), different three- dimensional displays 104 are used to display different light fields to different targets 400, e.g., to different eyes of the user. In this embodiment, the three- dimensional displays 104 are near-eye displays. The light fields are generated by the three-dimensional displays 104 so that the user perceives a virtual object at a displayed location 502 in three-dimensional space.

[0054] The sharpness of displaying a 3D image is determined by the pixel density of the three-dimensional displays 104. If the pixel density of the three- dimensional displays 104 is insufficient to display 3D images, then the displayed location 502 of a virtual object may be different from an intended location 504 of the virtual object. When the error between a displayed location 502 and an intended location 504 of a virtual object is large, augmented reality content may not be correctly overlaid on the real-world environment visible through the three- dimensional displays 104. Specifically, the displayed augmented reality content may be high-order grating diffraction patterns that appear to the user as ghost images.

[0055] The pixel density of a three-dimensional display 104, measured in pixels per degree (PPD), is determined by the density of the 3D light-emitting units 402 and by the viewing distance between the three-dimensional display 104 and a target 400. In some embodiments where the three-dimensional displays 104 are near-eye displays, the pixel density of the three-dimensional display 104 is in the range of 1 PPD to 10 PPD. A pixel density of less than 1 PPD may be insufficient to display 3D images, such that there is a large error between a displayed location 502 and an intended location 504 of a virtual object. A pixel density of greater than 10 PPD may be difficult to manufacture. In an embodiment where the viewing distance is about 10 mm and where the size of the 3D light-emitting units 402 is about 50 pm, the pixel density of the three- dimensional display 104 is about 3 PPD.

[0056] Figures 6A-6B are views of a three-dimensional display 104, in accordance with some embodiments. Specifically, Figure 6A is a top-down view of a portion of the three-dimensional display 104, and Figure 6B is a cross-sectional view of a portion of the three-dimensional display 104. The three-dimensional display 104 includes a display panel 602 and a plurality of microlenses 604.

[0057] The display panel 602 is shown schematically in Figures 6A-6B, where some features (subsequently described for Figures 8-12) are omitted for illustration clarity. The display panel 602 includes a transparent layer 612, a plurality of light-emitting diode arrays 614 in/on the transparent layer 612, and a plurality of transistor arrays 616 under the light-emitting diode arrays 614.

[0058] The transparent layer 612 at least laterally surrounds the lightemitting diode arrays 614. The transparent layer 612 may be a glass layer, an ultraviolent absorbent layer, a liquid crystal display layer, or the like. The material of the transparent layer 612 may be selected based on the desired application of the three-dimensional display 104. A glass layer provides good transparency when the three-dimensional display 104 is a lens for clear eyeglasses. An ultraviolent absorbent layer is a photochromic layer that changes transmission based on the brightness of ambient light, which provides good sun protection when the three-dimensional display 104 is a lens for sunglasses. A liquid crystal display layer may be programmable such that its transmission may be changed as desired. For example, a liquid crystal display layer may be programmed to have high transparency in a low-light environment and may be programmed to have low transparency in a high-light environment. The controller 112 (see Figure 2) may be adapted to program the liquid crystal display layer.

[0059] The light-emitting diode arrays 614 are light-emitting pixel arrays, and each include any quantity of light-emitting diodes. In an embodiment, each lightemitting diode array 614 includes a grid having from 10 to 15 columns of lightemitting diodes and from 10 to 15 rows of light-emitting diodes. The lightemitting diodes are diodes that, during operation, self-emit light in a desired color without a backlight. Examples of suitable light-emitting diodes include organic light-emitting diodes (OLEDs), micro light-emitting diodes (pLEDs), and the like. The light-emitting diode arrays 614 are spaced apart from one another. In some embodiments, the spacing between the light-emitting diodes of adjacent light-emitting diode arrays 614 is greater than the spacing between the lightemitting diodes within a light-emitting diode array 614.

[0060] The microlenses 604 are placed over the display panel 602. Specifically, the microlenses 604 are over the light-emitting diode arrays 614 with a one-to-one correspondence. Each microlens 604 thus overlaps an underlying light-emitting diode array 614. A microlens 604 may be narrower than, wider than, or the same width as the underlying light-emitting diode array 614. The microlenses 604 may be half-sphere lenses, spherical lenses, aspherical lenses, flat lenses, holographic lenses, metalens (e.g., nano-sized meta-surfaces lenses), Fresnel lenses, or the like. Additionally, the microlenses 604 may be singlelayered lenses, multi-element lenses, or the like. The microlenses 604 are spaced apart from one another such that the microlenses 604 do not contact one another in at least one direction. The spacings between the microlenses 604 expose portions of the transparent layer 612 in the top-down view.

[0061] The transistor arrays 616 are under the light-emitting diode arrays 614. Specifically, the transistor arrays 616 are beneath the light-emitting diode arrays 614 with a one-to-one correspondence. The transistor arrays 616 are spaced apart from one another. The transistor arrays 616 include transistors that control the diodes of the respective overlying light-emitting diode arrays 614. The transistor arrays 616 are formed on opaque layers (e.g., semiconductor layers, which are opaque to visible light) that block light from passing through the portions of the three-dimensional display 104 where the microlenses 604 are located. The microlenses 604 distort (e.g., refract) light, and preventing light from passing through the three-dimensional display 104 where the microlenses 604 are located helps reduce distortion of the light passing through the three- dimensional display 104.

[0062] Each 3D light-emitting unit 402 includes a microlens 604, a lightemitting diode array 614, and a transistor array 616. The microlens 604 refracts light rays emitted by the light-emitting diode array 614. Light rays emitted from a 3D light-emitting unit 402 may be directed in a desired direction by illuminating certain diodes of the light-emitting diode array 614 such that the light emitted by the diodes is refracted in the desired direction by the microlens 604. The relative position between a light-emitting diode and the center of the microlens 604 defines the exit angle of a light ray exiting from the microlens 604.

[0063] The display panel 602 has a plurality of emitting regions 602E and a plurality of transparent regions 602T. The emitting regions 602E are opaque regions where ambient light may not pass through the display panel 602. Instead, the emitting regions 602E emit light. The light-emitting diode arrays 614 and the transistor arrays 616 are formed in the emitting regions 602E. The microlenses 604 completely overlap the emitting regions 602E. The transparent regions 602T are regions where light may pass through the display panel 602. A transparent region 602T is between adjacent emitting regions 602E. The transparent layer 612 is formed in the transparent regions 602T. The microlenses 604 do not overlap the transparent regions 602T.

[0064] Although the emitting regions 602E are opaque (e.g., not transparent), the transparent regions 602T are large enough relative to the emitting regions 602E that the display panel 602 is optical see-through. The separation distance D between the microlenses 604 (in at least one direction) is large relative to the size S of the microlenses 604. Specifically, the separation distance D between the microlenses 604 is larger than the size S of the microlenses 604. In some embodiments, the size S of the microlenses 604 is in the range of 20 |am to 50 |am, the separation distance D between the microlenses 604 is in the range of 30 pm to 100 pm, and the pitch P of the microlenses 604 is in the range of 100 pm to 150 pm. In some embodiments, the size S of the microlenses 604 is in the range of 5 pm to 2000 pm (such as 200 pm to 2000 pm), the separation distance D between the microlenses 604 is in the range of 200 pm to 2000 pm, and the pitch P of the microlenses 604 is in the range of 400 pm to 4000 pm. The size S, the separation distance D, and the pitch P may also have other values, particularly as technology scales down. The ratio of the microlens separation distance D to the microlens size S determines whether the three-dimensional display 104 is optical see- through. In some embodiments, the ratio of the microlens separation distance D to the microlens size S is in the range of 1 to 6, and more specifically, in the range of 2 to 5. If the ratio of the microlens separation distance D to the microlens size S is less than 2, scattering or diffraction may occur and the three-dimensional display 104 may display ghost images. If the ratio of the microlens separation distance D to the microlens size S is greater than 5, the pixel density of the three- dimensional display 104 may be outside of the desired range (previously described) and insufficient to display 3D images.

[0065] The microlenses 604 occupy a minority (e.g., less than half) of the area of the display panel 602 in the top-down view. Therefore, the total area of the transparent regions 602T is greater than the total area of the emitting regions 602E. In some embodiments, the ratio of the total area of the transparent regions 602T to the total area of the emitting regions 602E is in the range of 1 to 6. In some embodiments, the ratio is about 2, such that the emitting regions 602E occupy about one-third of the area of the display panel 602, and the transparent regions 602T occupy about two-thirds of the area of the display panel 602.

[0066] Figures 7A-7G are top-down views of three-dimensional displays 104, in accordance with various embodiments. The microlenses 604 may have various shapes in the top-down views. Additionally, the microlenses 604 may have various layouts in the top-down views. It should be appreciated that the lightemitting diode arrays underlying the microlenses 604 have the same layout as the microlenses 604.

[0067] In some embodiments, as shown in Figure 7A, the microlenses 604 are round microlenses and are laid out in a grid microlens array. In a grid microlens array, the microlenses 604 are aligned along rows and columns, in the top-down view. The spacing between the rows of the microlenses 604 may be different from the spacing between the columns of the microlenses 604. In some embodiments, the columns of the microlenses 604 are closer together than the rows of the microlenses 604, such that the microlenses 604 are in dense horizontal lines. The spacing between the rows/columns of the microlenses 604 determines the field of view of the three-dimensional display 104. For example, in the illustrated example where the columns of the microlenses 604 are closer together than the rows of the microlenses 604, the horizontal field of view of the three-dimensional display 104 may be larger than the vertical field of view of the three-dimensional display 104.

[0068] In some embodiments, as shown in Figure 7B, the microlenses 604 are round microlenses and are laid out in a checkerboard microlens array. In a checkerboard microlens array, the microlenses 604 are aligned along diagonal directions, in the top-down view. When the microlenses 604 are in a checkerboard layout, the horizontal field of view of the three-dimensional display 104 may be equal to the vertical field of view of the three-dimensional display 104.

[0069] In some embodiments, as shown in Figure 7C, the microlenses 604 are rectangular microlenses (e.g., square microlenses) and are laid out in microlens strips, where each of the microlens strips is spaced apart. In a microlens strip, the microlenses 604 are aligned along a same direction and are densely packed along the strip. The microlenses 604 within a strip may be closer to one another than previously described, and in some embodiments, the microlenses 604 within a strip are not spaced apart and instead contact one another.

[0070] Other acceptable microlens shapes may be utilized. For example, the microlenses 604 may have other polygon shapes. Similarly, the microlenses 604 may have non-polygon shapes. For example, the microlenses 604 maybe truncated circular microlenses.

[0071] In some embodiments, as shown in Figure 7D, the microlenses 604 are rectangular microlenses and are laid out in microlens strips, where some of the microlens strips are not spaced apart and instead contact one another. For example, the microlens strips may be grouped such that a plurality of microlens strips extend along and are in contact with one another, and the groups of the microlens strips are spaced apart. When the microlens strips are grouped, the spacing between groups of the microlens strips may be large. For example, the microlens strips may each have a width of about 33 pm, three microlens strips may be grouped along each row for a total row width of about 100 pm, and the separation distance between the rows may be about 250 pm.

[0072] In some embodiments, as shown in Figure 7E, the microlenses 604 are hexagonal microlenses and are laid out in microlens strips, where some of the microlens strips are not spaced apart and instead contact one another. The hexagonal microlenses may be grouped in microlens strips with a honeycomb layout. The honeycomb structures are spaced apart. In this embodiment, the honeycomb structures are non-truncated. As such, the edges of the hexagonal microlenses define jagged edges of the microlens strips in the top-down view. [0073] In some embodiments, as shown in Figure 7F, the microlenses 604 are hexagonal microlenses and are laid out in microlens strips with a honeycomb layout, where the honeycomb structures are truncated. As such, the edges of the hexagonal microlenses define straight edges of the microlens strips in the top- down view.

[0074] The microlenses 604 may be disposed closer to one another along a first direction (e.g., the horizontal direction in Figures 7A and 7C-7F) than along a second direction (e.g., the vertical direction in Figures 7A and 7C-7F). The horizontal direction is parallel to a line connecting the eyes of the user (e.g., a line between the three-dimensional displays 104 of the eyeglasses 100, see Figure 1). In some embodiments, the microlens separation distance along the first direction is smaller than the desired separation distance (previously described for Figures 6A-6B). The microlenses 604 may not be spaced apart in the first direction. However, the microlens separation distance along at least one direction, such as the second direction, is within the desired separation distance such that the three-dimensional display 104 is optical see-through. The human eye has a greater horizontal field of view than vertical field of view. As such, in some embodiments, the microlenses 604 are spaced apart by the desired separation distance along the vertical direction, such that the gaps between the microlenses 604 are horizontally oriented, e.g., oriented along a direction with a greater field of view. [0075] The ratio of the microlens separation distance to the microlens size may vary across a three-dimensional display 104. For example, the ratio may have a first value in a first region of a three-dimensional display 104 and may have a second value in a second region of the three-dimensional display 104, where the first value is different from the second value, and the first and second values are both in the previously described range. In some embodiments, the ratio in a central portion of a three-dimensional display 104 is less than the ratio in an edge portions of the three-dimensional display 104, where the central portion is between the edge portions. The edge portions may include the upper and lower portions of the three-dimensional display 104. Put another way, the separation distance between the microlenses 604 may be smaller in the central portion of a three-dimensional display 104 than in the edge portions of the three-dimensional display 104. In the embodiments of Figures 7C-7F, the microlens strips in the central portion of the three-dimensional display 104 may be closer together than the microlens strips in the edge portions of the three-dimensional display 104. [0076] Other acceptable microlens layouts may be utilized. In some embodiments, as shown in Figure 7G, the microlenses 604 are laid out in a randomized microlens array. Specifically, the spacings between adjacent microlenses 604 may be randomized. In a randomized microlens array, the ratio of the microlens separation distance to the microlens size of the adjacent microlenses 604 may still be in the previously described range. In other words, the spacings between adjacent microlenses 604 varies, but the spacings are in the previously described range. Randomizing the spacing between adjacent microlenses 604 may help decrease the interference of light passing through the spaces between the microlenses 604. Although Figure 7G shows a randomized microlens array with round microlenses, the microlenses 604 of the randomized microlens array may be rectangular microlenses, hexagonal microlenses, or the like.

[0077] Figures 8-11 are cross-sectional views of intermediate stages in the manufacturing of a three-dimensional display 104, in accordance with various embodiments. The three-dimensional display 104 is manufactured by separately manufacturing the light-emitting diode arrays 614 and then transferring the lightemitting diode arrays 614 to the transparent layer 612. [0078] In Figure 8, a thin-film transistor layer 802 is formed. A light-emitting diode layer 804 is formed on the thin-film transistor layer 802. The thin-film transistor layer 802 includes transistors, which may be arranged in transistor arrays 616, and is formed on an opaque layer, such as a semiconductor substrate. The light-emitting diode layer 804 includes light-emitting diodes, which may be arranged in light-emitting diode arrays 614. The transistors of the thin-film transistor layer 802 and the diodes of the light-emitting diode layer 804 may be formed by an acceptable complementary metal-oxide semiconductor (CMOS) process. For example, deposition, lithography, and etching processes maybe performed to form various features (e.g., buffer layers, channel layers, pixeldefining layers, cathodes, etc.) for the transistors of the thin-film transistor layer 802 and the diodes of the light-emitting diode layer 804.

[0079] The microlenses 604 are then placed on the light-emitting diode layer 804. As an example of placing the microlenses 604, the microlenses 604 may be placed on a microlens sheet 806. The microlens sheet 806 may be formed of glass or the like. The microlens sheet 806 may then be placed on the light-emitting diode layer 804 and may be aligned with the light-emitting diode layer 804 such that the microlenses 604 are aligned with the respective underlying light-emitting diode arrays 614 of the light-emitting diode layer 804.

[0080] In Figure 9, the microlens sheet 806, the light-emitting diode layer 804, and the thin-film transistor layer 802 are diced to form display components 808. The display components 808 include the transistor arrays 616 (portions of the thin-film transistor layer 802), the light-emitting diode arrays 614 (portions of the light-emitting diode layer 804), and the microlenses 604. In some embodiments where a microlens array (e.g., a grid microlens array or checkerboard microlens array, see Figures 7A-7B) is utilized, each display component 808 is an individual unit comprising a single transistor array 616, a single light-emitting diode array 614, and a single microlens 604. In some embodiments where microlens strips (see Figures 7C-7F) are utilized, each display component 808 is a strip comprising a plurality of transistor arrays 616, a plurality of light-emitting diode arrays 614, and a plurality of microlenses 604. A dicing process is performed along scribe line regions, e.g., between the display components 808. The dicing process may include performing a sawing process, a laser cutting process, or the like. The dicing process separates the adjacent display components 808. The microlenses 604 may be trimmed by the dicing of the display components 808. After the dicing process, the respective lightemitting diode arrays 614, transistor arrays 616, and microlens 604 are laterally coterminous.

[0081] In Figure 10, a transparent layer 612 is formed. Recesses 810 are then patterned in the transparent layer 612. The recesses 810 may be patterned utilizing acceptable photolithography and etching techniques. In some embodiments where a microlens array (e.g., a grid microlens array or checkerboard microlens array, see Figures 7A-7B) is utilized, the recesses 810 are pits. In some embodiments where microlens strips (see Figures 7C-7F) are utilized, the recesses 810 are grooves. The recesses 810 are patterned where the emitting regions 602E of the display panel 602 will be located.

[0082] In Figure 11, the display components 808 are transferred to the transparent layer 612, thereby forming the three-dimensional display 104. After transfer, the display components 808 are coupled to the transparent layer 612. The display components 808 may be transferred to the transparent layer 612 by placing the display components 808 in the recesses 810 (see Figure 10). The display components 808 may be placed in the recesses 810 by an acceptable pick- and-place process. Thus, portions of the display components 808 are disposed in the transparent layer 612. In some embodiments, the transistor arrays 616 and the light-emitting diode arrays 614 are disposed in the transparent layer 612. Other portions of the display components 808 protrude from the transparent layer 612. In some embodiments, the microlenses 604 protrude from the transparent layer 612. Electrical connections to the devices of the display components 808 (e.g., the transistors of the transistor array 616) may be formed on the transparent layer 612.

[0083] In this embodiment, the microlenses 604 are placed on the lightemitting diode arrays 614 before the display components 808 are placed in the recesses 810, e.g., before the display components 808 are diced. In another embodiment, the microlenses 604 are placed on the light-emitting diode arrays 614 after the display components 808 are placed in the recesses 810, e.g., after the display components 808 are diced. In such embodiments, a microlens sheet 806 (see Figure 8), including the microlenses 604, may be placed on the transparent layer 612 and the display components 808. [0084] In some embodiments, one or more surfaces of the microlenses 604 are coated. The sidewalls 812 of the microlenses 604 may be coated with a reflective coating, such as a metal coating, which may help block light from obliquely passing through a transparent region 602T and a microlenses 604. Such light would otherwise be deflected by the microlens 604. The rounded top surfaces 814 of the microlenses 604 may be coated with an anti-reflective coating, such as a dielectric multi-layer coating, which may help increase light transmission through the microlenses 604.

[0085] Figure 12 is a cross-sectional view of an intermediate stage in the manufacturing of a three-dimensional display 104, in accordance with various embodiments. This step is similar to the step of Figure 12, except the display components 808 are placed on a top surface of the transparent layer 612 instead of being placed in recesses in the transparent layer 612. The recessing of the transparent layer 612 may be omitted when the transparent layer 612 is a layer that cannot be patterned without causing damage, such as a liquid crystal display layer.

[0086] Figure 13 is a flow chart of a method of manufacturing a three- dimensional display, in accordance with some embodiments. The display maybe a direct-emissive display for a wearable device. The method may be implemented using appropriate steps of the processes described for Figures 8-12.

[0087] In step 1302, a thin-film transistor layer and a light-emitting diode layer are diced to form a first display component and a second display component. The first display component includes a first transistor array and a first light-emitting diode array. The second display component includes a second transistor array and a second light-emitting diode array.

[0088] In step 1304, the first display component and the second display component are transferred to a transparent layer. In some embodiments, transferring the first display component and the second display component to the transparent layer includes patterning a first recess and a second recess in the transparent layer and placing the first display component and the second display component in, respectively, the first recess and the second recess. In some embodiments, transferring the first display component and the second display component to the transparent layer includes placing the first display component and the second display component on a top surface of the transparent layer. [0089] In step 1306, a first microlens and a second microlens are placed on, respectively, the first light-emitting diode array and the second light-emitting diode array. The first microlens and the second microlens may be placed before transferring the first display component and the second display component to the transparent layer. In such embodiments, placing the first microlens and the second microlens includes placing a microlens sheet on the light-emitting diode layer, the microlens sheet diced with the light-emitting diode layer. The first microlens and the second microlens may be placed after transferring the first display component and the second display component to the transparent layer. In such embodiments, placing the first microlens and the second microlens includes placing a microlens sheet on the transparent layer, the first microlens and the second microlens being aligned to, respectively, the first light-emitting diode array and the second light-emitting diode array.

[0090] Figure 14 illustrates a block diagram of an embodiment processing system 1300 for performing methods described herein, which may be installed in a host device. For example, the processing system 1300 may be used to implement the controller 112 (see Figure 2). As shown, the processing system 1300 includes a processor 1404, a memory 1406, and interfaces 1410-1414, which may (or may not) be arranged as shown in Figure 14. The processor 1404 may be any component or collection of components adapted to perform computations and/or other processing-related tasks, and the memory 1406 may be any component or collection of components adapted to store programming and/or instructions for execution by the processor 1404. In an embodiment, the memory 1406 includes a non-transitory computer readable medium. The interfaces 1410, 1412, 1414 may be any component or collection of components that allow the processing system 1300 to communicate with other devices/components and/or a user. For example, one or more of the interfaces 1410, 1412, 1414 may be adapted to communicate data, control, or management messages from the processor 1404 to applications installed on the host device and/or a remote device. As another example, one or more of the interfaces 1410, 1412, 1414 may be adapted to allow a user or user device (e.g., personal computer (PC), etc.) to interact/communicate with the processing system 1300. The processing system 1300 may include additional components not depicted in Figure 14, such as long-term storage (e.g., non-volatile memory, etc.). [0091] Embodiments may achieve advantages. Placing the microlenses 604 on the display panel 602 allows the resulting 3D light-emitting units 402 to emit a light field that is suitable for displaying 3D images. Utilizing the particular ratio (previously described) of the microlens separation distance D to the microlens size S allows the three-dimensional displays 104 to be optical see-through. Because the three-dimensional displays 104 of the eyeglasses 100 are direct- emissive displays, they may be more compact than other types of displays, such as backlit displays and waveguide combiner displays. Avoiding the use of backlights in the eyeglasses 100 may reduce the power consumption of the eyeglasses 100. The three-dimensional displays 104, being direct-emissive, have high optical emission efficiency that may reduce the power consumption to as low as 10 mW. Additionally, direct-emissive displays have better optical delivery efficiency than waveguide combiner displays. Avoiding the use of waveguide combiner displays in the eyeglasses 100 may reduce exposure of the user to excessively bright lights. [0092] Figure 15 is a view of an autostereoscopic device 1500, in accordance with some embodiments. Specifically, Figure 15 is a view of a front of the autostereoscopic device 1500. The autostereoscopic device 1500 may be a handheld device (e.g., a tablet), a stationary device (e.g., a television or computer monitor), a tabletop device, or the like. The autostereoscopic device 1500 includes a frame 102, a three-dimensional display 104, and sensors 106. Although not separately illustrated in Figure 15, the autostereoscopic device 1500 may also include a human interface device 108, speakers 110, a controller 112 and a transceiver 114 (see Figure 2).

[0093] The frame 102 is similar to that described for Figure 1, except the frame 102 is a handheld or stationary frame. The frame 102 extends around the three-dimensional display 104 such that the front and back of the three- dimensional display 104 are exposed and are not covered. The frame 102 may be formed of any acceptable material such as plastic, metal, a combination thereof, or the like.

[0094] The three-dimensional display 104 is similar to those described for Figures 1-12, except the three-dimensional display 104 is an autostereoscopic display. Additionally, in this embodiment, the three-dimensional display 104 may (or may not) be a direct-emissive display. [0095] The sensors 106 are similar to those described for Figure 1. For example, the sensors 106 may include cameras facing towards the user of the autostereoscopic device 1500, which may be utilized to track position(s) of the head, face, and/or eyes of the user. Similarly, the sensors 106 may include cameras facing away from the user of the autostereoscopic device 1500, which may be utilized to track the position of objects the user is facing.

[0096] Figures 16A-16B are schematic diagrams of an autostereoscopic display during display of a 3D image, in accordance with some embodiments. When displaying a 3D image with a non-wearable autostereoscopic device, a single three-dimensional display 104 is used to display different light fields to different targets 400, e.g., to different eyes of the device user. As such, the three- dimensional display 104 is an autostereoscopic display. In this embodiment, the three-dimensional display 104 is a far-eye display. The light fields are generated by the three-dimensional display 104 so that the device user perceives a virtual object at a displayed location 502 in three-dimensional space.

[0097] As noted above, the sharpness of displaying a 3D image is determined by the pixel density of the three-dimensional display 104. In some embodiments where the three-dimensional display 104 is a far-eye display, the pixel density of the three-dimensional display 104 is in the range of 30 PPD to 60 PPD. A pixel density of less than 30 PPD may be insufficient to display 3D images, such that there is a large error between a displayed location 502 and an intended location 504 of a virtual object. A pixel density of greater than 60 PPD may be difficult to manufacture. In an embodiment where the viewing distance is about 100 mm and where the size of the 3D light-emitting units 402 is about 50 pm, the pixel density of the three-dimensional display 104 is about 30 PPD. In an embodiment where the viewing distance is about 100 mm and where the pitch of the 3D lightemitting units 402 is about 30 pm, the pixel density of the three-dimensional display 104 is about 60 PPD. A pixel density of about 60 PPD is near the limit at which a human eye may discern pixels.

[0098] As noted above, the three-dimensional display 104 is an autostereoscopic display. A single three-dimensional display 104 may thus be utilized to display 3D images, without the use of headgear by a user. An autostereoscopic display may be operated more than 100 mm from the user. The microlenses 604 form a parallax barrier for the autostereoscopic display. Including a parallax barrier in the three-dimensional display 104 allows the three-dimensional display 104 to have motion parallax. Motion parallax allows the three-dimensional display 104 to be autostereoscopic.

[0099] The three-dimensional display 104 may be utilized to display multiple types of 3D images. In some embodiments, the three-dimensional display 104 is used to display floating 3D images, as illustrated in Figure 16A. Floating 3D images are perceived by a user as being located on the same side of the three- dimensional display 104 as the user. In some embodiments, the three- dimensional display 104 is used to display sinking 3D images, as illustrated in Figure 16B. Sinking 3D images are perceived by a user as being located on an opposite side of the three-dimensional display 104 from the user. Because the three-dimensional display 104 is optical see-through, it may be used to display sinking 3D images. As a result, a user may interact with objects that are outside the display panel. The ability of the three-dimensional display 104 to display sinking 3D images maybe desirable for some applications of the display, such as medical applications and gaming applications.

[0100] Figures 17A-17B are views of a three-dimensional display 104, in accordance with some other embodiments. This embodiment is similar to the embodiment of Figures 6A-6B, except the display panel 602 is not a direct- emissive panel. In this embodiment, the display panel 602 is a backlit panel, such as a liquid crystal display (LCD) panel. Accordingly, the display panel 602 includes liquid crystal pixel arrays 624 instead of light-emitting diode arrays. The microlenses 604 are placed over the liquid crystal pixel arrays 624. In some embodiments, the display panel 602 further includes reflective layers 626 under the liquid crystal pixel arrays 624. The reflective layers 626 are opaque layers that block light from passing through the portions of the three-dimensional display 104 where the microlenses 604 are located. Utilizing a backlit panel instead of a direct-emissive panel may reduce device costs. Although the microlenses 604 in Figure 17A are round microlenses and are laid out in a grid microlens array, the microlenses 604 may also be rectangular or hexagonal microlenses, and may also be laid out in a checkerboard microlens array, microlens strips, or a randomized microlens array.

[0101] Figure 18 is a flow chart of a method of manufacturing a three- dimensional display, in accordance with some embodiments. The display maybe an autostereoscopic display for an autostereoscopic device. The method may be implemented using appropriate steps of the processes described for Figures 8-12. [0102] In step 1802, a thin-film transistor layer and a light-emitting diode layer are diced to form a first display component and a second display component. The first display component includes a first transistor array and a first light-emitting diode array. The second display component includes a second transistor array and a second light-emitting diode array.

[0103] In step 1804, the first display component and the second display component are transferred to a transparent layer. In some embodiments, transferring the first display component and the second display component to the transparent layer includes patterning a first recess and a second recess in the transparent layer and placing the first display component and the second display component in, respectively, the first recess and the second recess. In some embodiments, transferring the first display component and the second display component to the transparent layer includes placing the first display component and the second display component on a top surface of the transparent layer.

[0104] In step 1806, a parallax barrier is formed on the transparent layer, the first light-emitting diode array, and the second light-emitting diode array. In some embodiments, forming the parallax barrier includes, before transferring the first display component and the second display component to the transparent layer, placing a first microlens and a second microlens on, respectively, the first light-emitting diode array and the second light-emitting diode array. In some embodiments, forming the parallax barrier includes, after transferring the first display component and the second display component to the transparent layer, placing a first microlens and a second microlens on, respectively, the first lightemitting diode array and the second light-emitting diode array.

[0105] Figures 19A-19B are schematic diagrams of an autostereoscopic display during display of a 3D image, in accordance with some other embodiments. This embodiment is similar to the embodiment of Figures 16A- 16B, e.g., where the three-dimensional display 104 is a far-eye display, except the three-dimensional display 104 may also be used to simultaneously display two- dimensional (2D) and 3D images. As such, the three-dimensional display 104 is a combination 2D-3D display. The three-dimensional display 104 further includes a plurality of 2D light-emitting units 1902 for displaying the 2D images. [0106] Similar to the previously described embodiments, the 3D lightemitting units 402 are spaced apart from one another, but in this embodiment, the 2D light-emitting units 1902 are disposed between the 3D light-emitting unit 402. The 3D light-emitting units 402 include microlenses, while the 2D lightemitting units 1902 do not include microlenses. This three-dimensional display 104 may be used to simultaneously display 3D images with the 3D light-emitting units 402 and display 2D images with the 2D light-emitting units 1902. The 3D light-emitting units 402 may be used to display far-field 3D images at a low resolution, while the 2D light-emitting units 1902 may be used to display nearfield 2D images at a high resolution. Using a single three-dimensional display 104 to display both near-field 2D images and far-field 3D images may allow for more accurate depth perception by a user without sacrificing excessive screen real estate.

[0107] In this context, a near-field 2D image is an image perceived as located on the surface of the display panel 602, and a far-field 3D image is an image perceived as being floating or sinking, relative to the surface of the display panel 602. Similar to the embodiments of Figures 16A-16B, the three-dimensional display 104 may be used to display floating 3D images, as illustrated in Figure 19A, or may be used to display sinking 3D images, as illustrated in Figure 19B. Floating 3D images are perceived by a user as being located in front of the 2D images displayed with the 2D light-emitting units 1902. Sinking 3D images are perceived by a user as being located behind the 2D images displayed with the 2D light-emitting units 1902.

[0108] Figures 20A-20B are views of a three-dimensional display 104, in accordance with some other embodiments. This embodiment is similar to the embodiment of Figures 6A-6B, except the display panel 602 includes far-field regions 602F for displaying far-field 3D images and near-field regions 602N for displaying near-field 2D images.

[0109] The display panel 602 of this embodiment may be formed differently than the display panels 602 of previously described embodiments. Specifically, the structure described for Figure 8 may be formed but not diced. A thin-film transistor layer 802 (including transistors 2002) may be formed, a light-emitting diode layer 804 (including the light-emitting diodes 2004) may be formed on the thin-film transistor layer 802, and microlenses 604 may be placed on the light- emitting diode layer 804 (e.g., by placing a microlens sheet 806 on the lightemitting diode layer 804). The microlenses 604 are aligned with respective groups of the light-emitting diodes 2004. Thus, each microlens 604 is over and aligned with an array of the light-emitting diodes 2004, e.g., over a light-emitting diode array. In this embodiment, the layers are not diced and transferred to a transparent layer. Rather, the thin-film transistor layer 802, the light-emitting diode layer 804, and the microlens sheet 806 are components of the three- dimensional display 104.

[0110] The microlenses 604 are placed on the far-field regions 602F. Accordingly, the 3D light-emitting units 402 include the microlenses 604 and the subset of the light-emitting diodes 2004 in the far-field regions 602F.

Microlenses are omitted from the near-field regions 602N, such that the nearfield regions 602N are free of microlenses. Accordingly, the 2D light-emitting units 1902 include the subset of the light-emitting diodes 2004 in the near-field regions 602N. The total area of the near-field regions 602N (e.g., the 2D lightemitting units 1902) is greater than the total area of the far-field regions 602F (e.g., the 3D light-emitting units 402). The 2D light-emitting units 1902 (which lack microlenses) are flat regions of the three-dimensional display 104, and the 3D light-emitting units 402 (which include microlenses) are raised regions of the three-dimensional display 104. The light-emitting diodes 2004 are uniformly distributed across the display panel 602, e.g., across the light-emitting diode layer 804. As such, the density of the light-emitting diodes 2004 in the far-field regions 602F (e.g., beneath the microlenses 604) is equal to the density of the lightemitting diodes 2004 in the near-field regions 602N (e.g., between the microlenses 604).

[0111] In this embodiment, the microlenses 604 are laid out in a grid microlens array. The microlenses 604 are disposed the same distance from one another along a first direction (e.g., the horizontal direction in Figure 20A) and along a second direction (e.g., the vertical direction in Figure 20A). Although the microlenses 604 in Figure 20A are round microlenses and are laid out in a grid microlens array, the microlenses 604 may also be rectangular or hexagonal microlenses, and may also be laid out in a checkerboard microlens array, microlens strips, or a randomized microlens array. In this embodiment, the ratio of the microlens separation distance to the microlens size may not vary across the three-dimensional display 104. Accordingly, the ratio may have a substantially identical value in all regions of the three-dimensional display 104. The microlenses 604 may be disposed a same distance from one another in the central portion and edge portions of the three-dimensional display 104.

[0112] The three-dimensional display 104 of this embodiment may (or may not) be optical see-through. In some embodiments, the transistors 2002 are formed on opaque layers, in which case the three-dimensional display 104 of this embodiment is not optical see-through. In some embodiments, the transistors 2002 are formed on transparent layers, in which case the three-dimensional display 104 of this embodiment is optical see-through. When the transistors 2002 are formed on transparent layers, other techniques may be utilized to block light from passing through the portions of the three-dimensional display 104 where the microlenses 604 are located, e.g., the far-field regions 602F of the display panel 602. In some embodiments, the transparency of the far-field regions 602F is programmatically changed (e.g., decreased and/or increased) during operation. For example, the light emitting diode diodes 2004 in the far-field regions 602F may be controlled during operation (by the controller 112, see Figure 2) to emit a certain type of light (e.g., bright white light) to overwhelm the eyes of a viewer and cause the viewer to perceive the far-field regions 602F as being opaque. A similar technique may be utilized to temporarily block visibility through the nearfield regions 602N, if desired. Other acceptable techniques may be utilized to change (e.g., decrease and/or increase) the transparency of the display panel 602. [0113] Some variations of the three-dimensional display 104 of this embodiment are contemplated. In some embodiments, the total area of the nearfield regions 602N (e.g., the 2D light-emitting units 1902) is less than the total area of the far-field regions 602F (e.g., the 3D light-emitting units 402). As such, the separation distance D between the microlenses 604 may be less than or greater than the size S of the microlenses 604. In either case, the separation distance D between the microlenses 604 is different from the size S of the microlenses 604. In some embodiments, the ratio of the microlens separation distance D to the microlens size S is in the range of 0.5 to 6.

[0114] Figure 21 is a three-dimensional schematic diagram of an autostereoscopic display during display of 2D and 3D images, in accordance with some other embodiments. To display a 3D image with the three-dimensional display 104, light fields are generated so that the device user perceives a virtual object at a displayed location 502 in three-dimensional space. To display a 2D image with the three-dimensional display 104, pixels of the display panel 602 are illuminated so that the device user perceives a two-dimensional image 2102 located on the display panel 602. The 2D and 3D images are displayed simultaneously with the same display panel 602.

[0115] As previously noted, the 3D light-emitting units 402 may be used to display 3D images, and the 2D light-emitting units 1902 may be used to display 2D images. In some embodiments, 3D operation of the three-dimensional display 104 may be disabled, in which case the 3D light-emitting units 402 may also be used to display 2D images. Because the microlenses 604 are transparent, an image displayed with the light-emitting diodes 2004 beneath the microlenses 604 may still be perceived by a user as a 2D image located on the surface of the display panel 602, as a result of the user’s eyes focusing on the surface of the display panel 602.

[0116] Figure 22 is a flow chart of a method of switching operation of a three- dimensional display, in accordance with some embodiments. The method maybe used to switch between 2D and 3D operation of a three-dimensional display 104, e.g., to enable or disable 3D operation of the three-dimensional display 104. In step 2202, input is received from a user. The input may be through a user interface of an autostereoscopic device. If, in step 2204, the input indicates the three-dimensional display 104 should operate in 3D, then, in step 2206, the far- field regions 602F (e.g., the 3D light-emitting units 402) are controlled to display 3D images and the near-field regions 602N (e.g., the 2D light-emitting units 1902) are controlled to display 2D images. If, in step 2204, the input indicates the three-dimensional display 104 should operate in 2D, then, in step 2208, the far- field regions 602F (e.g., the 3D light-emitting units 402) and the near-field regions 602N (e.g., the 2D light-emitting units 1902) are both controlled to display 2D images. When displaying a 2D image, the size of the 2D image is larger than the size of the microlenses 604, in which case the 2D image displayed with the light-emitting diodes 2004 beneath the microlenses 604 may still be perceived by a user as being 2D.

[0117] Figure 23 is a flow chart of a method of manufacturing a three- dimensional display, in accordance with some embodiments. The display maybe a combination 2D-3D display for an autostereoscopic device. The method may be implemented using appropriate steps of the previously described processes.

[0118] In step 2302, a thin-film transistor layer is formed. The thin-film transistor layer includes transistors. In some embodiments, the thin-film transistor layer is opaque. In some embodiments, the thin-film transistor layer is transparent.

[0119] In step 2304, a light-emitting diode layer is formed over the thin-film transistor layer. The light-emitting diode layer includes light-emitting diodes. The transistors of the thin-film transistor layer are configured to control the lightemitting diodes.

[0120] In step 2306, a parallax barrier is formed over the light-emitting diode layer. The parallax barrier may be formed by placing a first microlens and a second microlens over the light-emitting diode layer, such as by placing a microlens sheet over the light-emitting diode layer. The first microlens is aligned with a first array of the light-emitting diodes. The second microlens is aligned with a second array of the light-emitting diodes. Further, the second microlens is spaced apart from the first microlens.

[0121] It should be appreciated that one or more steps of the embodiment methods provided herein may be performed by corresponding units or modules. For example, a signal may be transmitted by a transmitting unit or a transmitting module. A signal may be received by a receiving unit or a receiving module. A signal may be processed by a processing unit or a processing module. Other steps may be performed by an analyzing unit/module, a rendering unit/module, an input unit/module, an output unit/module, a displaying unit/module, a controlling unit/module, a sensing unit/module, and/or a networking unit/module. The respective units/modules maybe hardware, software, or a combination thereof. For instance, one or more of the units/modules maybe an integrated circuit, such as field programmable gate arrays (FPGAs) or application-specific integrated circuits (ASICs).

[0122] Although the description has been described in detail, it should be understood that various changes, substitutions and alterations can be made without departing from the spirit and scope of this disclosure as defined by the appended claims. Moreover, the scope of the disclosure is not intended to be limited to the particular embodiments described herein, as one of ordinary skill in the art will readily appreciate from this disclosure that processes, machines, manufacture, compositions of matter, means, methods, or steps, presently existing or later to be developed, may perform substantially the same function or achieve substantially the same result as the corresponding embodiments described herein. Accordingly, the appended claims are intended to include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or steps.