WILSON AUSTIN (US)
WO2019173158A1 | 2019-09-12 | |||
WO2018165119A1 | 2018-09-13 |
US20170256095A1 | 2017-09-07 | |||
US20180292655A1 | 2018-10-11 | |||
US20190171005A1 | 2019-06-06 | |||
US20180024373A1 | 2018-01-25 |
AUSTIN WILSON ET AL.: "Design and demonstration of a vari-focal optical see-through head-mounted display using freeform Alvarez lenses", OPTICS EXPRESS, vol. 27, no. 11, 27 May 2019 (2019-05-27), pages 15627 - 15637, XP055803600, DOI: https://doi.org/10.1364/OE.27.015627
WILSON AUSTIN, HUA HONG: "Design and prototype of an augmented reality display with per-pixel mutual occlusion capability", OPTICS EXPRESS, vol. 25, no. 24, 21 November 2017 (2017-11-21), pages 30539 - 30549, XP055803610, DOI: https://doi.org/10.1364/OE.25.030539
See also references of EP 4028827A4
CLAIMS WHAT IS CLAIMED IS: 1. An occlusion-capable optical see-through head-mount display (OCOST-HMD), comprising: a polarization element configured to receive light from a real scene and to produce polarized light at the output thereof; a polarizing beam splitter (PBS); an objective lens; a spatial light modulator (SLM); an eyepiece lens; a quarter wave plate (QWP); and a reflective optical element configured to reflect substantially all or a portion of light that is incident thereupon in a first direction, and to transit substantially all or a portion of light received from a microdisplay that is incident thereupon from a second direction, wherein the SLM and the objective lens form a first double-pass configuration that allows at least a portion of light that passes through the objective lens to be reflected from the SLM and to propagate again through the objective lens, and the eyepiece lens and the reflective optical element form a second double-pass configuration that allows at least a portion of light that passes through the eyepiece lens to be reflected from the reflective optical element and to propagate again through the eyepiece lens. 2. The OCOST-HMD of claim 1, wherein: the PBS is positioned to receive the polarized light and reflect the polarized light towards the objective lens, the PBS is positioned to receive, and transmit therethrough toward the eyepiece lens, light that is output from the first double-pass configuration, and to reflect light that the PBS receives from the second double-pass configuration, including light from the microdisplay, towards a position of a human eye. 3. The OCOST-HMD of claim 1, further comprising a first reflecting surface, wherein, the PBS is positioned to: receive the polarized light and transmit therethrough the polarized light towards the objective lens, receive, and reflect toward the eyepiece lens, light that is output from the first double-pass configuration, reflect light that the PBS receives from the second double-pass configuration, including light from the microdisplay, towards the first reflecting surface, and wherein the first reflecting surface is positioned to reflect light that is incident thereupon towards a position of a human eye. 4. The OCOST-HMD of claim 1, wherein the SLM is configured to modulate the light that is incident thereupon. 5. The OCOST-HMD of claim 4, wherein the SLM is configured to operate in an on-off modulation mode. 6. The OCOST-HMD of claim 1, further including an occlusion mask corresponding to a virtual image presented on the microdisplay, wherein the occlusion mask is used to effectuate modulation of one or more regions of the SLM. 7. The OCOST-HMD of claim 1, further including the microdisplay. 8. The OCOST-HMD of claim 7, wherein the reflective optical element is positioned on a surface of the microdisplay. 9. The OCOST-HMD of claim 7, wherein the microdisplay includes an organic light emitting diode (OLED) device. 10. The OCOST-HMD of claim 1, wherein the QWP is positioned between the eyepiece lens and the reflective optical element. 11. The OCOST-HMD of claim 1, wherein the QWP is positioned between the eyepiece lens and the PBS. 12. The OCOST-HMD of claim 1, wherein the SLM includes a liquid crystal on silicon (LCoS) device. 13. The OCOST-HMD of claim 1, wherein the OCOST-HMD is configured to produce an erect image without using a roof prism. 14. The OCOST-HMD of claim 1, wherein the OCOST-HMD provides a pupil- matched optical configuration that maps a user's pupil, or relayed pupil, back to the user's eye position to enable a correct view point disparity to be maintained. 15. The OCOST-HMD of claim 1, wherein the OCOST-HMD is configured to produce a field of view (FOV) that is not limited by the eyepiece lens in at least one direction. 16. The OCOST-HMD of claim 1, wherein the OCOST-HMD has a field of view (FOV) greater than 40 degree diagonally and an optical performance that is greater than 20% modulation contrast over a full FOV. 17. The OCOST-HMD of claim 1, wherein the OCOST-HMD has a see-through field of view (FOV) of 90 degrees by 40 degrees with an angular resolution of 1.0 arc minutes. 18. The OCOST-HMD of claim 1, wherein a least a portion of the OCOST-HMD corresponds to a set of two afocal 4f relays that image an entrance pupil to a conjugate intermediate pupil location. 19. The OCOST-HMD of claim 1, wherein the OCOST-HMD forms a single-layer, double-pass, pupil matched OCOST-HMD. 20. The OCOST-HMD of claim 1, comprising one or both of the following: (a) an objective lens group that includes the objective lens, or (b) an eyepiece lens group that includes the eyepiece lens. 21. An occlusion-capable optical see-through head-mount display (OCOST-HMD), comprising: a polarizer to produce polarized light associated with a real scene; a beam splitter (PBS); an objective lens; a spatial light modulator (SLM); an eyepiece lens; a retarder; and a half-mirror configured to reflect substantially all of light associated with an occlusion mask that is incident thereupon in a first direction, and to transit substantially all of light associated with a virtual scene that is incident thereupon from a second direction, wherein the PBS is positioned to: receive and direct the polarized light toward the SLM, receive and direct the light associated with the virtual scene toward a position for viewing by a user’s eye, receive and direct the light associated with the occlusion mask toward the half mirror, the SLM is configured to modulate the light incident thereon in accordance with a two- dimensional shape of the occlusion mask, and the OCOST-HMD is configured to produce an erect image, and the position of a user's pupil, or relayed pupil, is mapped to the position of the user’s eye to enable a correct view point disparity to be maintained. |
Table 1. Specifications of an Example System
[0016] One of the key parameters driving the example design is the choice of display technologies. We chose a 0.85" Ernagin OLED microdisplay for the virtual display path. The eMagin OLED, having an effective area of 18.4mm and 11.5mm and an aspect ratio of 8:5, offers pixel size of 9.6mm at a native resolution of 1920x1200 pixels. Based on this microdisplay, we aimed to achieve an OCOST-HMD prototype with a diagonal FOV of > 40°, or 34° horizontally and 22° vertically, and an angular resolution of 1.06 arcmins per pixel, corresponding to aNyquist frequency of 53 cycles/mm in the microdisplay space or 28.6 cycles/degree in the visual space. Separately, for the SLM path, we used a 0.7" reflective LCoS from a projector. A reflective SLM was chosen for its substantial advantage in light efficiency, contrast and low diffraction artifacts, commonly found in a light transmitting SLM used in previous works. [0017] The selected LCoS offers a native resolution of 1400x1050 pixels, a pixel pitch of 10.7mm, and an aspect ratio of 4:3. Based on the different display specifications of the SLM, we aimed to achieve an optical mask diagonal FOV of > 42.5°, or 34° horizontally and 25.5° vertically, and an angular resolution of 1.45 arcmins per pixel, corresponding to a Nyquist frequency of 47 cycles/mm in the SLM space or 19.66 cycles/degree in the visual space. Further, our system requires an objective focal length of 24.4mm and an eyepiece focal length of 29.8mm giving a relay magnification of 1:1.22. To allow eye rotation of about ±25° within the eye socket without causing vignetting, we set an exit pupil diameter (EPD) of 10mm. An eye clearance distance of 20mm was used to allow a fit for most head shapes.
[0018] To achieve a high optical performance over the three optical paths, we optimized the system using 3 zoom configurations, each corresponding to a different optical path and design specification. FIG. 3 illustrates the lens layout of the final OCOST-HMD of the prototype design. The light path for the virtual display (eyepiece) is denoted by the rays designated with the dashed rectangle, while the light path for the SLM (relay + eyepiece) is shown by the rays designated with the dashed oval and the see-through path (objective + relay + eyepiece) is denoted by the rays designated with the dashed circle. It should be noted that the see-through path overlaps with the microdisplay and the SLM paths after tire PBS and thus only the virtual display rays are traced to the eye pupil.
[0019] Overall, the final lens design of the prototype example in FIG. 3 includes 10 lenses, including 5 stock crown glass lenses, 3 acrylic aspheric lenses and 2 custom flint glass lenses. Lenses labeled as 2 to 6 form the eyepiece group. Lenses labeled as 8 to 12 form the objective group. The half-mirror is illustrated as the gray element between the quarter-wave plate and the OLED 7. A wire grid polarizer and Q WP film in conjunction with a single custom PBS frorn Moxtek was used to manipulate the polarization. The system was optimized for 3 wavelengths, 430, 555, and 625nm with weights of 1, 2 and 1, respectively, in accordance to the dominant wavelengths of the OLED microdisplay. To ensure the system was properly pupil-matched for correct viewing perspective in accordance with Eq. (1) and (2), the objective and eyepiece were optimized to have a chief ray deviation of less than ±0.5°, demonstrating image space telecentricily. The eyepiece lenses, we were cropped to achieve an eye clearance of 20mm and a 10mm EPD.
[0020] Tables 2-9 provide the optics prescriptions for the virtual display path and the see-through path, respectively, for the above prototype system. Both of the optical paths were ray-traced from the exit pupil of the system, which coincides with the entrance pupil of the eye. The term "Asphere" in the Tables refers to an aspherical surface which may be represented by the equation where z is the sag of the surface measured along the z-axis of a local x, y, z coordinate system, c is the vertex curvature, r is the radial distance, k is the conic constant, A through E are the 4th, 6th, 8th, 10th and 12th order deformation coefficients, respectively. Tables 4 through 9 provide the aspheric coefficients for the aspheric surfaces 11, 12, 15, 16, 23, and 24 respectively. Table 2: System prescription for the virtual display path
Table 3: System prescription for the see-through path
Table 4. Surface Prescription for Surface #11 of Table 2. Table 5. Surface Prescription for Surface #12 of Table 2. Table 6. Surface Prescription for Surface #15 of Table 3. Table 7. Surface Prescription for Surface #16 of Table 3.
Table 8. Surface Prescription for Surface #23 of Table 3. Table 9. Surface Prescription for Surface #24 of Table 3. [0021] The simulated optical performance of the double pass OCOST-HMD prototype system was assessed over the full FOV in the display space where the spatial frequencies are characterized in terms of cycles per millimeter. In the example provided, the optical performance of the see-through path is limited to a 40° diagonal in accordance to the field that is passed through the system and optically overlaid on the virtual and masked image. Light from the real scene outside this field passes though only a single PBS and is not optically affected and should be otherwise seen at the native resolution of the human eye. [0022] FIGS. 4A to 4C show the polychromatic modulation transfer function (MTF) curves, evaluated with a 4-mm eye pupil, such that it is similar to the human eye, for several weighted fields of the virtual display, SLM and the see-through paths. In these figures, the modulation transfer function of 0, 6, 10, 15 and 20 degree are evaluated for the transverse (Tan) and radial (Rad) diagonal half FOV with a 4mm pupil diameter and a cutoff spatial frequency of 53 cycles/mm are plotted for the OCOST-HMD OLED optical path (FIG. 4A), OCOST-HMD SLM optical path (FIG. 4B), and 110 cycles/degree for the OCOST-HMD see- through optical path (FIG. 4C). Staring with FIG. 4A, the OLED display optical performance preserves roughly an average of 40% modulation, over the full field, at the designed Nyquist frequency of 52 cycles/mm, corresponding to the 9.6mm pixel size. The optical performance of the LCoS, shown in FIG. 4B demonstrates an average of 50% modulation over the full field, at the designed Nyquist frequency of 47 cycles/mm, corresponding to the 10.7 mm pixel size. Lastly, FIG. 4C shows the see-through optical performance which maintains an average modulation of 15% at the human eye's cutoff frequency of 110 cycles/mm corresponding to a 1.0 arcminute resolution or 20/20 human visual acuity.
|0023] FIG. 5 shows the distortion grid of the see-through optical path over the full 40- degree diagonal FOV that is overlapped with the virtual inage. In the figure, the actual and paraxial FOV are almost coincident, illustrating strong agreement between the two. Per our optical specifications seen in Table 1 above, the see-through path has < 1% distortion over the full field. This is important because unlike the display distortion that can be digitally' corrected, the see-through path cannot. The distortion for the microdisplay (virtual image) was held under 5% while the distortion for the SLM (optical mask) was held to 20% for digital correction.
[0024] Along with the MTF and distortion, several other metrics were used to characterize the optical performance of the virtual display path, such as wave front error and spot diagram. Both the SLM and microdisplay paths suffer largely from lateral chromatic aberration and coma. This is due to the non-pupil forming, telecentre design of the eyepiece utilized in both the SLM and microdisplay path not allowing for the stop position to be moved to balance off-axis aberrations. Overall the wavefront aberration in each of the three paths is sufficiently low, being under 1 wave. The average root mean square (RMS) spot diameter across the field is 9mm for both the see-through path and the display path however jumps to 16.4mm for the SLM path due to such large allowed distortion. Although it appears to be larger than the 10.7mm pixel size, this difference is largely due to lateral chromatic aberration, and can be corrected.
Example System Prototype and Experimental Demonstration
[0025] FIG. 6 illustrates an example OCOST-HMD layout and prototype according to the disclosed embodiments. Panel (a) shows the front view of the fully assembled OCOST- HMD Solidwords CAD design in reference to an average human head. Panel (b) shows the side view of the fully assembled OCOST-HMD CAD design, where it demonstrates the greatly reduced form factor due to the double pass architecture. Due to multiple passes of the light through the same optical path, die optical performance is sensitive to the optical and mechanical tolerancing. For the mechanical design, individual lenses were held by set screws to achieve more compensation and lower the tolerance stack in the mechanical design to meet the minimum MTF requirements. Panels (c) and (d) show a front view and an angled view, respectively, of the binocular prototype of the OCOST-HMD system built upon the optical design in FIG. 3, further illustrating a quarter (coin) to provide a size metric. The overall height and width of the prototyped OCOST-HMD system was 130mm by 140mm, with a depth of 25mm and an adjustable intraocular distance of 50-70mm.
|0026] FIG. 7 show's a qualitative demonstration of the occlusion capability of the prototyped OCOST-HMD of FIGS. 3 and 6. A real-world background scene comprising common objects (text, cans and boxes with different colors, shapes and printed fonts) were used to provide several different spatial frequencies and object depths; these items were placed against a well-illuminated white background wall (-300-500 cd'm2). The virtual 3D scene used in this example was that of a simple image of a wildcat. Panels (a) to (c) in FIG. 7 show a set of images captured with a digital camera placed at the exit pupil of the eyepiece. The same 16mm camera lens and 1/3" Point Grey image sensor with 3.75 mm pixel pitch was used with an increased 4mm aperture, to better match the F/# of the human eye under typical lighting conditions.
[0027] By simply turning on the OLED microdisplay and applying no modulated mask to the SLM, panel (a) in FIG. 7 shows the augmented view of the real-world and virtual scenes without the occlusion capability enabled. Without the mask occluding the see-through path, the “wildcat” figure with a cowboy hat (see panels (b) for a better outline of the wildcat) looks washed out, appearing transparent and unrealistic due to the brightness of the background scene shared with the virtual display. This causes the depth of the wildcat to be spatially ambiguous.
[0028] Panel (b) portrays the opposite situation: a view of the real-world scene when the occlusion mask was displayed on the SLM but no virtual content was shown on the OLED display. This validates that the mask can effectively block the superimposed portion of the sea-through view.
[0029] Panel (c) shows a view captured with the mask on the SLM and the virtual scene displayed on the OLED display, where the virtual wildcat is inserted between twO real objects, demonstrating the mutual occlusion capability of the system In this case, die full capability and correct depth perception along with improved contrast is rendered. By knowing the relative location of the WD-40 canister, which is meant to occlude part of the wildcat figure, we removed the pixels that correspond to the projection of the occluding canister on the virtual display from the wildcat rendering. Again, the significance of the result is that correct occlusion relationships can be created and used to give an unparalleled sense of depth to a virtual image in an OST-HMD. [0030] The disclosed double-pass OCOST-HMD system can achieve a high optical performance and dynamic range of the real and virtual content with a significantly improved formfactor, viewpoint perspective and technical specifications over our previous OCOST- HMD design. [0031] Example Optical Performance Test Results: The vertical and horizontal FOV of the example system was measured for each optical path. It was determined that the see- through FOV was ~90º horizontally and ~40º vertically with an occlusion capable see-through FOV ~34º horizontally and ~26º vertically, while the virtual display had an FOV of ~33.5º horizontally and ~23º vertically, giving a measured diagonal Full FOV of 41.6º. Due to our improved double pass architecture and added optical relay the LCoS can fully occlude the virtually displayed scene. [0032] The optical performance of the prototype system was further quantified by characterizing the MTF performance of the three optical paths through the prototype. A high- performance camera, consisting of a nearly diffraction-limited 25mm camera lens by Edmund Optic and a 1/2.3" Point Greyimage sensotof a1.55mm pixel pitch wasplaced atthe exit pupil of the system. It offers an angular resolution of about 0.5 arcminutes per pixel, significantly higher than the anticipated performance of the prototype. Therefore, it is assumed that no loss of performance to the MTF was caused by the camera. The camera then captured images of a slanted edge target, which were either displayed by the microdisplay at an angle or a printed target placed in the see-through view. To provide a separable quantification of the performance for the virtual and see-through path, the virtual image of a slanted edge was taken while the see-through scene was completely blocked by the SLM. Similarly, the see-through image of the target was taken with the microdisplay turned off. The captured slanted-edge images were analyzed using Imatest® software to obtain the MTF of the corresponding light paths. [0033] FIG. 8 shows the measured on-axis MTF performance of the SLM, OLED and optical see-through path and camera, along with each individual slanted edge. Due to the magnification difference between the pixel pitch of the camera sensor and the microdisplay and SLM, the horizontal axis of the MTF measurement by Imatest® was scaled by the pixel magnification difference between the camera and display and then converted to define the spatial frequency in the visual display space in terms of cycles/mm. The prototyped design was able to achieve a contrast greater than 50% at the Nyquist frequency 53 cycles/mm of the virtual display and similar performance for the SLM path. While the modulation contrast for the see-through path was about 15% at the cut-off frequency of 110 cycles/mm corresponding to 1 arcminute. The curves shown in Figure 8 closely resemble the on-axis curves in Figure 3, demonstrating that the resolvability of three optical paths through the occlusion module is nearly intact with the designed specifications originally set out and a human viewer. [0034] We measured the image contrast between the virtual display and the real-world scene as a function of the real-world scene brightness for different spatial frequencies. A grayscale solid image, ranging from black to white in 10 linear steps, was displayed on an LCD monitor to create a controlled background scene with varying luminance from 0 to 350cd/m2. The monitor was placed roughly 10cm in front of the OCOST-HMD system to simulate an array of real scene brightness. A sinusoidal grating pattern with a spatial frequency ranging from 0.88 to 28.2 cycles/degree was displayed on the OLED microdisplay (virtual path) to evaluate the effect of scene brightness on the image contrast of the virtual scene at different spatial frequencies. The fall-off in contrast to the virtual scene was then plotted and compared with occlusion enabled (SLM blocking see-through light) and without occlusion (SLM passing see-through light). [0035] FIGS. 9A and 9B illustrate plots of the virtual object contrast with the see- through path un-occluded and occluded, respectively. We can observe that the contrast of the virtual object without occlusion (Figure 9A) quickly deteriorated to zero for a well-lit environment luminance above 300 cd/m2, while the contrast of the virtual target with occlusion of the real scene (Figure 9B) was nearly constant over an increased brightness. We further measured the obtainable contrast ratio by measuring a collimated depolarized light source through the system with full occlusion being enabled and disabled. The dynamic range of the occlusion system was determined to be greater than 100:1. [0036] One aspect of the disclosed embodiments relates to an occlusion-capable optical see-through head-mount display (OCOST-HMD) that includes a polarization element configured to receive light from a real scene and to produce polarized light at the output thereof, a polarizing beam splitter (PBS), an objective lens, a spatial light modulator (SLM), an eyepiece lens, a quarter wave plate (QWP), and a reflective optical element configured to reflect substantially all or a portion of light that is incident thereupon in a first direction, and to transit substantially all or a portion of light received from a microdisplay that is incident thereupon from a second direction. The SLM and the objective lens form a first double-pass configuration that allows at least a portion of light that passes through the objective lens to be reflected from the SLM and to propagate again through the objective lens. The eyepiece lens and the reflective optical element form a second double-pass configuration that allows at least a portion of light that passes through the eyepiece lens to be reflected from the reflective optical element and to propagate again through the eyepiece lens. [0037] In one example embodiment, the PBS is positioned to receive the polarized light and reflect the polarized light towards the objective lens; the PBS is also positioned to receive, and transmit therethrough toward the eyepiece lens, light that is output from the first double- pass configuration, and to reflect light that the PBS receives from the second double-pass configuration, including light from the microdisplay, towards a position of a human eye. In another example embodiment, the OCOST-HMD further includes a first reflecting surface, wherein, the PBS is positioned to (a) receive the polarized light and transmit therethrough the polarized light towards the objective lens, (b) receive, and reflect toward the eyepiece lens, light that is output from the first double-pass configuration, and (c) reflect light that the PBS receives from the second double-pass configuration, including light from the microdisplay, towards the first reflecting surface. In this example embodiment, the first reflecting surface is positioned to reflect light that is incident thereupon towards a position of a human eye. [0038] According to one example embodiment, the SLM is configured to modulate the light that is incident thereupon. For example, the SLM is configured to operate in an on-off modulation mode. In another example embodiment, the OCOST-HMD further includes an occlusion mask corresponding to a virtual image presented on the microdisplay, wherein the occlusion mask is used to effectuate modulation of one or more regions of the SLM. In yet another example embodiment, the OCOST-HMD further includes the microdisplay. In still another example embodiment, the reflective optical element is positioned on a surface of the microdisplay. According to another example embodiment, the microdisplay includes an organic light emitting diode (OLED) device. [0039] In another example embodiment, the QWP is positioned between the eyepiece lens and the reflective optical element. In one example embodiment, the QWP is positioned between the eyepiece lens and the PBS. In another example embodiment, the SLM includes a liquid crystal on silicon (LCoS) device. In still another example embodiment, the OCOST-HMD is configured to produce an erect image without using a roof prism. In another example embodiment, the OCOST-HMD provides a pupil-matched optical configuration that maps a user's pupil, or relayed pupil, back to the user's eye position to enable a correct view point disparity to be maintained. According to yet another example embodiment, the OCOST-HMD is configured to produce a field of view (FOV) that is not limited by the eyepiece lens in at least one direction. [0040] In another example embodiment, the OCOST-HMD has a field of view (FOV) greater than 40 degree diagonally and an optical performance that is greater than 20% modulation contrast over a full FOV. In one example embodiment, the OCOST-HMD has a see- through field of view (FOV) of 90 degrees by 40 degrees with an angular resolution of 1.0 arc minutes. In yet another example embodiment, a least a portion of the OCOST-HMD corresponds to a set of two afocal 4f relays that image an entrance pupil to a conjugate intermediate pupil location. In another example embodiment, the OCOST-HMD forms a single-layer, double-pass, pupil matched OCOST-HMD. In one some example embodiments, the OCOST-HMD includes one or both of the following: (a) an objective lens group that includes the objective lens, or (b) an eyepiece lens group that includes the eyepiece lens. [0041] Another aspect of the disclosed embodiments relates to an occlusion-capable optical see-through head-mount display (OCOST-HMD) that includes a polarizer to produce polarized light associated with a real scene, a beam splitter (PBS), an objective lens, a spatial light modulator (SLM), an eyepiece lens, a retarder, and a half-mirror configured to reflect substantially all of light associated with an occlusion mask that is incident thereupon in a first direction, and to transit substantially all of light associated with a virtual scene that is incident thereupon from a second direction. In this configuration, the PBS is positioned to (a) receive and direct the polarized light toward the SLM, (b) receive and direct the light associated with the virtual scene toward a position for viewing by a user’s eye, and (c) receive and direct the light associated with the occlusion mask toward the half mirror. The SLM is configured to modulate the light incident thereon in accordance with a two-dimensional shape of the occlusion mask. The OCOST-HMD is configured to produce an erect image, and the position of a user's pupil, or relayed pupil, is mapped to the position of the user’s eye to enable a correct view point disparity to be maintained. [0042] FIG. 10 illustrates a block diagram of a device 1000 that can be used to implement certain aspects of the disclosed technology. For example, the device of FIG. 10 can be used to receive, process, store, provide for display and/or transmit various data and signals associated with disclosed image sensors that capture and process images, and/or microdisplays, and SLMs to enable control, display, storage and processing of the virtual content and the occlusion masks, as well as brightness control, light modulation or other operations associated with electronic and opto-electronic components disclosed herein. The device 1000 comprises at least one processor 1004 and/or controller, at least one memory 1002 unit that is in communication with the processor 1004, and at least one communication unit 1006 that enables the exchange of data and information, directly or indirectly, through the communication link 1008 with other entities, devices, databases and networks. The communication unit 1006 may provide wired and/or wireless communication capabilities in accordance with one or more communication protocols, and therefore it may comprise the proper transmitter/receiver, antennas, circuitry and ports, as well as the encoding/decoding capabilities that may be necessary for proper transmission and/or reception of data and other information. The exemplary device 1000 of Figure 10 may be integrated as part of larger component (e.g., a server, a computer, tablet, smart phone, etc.) that can be used for performing various computations, methods or algorithms disclosed herein. [0043] The processor(s) 1004 may include central processing units (CPUs) to control the overall operation of, for example, the host computer. In certain embodiments, the processor(s) 1004 accomplish this by executing software or firmware stored in memory 1002. The processor(s) 1004 may be, or may include, one or more programmable general-purpose or special-purpose microprocessors, digital signal processors (DSPs), programmable controllers, application specific integrated circuits (ASICs), programmable logic devices (PLDs), graphics processing units (GPUs), or the like, or a combination of such devices. [0044] The memory 1002 can be or can include the main memory of a computer system. The memory 1002 represents any suitable form of random access memory (RAM), read-only memory (ROM), flash memory, or the like, or a combination of such devices. In use, the memory 1002 may contain, among other things, a set of machine instructions which, when executed by processor 1004, causes the processor 1004 to perform operations to implement certain aspects of the presently disclosed technology. [0045] While this patent document contains many specifics, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features that may be specific to particular embodiments of particular inventions. Certain features that are described in this patent document in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination. [0046] Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Moreover, the separation of various system components in the embodiments described in this patent document should not be understood as requiring such separation in all embodiments. [0047] It is understood that the various disclosed embodiments may be implemented individually, or collectively, in devices comprised of various optical components, electronics hardware and/or software modules and components. These devices, for example, may comprise a processor, a memory unit, an interface that are communicatively connected to each other, and may range from desktop and/or laptop computers, to mobile devices and the like. The processor and/or controller can perform various disclosed operations based on execution of program code that is stored on a storage medium. The processor and/or controller can, for example, be in communication with at least one memory and with at least one communication unit that enables the exchange of data and information, directly or indirectly, through the communication link with other entities, devices and networks. The communication unit may provide wired and/or wireless communication capabilities in accordance with one or more communication protocols, and therefore it may comprise the proper transmitter/receiver antennas, circuitry and ports, as well as the encoding/decoding capabilities that may be necessary for proper transmission and/or reception of data and other information. For example, the processor may be configured to receive electrical signals or information from the disclosed sensors (e.g., CMOS sensors), and to process the received information to produce images or other information of interest. [0048] Various information and data processing operations described herein may be implemented in one embodiment by a computer program product, embodied in a computer- readable medium, including computer-executable instructions, such as program code, executed by computers in networked environments. A computer-readable medium may include removable and non-removable storage devices including, but not limited to, Read Only Memory (ROM), Random Access Memory (RAM), compact discs (CDs), digital versatile discs (DVD), etc. Therefore, the computer-readable media that is described in the present application comprises non-transitory storage media. Generally, program modules may include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Computer-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of the methods disclosed herein. The particular sequence of such executable instructions or associated data structures represents examples of corresponding acts for implementing the functions described in such steps or processes [0049] Only a few implementations and examples are described and other implementations, enhancements and variations can be made based on what is described and illustrated in this patent document.