Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
DISPLAY OPTICAL SYSTEMS FOR STEREOSCOPIC IMAGING SYSTEMS WITH REDUCED EYE STRAIN
Document Type and Number:
WIPO Patent Application WO/2021/003009
Kind Code:
A1
Abstract:
A display optical system for a stereoscopic imaging system for use by a viewer for reducing eye strain caused by convergence-accommodation mismatch. The system includes an image generator that generates a display image and that emits display light. A relay lens section is arranged to receive the display light and includes an adjustable imaging lens assembly configured receive the display light and therefrom form a real display image having a spherically curved focus surface at a time-varying real-image position along the first optical axis. A virtual imaging section is operably arranged relative to the relay lens section and is configured to form from the real image a virtual image as seen by the viewer. The virtual image has a time-varying virtual-image position along a second optical axis due to the time-varying real-image position.

Inventors:
COBB JOSHUA MONROE (US)
Application Number:
PCT/US2020/037363
Publication Date:
January 07, 2021
Filing Date:
June 12, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
CORNING INC (US)
International Classes:
G02B27/01; G02B30/52
Domestic Patent References:
WO2017062419A12017-04-13
Foreign References:
US20110075257A12011-03-31
US20150346495A12015-12-03
EP1798587A12007-06-20
EP1267197A22002-12-18
Attorney, Agent or Firm:
LANG, Amy T. (US)
Download PDF:
Claims:
What is claimed is:

1. A display optical system for a stereoscopic imaging system for use by a viewer, comprising:

an image generator that generates a display image and that emits display light; a relay lens section arranged to receive the display light, the relay lens section comprising along a first optical axis an adjustable imaging lens assembly configured receive the display light and form a real display image having a curved focus surface at a time- varying real-image position along the first optical axis; and

a virtual imaging section operably arranged relative to the relay lens section and configured to form from the real image a virtual image as seen by the viewer, the virtual image having a time-varying virtual-image position along a second optical axis due to the time-varying real-image position.

2. The display optical system according to claim 1, wherein the relay lens section comprises:

a beam splitting element that resides along the first optical axis and that defines the second optical axis; and

a concave mirror having a spherical reflecting surface and that resides along the first optical axis optically downstream of the beam splitting element.

3. The display optical system according to claim 1 or claim 2, wherein the virtual imaging section has a depth of field DOF, and wherein the time-varying real-image position has an axial range DReo that is at least twice the depth of field.

4. The display optical system according to claim 3, wherein the axial range DReo is from 1 millimeter to 13 millimeters.

5. The display optical system according to any one of claims 1-3, wherein the time- varying real-image position varies with a frequency/ in the range from 60 Hertz (Hz) to 250 Hz.

6. The display optical system according to any one of claims 1-3 and 5, wherein the relay lens section comprises:

an image source configured to provide a time-varying display image to the image generator;

a controller operably connected to the image source and configured to control the time-varying real-image position of the real image; and

wherein the image source and controller operate to synchronize the time-varying display image with the time-varying real image position.

7. The display optical system according to any of claims 1-3, 5 and 6, wherein the relay lens section comprises an adjustable imaging lens assembly having an imaging lens with an adjustable focal length.

8. The display optical system according to any of claims 1-3, 5 and 6, wherein the relay lens section comprises an adjustable imaging lens assembly having an imaging lens with an adjustable axial position.

9. The display optical system according to any one of claims 1-3, 5-8, wherein the time- varying real-image position generates a time-varying magnification of the real image, and wherein the image generator is configured to adjust a size of the display image to compensate for the time-varying magnification.

10. The display optical system according to any one of claims 1-3, 5-9, wherein the relay- lens section comprises a light-diffusing system comprising a light-diffusing element that resides at the real image and that is movable to reside at the time-varying real-image position.

11. A stereoscopic imaging system, comprising:

first and second display optical systems according to any one of claims 1-3, 5-9, wherein at least a portion of the first and second display optical systems are arranged side by side.

12. The display optical system according to any one of claims 7-11, wherein the imaging lens with the adjustable focal length includes a liquid lens.

13. A display optical system for a stereoscopic imaging system for use by a viewer, comprising:

a relay lens section arranged to receive display light from a display, the relay lens section comprising an adjustable imaging lens assembly configured receive the display light and form a real display image having a time-varying real-image position that varies along a first optical axis over a range DReo from 1 mm to 13 mm at a frequency/ in the range from 60 Hertz (Hz) to 250 (Hz); and

a virtual imaging section having a depth of field DOF and operably arranged relative to the relay lens section to form from the real image a virtual image as seen by the viewer, the virtual image having a time-varying virtual-image position along a second optical axis due to the time-varying real-image position, and wherein DReo > 2-DOF.

14. The display optical system according to claim 13, wherein the relay lens section comprises:

a beam splitting element that resides along the first optical axis and that defines the second optical axis; and

a concave mirror having a spherical reflecting surface and that resides along the first optical axis optically downstream of the beam splitting element.

15. The display optical system according to claims 13 or 14, wherein the relay lens section comprises an adjustable imaging lens assembly having either an imaging lens with an adjustable focal length or an imaging lens with an adjustable axial position.

16. The display optical system according to claims 13, 14 or 15, wherein the relay lens section comprises:

an image source configured to provide a time-varying display image to the display, wherein the display light is representative of the time-varying display image;

a controller operably connected to the image source and configured to control the time-varying real-image position of the real image; and wherein the image source and controller operate to synchronize the time-varying display image with the time variation of the real image position.

17. The display optical system according any one of claims 13-16, wherein the time- varying real-image position generates a time-varying magnification of the real image, and wherein the image source is configured to adjust a size of the display image to compensate for the time-varying magnification.

18. The display optical system according to any one of claims 13-17, wherein the relay lens section comprises a light-diffusing system comprising a light-diffusing element that resides at the real image and is movable to reside at the time-varying real-image position.

19. A stereoscopic imaging system, comprising:

first and second display optical systems according any of claims 13-18, wherein at least a portion of the first and second display optical systems are arranged side by side.

20. A method of reducing a convergence-accommodation mismatch when a view with left and right eyes forms a stereoscopic image, comprising:

generating left and right display images;

forming left and right real images of the left and right display images, wherein the left and right real images respectively have left and right time-varying real-image positions; the user respectively forming with the left and right eyes left and right virtual images from the left and right real images to form the stereoscopic image, wherein the left and right virtual images have respective left and right time-varying virtual-image positions due to the time-varying left and right time-varying real-image positions; and

wherein the left and right time-varying virtual image positions vary with a frequency /sufficient to reduce the convergence-accommodation mismatch as compared to if the left and right virtual image positions were stationary.

21. The method according to claim 20, wherein the frequency/ is in range from 60 Hz to

250 Hz.

22. The method according to claims 20 or 21, wherein the left and right real-image positions vary over a real-image position range DReo, and wherein 1 mm < DReo £ 13 mm.

23. The method according to claims 20, 21 or 22, wherein the left and right display images each have a size, wherein the left and right time-varying real-image positions generate a time-varying magnification of the left and right real images, and further comprising adjusting the size of the left and right display images to compensate for the time-varying magnification of the left and right real images.

24. The method according to any one of claims 20-23, further comprising diffusing the left and right real images.

25. The method according to claim 24, wherein said diffusing comprises moving at least one light-diffusing element so that it coincides with the time-varying left and right real- image positions.

26. The method according any one of claims 20-24, further comprising generating the left and right display images using respective left and right displays each comprising an array of pixels, and allocating select groups of the pixels for forming the left and right real images at each of the left and right real-image positions.

27. The method according any one of claims 20-24 and 26, further comprising forming the left and right time-varying real-image positions by forming the left and right real images using respective left and right imaging lenses each having a focal length and an axial position, and either adjusting the focal lengths or adjusting the axial positions of the left and right imaging lenses.

28. The method according to any one of claims 20-24 and 26-27, wherein the generating of the left and right display images is synchronized with forming the left and right real images at the time-varying real image positions.

29. The method according to any one of claims 20-24 and 26-28, wherein the left and right time-varying real image positions each comprises either 2, 3 or 4 left and right time- varying real image positions.

Description:
DISPLAY OPTICAL SYSTEMS FOR STEREOSCOPIC IMAGING

SYSTEMS WITH REDUCED EYE STRAIN

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] This application claims the benefit of priority under 35 U.S.C. § 119 of U.S.

Provisional Application Serial No. 62/868937 filed on June 30, 2019 the content of which is relied upon and incorporated herein by reference in its entirety.

FIELD

[0002] The present disclosure relates to display optical systems and in particular relates to display optical systems for stereoscopic imaging systems with reduced eye strain.

BACKGROUND

[0003] Personal display devices worn by a user (such as head-mounted displays) are used in applications where the use of conventional display screens would be inconvenient. Head- mounted devices (HMDs), such as display goggles, are useful wearable personal display devices for a variety of augmented reality applications for military, medical, dental, industrial, and game presentation purposes, among others. Stereoscopic imaging, with its enhanced spatial representation and improved presentation of relevant detail, can be particularly useful for presenting images that are more lifelike and that show depth information more accurately than is possible with 2-dimensional (2-D) flat displays.

Stereoscopic systems can be used for virtual reality applications where the viewed image is from a single stereoscopic source (i.e., matched displays that provide a display image) as well as augmented reality applications wherein the viewed image is a combination of a display image and a directly viewed scene.

[0004] Although advances have been made for improving usability, size, cost, and performance of stereoscopic systems, there remains considerable room for improvement.

In particular, the imaging optics that present electronically processed images from a display to the viewer have been disappointing. Conventional design approaches have proved difficult to scale to the demanding size, weight, and placement requirements, often poorly addressing optical problems related to field of view and distortion, eye relief, pupil size, and other factors.

[0005] One display attribute that has been difficult to achieve in stereoscopic systems is the ability to create images at different focal planes. Most of the stereoscopic systems create images of different horizontal disparity, which allows the user to converge their eyes to simulate depth. However, the focus of the image does not change. This creates a well- known human-factors issue known in the art as convergence-accommodation mismatch (also called convergence-accommodation conflict or vergence-accommodation conflict), which leads to eye strain.

SUMMARY

[0006] The head-mounted display optical system disclosed herein utilizes a monocentric imaging system to present a virtual image to a viewer. A real image of a display image from a display is formed by an optical system inside the focal surface of a concave spherical mirror. The real image has a substantially spherically curved surface. The real image is then propagated through a beam splitter to the mirror. A virtual image is then created from the real image by the mirror and presented to a viewer's eye. Changing the distance of the real image to the mirror changes the virtual image location as seen by the viewer. This reduces or eliminates eye strain and in particular eye strain caused by convergence-accommodation mismatch (conflict). The modulation of the position of the virtual image as seen by the viewer is synchronous with data being displayed on the flat display, i.e., is synchronized with the generation of display images seen by the viewer.

[0007] An embodiment of the disclosure is a display optical system for a stereoscopic imaging system for use by a viewer. The system comprises: an image generator that generates a display image and that emits display light; a relay lens section arranged to receive the display light, the relay lens section comprising along a first optical axis an adjustable imaging lens assembly configured receive the display light and form a real display image having a curved focus surface at a time-varying real-image position along the first optical axis; and a virtual imaging section operably arranged relative to the relay lens section and configured to form from the real image a virtual image as seen by the viewer, the virtual image having a time-varying virtual-image position along a second optical axis due to the time-varying real-image position.

[0008] Another embodiment of the disclosure is a display optical system for a

stereoscopic imaging system for use by a viewer. The system comprises: a relay lens section arranged to receive display light from a display, the relay lens section comprising an adjustable imaging lens assembly configured receive the display light and form a real display image having a time-varying real-image position that varies along a first optical axis over a range DReo from 1 mm to 13 mm at a frequency/ in the range from 60 Hertz (Hz) to 250 (Hz); and a virtual imaging section having a depth of field DOF and operably arranged relative to the relay lens section to form from the real image a virtual image as seen by the viewer, the virtual image having a time-varying virtual-image position along a second optical axis due to the time-varying real-image position, and wherein DReo > 2-DOF.

[0009] Another embodiment of the disclosure is a method of reducing a convergence- accommodation mismatch when a view with left and right eyes forms a stereoscopic image. The method comprises: generating left and right display images; forming left and right real images of the left and right display images, wherein the left and right real images respectively have left and right real-image positions that are time varying; the user respectively forming with the left and right eyes left and right virtual images from the left and right real images to form the stereoscopic image, wherein the left and right virtual images have respective left and right virtual-image positions that are time varying due to the time-varying left and right real-image positions; and wherein the left and right time- varying virtual image positions vary with a frequency/sufficient to reduce the convergence- accommodation mismatch as compared to if the left and right virtual image positions were stationary.

[0010] Additional features and advantages are set forth in the Detailed Description that follows, and in part will be apparent to those skilled in the art from the description or recognized by practicing the embodiments as described in the written description and claims hereof, as well as the appended drawings. It is understood that both the foregoing general description and the following Detailed Description are merely exemplary, and are intended to provide an overview or framework to understand the nature and character of the claims. BRIEF DESCRIPTION OF THE DRAWINGS

[0011] The accompanying drawings are included to provide a further understanding, and are incorporated in and constitute a part of this specification. The drawings illustrate one or more embodiment(s), and together with the Detailed Description explain principles and operation of the various embodiments. As such, the disclosure will become more fully understood from the following Detailed Description, taken in conjunction with the accompanying Figures, in which:

[0012] FIG. 1A is a schematic diagram of an example stereoscopic display imaging system as disclosed herein that includes the display optical system according to the disclosure.

[0013] FIG. IB is a more detailed schematic diagram of the stereoscopic display imaging system of FIG. 1A and showing an example configuration of the display optical system.

[0014] FIG. 2 is a close-up view of a portion of the first optical axis (AX1) of the display optical system showing an example of three different real-image positions Reo(ΐ) (closest PC middle PM and farthest PF) at three different times tC, tM and tF, along with the depth of field (DOF) of the concave mirror of the system.

[0015] FIG. 3 is a plot of the real-image position Reo(ΐ) (relative units) as a function of time t (relative units), illustrating how the real-image position can vary with a period p and a frequency/ between the closest position PS and a farthest position PF that defines a position range DReo.

[0016] FIG. 4 is a close-up unfolded view of the user (viewer) and the concave mirror showing how the pupil of the user's eye is located a distance RM away from the concave mirror, where R M is the mirror's radius of curvature, so that the center of curvature (CC) resides substantially at the viewer's eye.

[0017] FIG. 5 is a simplified view of the relay lens section of the display optical system to illustrate the formation of virtual display images at different virtual image planes (positions) caused by moving the position of the real image formed by the relay lens section of the display optical system. [0018] FIG. 6A is a close-up view of an example adjustable light-diffusing system that includes a light diffuser that is axially movable so that it can move with the moving real image.

[0019] FIG. 6B is a bottom-elevated view of an example light diffuser that has a substantially hemispherical shape and suitable for use in the adjustable light-diffusing system of FIG. 6A.

[0020] FIG. 7 is a schematic diagram of an example of the stereoscopic imaging system disclosed herein and showing the two display imaging systems arranged side by side so that each eye of the viewer sees a corresponding one of the two display images to form a stereoscopic display image.

[0021] FIG. 8A is a close-up schematic diagram of an example adjustable imaging lens assembly that utilizes a liquid-based lens to change the focus and thus the axial position of the real image.

[0022] FIG. 8B is a close-up schematic diagram of an example adjustable imaging lens assembly that utilizes a collimating lens and movable imaging lens to change the position of the real image.

[0023] FIG. 9 is a schematic diagram of a viewer wearing head-mounted goggles that includes the stereoscopic imaging system disclosed herein.

DETAILED DESCRIPTION

[0024] Reference is now made in detail to various embodiments of the disclosure, examples of which are illustrated in the accompanying drawings. Whenever possible, the same or like reference numbers and symbols are used throughout the drawings to refer to the same or like parts. The drawings are not necessarily to scale, and one skilled in the art will recognize where the drawings have been simplified to illustrate the key aspects of the disclosure.

[0025] The claims as set forth below are incorporated into and constitute part of this Detailed Description.

[0026] Cartesian coordinates are shown in some of the Figures for the sake of reference and are not intended to be limiting as to direction or orientation. [0027] The Figures shown and described herein are provided to illustrate key principles of operation and fabrication for an optical apparatus according to various embodiments and a number of these figures are not drawn with intent to show actual size or scale. Some exaggeration may be necessary to emphasize basic structural relationships or principles of operation.

[0028] The Figures provided may not show various supporting components, including optical mounts, power sources, image data sources, and related mounting structure for standard features used in a display device. It can be appreciated by those skilled in the optical arts that embodiments of the present invention can use any of a number of types of standard mounts and support components, including those used with both wearable and hand-held display apparatus.

[0029] In the context of the present disclosure, terms such as "top" and "bottom" or "above" and "below" or "beneath" are relative and do not indicate any necessary orientation of a component or surface, but are used simply to refer to and distinguish views, opposite surfaces, spatial relationships, or different light paths within a component or apparatus. Similarly, terms "horizontal" and "vertical" may be used relative to the figures, to describe the relative orthogonal relationship of components or light in different planes relative to standard viewing conditions, for example, but do not indicate any required orientation of components with respect to true horizontal and vertical orientation.

[0030] Where they are used, the terms "first", "second", "third", and so on, do not necessarily denote any ordinal or priority relation, but are used for more clearly

distinguishing one element or time interval from another. These descriptors are used to clearly distinguish one element from another similar element in the context of the present disclosure and claims.

[0031] The terms "viewer", "observer", and "user" can be used interchangeably in the context of the present disclosure to indicate the person viewing an image from a personal display apparatus.

[0032] In the context of the present disclosure, two planes, direction vectors, or other geometric features are considered to be substantially orthogonal when their actual or projected angle of intersection is within +/- 4 degrees of 90 degrees. [0033] In the context of the present disclosure, the term "oblique" or phrase "oblique angle" is used to mean a non-normal angle that is slanted so that it differs from normal, that is, differs from 90 degrees or from an integer multiple of 90 degrees, by at least about 4 degrees or more along at least one axis. For example, an oblique angle may be at least about 4 degrees greater than or less than 90 degrees using this general definition.

[0034] In the context of the present disclosure, the term "operably coupled" is intended to indicate a working mechanical association, connection, relation, or linking, between two or more components, such that the disposition of one component affects the spatial disposition of a component to which it is coupled. For mechanical coupling, two

components need not be in direct contact, but can be linked through one or more intermediary components.

[0035] In the context of the present disclosure, the stereoscopic imaging system disclosed herein forms a "left eye image" and a "right eye image," which respectively describe virtual images that are viewed by the left and right eyes of the viewer. The phrases "left eye" and "right eye" may be used as adjectives to distinguish imaging components for forming each image of a stereoscopic image pair, as the concept is widely understood by those skilled in the stereoscopic imaging arts.

[0036] The term "at least one of" is used to mean one or more of the listed items can be selected. The term "about" or "approximately", when used with reference to a dimensional measurement or position, means within expected tolerances for measurement error and inaccuracy that are accepted in practice. The expressed value listed can be somewhat altered from the nominal value, as long as the deviation from the nominal value does not result in failure of the process or structure to conform to requirements for the illustrated embodiment.

[0037] With relation to dimensions, the term "substantially" means within better than +/- 12% of a geometrically exact dimension. Thus, for example, a first dimensional value is substantially half of a second value if it is in the range of from about 44% to about 56% of the second value. Positions in space are "near" each other or in close proximity when, relative to an appropriate reference dimension such as a radius of curvature, a focal point, a component location, or other point on an optical axis, distance dimensions are substantially the same, no more than about 12% apart, preferably within 5% or 1% or less distance from each other.

[0038] The term "exemplary" indicates that the description is used as an example, rather than implying that it is an ideal.

[0039] With respect to positions of components or centers of curvature or other features of an optical apparatus, the term "near" has its standard connotation as would be used by one skilled in the optical design arts, with consideration for expected manufacturing tolerances and for measurement inaccuracies, for example, as well as for expected differences between theoretical and actual behavior of light.

[0040] The terms "upstream" and "downstream" are used to indicate the relative positions of items A and B relative to the direction of travel of light, wherein "A is upstream (downstream) of B means that the light is incident upon A before (after) it is incident upon B.

[0041] As is well known, the light distribution within and from a specific optical system depends on its overall configuration, which need not be geometrically perfect or exhibit ideal symmetry for suitable performance. For example, the light distribution for a curved mirror can more accurately described as focused on a small region that is substantially centered about a focal point; however, for the purposes of description, the conventional terms such as "focal point" or "focal region" are used. The term "eye box" can be used to denote a region from which a virtual image formed by an optical system can be viewed.

[0042] When a scene is viewed from a single position and is presented to the observer at both eye positions the view lacks the perception of depth, the third dimensional effect. A scene viewed in this way is called bi-ocular. However, when a single scene is viewed from two positions spaced even slightly from one another, the view presented to the observer has the sense of depth. A scene viewed in this way is called binocular. A scene viewed from one eye position and produced from one position is called monocular and lacks the third dimensional effect.

[0043] As is well known to those skilled in the imaging arts, a virtual image is synthetically simulated by divergence of light rays provided to the eye from an optical system and viewed in space at an eye box. An optical system forms a virtual image that appears in the field of view of a viewer at a given position and distance. There is no corresponding "real" object in the field of view from which the rays actually diverge. So-called "augmented reality" viewing systems employ a virtual imaging system to provide superposition of the virtual image onto the real-world object scene that is viewed along a line of sight of the viewer.

[0044] This capability for forming a virtual image that can be combined with object scene image content in the viewer's field of view distinguishes augmented reality imaging devices from other display devices that provide only a real image to the viewer.

General configuration of the display optical system

[0045] FIG. 1A is a schematic diagram of an example stereoscopic display imaging system 400 as disclosed herein that includes a display optical system ("system") 10 according to the disclosure. FIG. IB is a more detailed schematic diagram of the stereoscopic display imaging system 400 and the system 10 therein. It is noted that the system 10 described below is for one of two identical systems 10 that operate side-by-side to define the stereoscopic display imaging system 400 for use by a user 200 with left and right eyes 210. Only one of the systems 10 is shown in FIG. 1A for ease of illustration and explanation.

[0046] With reference to FIGS. 1A and IB, the system 10 comprises a relay lens section 10A and a virtual imaging section 10B. The relay lens section 10A has an input end 12 while the virtual imaging section 10B has an output end 14. The system 10 also comprises a first optical axis AX1 and a second optical axis AX2 substantially perpendicular to the first optical axis. Note that in practice the system 10 can include one or more fold mirrors (not shown) to make the system compact and to provide an optical path for the system that is favorable for integration into another system, such as a head-mounted stereoscopic display system 500 (see, e.g., FIG. 9).

[0047] The relay lens section 10A of the system 10 comprises in order along the first optical axis AX1 an image generator 20 that includes an imaging surface 22. The imaging surface 22 can be light emitting or spatial light modulating and supports a generated image 25, an example of which is shown in the first close-up inset II. In an example, the image generator 20 comprises a display (and in a particular example, a micro-display) and the imaging surface 22 supports the generated image 25, which is also referred to as a display image 25. [0048] In an example, the imaging surface 22 is substantially flat and has a width ("display width") WD, which for a micro-display can be in the range from a few millimeters to 50 millimeters, with about 20 mm being an exemplary display width. In an example, the generated image 25 is in the form of a video image. In an example, the imaging surface 22 conditions or emits light 26 to define the generated image 25. In the discussion below, the generated image 25 is referred to as the display image 25 and the light 26 is referred to as the display light 26. In an example, the image surface 22 comprises or is otherwise defined by pixels 23, as shown in the second close-up inset 12 of FIG. 1A.

[0049] The image generator 20 is operably coupled to an image source 40 that supplies an electronic version of the display image 25. In an example, the image source 40 comprises a video source, such as a computer, mobile phone, or a video storage device. In an example, the image generator 20 and the image source 40 are operably connected by a high-speed data line, such as an HDMI cable or wirelessly connected, e.g., via Bluetooth ® wireless link or WIFI. In an example, the image source 40 is part of main computer/controller 44 that controls the overall operation of the system 10. The main computer/controller 44 can be a personal computer or like device configured with hardware and software (i.e., instructions embodied in a non-transitory computer-readable medium) to manage the flow of signals, data, perform calculations and otherwise control the flow and processing of information in association with managing the operation of the system 10.

[0050] With reference to FIG. IB, the relay lens section 10A of the system 10 also includes an adjustable imaging lens assembly 50 disposed along the first optical axis AX1

downstream of the image generator 20. The adjustable imaging lens assembly 50 includes an imaging lens 52 that defines an object plane OP and an image plane IP. In an example, the imaging lens 52 comprises multiple lens elements (not shown) and is achromatic. The imaging lens 52 has a focal length F50, which in an example can be adjusted, i.e., varied with time to define a time-varying focal length Fso(t).

[0051] The adjustable imaging lens assembly 50 also includes an aperture stop 55 with an aperture 56 of diameter DP. The aperture stop 55 is located at or closely proximate to (or within) the imaging lens 52 so that the imaging lens diameter D52 is substantially the same as the aperture stop diameter DP. The aperture stop 56 defines the axial location (at imaging lens 52) and size (diameter DP = D50) of the entrance pupil for the system 10. [0052] The imaging surface 22 of the image generator 20 resides substantially at the object plane OP of the imaging lens 52. The image plane IP is curved (e.g., substantially spherically curved) and is where the imaging lens 52 forms a real image 60 of the imaging surface 22, i.e., a real image of the display image 25 using the display light 25. The real image 60 is formed at an axial position Reo, referred to hereinafter as the real-image position, which is the same as the axial position of the (curved) image plane IP.

[0053] The adjustable imaging lens assembly 50 is configured to change (modulate) the real-image position Reo, as explained below, so that the real-image position can be time varying, i.e., Reo(ΐ). The double-ended arrow AR in FIGS. 1A and IB illustrates the back and forth axial movement of the time-varying real-image position Reo(ΐ).

[0054] The adjustable imaging lens assembly 50 is operably connected to a controller 80 that controls the adjusting (changing) of the real-image position Reo as defined by the adjustable imaging lens assembly 50. In an example, the controller 80 is operably connected to the main computer/controller 44. Different configurations for the adjustable imaging lens assembly 50 and the controller 80 are described in greater detail below.

[0055] FIG. 2 is a close-up view of a portion of the first optical axis AX1 showing an example of three different real-image positions Reo(ΐ) at three different times tC, tM and tF, i.e., closest, middle and farthest real-image positions P 6 o(tC) = PC, P 6 o(tM) = PM and P 6 o(tF) = PF, respectively, as measured relative to the imaging lens 52. The axial distance DR between the closest and farthest real-image positions PC and PF is referred to as the real- image position range and is defined by DReo = PF - PC.

[0056] FIG. 3 is a plot of the real-image position Reo(ΐ) versus time t (relative units) showing an example of how the real-image position can be continuously varied between the closest and farthest positions PC and PF. The plot of FIG. 3 shows by way of example three times tC, tM and tR associated with three real-image positions, namely a closest, a middle and a farthest real-image position, which are shown along the time t axis, along with sequential times tl, t2, t3, etc. Note that the closest, middle and farthest times tC, tM and tF repeat, e.g., tC = tl, t5, t9, ... while tM = t2, t4, t6, ... and tF = t3, t7, til, ... Note also, that as few as two discrete real-image positions Peo can be used. In an example, two, three or four discrete real image positions Peoare used. In examples, the generation of the display image 25 is controlled (synchronized) so that it only appears at one of the two or more discrete real-image positions Reo.

[0057] The real-image position Reo(ΐ) can be varied with a period p = 1/f, where/ is the frequency, i.e., the frequency of the periodic change in the real-image position. In an example, the real-image position range DReo is from 1 mm to 13 mm, with 3 mm being an exemplary range. An example period p is from 0.016 second (s) to 0.004 s, which corresponds to a frequency/= 1/p from 60 Hertz (Hz) to 250 Hz. The rationale behind the ranges on the above-described real-image position parameters is explained greater detail below.

[0058] With reference again to FIGS. 1A and IB, the virtual imaging section 10B of the system 10 is configured to convert the real image 60 to a virtual image seen by the viewer 200. An example virtual imaging section 10B includes along the first optical axis A1 a beam splitting element 100 and a concave mirror 110. The beam splitting element 100 includes opposite surfaces 102 and 103, with the surface 103 being closest to the concave mirror 110 and being partially reflective. In examples, the partially reflective surface 103 is at least 25% reflective (e.g., 25% to 75%, or 30% to 70%, or 40 to 60% or 45 to 65%) in the visible spectrum, and has a transmission of at least 25% (e.g., 25% to 75%, or 30% to 70%, or 40 to 60% or 45 to 65%) in the visible spectrum, consistent with the condition that the reflectivity and the transmissivity add up to 100% in the case of no absorption. For example, the partially reflective surface may be (for the visible spectrum): (i) 45% transmissive and 55% reflective, or (ii) 50% transmissive and 50% reflective, or (ii) 55% transmissive and 55% reflective. In the case where the coating absorbs some of the energy, the reflectivity and the transmissivity add up to 100% minus the absorption value.

[0059] The reflective surface 112 of the concave mirror 110 is spherical and has a center of curvature CC, a radius of curvature RM and a focal length FM = RM/2. The center of curvature CC resides at the axial location of the exit pupil of imaging lens 52 and the aperture stop 56. This makes the substantially spherically curved focal plane 60

substantially concentric with the spherically curved surface 112 of the concave mirror 110. For convenience, the radius of curvature RM is taken as positive. [0060] The virtual imaging section 10B has a depth of field DOF that is approximated by the equation DOF « (F/#) 2 (in microns) of the spherical mirror 110, where F/# stands for F- number. The F-number of the spherical mirror is the focal length divided by the viewer's eye's pupil diameter dP. The F-number of spherical mirror 110 will typically be in the range from about F/5 to about F/30. The DOF will thus range from about 25 microns to 900 microns. Thus, the real-image position range DReo > DOF, and in examples, DReo > 2-DOF or DReo > 10-DOF or DReo > 20-DOF or DReo > 30-DOF. The significance of the condition on the real-image position range DReo being greater (and in some cases substantially greater) than the depth of field DOF is discussed below.

[0061] The beam splitting element 100 is disposed between the concave mirror 110 and the focus position FPM of the concave spherical mirror and generates the second optical axis AX2 that runs in the X-direction. The user (viewer) 200 places one of their eyes 210 along the second optical axis AX2 for viewing, as described in greater detail below. The user's eye 210 has a pupil 215.

[0062] With reference to the close-up view of FIG. 4, the user's pupil 215 is located substantially at a distance RM from the concave mirror 110. Noting that the aperture stop 55 is also located a distance RM from the concave mirror 110, this configuration images the entrance pupil of system 100 into the user's pupil 215. Here, the user's pupil 215 can be thought of as the exit pupil of the system 10. Of course, the user's pupil 215 constitutes the entrance pupil of the user's eye 210. This means that the aperture stop diameter DP is about 2 mm to 6 mm, i.e., the size of a typical human pupil 215.

[0063] FIG. 5 is a simplified view of the virtual imaging section 10B of system 100 to illustrate the formation of virtual display images as seen by the user 200. FIG. 5 shows the user's eye 210, which sees, at a virtual image plane VIP having a position R'eo along the second optical axis AX2, a virtual display image ("virtual image") 60' formed by the concave mirror 110 (not shown) and the beam splitting element 100 from the real image 60.

[0064] When the real-image position Reo = PC of the real image 60 changes from the closest position PC to the farthest position PF as shown by the dark arrow AR1, then the virtual-image position R'eo changes from a corresponding closest position PC' to a farthest position PF' as shown by the arrow AR2. This change in the virtual-image position R'eo

- IB - causes the user's eyes 210 to change the direction where they look (vergence) and also change focus (accommodation) to form the virtual images seen by the user 200. The virtual image plane VIP moves in the opposite direction when the real image 60 is shifted from the farthest position PF to the close position PC.

[0065] This effect occurs because the real-image positions Reo move over a range DReo that is greater than the depth of field DOF of the concave mirror 60. Real-image positions Reo that vary within the depth of field DOF do not result in a substantial change in the virtual image location of display light 26C seen by the viewer 200.

[0066] FIG. 6A illustrates an example embodiment of system 100 that includes an adjustable light diffusing system 240 that comprises a light-diffusing element ("diffuser")

250 that resides at the real-image position Reo. The diffuser 250 has a body 251 that defines a front surface 252 that faces the adjustable imaging lens assembly 50 and a back surface 254 that faces the beam splitting element 100. The close-up inset shows an example of the diffuser 250 wherein the body 251 includes light-diffusing features 255, which can be particulates (e.g., microparticles and/or nanoparticles), voids, index variations, etc. Other example diffusers 250 can have a rough surface that acts to diffuse light. With reference to FIG. 6B, the diffuser 250 can have a substantially hemispherical shape or like curved shape that substantially matches the substantially curved real image 60.

[0067] FIG. 6B shows a display light ray 26R incident upon the diffuser 250, which generates diffused display light 26D. The diffused display light 26D is then used by the virtual imaging section 10B as described below.

[0068] With reference again to FIG. 6A, the light diffuser system 240 further includes a support structure 260 that operably supports the diffuser 250, and also can include translation stage 270 to which the support structure is operably attached. In an example, the translation stage 270 comprises a linear actuator system, such as a voice coil linear actuator system. The light diffuser system 240 also includes translation stage driver 280 that is operably connected to the translation stage 270 and which controls the activation and movement of the translation stage. In an example, the translation stage driver 280 can be operably connected to the main computer-controller 44 and further can be operably connected to the controller 80. The translation stage 270 and translation stage drive 280 cooperate to axially move the diffuser 250 so that it resides at the adjustable real-image position P6o(t). Operably connecting the translation stage driver 280 to the controller 80 allows for the movement of the diffuser 250 to be coordinated (synchronized) with the change in the real-image position Reo(ΐ) by using the control signals that adjust the adjustable imaging lens assembly as timing signals for the movement of the diffuser. In an example, the diffuser 250 acts like a secondary display screen for the real image 60.

[0069] The advantages of using the light-diffusing system 240 are explained below.

General operation of the display optical system

[0070] With reference again to FIG. 1A and FIG. IB, the image source 40 sends an image to the image generator 20, which generates the display image 25. The display light 26 from each point on the imaging surface 22 travels generally in the y-direction as a diverging light beam that is received by the adjustable imaging lens assembly 50. Two example diverging display light beams (ray bundles) 26a (on-axis) and 26b (edge of the imaging surface 22) are shown. In another embodiment, the imaging lens assembly 50 is not adjustable.

[0071] In FIG. 6A, the numerical aperture of the image forming light 26 is less than the numerical aperture of the light exiting the diffuser 250. Therefore, as the diffuser 250 is modulated through focus, it is within the depth of focus of the imaging lens system and outside the depth of focus of the virtual imaging system. The adjustable imaging lens assembly 50 focuses the display light 26 to form the real image 60. FIG. IB shows two focused display light beams (ray bundles) 26Fa and 26Fb, which respectively correspond to the diverging display light beams (ray bundles) 26a and 26b. The real-image position Reo(ΐ) is varied with time as explained above and as explained further below.

[0072] The concave mirror 110 uses the real image 60 as an object that resides in a spherically curved object plane. Because this "object" resides inside the focal point FP M of the concave mirror 110, the focused display light 26F that forms the spherically curved focus surface 60 is converted by the concave mirror to a virtual image of display light 26C, which is partially reflected by the beam splitting element 100 to travel in the -X direction along the second optical axis AX2 and to the eye 210 of the viewer 200. The diameter of the virtual image display light 26C is substantially larger than the diameter dP of the user's pupil to enable multiple users with different intra-pupillary distances to use the system without adjustment.

[0073] The viewing of the virtual image light beam 26C for the different real-image positions Reo(ΐ) to form virtual display images 60' at different virtual planes VIP with different axial positions Reo' along the second optical axis AX2 was described above in connection with FIG. 5.

[0074] In the embodiment of the system 10 where the light-diffusing system 240 is employed, the light diffuser 250 generates diffused display light 26D, as shown in FIGS. 6A and 6B. This diffused display light 26D is then used by the virtual imaging section 10B of the system 10 to form the virtual image display light 26C seen by the viewer. Since the diffused display light 26D diverges at a larger angle than the original focused display light 26F, it alters the etendue of the system. In particular, it reduces the amount of luminance in the virtual image light 26C as seen by the viewer 200 but increases the fill at the viewer's pupil 215. Since the luminance of the image generator 20 is relatively large, the reduction in luminance from using the light diffuser 250 can be tolerated and in fact may be required given the large luminance generated by the image generator 20.

[0075] Meantime, in an example, scene light 300 from a scene 302 travels along the second optical axis AX2 and through the beam splitting element 100 to the eye 210 of the user 200. Thus, in an example, both virtual image display light 26C and scene light 300 overlap at the user's eye 210 to form an augmented stereoscopic reality image using two systems 10, as illustrated by the stereoscopic imaging system 400 of FIG. 7. In another embodiment, the scene light 300 is not used so that only the virtual image display light 26C is viewed stereoscopically using the stereoscopic imaging system 400.

[0076] The stereoscopic imaging system 400 of FIG. 7 comprises two example systems 10 (which can be referred to as left and right systems), wherein at least a portion of each of the system 10 reside side-by-side so the user's left and right eyes 210 receive a stereoscopic pair of the display image to create the stereoscopic effect.

[0077] An aspect of the disclosure includes a method of reducing a convergence- accommodation mismatch when a viewer 200 with left and right eyes 210 forms a stereoscopic image. The method includes generating left and right display images 25. The method also includes forming left and right real images 60 of the left and right display images, wherein the left and right real images respectively have left and right real-image positions Reo hat are time varying. The method then includes the user 200 respectively forming with the left and right eyes 210 left and right virtual images 60' from the left and right real images 60 to form the stereoscopic image, wherein the left and right virtual images have respective left and right virtual-image positions R'eo that are time varying due to the time-varying left and right real-image positions (see FIG. 5). The left and right time- varying virtual image positions P'eovary with a frequency/sufficient to reduce the convergence-accommodation mismatch as compared to if the left and right virtual image positions were not time varying. The frequency/of the variation in the virtual-image positions R'eo is the same as the frequency of the variation in the real-image positions Reo.

Example adjustable imaging lens assemblies

[0078] The adjustable imaging lens assembly 50 can have a variety of different

configurations that change the focus position FPso(t) with time over the focus change distance AF.

[0079] FIG. 8A is a schematic diagram of an example adjustable imaging lens assembly 50 wherein the lens 52 comprises a liquid-based adjustable lens having liquid-based lens element 53A supported within a lens housing 53B. An example of a liquid-based adjustable lens is one of the Varioptic ® lenses, available from Corning Inc., Corning, New York. The controller 80 applies a voltage that causes the liquid lens element 53A to either change its shape or change its refractive index profile, which in turn changes the focal length FP 50 . The example adjustable imaging lens assembly 50 is shown along with collimated incident light CLR and outgoing focused light FLR. Two example focal positions FP 50 associated with two different focal lengths F 50 formed at different times Ϊ A and te from the focused light FLR are shown.

[0080] Application of a time-varying voltage results in a time-varying change in the focal length F5o(t) and thus a time-varying change in the focus position FPso(t) as described above. In an example, the liquid lens element 50A is defined by two different liquids. Other types of liquid-based lenses are also commercially available, including those that employ liquid crystals and birefringence to change the focus, such as those available from LensVector Inc., of San Jose, California.

[0081] It is noted that a change in focal length FP50 can result in a change in magnification of the real image 60. However, the changes in magnification are relatively small and are readily compensated electronically by adjusting the size of the generated image 25 on the image generator. For example, the image source can be controlled to provide select changes in magnification of the generated image that compensate for changes in focal length of the adjustable imaging lens assembly.

[0082] FIG. 8B illustrates another example configuration for the adjustable imaging lens assembly 50 wherein the lens 52 comprises a collimating lens 52C and an axially movable imaging lens 52F. The axially movable imaging lens 52F is supported by a support structure 260, which in turn is attached to a translation stage 270. The support structure 260 can also define the aperture stop 55. The collimating lens 52C receives and collimates the display light 26 from the image generator (display) 20 to form collimated display light 26C. The imaging lens 26F receives the collimated display light 26C and forms the real image 60 (just the real-image position Reo is shown).

[0083] The axially movable imaging lens 52F is movable by the translation stage 270 axially moving the support structure 260. In the left-hand configuration of the adjustable imaging lens assembly 50, the axially movable imaging lens 52F is at a first axial position PI. In this position, the axially movable imaging lens 52F is relatively close to the collimating lens 52C and by way of example forms the real image at the close position PC by way of example. In the right-hand configuration of the adjustable imaging lens assembly 50, the axially movable imaging lens 52F has been axially moved by action of the translation stage 270 under the control of the controller 80, which in an example can comprise a translation stage driver. In an example, the translation stage 270 can comprise a voice coil and the controller comprises a voice coil controller. In an example, the translation stage can comprise a linear voice coil motor.

[0084] The axial movement of the axially movable imaging lens extends the axial length over which the collimated display light 26C travels. The second position P2 of the axially movable imaging lens 52F allows this lens to establish the far real-image position PF by way of example.

[0085] Note that the focal length of the imaging lens 52F has not changed. Instead, its axial position has moved farther away from the image generator 20, thereby allowing the focus position FPso(t) to also be farther away from the image generator. Continuous movement of the axially movable imaging lens 52F allows for continuous movement of the real-image position P 6 o(t), as described above in connection with FIG. 3.

[0086] FIG. 9 is a schematic diagram of a viewer 200 wearing head-mounted goggles 500 that includes the stereoscopic imaging system 400 disclosed herein. The head-mounted goggles 500 include a housing 510 within which the two systems 10 can be operably supported. As noted above, this may involve the use of folding mirrors and like light-turning elements (not shown) to make the systems 10 for each eye sufficiently compact and have a suitable optical path configuration within the housing 510. The head-mounted goggles 500 show the beam splitting elements 100 (which can comprise a single beam splitting element) and the back sides of the concave mirrors 110.

Synchronizing the display image with the focus position

[0087] As the focus of the systems 10 in the stereoscopic imaging system 400 are adjusted in time using the respective adjustable imaging lens assemblies 50, the respective image generators 20 generate respective display images 25 synchronized with the focus positions FP5o(t). In an example, this synchronization is carried out in the main computer/controller 44. This allows a variety of image data to be presented to the viewer 200 in different focus positions. In an example, the imaging surface 22 and the display light 26 emitted therefrom can be modulated in resonance and synchronized to a clock that controls the display frequency. In an example, the user can set the start of travel position, e.g., via an input device (switch) on the main computer/controller, on the image generator, etc.

[0088] In an example, the adjustable imaging lens assemblies 50 continuously change the focus positions FPso(t) of the two systems 10 in the stereoscopic imaging system 400 while the display images 25 are modulated to match. In an example, if there are two real-image positions Reo, and the display frame rate (frequency) /is 60 Hz, then each real image 60 at each of the two real-image positions is displayed at//2 = 30 Hz. In an example, the two real images 60 respectively formed at the two different real-image positions are formed on the diffuser 250, with the real-images "displayed" on the diffuser 250 during the phase of the modulation.

[0089] In an example, the imaging surface 22 of the image generator 20 comprises pixels, and different select groups (sets) of pixels are allocated to each real-image position Reo. In this way, the frame rate of the image generator (display) 20 can be reduced or the number of real images formed at different real-image positions Reo can be increased without unduly taxing the speed (frame rate) of the image generator.

[0090] An advantage of the systems and methods disclosed herein is that they reduce user eye strain, and in particular reduce the convergence-accommodation mismatch inherent in typical stereoscopic imaging systems. This is accomplished by the left and right systems 10 in the stereoscopic imaging system 400 each presenting to the viewer least two virtual images having different virtual image locations. This causes the viewer to adjust their eye position/direction (vergence) and eye focus (accommodation) to form virtual images having positions that better comport with what the eye expects to see in forming a single stereoscopic image instead of having to perform vergence and accommodation with virtual images formed at a fixed position that the eyes cannot properly resolve into a stereoscopic image (e.g., seeing double). For example, display images 25 presented as a close object that requires a close eye convergence by the user can also have a close focus by virtue of the real-image 60 having a "close focus" real-image position Reo so that there is no conflict (or substantially reduced conflict) between the eye vergence (direction) and eye

accommodation (focus).

[0091] it will be apparent to those skilled in the art that various modifications to the preferred embodiments of the disclosure as described herein can be made without departing from the spirit or scope of the disclosure as defined in the appended claims.

Thus, the disclosure covers the modifications and variations provided they come within the scope of the appended claims and the equivalents thereto.