Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AUTOFOCUS EYEGLASS SYSTEM
Document Type and Number:
WIPO Patent Application WO/2014/179857
Kind Code:
A1
Abstract:
An autofocus eyeglass system comprises adjustable focus lenses, a distance-finding subsystem for determining the distance to an object being viewed, a focus adjustment subsystem for changing the focus of the adjustable focus lenses to bring the object into focus for the user, a controller, at least one non-volatile memory, a power source, and a frame. The autofocus eyeglass system and associated method use a sparse set of measurements of the eyes of a user to determine the gaze directions of the two eyes and determines from the two gaze directions the gaze distance of the two eyes and then focuses the eyeglasses based on the gaze distance. Features of the face may be used to detect a change in position of an eyeglass relative to a corresponding eye and to compensate for the change in position. Predetermined viewing distances may be separated from each other based on a constant incremental lens power.

Inventors:
SHINKODA ICHIRO (CA)
Application Number:
PCT/CA2014/000377
Publication Date:
November 13, 2014
Filing Date:
April 25, 2014
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
RAVENNI TECHNOLOGY INC (CA)
International Classes:
A61B3/113; G02B7/28; G02C7/08; H04N5/335
Foreign References:
US8274578B22012-09-25
US6478425B22002-11-12
US20120133891A12012-05-31
Attorney, Agent or Firm:
BENOÎT & CÔTÉ INC. (Montreal, Québec H3A 1X6, CA)
Download PDF:
Claims:
WHAT IS CLAIMED IS:

1. A method for determining the gaze direction an eye of a user, the method comprising:

a. directing an image of the eye onto a first plurality of imaging pixels of an imager; and

b. analyzing image information from a predetermined second plurality of pixels of the imager.

2. The method of claim 1 , wherein the second plurality of pixels is a smaller number of pixels than the first plurality of pixels.

3. The method of claim 2, wherein the second plurality of pixels is a sparse subset of all the pixels of the imager.

4. The method of claim 3, wherein the analyzing comprises comparing an intercept of a feature of the eye with the sparse subset of pixels with predetermined intercepts of the feature with the sparse subset of pixels.

5. The method of claim 4, wherein the predetermined intercepts are based on image data from the first plurality of imaging pixels.

6. The method of claim 5, further comprising selecting the second plurality of pixels from at least one scan line of pixels of the imager.

7. The method of claim 6, wherein the selecting the second plurality of pixels is based on a signal-to-noise ratio of the at least one scan line of pixels.

8. The method of claim 5, further comprising selecting the second plurality of pixels to be an integer number of scan lines of the imager.

9. The method of claim 8, wherein the selecting the second plurality of pixels comprises selecting the second plurality of pixels to be at least one set of contiguously arrayed scan lines of the imager.

10. The method of claim 9, wherein the at least one set of scan lines is oriented parallel to the horizon of the eye.

11. The method of claim 3, wherein the analyzing comprises comparing the intercept of a limbus of the eye with predetermined intercepts of the limbus with the second plurality of pixels.

12. The method of claim 11, comprising selecting the second plurality of pixels to be located one of above and below the horizon of the eye.

13. The method of claim 3, wherein the analyzing comprises comparing the intercept of a feature on the sclera of the eye with predetermined intercepts of the feature on the sclera with the second plurality of pixels.

14. A method for determining a gaze distance of first and second eyes of a user, the method comprising:

a. determining a first gaze direction of the first eye by directing an image of the first eye onto a first plurality of imaging pixels of a first imager and analyzing image information from a predetermined second plurality of pixels of the first imager;

b. determining a second gaze direction of the second eye by directing an image of the second eye onto a third plurality of imaging pixels of a second imager and analyzing image information from a predetermined fourth plurality of pixels of the second imager; and

c. determining the gaze distance from a mutual spatial intercept of the first and second gaze directions.

15. The method of claim 14, wherein the first and second imagers are the same imager.

16. A method for determining a gaze distance of first and second eyes of a user, the method comprising: a. determining a first gaze direction of the first eye by directing an image of the first eye onto a first plurality of imaging pixels of a first imager and comparing the intercept of a first feature of the first eye with predetermined intercepts of the first feature with a first sparse subset of the first plurality of pixels;

b. determining a second gaze direction of the second eye by directing an image of the second eye onto a second plurality of imaging pixels of a second imager and comparing the intercept of a second feature of the second eye with predetermined intercepts of the second feature with a second sparse subset of the second plurality of pixels;

c. determining the gaze distance from a mutual spatial intercept of the first and second gaze directions.

17. The method of claim 16, wherein the first feature is one of a limbus of the first eye and a feature of a sclera of the first eye and the second feature is one of a limbus of the second eye and a feature of a sclera of the second eye.

18. The method of claim 16, wherein the first and second imagers are the same imager.

19. An automatically focused eyeglass system comprising:

a. a frame configured to engage with a nose and ears of a user; b. first and second adjustable focus lenses disposed in the frame to be located in front of respectively a first and a second eye of the user when the frame is engaged with the nose and ears of the user;

c. first and second imagers comprising first and second respective pluralities of imaging pixels disposed to image respectively the first and second eye; d. a focus adjustment subsystem for changing the focus of the first and

second adjustable focus lenses;

e. a memory;

f. a power source; and

g. a controller configured for, when the frame is engaged with the nose and ears of the user, directing the focus adjustment subsystem to change a first focal length of the first adjustable focus lens based on first information about the first eye obtained from a first sparse subset of the first plurality of imaging pixels and to change a second focal length of the second adjustable focus lens based on second information about the second eye obtained from a second sparse subset of the second plurality of imaging pixels.

20. The apparatus of claim 19, wherein at least one of the first information and the second information is information about a limbus of the corresponding one of the first and second eyes.

21. The apparatus of claim 19, wherein at least one of the first information and the second information is information about a feature on the sclera of the corresponding one of the first and second eyes.

22. The apparatus of claim 19, wherein at least one of the first and second sparse subsets comprises an integer number of contiguously arrayed scan lines of the

corresponding imager.

23. The apparatus of claim 19, wherein the focus adjustment subsystem, the memory, the power source and the controller are embedded in the frame.

24. The apparatus of claim 23, further comprising a computer in communication with the controller for receiving and sending data between the controller and the computer.

25. The apparatus of claim 19, wherein at least one of the first and second sparse subsets forms a mathematically describable curve.

26. The apparatus of claim 19, wherein the first and second imagers are the same imager.

27. The apparatus of claim 19, wherein the focus adjustment subsystem is restricted to change the focus of the first and second adjustable focus lenses to an integer number of predetermined viewing distances.

28. The apparatus of claim 27, wherein the integer number of predetermined viewing distances is three.

29. The apparatus of claim 27, wherein the three predetermined viewing distances are 35 cm +/-5cm, 65cm +/- 10cm, and 250cm +/- 40cm

30. The apparatus of claim 27, wherein successive predetermined viewing distances among the integer number of predetermined viewing distances are separated from each other based on a constant incremental lens power.

31. The apparatus of claim 30, wherein the constant incremental lens power is a constant factor of a user focal depth lens power variation.

32. The apparatus of claim 31 , wherein the constant factor is greater than zero and less than or equal to 2.

33. The apparatus of claim 30, wherein the constant difference in lens power is a predetermined amount of lens power.

34. The apparatus of claim 33, wherein the predetermined amount of lens power is ¼ diopter.

35. The apparatus of claim 19, wherein at least one of the first and second imagers is further disposed to image at least one feature of a face of the user.

The apparatus of claim 35, wherein the controller is further configured:

a. or determining from the image of the face whether one of the first and second adjustable focus lenses has substantially changed position relative to the corresponding eye; and

b. for directing the focus adjustment subsystem to change a focal length of the corresponding adjustable focus lens based on the changed position.

Description:
AUTOFOCUS EYEGLASS SYSTEM

BACKGROUND OF THE INVENTION

Field of the Invention.

[0002] The invention relates to the automatic adjustment of the focal length of eyeglasses to adapt to a varying object distance.

Description of the Related Art

[0003] Presbyopia is the loss of accommodation in the lenses of the eye due to the aging process. In humans the symptoms typically appear near the age of 40 and generally, by the age of 55, a pair of bifocal glasses is required to read a book and to correct vision for a different distance. Absolute presbyopia is the condition where the depth of field of the eye is predominantly determined by the size of the iris. This pertains when the lenses of the eyes have lost almost all of their ability to change their power. At that point a pair of progressive glasses may be required, with optical compensation to enable reading a book, reading a monitor, seeing distant objects and for regions betweens those distances.

[0004] Nature attempts to compensate by decreasing the aperture size to which the pupils can contract. This increases the depth of focus of the aging eye, but presbyopia still limits the ability of otherwise able-bodied people to concentrate on close objects for extended periods due to eyestrain. This also leads to headaches due to the eyestrain.

[0005] Progressive glasses are eyeglasses in which the optical power compensation of a particular lens is varied across that lens with very little demarcation between the regions of differing optical power. The optical power compensation of the lens may be considered a continuous function or near-continuous function of position. These devices are popular as the transition lines found in bifocal lenses and trifocal lenses are not visible and the user avoids any real or perceived stigma associated with wearing trifocal glasses.

[0006] Another method by which to compensate for the affect of presbyopia, is by employing devices that can change the optical power of a substantial area of the lens. Such devices are commercially available, examples being the Superfocus™ and Empower™ products. The first product varies the optical power of a liquid lens contained within a flexible membrane structure. This is achieved by varying the amount of liquid in the lens to change the surface optical shape of the membrane structure. In the second product a voltage applied to an index changing electro-active material is varied to thereby vary the compensation. In principle the optical compensation can be adjusted accurately for the viewing distance.

[0007] The devices developed to date for finding the distance to the object being viewed include eye safe laser range finders, infrared gaze monitoring devices and electrooculography detection systems. Each has its strengths and weakness. Generally, these devices are prominent and not suitable, given the constraints of the form factor of an ordinary eyeglass, much of which is dictated by aesthetics.

SUMMARY OF THE INVENTION

[0008] In a first aspect of the invention a method is provided for determining the gaze direction of an eye of a user, the method comprising: (a) directing an image of the eye onto a first plurality of imaging pixels of an imager; and (b) analyzing image information from a predetermined second plurality of pixels of the imager. The second plurality of pixels may be a smaller number of pixels than the first plurality of pixels and/or a sparse subset of all the pixels of the imager. The analyzing may comprise comparing an intercept of a feature of the eye with the sparse subset of pixels with predetermined intercepts of the feature with the sparse subset of pixels. The predetermined intercepts may be based on image data from the first plurality of imaging pixels. [0009] The method may further comprise selecting the second plurality of pixels from at least one scan line of pixels of the imager. The selecting the second plurality of pixels may be based on a signal-to-noise ratio of the at least one scan line of pixels. The method may further comprise selecting the second plurality of pixels to be an integer number of scan lines of the imager. The selecting the second plurality of pixels may comprise selecting the second plurality of pixels to be at least one set of contiguously arrayed scan lines of the imager and the at least one set of scan lines may be oriented parallel to the horizon of the eye.

[00010] The analyzing may comprise comparing the intercept of a limbus of the eye with predetermined intercepts of the limbus with the second plurality of pixels. The method may comprise selecting the second plurality of pixels to be located one of above and below the horizon of the eye.

[00011] Alternatively, the analyzing may comprise comparing the intercept of a feature on the sclera of the eye with predetermined intercepts of the feature on the sclera with the second plurality of pixels.

[00012] In another aspect the invention provides a method for determining a gaze distance of first and second eyes of a user, the method comprising: (a) determining a first gaze direction of the first eye by directing an image of the first eye onto a first plurality of imaging pixels of a first imager and analyzing image information from a predetermined second plurality of pixels of the first imager; (b) determining a second gaze direction of the second eye by directing an image of the second eye onto a third plurality of imaging pixels of a second imager and analyzing image information from a predetermined fourth plurality of pixels of the second imager; and (c) determining the gaze distance from a mutual spatial intercept of the first and second gaze directions. In some implementations, the first and second imagers may be the same imager.

[00013] The method for determining a gaze distance of first and second eyes of a user, may in other embodiments comprise: (a) determining a first gaze direction of the first eye by directing an image of the first eye onto a first plurality of imaging pixels of a first imager and comparing the intercept of a first feature of the first eye with predetermined intercepts of the first feature with a first sparse subset of the first plurality of pixels; (b) determining a second gaze direction of the second eye by directing an image of the second eye onto a second plurality of imaging pixels of a second imager and comparing the intercept of a second feature of the second eye with predetermined intercepts of the second feature with a second sparse subset of the second plurality of pixels; and (c) determining the gaze distance from a mutual spatial intercept of the first and second gaze directions. The first feature may be one of a limbus of the first eye and a feature of a sclera of the first eye and the second feature may be one of a limbus of the second eye and a feature of a sclera of the second eye. In some implementations, the first and second imagers may be the same imager.

[00014] In another aspect the invention provides an automatically focused eyeglass system comprising: (a) a frame configured to engage with a nose and ears of a user; (b) first and second adjustable focus lenses disposed in the frame to be located in front of respectively a first and a second eye of the user when the frame is engaged with the nose and ears of the user; (c) first and second imagers comprising first and second respective pluralities of imaging pixels disposed to image respectively the first and second eye; (d) a focus adjustment subsystem for changing the focus of the first and second adjustable focus lenses; (e) a memory; (f) a power source; and (g) a controller configured for, when the frame is engaged with the nose and ears of the user, directing the focus adjustment subsystem to change a first focal length of the first adjustable focus lens based on first information about the first eye obtained from a first sparse subset of the first plurality of imaging pixels and to change a second focal length of the second adjustable focus lens based on second information about the second eye obtained from a second sparse subset of the second plurality of imaging pixels.

[00015] At least one of the first information and the second information may be information about a limbus of the corresponding one of the first and second eyes. Alternatively, at least one of the first information and the second information is information about a feature on the sclera of the corresponding one of the first and second eyes. At least one of the first and second sparse subsets may comprise an integer number of contiguously arrayed scan lines of the corresponding imager. The focus adjustment subsystem, the memory, the power source and the controller may be embedded in the frame. The automatically focused eyeglass system may further comprise a computer in communication with the controller for receiving and sending data between the controller and the computer. At least one of the first and second sparse subsets may form a mathematically describable curve. The first and second imagers may be the same imager.

The focus adjustment subsystem may be restricted to change the focus of the first and second adjustable focus lenses to an integer number of predetermined viewing distances. The integer number of predetermined viewing distances may be three. The three predetermined viewing distances may be 35 cm +/-5cm, 65cm +/- 10cm, and 250cm +/- 40cm.

The successive predetermined viewing distances among the integer number of predetermined viewing distances may be separated from each other based on a constant incremental lens power. The constant incremental lens power may be a constant factor of a user focal depth lens power variation. The constant factor may be greater than zero and less than or equal to 2. In other embodiments, the constant difference in lenspower may a predetermined amount of lenspower. The predetermined amount of lenspower may be ¼ diopter.

At least one of the first and second imagers may further be disposed to image at least one feature of a face of the user and the controller may be configured for determining from the image of the face whether one of the first and second adjustable focus lenses has substantially changed position relative to the corresponding eye; and for directing the focus adjustment subsystem to change a focal length of the corresponding adjustable focus lens based on the change in position.

BRIEF DESCRIPTION OF THE DRAWINGS [00016] The abovementioned and other features and objects of this invention, and the manner of attaining them, will become more apparent and the invention itself will be better understood by reference to the following description of an embodiment of the invention taken in conjunction with the accompanying drawings, wherein:

[00017] Figure 1 is an auto focus eyeglass system in use with a user.

[00018] Figure 2a is a projection of three horizontal pixel sections onto an image of the eye of a user.

[00019] Figure 2b is a projection of the three horizontal pixel sections of Fig.2a onto an image of the eye of the user with the gaze of the eye directed upwards.

[00020] Figure 3 is a flowchart of a method for determining the gaze direction of an eye automatically focusing an autofocus eyeglass system.

[00021] Figure 4 is a flowchart of a method for calibrating an autofocus eyeglass system.

[00022] Figure 5 is a flowchart of a method for automatically focusing an autofocus eyeglass system.

[00023] Although the drawings represent embodiments of the present invention, the drawings are not necessarily to scale and certain features may be exaggerated in order to better illustrate and explain the present invention. Any flow charts are also representative in nature, and actual embodiments of the invention may include further features or steps not shown in the drawings. The exemplification set out herein illustrates an embodiment of the invention, in one form, and such exemplifications are not to be construed as limiting the scope of the invention in any manner. DESCRIPTION OF EMBODIMENTS OF THE PRESENT INVENTION

[00024] The embodiments disclosed below are not intended to be exhaustive or limit the invention to the precise form disclosed in the following detailed description. Rather, the embodiments are chosen and described so that others skilled in the art may utilize their teachings.

[00025] The autofocus eyeglass system 100 of the present invention is shown in Figure 1 and comprises adjustable focus lenses 10 and 20, a distance-finding subsystem 30 for determining the distance to an object being viewed, focus adjustment subsystem 40 for changing the focus of the adjustable focus lenses 10 and 20 to bring the object into focus for the user, a controller 50, at least one non-volatile memory 60, a power source 90, and a frame 25. As shown in Figure 1, distance-finding subsystem 30, power source 90, and non-volatile memory 60 may be physically incorporated in frame 25. The distance-finding subsystem 30 and focus adjustment subsystem 40 can be combined in some embodiments or distributed in various devices in other embodiments to conform to space and weight constraints. Other embodiments may include various combinations of the components of the autofocus eyeglasses to conform to space and weight constraints. Further embodiments of the apparatus may include a miniature connector, such as a micro usb port, commonly found in consumer devices, to charge the battery or to provide a means to communicate with a device to program the controller 50 or write to nonvolatile memory 60. Still further embodiments may include a wireless means to charge or to provide a means to communicate with a device to program the controller or write to memory.

[00026] The viewing distance can be inferred from the two individual gaze directions of the two eyes 12 and 22 of the user, eye 12 being served by lens 10 and eye 20 being served by lens 22. In a first embodiment of the invention the distance-finding subsystem 30 comprises two detectors 70 and 80, which may be without limitation two cameras, attached to the frame 25. Camera 70 is directed at eye 12, and camera 80 is directed to eye 22. The total field of view of each of the cameras is at least 125% of the diameter of the cornea of the human eye in the horizontal direction and at least 50% of the diameter of the cornea of the human eye in the vertical direction. The cameras may each be without limitation a digital camera. The digital camera may be a miniature digital camera and the lens of the digital camera may be, without limitation, a single aspherical lens. The camera may comprise a multipixel detector array.

[00027] In another embodiment of the invention the distance-finding subsystem 30 comprises a detector, which may be without limitation a single camera, attached to the bridge of the frame. The single camera is directed to capture images of both eyes. The digital camera may be a miniature digital camera and lens of the digital camera may be, without limitation, a fisheye lens. In another embodiment, the lens of the digital camera may include a curved mirror to direct the light from both eyes to be received by the digital camera.

[00028] The non-volatile memory 60 may contain predetermined information regarding the characteristics of a typical human eye. The memory may also contain predetermined custom information regarding characteristics of the eyes 12 and 22 of the individual user. The predetermined information may include, without limitation, the interocular distance, the position of the pupils of the eyes 12 and 22 relative to the lenses 10 and 20 and the distance of the eyeglasses from the vertices of the corneas. These three quantities place the center of the pupil of each eye 12 and 22 in three dimensions with respect to the lens 10 and 20 serving each respective eye. To this end a reference point may be selected on each of lenses 10 and 20 with respect to which to express position.

[00029] The autofocus eyeglass system 100 of the present invention is prepared for use by first calibrating the distance-finding subsystem 30 to characteristics of the eyes 12 and 22 of the user. The calibration process may be conducted, for example, by putting the autofocus eyeglass system 100 into a calibration mode. In an initial calibration step of the calibration mode the user looks at a clearly definable object at some close distance, while input buttons 52 and 54 on the frame 25 are adjusted to vary the focus of the lenses 10 and 20 until the object of interest is in sharp focus as determined by the user. The user then may enter the focus adjustment subsystem settings into the non-volatile memory 60 by, for example without limitation, pressing "Enter" button 56. In a second calibration step in the calibration mode, the user looks at a second object at a distance different from the first object, and the autofocus system adjusts the focus of the lenses 10 and 20 again. The new settings of the adjustment subsystem 40 settings may be entered into the nonvolatile memory 60 by, for example without limitation, pressing "Enter" button 56.

[00030] The number of times the steps must be repeated will depend on the number of parameters being adjusted while the autofocus eyeglass system 100 is in the calibration mode. For example, the three dimensional position and orientation of the cameras 70 and 80, expressed in the reference frames of the lenses 10 and 20 respectively, may be one group of such parameters. Measured optical parameters of the lenses 10 and 20, and the intraocular separation of the eyes are other parameters that may be modified in calibration mode. In this case, an initial number may be used, but fine- tuning of the parameter will occur during the calibration mode. The cycle repeats until no further manual adjustments are required. The device by these steps builds or refreshes a table of parameters characterizing the user's eye vergence-accommodation requirements and self-calibrates any changes to the operating characteristics of the autofocus eyeglass system 100. It is advantageous to vary both the gaze distance and the direction of the gaze of each eye during the calibration mode. The gaze distances measured while the autofocus eyeglass system 100 is in the calibration mode may be varied over a substantial portion of a nearest point of about 20 cm to a far point of about 7 meters. The direction of the gaze of each eye may substantially cover a range of 50 degrees to either side of looking directly ahead.

[00031] The field of view of the cameras 70 and 80 may also include features of the face. The positions and/or sizes of these features in the images taken by the cameras maybe used to determine the orientation and position of the frame relative to the eyes. The set of determined information may be stored in memory. The orientation and position of the frame, as determined from subsequent images, may be compared with the stored information. If a substantial change is detected, then the autofocusing eyeglasses system can modify the setting of the lenses 10 and 20 to compensate for the shift in the lenses relative to the eyes. Cameras 70 and 80 may be the same camera.

[00032] The controller 50 can compensate for a change in the relative position of the lenses 10,20 to the eyes 12,22 of the user by changing the focus setting of the lenses 10,20 from the focus setting when the lenses 10,20 are in a nominal position. The controller 50 can compensate for the change in the distance between the cornea of the eye 12,22 and the lens 10,20 corresponding to each eye 12,22, a change in gaze direction of each eye due to any optical translation or deviation as a consequence of the movement of the lens 10,20 relative to the eye 12,22, and a change in a distance between the lens 10,20 and a viewed object. The modified quantities are stored in memory and used in subsequent calculations and process inputs.

[00033] The size of the pupil maybe determined from images of the eyes 12,22 by cameras 70,80 and the controller 50 can compensate for a change in the depth of focus of eyes 12,22 with the lighting conditions.

[00034] In the calibration mode, in some embodiments, the computing processing power or storage capacity of the autofocusing eyeglass system 100 may be augmented by communicating with an external computer 285. The external computer may also provide a means to input instructions and parameters during the calibration mode.

[00035] In Figure 2a eye 12 is shown in a generally "straight ahead" viewing orientation 200 with three horizontal pixel sections comprising imaging pixels of camera 70 superimposed. Alternatively it may be viewed as the image of eye 12 projected onto the multi-pixel imaging array of camera 70. Three horizontal pixel sections 240, 242 and 244, each comprising three rows of pixels, are shown. The present invention may be implemented using one or more horizontal pixel sections of the multi-pixel imaging array of camera 70. Figure 2a shows an embodiment employing three horizontal pixel sections As shown in Figure 2a, the horizontal pixel sections, and thereby the image frames of camera 70, may be aligned substantially horizontally with respect to the view perceived by eye 12. While figure 2a shows three rows of pixels per horizontal pixel section, the number of rows of pixels included in each horizontal pixel section may vary dynamically with the signal-to-noise ratio in the pixels within a row. The lower the signal-to-noise ratio in each of the pixels in a row, the greater the number of rows required in a horizontal pixel section, as will become clear below.

[00036] In other embodiments, the image frame of camera 70 may be aligned at an angle to the horizontal with respect to the view perceived by the eye 12. In yet further embodiments the alignment of image frames of camera 70 to eye 10 and the alignment of image frames of camera 80 to eye 20 may at an angle with respect to each other.

[00037] With the camera 70 unchanged and the eye 12 reoriented to orientation 200' of Figure 2b for a different gaze direction, the eye 12 now maps differently onto horizontal pixel sections 240, 242 and 244. This difference in eye-to-horizontal pixel section mapping may be employed to determine the gaze direction of eye 12. While not shown in a similar figure, the same principle applies to eye 22 and camera 80 and to the determination of the gaze direction of eye 22. A collection of data, either predefined for the system or that which is determined or refined during the calibration mode, is used to determine the focus setting of the lenses for a measured gaze direction of each eye. The collection of data is a gaze map of the autofocus eyeglass system 100.

[00038] In some embodiments, during the calibration mode, the focus setting of each lens 10 and 20 may be adjusted separately. It is advantageous to interface to the autofocusing eyeglass system 100 with the computer 285 during the calibration mode.

[00039] While the principle of employing the features of the human eye to determine its gaze direction is known in the art, the methods employed are typically computation intensive and are based on the use of a substantial portion of the image of the eye. The challenge in the present invention lies in the use of a minimum of memory and computational power, and in limiting the demands on the limited power supply, which has to be integrated with the autofocus eyeglass system 100. [00040] We consider now the features of the human eye that make it possible to employ only the limited number of pixels contained in the horizontal pixel sections 240, 242 and 244 of Figures 2a and 2b to accurately determine the gaze angle of eye 12. The limbus 215 of the human eye 12 is the transition region between the sclera 210 and the cornea 220. The inside diameter of the limbus 215 is approximately 1 1.7 mm and its width is approximately 1.5 mm. Since the iris 230 behind the cornea 220 is mostly circular, the number of rows of pixels to include in the horizontal pixel sections depends on what portion of the limbus 215 is measured.

[00041] When the user is using the autofocusing eyeglass system 100 in daily use, the autofocusing eyeglass system 100 may be placed in a tracking mode. With reference to Figure 2a, the tracking mode may use a method for determining the transition point between the sclera and the cornea of eye 12, the method comprising choosing an initial trial number no of adjacent rows of imaging pixels of the imaging array of camera 70 to include in horizontal pixel sections 240, 242 and 244. The initial trial number n 0 may be different for each of the horizontal pixel sections. The signal is then averaged over corresponding pixels adjacent to each other across the n 0 rows. In this process of averaging the signal-to-noise ratio will generally be improved. Using this set of n 0 rows per horizontal pixel section, the image of eye 12 is analyzed and a transition region with the general characteristics of the limbus 215 is identified. The number of rows of pixels included may then be increased from n 0 . The noise within the signal at this transition region, and thereby the accuracy, is traded off against the sharpness of the transition representing the limbus 215. The number of rows of pixels included may thus be optimized in order to obtain the minimum number of rows of pixels that need to be included to get a suitably consistent and reliable determination of the limbus 215.

[00042] Typically the 50% image signal point of the transition is chosen as the transition point. Methods for determining a value for the transition point of a one- dimensional curve defined by discrete data are well known in the art. The transition point accurately describes an intercept of the limbus with a particular pixel section 240, 242 or 244. [00043] In one embodiment, three horizontal pixel sections comprising the optimal number of rows as determined above, are selected. They are chosen such that the three resulting horizontal pixel sections 240, 242 and 244 have a spacing that is more than 20% of the diameter of the cornea. Usefully, the middle one of the three horizontal pixel sections 240, 242 and 244 may be chosen to intercept the center of the cornea 220 of eye 12

[00044] In one embodiment, where only the pixels in horizontal pixel section 242 in Figures 2a and 2b are employed to determine the gaze angle of eye 12. The horizontal positions of the transition points may be used to determine the gaze direction in the horizontal direction by reference to the gaze map created during the calibration mode. The horizontal positions of the transition points may also be used to determine the vertical direction of the gaze direction by reference to the gaze map created during the calibration mode.

[00045] Under normal operating conditions, the power consumed by the imaging system can be reduced in proportion to the ratio between the number of pixels in the active horizontal sections and the total number of pixels in the sensor. Reducing the number of pixels to be processed also allows for the use of a microprocessor, which consumes less power to process information to determine the focus setting of the lenses In a human eye, when the eye changes gaze direction, the eye rotates around a center of rotation. The cornea of the human eye is an optical lens and has a vertex. The center of rotation is located about 13.5 mm behind the vertex of the cornea. Using only two horizontal sections of the type described above will generally provide four positions of intercept on the limbus 215. From these four positions a center of the pupil 230 of eye 12 can be inferred. This center may be determined to an accuracy of within approximately 0.08 degrees as measured relative to the rotation point of the eye. With the same information derived also for eye 22, the gaze directions of the two eyes 12 and 22 may then be determined with a suitable accuracy to allow the gaze distance of the two eyes 12 and 22 to be calculated to within an uncertainty equivalent of plus or minus 0.1 diopter. Measured total depth of field for a typical aged human eye with absolute prebyopia is approximately equivalent to a defocus of 0.2 diopter. A value of 0.1 diopter, as an acceptable tolerance for the maximum uncertainty in the calculated equivalent distance, is one possible choice for the required accuracy to which the gaze distance needs to be determined. Even in the case where one of the four positions on the limbus 215 is obscured by an eyelid, the remaining three can provide an estimate of the gaze direction. Using three horizontal lines as in Figures 2a and 2b improves the estimate further. If only two horizontal sections of the type described above are employed, it may be advantageous to separation the sections by approximately half of the diameter of the cornea 220.

[00046] Other embodiments may use additional features of the eye to improve the estimation of the gaze direction in the tracking mode. In one embodiment a feature that corresponds to the pupil 230 of the eye 12 may be present in image data of at least one of the horizontal pixel sections 240, 242 or 244. In the human eye the pupil may range in size from 2 to 8 mm. The pupil size is smaller than the inner diameter of the limbus and pupil is generally darker than the sclera in color. The image of the pupil thus provides a signature distinct from the signature of the limbus 215 in the data from a horizontal pixel section which includes a portion of the pupil image. Placing a light filter between the light path from the eye to the light sensor to selectivity transmit the spectrum of light complementary to the color of the iris of the user improves the detection of the limbus 215 region. Advantageously, the filter can also be implemented as a digital filter on the image data, which can be more readily enabled and disabled than an optical filter.

[00047] In another embodiment, the right eye 12 may be tracked with the camera

70 positioned on the right temple side of the frame 25 and pointed substantially to the left. When the gaze of the right eye 12 is directed forward, the image of the eye is captured in profile by camera 70. The gaze direction of the eye 12 can be determined by measuring the portion of the limbus 215 visible in the profile view.

[00048] When the object of interest is situated to the left hand side and the user looks at the object without turning the user's head, there are situations when no portion of the limbus 215 may be captured by the camera. When the gaze of the right eye 12 is approaching a situation where the gaze is pointed far enough to the left so that the right hand side of the limbus 215 of the eye 12 is about to disappear from the field of view of camera 70, a second method of determining the gaze direction can be used in conjunction with the first method. At least one, fixed feature, such as for instance a prominent blood vessel structure or feature on the sclera 210, may be used. The calibration mode may include the determination of the structures to utilize for the gaze tracking and create a gaze map using the position of those structures in the image frame as indexing parameters. The gaze map for the second method may use a different set of pixels from the first method for determining the gaze direction. A total map that is a combination of the gaze map for the first and the second method may be used to track the gaze of the right eye 12. Advantageously, the total map includes a region where the gaze maps of the first and second methods overlap. The same method applies to eye 22 and camera 80 and to the determination of the gaze direction of eye 22. The autofocus eyeglass system 100 may employ the limbus of both eyes, a feature of the sclera of both eyes, or the sclera of one eye and the limbus of the other while tracking both eyes. Since the occasions where the user is looking to the extreme left or right without turning their head are rare, higher power consumption during that time will not significantly affect the operating time of the autofocus eyeglass system 100.

[00049] In another implementation horizontal pixel sections 240, 242 and 244 need not be horizontal and may, in fact, be vertical or be at some angle to the horizontal. Using a sparse set of measurements to determine the gaze direction, preferably with some redundancy so that outlier measurements can be discarded, the system is able to determine accurately the required focus setting to provide sharp focus to the user.

[00050] In one embodiment, when autofocusing eyeglass system 100 is unable to determine a focus position setting to the required tolerance, autofocusing eyeglass system 100 may set the focus setting to provide focus between about 1 and 3 meters or some other distance range. Once autofocusing eyeglass system 100 determines a focus position setting to the required tolerance, it may set the focus setting to the determined position. [00051] In another embodiment of the invention, the controller 50 may vary the focus setting of the adjustable focus lenses 10 and 20 to one of a set of predetermined viewing distances. These positions may correspond to a standard reading distance in a range around 35 cm, a standard computer monitor viewing distance in a range around 65 cm, and a distance of about 2.5 meters. An estimate of the distance to an object of interest is determined by distance-finding subsystem 30. The controller 50 then determines which of the predetermined viewing distances optimizes the focus of the object to the user. This represents a trifocal configuration for autofocusing eyeglass system 100 in which the focus adjustment subsystem 40 is restricted to adjust the focus of lenses 10 and 20 to three predetermined fixed viewing distances each.

[00052] In other embodiments, the set of predetermined viewing distances may include more than 3 viewing distances. In other embodiments, the user is able to manually adjust the focus setting of any one of the set of predetermined viewing distances to compensate for fluctuations for the optimal focus setting for the eyes. The manually adjusted focus setting may be stored temporarily in memory 60 and used as the focusing setting for that predetermined viewing distance. In other embodiments, the controller 50 may store in memory more than one set of focus setting for a set of predetermined viewing distances. The user may cause the controller 50 to store in new set of focus setting. In other embodiments, there may be more than one set of predetermined viewing distances.

[00053] The depth of focus of most optical devices depends on the focus setting. A convenient means for expressing the total depth of focus in a manner substantially independent of the position of focus is to describe the total depth of focus as an equivalent lens power. When the focal length of the lens power is expressed in units of meter, the equivalent lens power is expressed in diopter.

[00054] In another aspect of the invention, a method for determining the viewing distances to be included in the set of predetermined viewing distances comprises choosing a minimum viewing distance and a maximum viewing distance to include in the set of predetermined viewing distances. The term "viewing range" is used in the present specification to describe the physical distance range between the two limits so determined. Assosciated with this viewing range is a corresponding lens power range. The method further comprises dividing the lens power range into equal segments of incremental lens power. This implies that the corresponding selection of viewing distances is not separated by equal distance segments. A preferred segment size or increment size for the lens power range may be based on the lens power variation corresponding to the depth of focus of the compound lens formed by the eye 12, 22 of the user and the corresponding lens 10, 20 of the autofocus eyeglass system 100. The term "user focal depth lens power variation" is used in the present specification to describe the variation in lens power that is equivalent to the depth of focus of the compound lens formed by the eye 12,22 of the user and the corresponding lens 10,20 of the autofocus eyeglass system 100. In particular, the incremental lens power may be chosen to be a constant factor of the user focal depth lens power variation. The constant factor may be greater than zero and less than or equal to two. This approach has the benefit of creating the smallest number of set points to store in memory. This prevents the controller 50 from "hunting" for distance settings and thereby keeps the required processing power and time to a minimum.

[00055] In other embodiments, the adjustment range is divided in predetermined lens power units. One preferred predetermined lens power units for use to segment the range of viewing distances is a quarter of a diopter, a unit commonly used in industry for adjustable optical equipment.

[00056] In another aspect of the invention a method for determining the gaze direction of the eye 12 comprises, as shown in the flow chart of Fig. 3, capturing [310] an image data set for the eye 12 of the user, and transferring [315] to the controller 50 only a sparse portion of the image data set corresponding to the regions of the horizontal sections 240, 242, and 244 determined in the calibration mode. The method proceeds further by analyzing [320] the sparse portion of the image date set to determine the transition points of limbus of the eye 12, and determining [325] the gaze direction of the eye 12, using the parameters in a gaze map generated during a calibration mode.

[00057] With reference to Figure 4, the calibration mode may employ a method for determining the gaze map for autofocus eyeglass system 100 for use in the tracking mode, the method comprising determining [410] which of the device parameters and user specific parameters described at the hand of figure 1 are to be optimized; capturing [420] a full frame image of each of the eyes of the user while the user is looking at an object at a first distance; determining [430] a user preferred focus setting for optimal focus for the first object distance; changing [440] the object distance to a second object distance; repeating the steps [420] to [440] to gather a data pair comprising of image data and preferred focus setting; collecting data pairs until, at query point [450], the number of data pairs is at least equal to the number of parameters to be optimized; optimizing [460] the parameters to be optimized until, at query point [470], determining the focus setting using the image data provides focus setting values that agree with the corresponding user preferred focus settings; determining [480] a gaze direction of the eye in the images in the data set using the optimized parameters; choosing [485] a sparse set of pixels for determining the transition points of the limbus in each image of the data set; optimizing

[490] a second set of parameters by varying them until, at query point [495], calculating the gaze direction of the eye using only the information in the set of sparse pixels of each image of the data set and the values of the second set of parameters provides gaze directions that agree with the corresponding gaze direction calculated using the full frame images, this set of parameters includes parameters for describing the eccentricity and asymmetry of the shape of the limbus, parameters to account for the non-uniform color of the iris and the patterns in the iris, as well as the device and user specific parameters; and thus generating a gaze map for the device for use in the tracking mode.

[00058] In another method for the calibration mode, at the beginning of step 430 of the previous method, the controller may use the existing value of the parameters to adjust the focus of the lenses prior to accepting input from the user. [00059] In yet another method for the calibration mode, any new information that may affect a value of one of the parameters being optimized during the calibration mode may initiate the respective optimization method to determine new values for the parameters.

[00060] Choosing a sparse set of pixels for determining the transition points of the limbus described at the hand of Figures 1, 2a and 2b may comprise choosing one or more horizontal sections, a mathematically describable curve, a set of piecewise contiguously arrayed scan lines, a set of pixels which is substantially smaller than the pixels in the entire image frame, or some combination thereof. The term "contiguously arrayed" is used in the present specification to describe scan lines of the imagers 70 and 80 that are immediately adjoining one another along their long sides. A mathematically describable curve is a set of pixels may include those pixels whose area includes a point that belongs to a mathematical function that is overlaid on the multi-pixels array sensory surface.

[00061] In another aspect of the invention a method for automatically focusing the autofocusing eyeglass system 100 comprises, as shown in the flow chart of Fig. 5, selecting [510] a tracking mode, and then determining [520] a gaze direction for each of eyes 12 and 22, as described at the hand of Fig 3. The gaze distance is then determined

[530] using the parameters determined in the calibration mode described at the hand of Fig 4. The method proceeds by setting [540] the focus for the lenses 10 and 20 for the gaze direction and distance.

[00062] While this invention has been described as having an exemplary design, the present invention may be further modified within the spirit and scope of this disclosure. This application is therefore intended to cover any variations, uses, or adaptations of the invention using its general principles. Further, this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains.