Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
IMAGING FOR LOCAL SCALING
Document Type and Number:
WIPO Patent Application WO/2015/043275
Kind Code:
A1
Abstract:
An imaging method and device are provided for local scaling. The method can comprise determining a gazed object of an eye, determining the corresponding sub-areas of an imaging lens group according to the gazed object, the imaging lens group being configured to scale and image for the gazed object, comprising a plurality of sub-areas with adjustable scaling property, and determining the scaling parameter of the corresponding sub-areas according to the gazed object. In the method and the device of at least one embodiment of this application, the images of the gazed object in the user's fundus can be scaled in a local scaling mode, so as to avoid changing the overall view of the user and to enable the user to conveniently observe the gazed object and to simultaneously correctly perceive the surrounding environment.

Inventors:
DU LIN (CN)
SHI WEI (CN)
YU KUIFEI (CN)
Application Number:
PCT/CN2014/081492
Publication Date:
April 02, 2015
Filing Date:
July 02, 2014
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
BEIJING ZHIGU RUITUO TECH CO LTD (CN)
International Classes:
H04N5/232
Domestic Patent References:
WO2012165203A12012-12-06
Foreign References:
CN103595912A2014-02-19
CN101448098A2009-06-03
CN101247461A2008-08-20
Attorney, Agent or Firm:
PATENTSINO IP FIRM (Shang Fang Building No.7 Middle Road of North Third Ring Road,Xi Cheng District, Beijing 9, CN)
Download PDF:
Claims:
Claims

1. A method, comprising:

determining, by a system comprising a processor, a gazed object at which an eye is gazing;

determining corresponding sub-areas of an imaging lens group according to the gazed object, the imaging lens group being configured to scale and image for the gazed object, comprises a plurality of sub-areas with adjustable scaling property; and

determining a scaling parameter of the corresponding sub-areas according to the gazed object.

2. The method of Claim 1, wherein the determining the gazed object comprises:

detecting a position of a focus point of the eye and determining the gazed object according to the position of the focus point of the eye.

3. The method of Claim 2, wherein the detecting the position of the focus point of the eye comprises:

determining the position of the focus point of the eye according to an optical parameter corresponding to images presented on the fundus and with a defined clarity determined to be greater than a preset value, wherein the optical parameter is the optical parameter of an optical path between an image collection position and the eye.

4. The method of Claim 3, wherein the determining the position of the focus point of the eye according to the optical parameter corresponding to the images presented on the fundus and with the defined clarity determined to be greater than the preset value comprises:

collecting fundus images;

adjusting the imaging parameter of the optical path between the eye and the image collection position for collecting a set of images with the defined clarity determined to be greater than the preset value;

processing the fundus images;

acquiring the optical parameter of the eye according to an imaging parameter corresponding to the set of images with the defined clarity determined to be greater than the preset value, wherein the imaging parameter is the imaging parameter of the optical path between the image collection position and the eye; and

determining the position of the focus point of the eye according to the optical parameter of the eye.

5. The method of Claim 4, wherein the optical parameter of the eye comprises an equivalent focal length and a line-of-sight direction of the eye.

6. The method of Claim 2, wherein the detecting the position of the focus point of the eye comprises:

tracking line-of-sight directions of two eyes and acquiring the position of the focus point of the eye through an intersection of the line-of-sight directions of the two eyes.

7. The method of Claim 2, wherein the detecting the position of the focus point of the eye comprises:

tracking a line-of-sight direction of the eye;

acquiring a scene depth of the position of the focus point containing the eye according to the line-of-sight directions and

determining and acquiring the position of the focus point of the eye according to the scene depth.

8. The method of Claim 1, wherein the determining the gazed object of the eye comprises:

collecting fundus images and determining the gazed object according to the fundus images.

9. The method of Claim 1, wherein the imaging lens group comprises at least two lenses, and the at least two lenses are adjustable in scaling property with respective portions of the corresponding sub-areas.

10. The method of Claim 9, wherein the scaling property is adjusted by changing respective focal lengths of the at least two lenses.

11. The method of Claim 9, wherein the scaling property is adjusted by changing a relative position between the at least two lenses.

12. The method of Claim 1, wherein the plurality of sub-areas are distributed in an array.

13. The method of Claim 12, wherein the plurality of sub-areas are distributed in a rectangular array.

14. The method of Claim 12, wherein the plurality of sub-areas are distributed in a radial concentric circle array.

15. The method of Claim 1, wherein the determining the corresponding sub-areas of the imaging lens group according to the gazed object comprises:

determining the corresponding sub-areas according to a projection of the gazed object on the imaging lens group.

16. The method of Claim 1, wherein the determining the scaling parameter of the corresponding sub-areas according to the gazed object comprises:

determining the scaling parameter of the corresponding sub-areas according to an actual viewing distance from the gazed object to the eye.

17. The method of Claim 16, wherein the determining the scaling parameter of the corresponding sub-areas according to the actual viewing distance from the gazed object to the eye comprises:

acquiring the actual viewing distance from the gazed object to the eye; and determining the scaling parameter of the corresponding sub-areas according to the actual viewing distance.

18. The method of Claim 16, wherein the determining the scaling parameter of the corresponding sub-areas according to the actual viewing distance from the gazed object to the eye comprises:

presetting a target viewing distance from the gazed object to the eye and a buffer of the target viewing distance;

acquiring the actual viewing distance from the gazed object to the eye; and determining the scaling parameter of the corresponding sub-areas according to the target viewing distance, the actual viewing distance and the buffer.

19. The method of Claim 18, wherein the buffer is zero.

20. The method of Claim 1, wherein the determining the scaling parameter of the corresponding sub-areas according to the gazed object comprises:

determining the scaling parameter of the corresponding sub-areas according to an actual area proportion of the fundus images of the gazed object on the fundus.

21. The method of Claim 20, wherein the determining the scaling parameter of the corresponding sub-areas according to the actual area proportion of the fundus images of the gazed object on the fundus comprises:

acquiring the actual area proportion of the fundus images of the gazed object on the fundus; and

determining the scaling parameter of the corresponding sub-areas according to the actual area proportion.

22. The method of Claim 20, wherein the determining the scaling parameter of the corresponding sub-areas according to the actual area proportion of the fundus images of the gazed object on the fundus comprises:

presetting a target area proportion of the fundus images of the gazed object on the fundus and the buffer of the target area proportion;

acquiring the actual area proportion of the fundus images of the gazed object on the fundus; and

determining the scaling parameter of the corresponding sub-areas according to the target area proportion, the actual area proportion and the buffer.

23. The method of Claim 22, wherein the buffer is zero.

24. The method of Claim 1, further comprising:

determining whether a time for the eye to observe the gazed object exceeds a predetermined time, and in response to the time for the eye to view the gazed object being determined to exceed the predetermined time, determining the corresponding sub-areas of the imaging lens group according to the gazed object and determining the scaling parameter of the corresponding sub-areas according to the gazed object.

25. The method of Claim 1, further comprising:

determining whether the eye has ametropia and generating ametropia information about the eye in response to the eye being determined to have ametropia,

wherein the determining the scaling parameter of the corresponding sub-areas according to the gazed object comprises:

determining the scaling parameter of the corresponding sub-areas according to the gazed object and the ametropia information.

26. The method of Claim 1, further comprising:

adjusting the scaling property of the corresponding sub-areas according to the scaling parameter.

27. The method of Claim 1, wherein the gazed object is a static object or a mobile object.

28. An imaging device, comprising:

an object determining unit configured to determine a gazed object of an eye; an imaging lens group configured to scale and image for the gazed object, comprising a plurality of sub-areas with an adjustable scaling property;

a sub-area determining unit configured to determine corresponding sub-areas of the imaging lens group according to the gazed object; and

a parameter determining unit configured to determine a scaling parameter of the corresponding sub-areas according to the gazed object.

29. The imaging device of Claim 28, wherein the object determining unit comprises:

a first object determining sub-unit configured to detect a position of a focus point of the eye and determine the gazed object according to the position of the focus point of the eye.

30. The imaging device of Claim 29, wherein the first object determining sub-unit comprises:

a first focus point detecting module configured to determine the position of the focus point of the eye according to an optical parameter corresponding to images presented on a fundus and with a defined clarity determined to be greater than a preset value, wherein the optical parameter is the optical parameter of an optical path between an image collection position and the eye.

31. The imaging device of Claim 30, wherein the first focus point detecting module comprises:

an image collecting sub-module configured to collect fundus images;

an image adjusting sub-module configured to adjust the optical parameter of the optical path between the eye and the image collection position and configured to collect a set of images with the defined clarity determined to be greater than a preset value;

an image processing sub-module configured to process the collected images; acquire the optical parameter of the eye according to an imaging parameter corresponding to the images with the defined clarity greater than the preset value; and

a focus point determining sub-module configured to determine the position of the focus point of the eye according to the optical parameter of the eye.

32. The imaging device of Claim 29, wherein the first object determining sub-unit comprises:

a second focus point detecting module configured to track line-of-sight directions of two eyes and acquire the position of the focus point of the eye through an intersection of the line-of-sight directions of the two eyes.

33. The imaging device of Claim 29, wherein the first object determining sub-unit comprises:

a third focus point detecting module configured to track line-of-sight directions of the eye, acquire a scene depth of the position of the focus point containing the eye according to the line-of-sight directions, and determine and acquire the position of the focus point of the eye according to the scene depth.

34. The imaging device of Claim 28, wherein the object determining unit comprises:

a second object determining sub-unit configured to collect fundus images and determine the gazed object according to the fundus images.

35. The imaging device of Claim 28, wherein the imaging lens group comprises at least two lenses, and the at least two lenses are adjustable in scaling property with each portion of the corresponding sub-areas.

36. The imaging device of Claim 28, wherein the plurality of sub-areas are distributed in an array.

37. The imaging device of Claim 36, wherein the plurality of sub-areas are distributed in a rectangular array.

38. The imaging device of Claim 36, wherein the plurality of sub-areas are distributed in a radial concentric circle array.

39. The imaging device of Claim 28, wherein the sub-area determining unit is configured to determine the corresponding sub-areas according to the projection of the gazed object on the imaging lens group.

40. The imaging device of Claim 28, wherein the parameter determining unit comprises:

a first parameter determining sub-unit configured to determine the scaling parameter of the corresponding sub-areas according to an actual viewing distance from the gazed object to the eye.

41. The imaging device of Claim 40, wherein the first parameter determining sub-unit comprises:

an actual viewing distance acquiring module configured to acquire the actual viewing distance from the gazed object to the eye; and

a parameter determining module configured to determine the scaling parameter of the corresponding sub-areas according to the actual viewing distance.

42. The imaging device of Claim 40, wherein the first parameter determining sub-unit comprises:

a presetting module configured to preset a target viewing distance from the gazed object to the eye and the buffer of the target viewing distance;

an actual viewing distance acquiring module configured to acquire the actual viewing distance from the gazed object to the eye; and

a parameter determining module configured to determine the scaling parameter of the corresponding sub-areas according to the target viewing distance, the actual viewing distance and the buffer.

43. The imaging device of Claim 28, wherein the parameter determining unit comprises:

a second parameter determining sub-unit configured to determine the scaling parameter of the corresponding sub-areas according to an actual area proportion of fundus images of the gazed object on a fundus.

44. The imaging device of Claim 43, wherein the second parameter determining sub-unit comprises:

an actual area proportion acquiring module configured to acquire the actual area proportion of the fundus images of the gazed object on the fundus; and

a parameter determining module configured to determine the scaling parameter of the corresponding sub-areas according to the actual area proportion.

45. The imaging device of Claim 43, wherein the second parameter determining sub-unit comprises:

a presetting module configured to preset a target area proportion of the fundus images of the gazed object on the fundus and the buffer of the target area proportion;

an actual area proportion acquiring module configured to acquire the actual area proportion of the fundus images of the gazed object on the fundus; and

a parameter determining module configured to determine the scaling parameter of the corresponding sub-areas according to the target area proportion, the actual area proportion and the buffer.

46. The imaging device of Claim 28, wherein the device further comprises: a time judging unit configured to determine whether a time for the eye to view the gazed object exceeds a predetermined time, and if the time for the eye to view the gazed object exceeds the predetermined time, enabling the sub-area determining unit, the imaging lens group and the parameter determining unit.

47. The imaging device of Claim 28, wherein the device further comprises: a refraction judging unit configured to determine whether the eye has ametropia problems and generate ametropia information of the eye if the eye has ametropia problems, and wherein the parameter determining unit is further configured to determine the scaling parameter of the corresponding sub-areas according to the gazed object and the ametropia information.

48. The imaging device of Claim 28, wherein the device further comprises: a property adjusting unit configured to adjust the scaling property of the corresponding sub-areas according to the scaling parameter.

49. The imaging device of Claim 28, wherein the device is a pair of glasses.

50. A computer readable storage device, comprising at least one executable instruction, which, in response to execution, causes an imaging device comprising a processor to perform operations, comprising:

determining a gazed object of an eye;

determining corresponding sub-areas of an imaging lens group according to the gazed object; and

determining a scaling parameter of the corresponding sub-areas according to the gazed object.

51. An imaging device, characterized by comprising a processor and a memory, the memory storing the executable instructions, the processor being connected with the memory through a communication bus, wherein, when the imaging device is operating, the processor executes or facilitates execution of the executable instructions stored by the memory to cause the imaging device to perform operations, comprising:

determining a gazed object of an eye;

determining corresponding sub-areas of an imaging lens group according to the gazed object; and

determining a scaling parameter of the corresponding sub-areas according to the gazed object.

Description:
Imaging for Local Scaling

Related Application

[0001] This application claims the priority of Chinese Patent Application No.

201310461019.1, entitled "Imaging Method and Device for Local Scaling", filed on Sep. 30, 2013, which is hereby incorporated by reference herein in its entirety.

Technical Field

[0002] This application relates to the technical field of imaging, and, more particularly, to imaging for local scaling.

Background

[0003] For users with healthy eyes, when the objects viewed are smaller or farther, it is difficult for the eyes to observe desired details. For example, when persons sit in farther positions to view a ball game, it is difficult to view the details of limb movement and expressions of athletes. For users whose eyes per se have shortsightedness or farsightedness, when the objects viewed are smaller or farther, it is more difficult for the eyes to identify the details of the objects or persons viewed. Conversely, when the objects viewed are too large or too close, it is difficult for the users to observe the global information of the gazed at objects. For example, when the users stand in front of a high building or a mountain, it is difficult to observe the overall situation of the high building or the mountain.

[0004] Conventional optical scaling devices, such as a telescope or a magnifier usually adopt global unified scaling. Fig. la is a schematic diagram of the view of a user, wherein A, B, C represent three objects in the view 110. Assuming that the user wants to amplify the viewing object B, as shown in Fig. lb, when a global unified scaling mode is adopted, the object B is amplified and meanwhile, the object C is also amplified, while the object A is out of the view 110, that is, at this moment, the user cannot see the object A. Thus, global unified scaling will cause a change in the integral view of the user. In many scenes, such as AR (Augmented Reality) environment, they will bring discomfort to the user, causing inconvenience for use.

Summary

[0005] The following presents a simplified summary in order to provide a basic understanding of some example embodiments disclosed herein. This summary is not an extensive overview. It is intended to neither identify key or critical elements nor delineate the scope of the example embodiments disclosed. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.

[0006] At least one embodiment in this application aims at: providing an imaging device and method for local scaling so as to facilitate the user to view the gazing object.

[0007] According to one example embodiment of this application, a method comprises:

determining, by a system comprising a processor, a gazed object at which an eye is gazing;

determining corresponding sub-areas of an imaging lens group according to the gazed object, the imaging lens group being configured to scale and image for the gazed object, comprises a plurality of sub-areas with adjustable scaling property; and

determining a scaling parameter of the corresponding sub-areas according to the gazed object.

[0008] According to another example embodiment of this application, an imaging device comprises:

an object determining unit configured to determine a gazed object of an eye;

an imaging lens group configured to scale and image for the gazed object, comprising a plurality of sub-areas with an adjustable scaling property;

a sub-area determining unit configured to determine corresponding sub-areas of the imaging lens group according to the gazed object; and

a parameter determining unit configured to determine a scaling parameter of the corresponding sub-areas according to the gazed object.

[0009] According to another example embodiment of this application, a computer readable storage device, comprising at least one executable instruction, which, in response to execution, causes an imaging device comprising a processor to perform operations, comprising:

determining a gazed object of an eye;

determining corresponding sub-areas of an imaging lens group according to the gazed object; and

determining a scaling parameter of the corresponding sub-areas according to the gazed object.

[0010] According to another example embodiment of this application, an imaging device, characterized by comprising a processor and a memory, the memory storing the executable instructions, the processor being connected with the memory through a communication bus, wherein, when the imaging device is operating, the processor executes or facilitates execution of the executable instructions stored by the memory to cause the imaging device to perform operations, comprising:

determining a gazed object of an eye;

determining corresponding sub-areas of an imaging lens group according to the gazed object; and

determining a scaling parameter of the corresponding sub-areas according to the gazed object.

[0011] In the method and the device of at least one embodiment in this example embodiment, adopts the imaging lens group comprising a plurality of sub-areas with adjustable scaling property to scale and image for the gazing object of the eye and can automatically determine the scaling parameter of the corresponding sub-areas according to the gazing object, thus scaling the images of the gazing object on the user's fundus by locally scaling, avoiding changing the whole view of the user, facilitating the user's observation for the gazing object and also enabling the user to correctly perceive the surrounding environment. Brief Description of the Drawings

[0012] Fig. la is an example schematic diagram of the view of a user;

[0013] Fig. lb is an example schematic diagram after objects in the view of the user are uniformly scaled;

[0014] Fig. lc is an example schematic diagram after objects in the view of the user are locally scaled;

[0015] Fig. 2 is an example flow diagram of the imaging method for local scaling in an embodiment of this application;

[0016] Fig. 3a is an example structural diagram of a module of an implementation of the imaging device for local scaling in an embodiment of this application;

[0017] Fig. 3b is an example structural diagram of a module of the object determining unit in an embodiment of this application;

[0018] Fig. 3c is an example structural diagram of a module of the first object determining sub-unit in an embodiment of this application;

[0019] Fig. 3d is an example structural diagram of another module of the first object determining sub-unit in an embodiment of this application;

[0020] Fig. 3e is an example structural diagram of another module of the first object determining sub-unit in an embodiment of this application;

[0021] Fig. 3f is an example structural diagram of a module of the object determining unit in an embodiment of this application;

[0022] Fig. 3g is an example structural diagram of a module of the parameter determining unit in an embodiment of this application;

[0023] Fig. 3h is an example structural diagram of a module of the first parameter determining sub-unit in an embodiment of this application;

[0024] Fig. 3i is an example structural diagram of another module of the first parameter determining sub-unit in an embodiment of this application;

[0025] Fig. 3j is an example structural diagram of a module of the parameter determining unit in an embodiment of this application;

[0026] Fig. 3k is an example structural diagram of a module of the second parameter determining sub-unit in an embodiment of this application; [0027] Fig. 3L is an example structural diagram of another module of the second parameter determining sub-unit in an embodiment of this application;

[0028] Fig. 4 is an example structural diagram of a module of another implementation of the imaging device for local scaling in an embodiment of this application;

[0029] Fig. 5a is an example structural diagram of a module of an implementation of the focus point detecting system of an eye in an embodiment of this application;

[0030] Fig. 5b is an example structural diagram of a module of another implementation of the focus point detecting system of an eye in an embodiment of this application;

[0031] Fig. 5c is an example structural diagram of a module of the determining unit of optical axis direction of an eye in an embodiment of this application;

[0032] Fig. 5d is an example structural diagram of another module of the determining unit of optical axis direction of an eye in an embodiment of this application;

[0033] Fig. 5e is an example diagram of a facula pattern in an embodiment of this application;

[0034] Fig. 5f is an example fundus image captured when the facula pattern is projected in an embodiment of this application;

[0035] Fig. 5g is an example diagram of eye imaging in an embodiment of this application;

[0036] Fig. 5h is an example diagram of a distance from the focus point of an eye to the eye acquired according to the known optical parameter of the system and the optical parameter of the eye in this application;

[0037] Fig. 6 is an example diagram of a specific example when the focus point detecting system of an eye is applied to glasses in an embodiment of this application;

[0038] Fig. 7a is an example diagram of a specific example when the imaging device for local scaling is applied to glasses in an embodiment of this application;

[0039] Fig. 7b is an example diagram of a sub-area when the imaging device for local scaling is applied to glasses in an embodiment of this application;

[0040] Fig. 8a and Fig. 8b are example diagrams of the application scenes of the imaging device for local scaling in an embodiment of this application; and

[0041] Fig. 9 is an example structural diagram of the imaging device for local scaling in an embodiment of the present invention.

Detailed Description

[0042] The concrete implementations of this application are further described in details in combination with the drawings and the embodiments. The following embodiments are used for illustrating this application, but do not limit the range of this application.

[0043] In many application scenes, the user only hopes to locally scale the viewed object and keep a feeling of distance from the surrounding environment. For example, when the user is driving, the user hopes to carefully observe the license plate numbers of front distant vehicles while does not hope to neglect other vehicles on the road (if other vehicles are neglected, danger may occur). At this time, the unified scaling mode should not be used for changing the overall view of the user. Thus, at least one embodiment of this application provides an imaging method for local scaling. When the method of this embodiment is adopted for amplifying the object B in the user's view shown in Fig. la, its effect is shown in Fig. lc. It can be seen that in the amplified view, only the object B is amplified, while the object A and the object C keep the original objects. Moreover, the integral size of the user's view is not changed, and the object A is still within the view 110.

[0044] As shown in Fig. 2, the method comprises:

[0045] S210: determining the gazing object of an eye.

[0046] Wherein, the gazing object generally means an object, a person, etc. viewed by the user with the observing time more than predetermined time, and may be a static object or a mobile object. [0047] S220: determining the corresponding sub-areas of an imaging lens group according to the gazing object.

[0048] Wherein, the imaging lens group is configured to scale and image for the gazing object, comprising a plurality of sub-areas with adjustable scaling property.

[0049] S230: determining the scaling parameter of the corresponding sub-areas according to the gazing object; and

adjusting the scaling property of the corresponding sub-areas according to the scaling parameter after determining the scaling parameter.

[0050] In the method of this embodiment, adopts the imaging lens group comprising a plurality of sub-areas with adjustable scaling property to scale and image for the gazing object of the eye and can automatically determine the scaling parameter of the corresponding sub-areas according to the gazing object, thus scaling the images of the gazing object on the user's fundus by locally scaling, preventing the whole view of the user from being changed, and facilitating the user's observation for the gazing object.

[0051] Specifically, the step S210 can adopt any one of the following several realization ways:

1) detecting the position of the focus point of the eye, and determining the gazing object according to the position of the focus point of the eye.

[0052] Wherein, the detection of the position of the focus point of the eye may have several realization ways, and three optional realization ways are provided as follows:

a) determining the position of the focus point of the eye according to an optical parameter corresponding to images presented on the fundus and with the clarity greater than a preset value; and the optical parameter being the optical parameter of an optical path between an image collection position and the eye.

[0053] Specifically, the realization way a) comprises:

[0054] Si l l: collecting fundus images.

[0055] SI 12: adjusting the imaging parameter of the optical path between the eye and the image collection position for collecting images with the clarity greater than the preset value.

[0056] SI 13: processing the collected images; acquiring the optical parameter of the eye according to an imaging parameter corresponding to the images with the clarity greater than the preset value; and the imaging parameter being the imaging parameter of the optical path between the image collection position and the eye. The step SI 13 specifically comprises: analyzing the collected images to find out the images with the clarity greater than the preset value; and calculating the optical parameter of the eye according to the images with the clarity greater than the preset value and the imaging parameter of the optical path corresponding to the images with the clarity greater than the defined value.

[0057] Wherein, in the step SI 13, in order to improve the accuracy, the collected images may be analyzed to select out the clearest image from the images with the clarity greater than the preset value, and the optical parameter of the eye is calculated according to the clearest image and the imaging parameter of the optical path corresponding to the clearest image.

[0058] SI 14: calculating the position of the focus point of the eye according to the optical parameter of the eye. Wherein, the optical parameter of the eye comprises: equivalent focal length and line-of-sight direction of the eye.

[0059] b) tracking the line-of-sight directions of two eyes and acquiring the position of the focus point of the eye through the intersection of the line-of-sight directions of the two eyes.

[0060] c) tracking the line-of-sight directions of the eye; acquiring the scene depth of the position of the focus point containing the eye according to the line-of-sight directions; and calculating and acquiring the position of the focus point of the eye according to the scene depth.

[0061] 2) collecting fundus images and determining the gazing object according to the fundus images.

[0062] There is a macular area in the center of the retina of human fundus, and the macular area, located in the central optical zone of human eye, is a projection point of the eye light axis. A depression in the center of the yellow spot, known as a central fovea, is the sharpest part of an eye, and the eye's gazing object is projected to the central fovea of the macular area. Accordingly, the user's gazing object may be determined by collecting the corresponding images at the central fovea of the macular area in the fundus images.

[0063] When corresponding sub-areas of the imaging lens group are determined in the step S120, the corresponding sub-areas may be determined according to the projection of the gazing object on the imaging lens group.

[0064] Wherein, the imaging lens group comprises at least two lenses, and the at least two lenses being adjustable in scaling property with each corresponding portion of the sub-areas. The scaling property may be adjusted by changing the respective focal lengths of the at least two lenses or by changing the relative position between the at least two lenses. In the imaging lens group, the sub-areas are distributed in an array, such as in a rectangular array, or distributed in a radial concentric circle array.

[0065] The projection of the gazing object on the imaging lens group along the line-of-sight directions covers some relevant sub-areas, i.e., the corresponding sub-areas, and these corresponding sub-areas covered by the projection are the sub-areas required to be adjusted in the scaling property.

[0066] After the sub-areas required to be adjusted in the scaling property are determined, the corresponding scaling parameter is required to be determined. The step S130 may use any one of the following several implementations:

[0067] 1) The scaling parameter of the corresponding sub-areas is determined according to the actual viewing distance from the gazing object to the eye. Specifically, this manner 1) may also comprise:

[0068] Mode d:

acquiring the actual viewing distance from the gazing object to the eye; and

determining the scaling parameter of the corresponding sub-areas according to the actual viewing distance. Alternatively,

[0069] Mode e:

presetting the target viewing distance from the gazing object to the eye and the buffer of the target viewing distance; acquiring the actual viewing distance from the gazing object to the eye; and

determining the scaling parameter of the corresponding sub-areas according to the target viewing distance, the actual viewing distance and the buffer.

[0070] Wherein, in the modes d and e, acquiring the actual viewing distance from the gazing object to the eye may be realized according to the equivalent focal length of the eye in above mode a or by calculating the same after the position of the focus point of the eye is acquired according to the above modes b and c.

[0071] In mode d, the scaling parameter of the corresponding sub-areas determined according to the actual viewing distance may be a magnification, and there are various modes for acquiring the magnification according to the actual viewing distance; for example, the corresponding magnification is determined according to a piecewise function corresponding to the viewing distance or by looking up the table. This implementation selects a quick way of looking up table, i.e., presetting a corresponding relation table between the actual viewing distance and the magnification, and then determining the currently needed magnification by looking up the table during the implementation of the method. Wherein, the magnification may be 1, a constant greater than 1, or a grade greater than zero and smaller than 1. Table 1 below is an example of a magnification list, and it is observed that corresponding to each actual viewing distance D 0 , a preset magnification T 2 is stored in Table 1; for example, when the actual viewing distance D 0 is 20 m, its corresponding magnification may be determined as 5 by looking up the table.

Table 1 : First Magnification List

[0072] In mode e, the target viewing distance is the viewing distance of user's eye expected to reach, i.e., the desired value of the viewing distance when the user observes the gazing object, such as 10 m. When the user's eye's actual viewing distance is the target viewing distance, the user will feel that the distance from the gazing object to the himself or herself is moderate and the fundus images will be not too large or too small. Besides, the target viewing distance that makes the user feel comfortable is generally not a distance point, but rather a distance range; accordingly, the buffer of the target viewing distance is also arranged in the mode e. Generally, the buffer of the target viewing distance is the distance range preset between both sides of the target viewing distance. For example, assuming that the target viewing distance is D T , the buffer may be ((D T - D L , D T ) U (D T , D T + D R )), wherein, D T , D L and D R are constants. Consequently, the viewing distance scope (D T - D L , D T + D R ) is set to the viewing distance scope that makes the user feel comfortable. D L may be equal to D R ; in this case, the first sub-buffer ((D T -D L , D T ) and the second sub-buffer (D T , D T + D R ) of the buffer of the target viewing distance are of equal size and take D T as the center; and D L may also be unequal to D R ; in this case, the first sub-buffer ((D T -D L , D T ) and the second sub-buffer (D T , D T + D R ) vary in size.

[0073] In the step of determining the scaling parameter of the corresponding sub-areas according to the target viewing distance, the actual viewing distance and the buffer, for the scaling parameter:

[0074] Under the circumstance that the actual viewing distance is less than the target viewing distance and the actual viewing distance is beyond the buffer of the target viewing distance, after the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the actual viewing distance will be increased to the target viewing distance, to shrink the fundus images of the gazing object.

[0075] Under the circumstance that the actual viewing distance is greater than the target viewing distance and the actual viewing distance is beyond the buffer of the target viewing distance, after the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the actual viewing distance will be decreased to the target viewing distance, to amplify the fundus images of the gazing object.

[0076] In some implementations demanding simple control, the buffer of the target viewing distance may be set as zero, i.e., equivalent to the buffer without setting the target viewing distance; in this case, this is equivalent to determining the scaling parameter of the corresponding sub-areas according to the target viewing distance and the actual viewing distance, for the scaling parameter:

[0077] Under the circumstance that the actual viewing distance is less than the target viewing distance, after the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the actual viewing distance will be increased to the target viewing distance, to shrink the fundus images of the gazing object.

[0078] Under the circumstance that the actual viewing distance is greater than the target viewing distance, the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the actual viewing distance will be decreased to the target viewing distance, to amplify the fundus images of the gazing object.

[0079] 2) determining the scaling parameter of the corresponding sub-areas according to the actual area proportion of the fundus images of the gazing object on the fundus. Specifically, this manner 2) may also comprise:

[0080] Mode f:

acquiring the actual area proportion of the fundus images of the gazing object on the fundus; and

determining the scaling parameter of the corresponding sub-areas according to the actual area proportion. Alternatively,

[0081] Mode g:

presetting the target area proportion of the fundus images of the gazing object on the fundus and the buffer of the target area proportion;

acquiring the actual area proportion of the fundus images of the gazing object on the fundus; and

determining the scaling parameter of the corresponding sub-areas according to the target area proportion, the actual area proportion and the buffer.

[0082] Wherein, in modes d and e, the area of the user's fundus is generally fixed, after the user's fundus images are collected, the images in the central fovea area of the yellow spot may be extracted therefrom and used as the fundus images of the gazing object, such that the area of the fundus images of the gazing object may be acquired and then the actual area proportion of the fundus images of the gazing object on the fundus may be acquired.

[0083] In the mode f, the scaling parameter of the corresponding sub-areas determined according to the actual area proportion may be a magnification, and there are various modes for determining the corresponding magnification according to the actual area proportion, for example, the corresponding magnification is determined according to the piecewise function corresponding to the actual area proportion or by looking up the table. This implementation selects a quick way of looking up table, i.e., presetting a corresponding relation table between the actual area proportion and the magnification, and then determining the currently needed magnification by looking up the table during the implementation of the method. Wherein, the magnification may be 1, a constant greater than 1, or a grade greater than zero and smaller than 1. Table 2 below is an example of a magnification list, and it is observed that corresponding to each actual area proportion S RE , a preset magnification T is stored in Table 2; for example, when the actual area proportion S RE is 20%, its corresponding magnification may be determined as 2 by looking up the table.

Table 2: Second Magnification List

[0084] In the mode g, the target area proportion is the area proportion of the fundus images of the gazing object on the fundus expected to achieve, such as 50%. Under the circumstance that the area proportion of the fundus images of the gazing object on the fundus is the target area proportion, the user will feel that the distance from the gazing object to himself or herself is moderate and the fundus images will be not too large or too small. Besides, the area proportion of the fundus images of the gazing object that makes the user feel comfortable is generally not an area proportion point, but rather an area proportion range; accordingly, the buffer of the target area proportion is also arranged in the mode g. Generally, the buffer is the area proportion range preset between both sides of the target area proportion. For example, assuming that the target area proportion is S T , the buffer may be ((S T - S L , S T ) U (S T , S T + S R )), wherein, S T , S L and S R are constants. Consequently, the region of area proportion (S T - S L , S T + S R ) is set as the region of area proportion that makes the user feel comfortable. S L may be equal to S R ; in this case, the third sub-buffer (S T - S L , S T ) and the fourth sub-buffer (S T , S T + S R ) of the buffer are of equal size and take S T as the center; and S L may also be unequal to S R ; in this case, the third sub-buffer (S T - S L , S T ) and the fourth sub-buffer (S T , S T + S R ) vary in size.

[0085] In the step of determining the scaling parameter of the corresponding sub-areas according to the target area proportion, the actual area proportion and the buffer, for the scaling parameter:

[0086] In the case that the actual area proportion is less than the target area proportion and the actual area proportion is beyond the buffer, after the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the fundus images of the gazing object may be amplified to the target area proportion.

[0087] In the case that the actual area proportion is greater than the target area proportion and the actual area proportion is beyond the buffer, after the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the fundus images of the gazing object may be shrunk to the target area proportion. [0088] In some implementations demanding simple control, the buffer may also be set as zero, i.e., equivalent to not setting the buffer; in this case, this is equivalent to determining the scaling parameter of the corresponding sub-areas according to the target area proportion and the actual area proportion, for the scaling parameter:

[0089] In the case that the actual area proportion is less than the target area proportion, after the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the fundus images of the gazing object may be amplified to the target area proportion.

[0090] In the case that the actual area proportion is greater than the target area proportion, after the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the fundus images of the gazing object may be shrunk to the target area proportion.

[0091] In order to avoid the condition that the user's fundus images are changed in ungazing state, such as random sweeping, to affect the user experience, the method may also comprise:

[0092] S240: judging whether the time for the eye to observe the gazing object exceeds the predetermined time or not, and if exceeded, executing the steps S220 and S230.

[0093] Wherein, the predetermined time shall be set in such a manner to just make sure that the user is gazing the current observation object, and generally, when human eyes view a target, an optical impression may be obtained with the minimum observation time of 0.07-0.3s, and the predetermined time shall be more than the minimum observation time; for example, it may be set at Is, 2s, etc. In addition, the time for the user to observe the gazing object may be acquired by monitoring the time when the position of the focus point of user's eye remains unchanged, and under the circumstance that the time when the position of the focus point of user's eye remains unchanged exceeds the predetermined time, it can be judged that the user is currently gazing the object in the position of the focus point, or acquired by monitoring the dwell time of the corresponding image in the central fovea of the yellow spot, and under the circumstance that the dwell time of the image corresponding to the same object in the central fovea exceeds the predetermined time, it can be judged that the user is currently gazing the object.

[0094] When the gazing object is a mobile object, the judgment is only made in the beginning on whether the time for the eye to observe the mobile object exceeds the predetermined time, once the time is judged to exceed the predetermined time, the steps S220 and S230 are triggered; and when the user's line-of-sight follows the mobile object, the judgment would not be made again on whether the gazing time exceeds the predetermined time as long as the user's eye are gazing the mobile object all the time (the user does not need to turn his or her head but move his or her eyeballs only), thereby facilitating the user in observing the scaling of the mobile object.

[0095] In addition, human eyes may have ametropia problems such as farsightedness, nearsightedness and/or astigmatism, and therefore, the method also comprises:

[0096] S250: judging whether the eye have ametropia problems and generating the ametropia information about the eye if the eye have ametropia problems;

[0097] correspondingly, in this case, the step S230 comprises:

[0098] determining the scaling parameter of the corresponding sub-areas according to the gazing object and the ametropia information.

[0099] It should be understood that, in various embodiments of the present invention, the sequence numbers of the above processes do not imply an execution sequence, and the execution sequence of the processes should be determined according to the functions and internal logic, which is not intended to limit the implementation processes of the embodiments of the present invention in any way.

[00100] From the above, the method of this embodiment can scale the images of the gazing object on the user's fundus in a local scaling mode, so as to avoid changing the overall view of the user and to facilitate the user's observation for the gazing object.

[00101] Furthermore, the embodiment of this application also provides a computer readable medium comprising computer readable instructions which implement the following operation when they are executed: the operation of executing the steps S210, S220 and S230 of the method in the above-mentioned implementation shown in Fig. 2.

[00102] The embodiment of this application also provides an imaging device for local scaling, as shown in Fig. 3a, the device 300 comprises: an object determining unit 310, an imaging lens group 320, a sub-area determining unit 330 and a parameter determining unit 340.

[00103] The object determining unit 310 is configured to determine the gazing object of the eye.

[00104] The imaging lens group 320 is configured to scale and image for the gazing object and comprises a plurality of sub-areas with adjustable scaling property.

[00105] The sub-area determining unit 330 is configured to determine the corresponding sub-areas of the imaging lens group according to the gazing object.

[00106] The parameter determining unit 340 is configured to determine the scaling parameter of the corresponding sub-areas according to the gazing object.

[00107] The device of this embodiment uses the imaging lens group comprising a plurality of sub-areas with adjustable scaling property to scale and image for the gazing object of the eye and can automatically determine the scaling parameter of the corresponding sub-areas according to the gazing object, thus scaling the images of the gazing object on the user's fundus in a local scaling mode, so as to avoid changing the overall view of the user and to facilitate the user's observation for the gazing object.

[00108] Specifically, the object determining unit 310 may use any one of the following several implementations:

[00109] 1) As shown in Fig. 3b, the object determining unit 310 may comprise a first object determining sub-unit 310' configured to detect the position of the focus point of an eye and determining the gazing object according to the position of the focus point of the eye.

[00110] Wherein, the function of the first object determining sub-unit 310' may be realized in various modes and three optional implementations are provided as follows:

[00111] a) As shown in Fig. 3c, the first object determining sub-unit 310' may comprise a first focus point detecting module 310a' configured to determine the position of the focus point of the eye according to an optical parameter corresponding to the images presented on the fundus and with the clarity greater than a preset value, and the optical parameter being the optical parameter of the optical path between the image collection position and the eye.

[00112] Specifically, the first focus point detecting module 310a' comprises:

[00113] an image collecting sub-module 311a' configured to collect fundus images;

[00114] an image adjusting sub-module 312a' configured to adjust the imaging parameter of the optical path between the eye and the image collection position to collect the images with the clarity greater than the preset value; and

[00115] an image processing sub-module 313a' configured to process the collect images and acquire the eye's optical parameter according to the imaging parameter corresponding to the images with the clarity greater than the preset value, and the imaging parameter being the imaging parameter of the optical path between the image collection position and the eye. The image processing sub-module is specifically configured to: analyze the collected images to find out the images with the clarity greater than the preset value; and calculate the optical parameter of the eye according to the images with the clarity greater than the preset value and the imaging parameter of the optical path corresponding to the images with the clarity greater than the preset value. Wherein, in order to improve the accuracy, the collected images may be analyzed to select out the clearest image from the images with the clarity greater than the preset value, and the optical parameter of the eye is calculated according to the clearest image and the imaging parameter of the optical path corresponding to the clearest image.

[00116] a focus point determining sub-module 314a' configured to calculate the position of the focus point of the eye according to the optical parameter of the eye. Wherein, the optical parameter of the eye comprises: equivalent focal length and line-of-sight directions of the eye. [00117] The function of the first focus point detecting module may be realized by using a focus point detecting system of the eye, the focus point detecting system of the eye will be detailed below, and the details are not described here.

[00118] b) As shown in Fig. 3d, the first object determining sub-unit 310' may comprise a second focus point detecting module 310b' configured to track the line-of-sight directions of two eyes to obtain the position of the focus point of the eye through the intersection of the line-of-sight directions of the two eyes.

[00119] c) As shown in Fig. 3e, the first object determining sub-unit 310' may comprise a third focus point detecting module 310c' configured to track the line-of -sight directions of the eye, obtain the scene depth of the position of the focus point containing the eye according to the line-of-sight directions, and calculate and obtain the position of the focus point of the eye according to the scene depth.

[00120] 2) As shown in Fig. 3f, the object determining unit 310 may comprise a second object determining sub-unit 310" configured to collect the fundus images and determine the gazing object according to the fundus images.

[00121] There is a macular area in the center of the retina of human fundus, and the macular area, located in the central optical zone of human eyes, is a projection point of the eyes light axis. A depression in the center of the yellow spot, known as a central fovea, is the sharpest part of an eye, and the eye's gazing object is projected to the central fovea of the macular area. Accordingly, the user's gazing object may be determined by collecting the corresponding images at the central fovea of the macular area in the fundus images.

[00122] The imaging lens group 320 comprises at least two lenses, and the at least two lenses being adjustable in the scaling property with each corresponding portion of the sub-areas. The scaling property may be adjusted by changing the respective focal lengths of the at least two lenses or by changing the relative position between the at least two lenses. In the imaging lens group, the sub-areas are distributed in an array, such as in a rectangular array, or distributed in a radial concentric circle array. [00123] When the corresponding sub-areas of the imaging lens group 320 are determined by the sub-area determining unit 330, the corresponding sub-areas may be determined according to the projection of the gazing object on the imaging lens group 320. The projection of the gazing object on the imaging lens group along the line-of-sight directions covers some relevant sub-areas, i.e., the corresponding sub-areas, and these corresponding sub-areas covered by the projection are the sub-areas required to be adjusted in the scaling property.

[00124] After the sub-areas required to be adjusted in the scaling property are determined, the corresponding scaling parameter is required to be determined. The parameter determining unit 340 may use any one of the following several implementations:

[00125] 1) As shown in Fig. 3g, the parameter determining unit 340 may comprise a first parameter determining sub-unit 340' configured to determine the scaling parameter of the corresponding sub-areas according to the actual viewing distance from the gazing object to the eye.

[00126] Specifically, as shown in Fig. 3h, in an optional implementation, the first parameter determining sub-unit 340' may comprise:

[00127] an actual viewing distance acquiring module 341a' configured to acquire the actual viewing distance from the gazing object to the eye. Wherein, acquiring the actual viewing distance from the gazing object to the eye may be realized according to the equivalent focal length of the eye acquired by the first focus point detecting module or by calculating the same according to the position of the focus point of the eye acquired by the second focus point detecting module or the third focus point detecting module.

[00128] a parameter determining module 342a' configured to determine the scaling parameter of the corresponding sub-areas according to the actual viewing distance. Wherein, the scaling parameter of the corresponding sub-areas determined according to the actual viewing distance may be a magnification, and there are various modes configured to acquire the magnification according to the actual viewing distance; for example, the corresponding magnification is determined according to a piecewise function corresponding to the viewing distance or by looking up the table. This implementation selects a quick way of looking up table, i.e., presetting a corresponding relation table between the actual viewing distance and the magnification, and then determining the currently needed magnification by looking up the table during the implementation of the method. The corresponding relation table between the actual viewing distance and the magnification is shown in Table 1, and the details are not described here.

[00129] In another optional implementation, as shown in Fig. 3i, the first parameter determining sub-unit 340' comprises:

[00130] a presetting module 341b' configured to preset the target viewing distance from the gazing object to the eye and the buffer of the target viewing distance. Wherein, the target viewing distance is the viewing distance of user's eye expected to reach, i.e., the desired value of the viewing distance when the user observes the gazing object, such as 10 m. When the user's eye's actual viewing distance is the target viewing distance, the user will feel that the distance from the gazing object to himself or herself is moderate and the fundus images will be not too large or too small. Besides, the target viewing distance that makes the user feel comfortable is generally not a distance point, but rather a distance range; accordingly, the buffer of the target viewing distance is also arranged in the mode e. Generally, the buffer of the target viewing distance is the distance range preset between both sides of the target viewing distance. For example, assuming that the target area proportion is D T , the buffer may be ((D T -D L , D T ) U (D T , D T + D R )), wherein, D T , D L and D R are constants. Consequently, the scope of viewing distance (D T - D L , D T + D R ) is set as the scope of viewing distance that makes the user feel comfortable. D L may be equal to D R ; in this case, the first sub-buffer ((D T -D L , D T ) and the second sub-buffer (D T , D T + D R ) of the buffer of the target viewing distance are of equal size and take D T as the center; and D L may also be unequal to D R ; in this case, the first sub-buffer ((D T -D L , D T ) and the second sub-buffer (D T , D T + D R ) vary in size.

[00131] an actual viewing distance acquiring module 342b' configured to acquire the actual viewing distance from the gazing object to the eye. The actual viewing distance acquiring module may be the same as that in the above implementation, and the details are not described here again.

[00132] a parameter determining module 343b' configured to determine the scaling parameter of the corresponding sub-areas according to the target viewing distance, the actual viewing distance and the buffer.

[00133] Wherein, in the step of determining the scaling parameter of the corresponding sub-areas according to the target viewing distance, the actual viewing distance and the buffer, for the scaling parameter:

[00134] Under the circumstance that the actual viewing distance is less than the target viewing distance and the actual viewing distance is beyond the buffer of the target viewing distance, after the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the actual viewing distance will be increased to the target viewing distance, to shrink the fundus images of the gazing object.

[00135] Under the circumstance that the actual viewing distance is greater than the target viewing distance and the actual viewing distance is beyond the buffer of the target viewing distance, after the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the actual viewing distance will be decreased to the target viewing distance, to amplify the fundus images of the gazing object.

[00136] In some implementations demanding simple control, the buffer of the target viewing distance may be set as zero, i.e., equivalent to the buffer without setting the target viewing distance; in this case, this is equivalent to determining the scaling parameter of the corresponding sub-areas according to the target viewing distance and the actual viewing distance, for the scaling parameter:

[00137] Under the circumstance that the actual viewing distance is less than the target viewing distance, after the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the actual viewing distance will be increased to the target viewing distance, to amplify the fundus images of the gazing object.

[00138] Under the circumstance that the actual viewing distance is greater than the target viewing distance, after the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the actual viewing distance will be decreased to the target viewing distance, to amplify the fundus images of the gazing object.

[00139] 2) As shown in Fig. 3j, the parameter determining unit 340 may comprise a second parameter determining sub-unit 340" configured to determine the scaling parameter of the corresponding sub-areas according to the actual area proportion of the fundus images of the gazing object on the fundus.

[00140] Specifically, as shown in Fig. 3k, in an optional implementation, the second parameter determining sub-unit 340" may comprise:

[00141] an actual area proportion acquiring module 341a" configured to acquire the actual area proportion of the fundus images of the gazing object on the fundus. Wherein, the area of the user's fundus is generally fixed, after the user's fundus images are collected, the images in the central fovea area of the yellow spot may be extracted therefrom and used as the fundus images of the gazing object, such that the area of the fundus images of the gazing object may be acquired and then the actual area proportion of the fundus images of the gazing object on the fundus may be acquired.

[00142] a parameter determining module 342a" configured to determine the scaling parameter of the corresponding sub-areas according to the actual area proportion. Wherein, the scaling parameter of the corresponding sub-areas determined according to the actual area proportion may be a magnification, and there are various modes configured to determine the corresponding magnification according to the actual area proportion; for example, the corresponding magnification is determined according to the piecewise function corresponding to the actual area proportion or by looking up the table. This implementation selects a quick way of looking up table, i.e., presetting a corresponding relation table between the actual area proportion and the magnification, and then determining the currently needed magnification by looking up the table during the implementation of the method. Wherein, the corresponding relation table between the actual area proportion and the magnification is shown in Table 2, and the details are not described here again.

[00143] As shown in Fig. 3L, in another optional implementation, the second parameter determining sub-unit 340" may comprise:

[00144] a presetting module 341b" configured to preset the target area proportion of the fundus images of the gazing object on the fundus and the buffer of the target area proportion. Wherein, the target area proportion is the area proportion of the fundus images of the gazing object on the fundus expected to achieve, such as 50%. Under the circumstance that the area proportion of the fundus images of the gazing object on the fundus is the target area proportion, the user will feel that the distance from the gazing object to himself or herself is moderate and the fundus images will be neither too large nor too small. Besides, the area proportion of the fundus images of the gazing object that makes the user feel comfortable is generally not an area proportion point, but rather an area proportion range; accordingly, the buffer of the target area proportion is also arranged in the mode g. Generally, the buffer is the area proportion range preset between both sides of the target area proportion. For example, assuming that the target area proportion is S T , the buffer may be ((S T - S L , S T )U (S T , S T + S R )), wherein, S T , S L and S R are constants. Consequently, the region of area proportion (S T - S L , S T + S R ) is set as the region of area proportion that makes the user feel comfortable. S L may be equal to S R ; in this case, the third sub-buffer (S T - S L , S T ) and the fourth sub-buffer (S T , S T + S R ) of the buffer are of equal size and take S T as the center; and S L may also be unequal to S R ; in this case, the third sub-buffer (S T - S L , S T ) and the fourth sub-buffer (S T , S T + S R ) vary in size.

[00145] an actual area proportion acquiring module 342b" configured to acquire the actual area proportion of the fundus images of the gazing object on the fundus. The actual area proportion acquiring module may be the same as that in the above implementation, and the details are not described here.

[00146] a parameter determining module 343b" configured to determine the scaling parameter of the corresponding sub-areas according to the target area proportion, the actual area proportion and the buffer.

[00147] Wherein, in the step of determining the scaling parameter of the corresponding sub-areas according to the target area proportion, the actual area proportion and the buffer, for the scaling parameter:

[00148] In the case that the actual area proportion is less than the target area proportion and the actual area proportion is beyond the buffer, after the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the fundus images of the gazing object may be amplified to the target area proportion.

[00149] In the case that the actual area proportion is greater than the target area proportion and the actual area proportion is beyond the buffer, after the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the fundus images of the gazing object may be shrunk to the target area proportion.

[00150] In some implementations demanding simple control, the buffer may also be set at zero, i.e., it is equivalent to not setting the buffer, meanwhile equivalent to determining the scaling parameter of the corresponding sub-areas according to the target area proportion and the actual area proportion, and for the scaling parameter:

[00151] In the case that the actual area proportion is less than the target area proportion, after the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the fundus images of the gazing object may be amplified to the target area proportion.

[00152] In the case that the actual area proportion is greater than the target area proportion, after the scaling property of the corresponding sub-areas is adjusted according to the scaling parameter, the fundus images of the gazing object may be shrunk to the target area proportion.

[00153] In order to avoid the condition that the fundus images of the user is changed in ungazing state, such as random sweeping, to affect the user experience, as shown in Fig. 4, the device may also comprise:

[00154] a time judging unit 410 configured to judge whether the time for the eye to observe the gazing object exceeds the predetermined time, and if it exceeds the predetermined time, enabling the sub-areas determining unit 330, the imaging lens group 320 and the parameter determining unit 340.

[00155] Wherein, the predetermined time shall be set in such a manner to just make sure that the user is gazing the current observation object, and generally, when human eyes view a target, an optical impression may be obtained with the minimum observation time of 0.07-0.3s, and the predetermined time shall be more than the minimum observation time; for example, it may be set at Is, 2s, etc. In addition, the time for the user to observe the gazing object may be obtained by monitoring the time that the position of the focus point of the user's eye remains unchanged, and in the case that the time that the position of the focus point of the user's eye remains unchanged exceeds the predetermined time, it can be judged that the user is gazing the object currently in the position of the focus point, or obtained by monitoring the dwell time of the corresponding image in the central fovea of the yellow spot, and in the case that the dwell time of the corresponding image of the same object in the central fovea exceeds the predetermined time, it can be judged that the user is gazing the object currently.

[00156] When the gazing object is a mobile object, the judgment is only made in the beginning on whether the time for the eye to observe the mobile object exceeds the predetermined time, once the time is judged to exceed the predetermined time, the sub-area determining unit 330, the imaging lens group 320 and the parameter determining unit 340 are enabled, and when the user's line-of-sight follows the mobile object, the judgment would not be performed again on whether the gazing time exceeds the predetermined time as long as the user's eye are gazing the mobile object all the time (the user needs not to turn the head but to move eyeballs only.), thereby facilitating the user in observing the scaling of the mobile object.

[00157] In addition, human eyes may have ametropia problems such as farsightedness, nearsightedness and/or astigmatism, therefore, the device also comprises:

[00158] a refraction judging unit 420 configured to judge whether the eye have ametropia problems and generating the ametropia information about the eye if the eye have ametropia problems;

[00159] Correspondingly, the parameter determining unit 340 configured to determine the scaling parameter of the corresponding sub-areas according to the gazing object and the ametropia information.

[00160] In addition, in order to adjust the imaging lens group 320, the device also comprises:

[00161] a property adjusting unit 430 configured to adjust the scaling property of the corresponding sub-areas according to the scaling parameter.

[00162] As shown in Fig. 5a, the above mentioned focus point detecting system of an eye is described as follows, the focus point detecting system of the eye 500 may comprise:

[00163] an image collecting device 510 configured to collect the image presented on the fundus;

[00164] an adjusting device 520 configured to adjust the imaging parameter between the eye and the image collecting device 510 so that the image collecting device 510 obtains an image with the definition greater than the preset value;

[00165] an image processing device 530 configured to process the image obtained by the image collecting device 510, in order to obtain the optical parameter of the eye corresponding to the image with the definition greater than the preset value.

[00166] The system 500 obtains the optical parameter of the eye corresponding to the image with the definition greater than the preset value by analyzing and processing the fundus images, thereby calculating the current position of the focus point of the eye.

[00167] The images presented on the "fundus" herein are mainly the images presented on the retina, which may be the images of the fundus itself or images of other objects projected to the fundus.

[00168] As shown in Fig. 5b, in a possible implementation, the image collecting device 510 is a micro-camera, and in another possible implementation of the embodiment of this application, the image collecting device 510 may also adopt a photosensitive imaging device directly, such as devices like CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor).

[00169] In a possible implementation, the adjusting device 520 comprises: an adjustable lens unit 521 located in the optical path between the eye and the image collecting device 510 with adjustable focal length of itself and/or adjustable position in the optical path. Through the adjustable lens unit 521, the equivalent focal length of the system between the eye and the image collecting device 510 may be adjusted, and through the adjustment by the adjustable lens unit 521, the image collecting device 510 obtains the clearest image on the fundus in a certain position or status of the adjustable lens unit 521. In this implementation, the adjustable lens unit 521 performs continuous and real-time adjustment in the detection process.

[00170] Wherein, in a possible implementation, the adjustable lens unit 521 is: a focal length adjustable lens configured to adjust the focal length of itself by adjusting the refractivity and/or shape of itself. Specifically: 1) adjusting the focal length by adjusting the curvature of at least one plane of the focal length adjustable lens, for example, adding or reducing the liquid medium in the cavity formed by the double-layer transparent layer to adjust the curvature of the focal length adjustable lens; 2) adjusting the focal length by changing the refractivity of the focus adjustable lens, for example, for the focus adjustable lens filled with specific liquid crystal media, adjusting the arrangement mode of the liquid crystal medium by adjusting the voltage of electrodes corresponding to the liquid crystal medium, thereby changing the refractivity of the focal length adjustable lens.

[00171] In another possible implementation, the adjustable lens unit 521 comprises: a lens group configured to adjust the relative position between the lenses of the lens group to adjust the focal length of the lens group.

[00172] In addition to the two methods above mentioned for changing the optical path parameter of the system by adjusting the property of the adjustable lens unit 521, the optical path parameter of the system may also be changed by adjusting the position of the adjustable lens unit 521 in the optical path.

[00173] Wherein, in a possible implementation, in order not to affect the viewing experience of the user on the observation object and in order to apply portably the system to the wearable device, the adjusting device 520 also comprises: a spectroscopic device 522 configured to form the light transmission paths between the eye and the observation object as well as between the eye and the image collecting device 510. Thereby folding the optical path, reducing the system volume and avoiding affecting other experiences of the user as far as possible.

[00174] Wherein, in this implementation, the spectroscopic device comprises: a first spectroscopic unit located between the eye and the observation object configured to transmit the light from the observation object to the eye and transfer the light from the eye to the image collecting device.

[00175] The first spectroscopic unit may be a spectroscope, a spectroscopic optical waveguide (comprising optical fiber) or other suitable spectroscopic devices.

[00176] In a possible implementation, the image processing device 530 of the system comprises an optical path calibrating module configured to calibrate the optical path of the system, for example, performing the alignment and calibration for the optical axis of the optical path, in order to ensure the accuracy of the measurement.

[00177] In a possible implementation, the image processing device 530 comprises:

[00178] an image analyzing module 531 configured to analyze the image acquired by the image collecting device to find out the image with the definition greater than the preset value; and

[00179] a parameter calculating module 532 configured to calculate the optical parameter of the eye based on the image with the definition greater than the preset value as well as the known imaging parameter of the system corresponding to the image with the definition greater than the preset value.

[00180] In this implementation, the image collecting device 510 can acquire the image with the definition greater than the preset value through the adjusting device 520, but it is required that the image with the definition greater than the preset value is found through the image analyzing module 531, and then, the optical parameter of the eye can be calculated based on the image with the definition greater than the preset value and the known optical path parameter of the system. The optical parameter of the eye herein may comprise the optical axis direction of the eye.

[00181] In a possible implementation of the embodiment of this application, the system also comprises: a projection device 540 configured to project the facula to the fundus. In a possible implementation, the functions of the projection device can be realized through a micro-projector.

[00182] The projected facula herein may have no specific pattern which is only configured to illuminate the fundus.

[00183] In a preferable implementation, the projected facula comprises a pattern with rich characteristics. Rich characteristics of the pattern may facilitate the detection, increasing the accuracy of detection. Fig. 5e shows an exemplary diagram of a facula pattern 550, and the pattern may be formed by the facula pattern generator such as frosted glass; Fig. 5f shows the fundus images taken when the facula pattern 550 is projected.

[00184] In order not to affect the normal view of the eye, preferably, the facula is the infrared facula invisible to the eye.

[00185] At this moment, in order to reduce the interference of other spectra:

[00186] The emergent surface of the projection device may be provided with a transmission filter of the light invisible to the eye.

[00187] The incident plane of the image collecting device may be provided with a transmission filter of the light invisible to the eye.

[00188] Wherein, in a possible implementation, the image processing device 530 also comprises:

[00189] a projection control module 534 configured to control the brightness of the projected facula of the projection device according to the results obtained by the image analyzing module.

[00190] For example, the projection control module 534 can adaptively adjust the brightness according to the characteristics of the images obtained by the image collecting device 510. The characteristics of the images herein comprise the contrast of the image features and the texture feature.

[00191] A special condition configured to control the brightness of projected facula of the projection device is to open or close the projection device; for example, the projection device can be periodically closed when the user continues to gaze at one point; When the user's fundus is bright enough, the light emitting source can be closed to detect the distance from the current focus point of the line-of-sight to the eye by only the fundus information.

[00192] In addition, the projection control module 534 may further control the brightness of projected facula of the projection device according to the ambient light.

[00193] Wherein, in a possible implementation, the image processing device 530 may also comprise: an image calibrating module 533 configured to calibrate the fundus images to obtain at least one reference image corresponding to the image presented on the fundus.

[00194] The image analysis unit 531 contrast the image acquired by the image collecting device 530 with the reference image, thereby acquiring the image with the definition greater than the preset value. Here, the images with the definition greater than the preset value can be images obtained with the minimum difference from the reference image. In this implementation, the difference between the currently acquired image and the reference image is calculated by the existing image processing algorithm such as an automatic focusing algorithm using the classical phase differences.

[00195] Wherein, in a possible implementation, the parameter calculating module 532 comprises:

[00196] a determining unit of optical axis direction of the eye 5321 configured to obtain the optical axis direction of the eye according to the characteristics of the eye corresponding to the image with the definition greater than the preset value. The line-of-sight direction may be obtained according to the optical axis direction of the eye and the fixed angle between the optical axis direction of the eye and the line-of-sight direction. [00197] The characteristics of the eye herein can be obtained from the image with the definition greater than the preset value or acquired additionally.

[00198] Wherein, as shown in Fig. 5c, in a possible implementation, the determining unit of optical axis direction of the eye 5321 comprises: a first determining sub-unit 5321a configured to obtain the optical axis direction of the eye according to the characteristics of the fundus corresponding to the image with the definition greater than the preset value. Compared with the optical axis direction of the eye obtained through the surface characteristics of pupils and eyeballs, the optical axis direction of the eye is determined in a higher accuracy through the characteristics of the fundus.

[00199] When the facula pattern is projected to the fundus, the size of the facula pattern may be larger than or smaller than the visible area in the fundus, wherein:

[00200] When the area of the facula pattern is smaller than or equal to that of the visible area, the classical feature points match algorithm (such as Scale Invariant Feature Transform (Scale Invariant Feature Transform, SIFT) algorithm) can be used to determine the optical axis direction of the eye by detecting the position of the facula pattern on the image relative to the fundus;

[00201] when the area of the facula pattern is greater than or equal to the visible area of the fundus, the optical axis direction of the eye can be determined through the position of the facula pattern on the obtained image relative to the original facula pattern (obtained through the image calibrating module), thereby determining the line-of-sight direction of the user.

[00202] As shown in Fig. 5d, in another possible implementation, the determining unit of the optical axis direction of the eye 5321 comprises: a second determination sub-unit 5321b configured to obtain the optical axis direction of the eye according to the characteristics of the eye's pupils corresponding to the image with the definition greater than the preset value. The characteristics of pupils of the eye herein may be acquired from the image with the definition greater than the preset value or acquired additionally. Obtaining the optical axis direction of the eye through the characteristics of the eye's pupils is the prior art, so it is not repeated herein.

[00203] Wherein, as shown in Fig. 5b, in a possible implementation, the image processing device 530 also comprises: a calibrating module of optical axis direction of the eye 535 configured to calibrate the optical axis direction of the eye, in order to determine the optical axis direction of the eye more accurately.

[00204] In this implementation, the known imaging parameter of the system comprises the fixed imaging parameter and the real-time imaging parameter, wherein the real-time imaging parameter is the parameter information of the adjustable lens unit corresponding to the image with the definition greater than the preset value, and the parameter information can be obtained by real-time recording when the image with the definition greater than the preset value is acquired.

[00205] After the current optical parameter of the eye is obtained, the distance from the focus point of the eye to the eye may be calculated as follows:

[00206] Fig. 5g shows a schematic diagram of the eye imaging, and in combination with the lens imaging formula in the classical optical theory, the formula (1) can be obtained from Fig. 5g:

[00207] J- + -l = i (1)

[00208] wherein, d Q and d g are the distances from the current observation object 5010 of the eye and the real image 5020 on the retina to the equivalent lens 5030 of the eye respectively, f is the equivalent focal length of the equivalent lens 5030 of the eye and X is the line-of-sight direction of the eye.

[00209] Fig. 5h shows a schematic diagram of obtaining the distance from the focus point of the eye to the eye according to the known optical parameter of the system and the optical parameter of the eye, the facula 5040 in Fig. 5h may form a virtual image through the adjustable lens unit 521; and assuming that the distance from the virtual image to the lens is x, the system of equations may be obtained by combining the formula (1) as follows:

[00211] Wherein, d p is the optical equivalent distance from the facula 5040 to the adjustable lens unit 521; cL is the optical equivalent distance from the adjustable lens unit 521 to the equivalent lens 5030 of the eye; f is the value of the focal length of the adjustable lens unit 521; and cL is the distance from the equivalent lens 5030 of the eye to the adjustable lens unit 521.

[00212] From formulas (1) and (2), the distance d Q from the current observation object 5010 (focus point of the eye) to the equivalent lens 5030 of the eye (i.e. the actual distance of focus point of the eye) can be obtained, as shown in formula (3):

d -f

[00213] d o = d i + 7 ± (3)

p p

[00214] According to the actual distance of focus point of the eye and the line-of-sight, the position of the focus point of the eye may be obtained.

[00215] Fig. 6 shows a diagram of a specific example when the focus point detecting system of the eye 600 is applied to glasses in an embodiment of this application. The focus point detecting system of the eye 600 can be configured to realize the function of the first focus point detecting module.

[00216] Wherein, the function of the micro-camera 610 is the same as that of the image collecting device in Fig. 5b, and in order to not affect the line-of-sight for the user to normally view the object, the micro-camera is arranged on the right outboard of the glasses;

[00217] the function of the first spectroscope 620 is the same as that of the first spectroscopic unit shown in Fig. 5b, and the first spectroscope 620 is arranged at the intersection of the gazing direction of the eye and the incidence direction of the camera 610 at a certain angle of inclination, to transmit the light of the gazing object entering the eye 200 and reflect the light from the eye 200 to the camera 610; and

[00218] the function of the focal length adjustable lens 630 is the same as that of the focal length adjustable lens in Fig. 5b, and the focal length adjustable lens is located between the first spectroscope 620 and the camera 610 to adjust the focal length value in real time, such that the camera 610 can capture the clearest image on the fundus at a certain focal length value.

[00219] In this implementation, the image processing device is not shown in Fig. 6, and the function thereof is the same as that of the image processing device shown in Fig. 5b.

[00220] Generally, the fundus is not bright enough, so it is better to illuminate the fundus, and in this implementation, the fundus is illuminated by a light emitting source 640. In order not to influence users' experience, the preferable light emitting source 640 herein may be the light invisible to the eye, therefore, a near-infrared light emitting source which may have a weak impact on the eye 200 and is relatively sensitive to the camera 610 is preferable.

[00221] In this implementation, the light emitting source 640 is located on the outside of the right eyeglasses frame, so the light emitted by the light emitting source 640 may be transmitted to the fundus through a second spectroscope 650 and the first spectroscope 620 together. In this implementation, the second spectroscope 650 is also located in front of the incident plane of the camera 610, so the light from the fundus to the second spectroscope 650 is also required to be transmitted.

[00222] It can be seen that in this implementation, in order to improve users' experience and enhance the clarity of collection of the camera 610, the first spectroscope 620 may have the property of high infrared reflectivity and high transmission rate of visible light. For example, the above property may be achieved by arranging an infrared reflective film on the side of the first spectroscope 620 toward the eye 200.

[00223] It can be seen from Fig. 6 that in this implementation, the focus point detecting system of the eye 600 is located on the side of glasses lenses away from the eye 200, so the lens may be regarded as a part of glasses when the eye optical parameter is calculated, without knowing about the optical property of the lens.

[00224] In other implementations of the embodiment of this application, the focus point detecting system of the eye 600 may be located on the side of the glasses lenses near the eye 200, and in this case, it is required that the optical property parameter of the lens is obtained in advance and the influencing factors of the lens are considered when the distance of the focus point is calculated.

[00225] The light emitted from the light emitting source 640 passes through the glasses lenses and enters users' eye through the reflection of the second spectroscope 650, the projection of the focal length adjustable lens 630 and the reflection of the first spectroscope 620, and finally arrives at the retina of fundus; The camera 610 captures the image on the fundus through the optical path formed by the first spectroscope 620, the focal length adjustable lens 630 and the second spectroscope 650 by passing through the pupil of the eye 200.

[00226] Fig. 7a is a schematic diagram of a specific example when the device of the embodiment of this application is applied to the glasses, wherein, the glasses may be both ordinary glasses or optical devices such as helmet, front windshield and contact lens. As shown in Fig. 7a, the glasses of the embodiment use the focus point detecting system of the eye 600 to determine the gazing object of the eye, i.e., realizing the function of the object determining unit 310, and the realization of the focus point detecting system of the eye 400 is not repeated.

[00227] Wherein, the imaging lens group 320 is arranged in the lens, and comprises at least two lenses and is divided into a plurality of sub-areas, and the part of the at least two lenses corresponding to the sub-areas being adjustable in the scaling property. For simplicity, the imaging lens group 320 in Fig. 7a comprises the first lens 321 on the side near the eye and the second lens 322 on the side near the gazing object.

[00228] Wherein, the scaling property may be adjusted by changing the focal lengths of the at least two lenses respectively, and the focal lengths may be adjusted in the following methods: 1) adjusting the focal length by adjusting the curvature of at least one plane of the lens, for example, for the lens comprising a cavity formed by the double-layer transparent layer, adjusting the focal length by adding or reducing the liquid medium in the cavity formed by the double-layer transparent layer. In this case, the scaling parameter above mentioned, for example, may mean that the liquid medium is reduced or increased by a certain value; 2) adjusting the focal length by changing the refractivity of the lens, for example, for the lens filled with specific liquid crystal medium, adjusting the arrangement mode of the liquid crystal medium by adjusting the voltage of the electrode corresponding to the liquid crystal medium, thereby changing the refractivity of the lens. In this case, the scaling parameter above mentioned, for example, may mean that the voltage of the electrode is increased or reduced by a certain value.

[00229] In addition to the focal length above mentioned, the scaling property may also be adjusted by changing the relative position between the at least two lenses. Herein, the relative position between the lenses may be changed by adjusting the relative distance between the lenses in the optical axis direction, and/or the relative position between the lenses in the vertical optical axis direction, and/or the relative rotation angle around the optical axis.

[00230] Wherein, the first lens 321 may be set in such a manner that the curvature on the side toward user's eye 200 is adjustable, the second lens 322 is set in such a manner that the curvature on the side toward the gazing object is adjustable, and the positions of the first lens 321 and the second lens 322 is fixedly set, so that the glasses are simple in structure, lightweight and portable.

[00231] As shown by the first imaging lens group 3201 on the left of the glasses in Fig. 7b, in a possible implementation of the embodiment of this application, the plurality of sub-areas 3201c with adjustable scaling property are distributed in a rectangular array. In this embodiment, the sub-areas 3201c are of the same size with the rows and columns aligned; and in other embodiments, the sub-areas 3201c may also be set with unaligned rows and columns.

[00232] As shown by the second imaging lens group 3202 on the right of the glasses in Fig. 7b, the plurality of sub-areas 3202c with adjustable scaling property are distributed in a radial concentric circle (consisting of several concentric circles 3202d and several radial lines 3202e radially connecting adjacent concentric circles 3202d) array. In this embodiment, the radial lines 3202e of the radial concentric circles are aligned, and in other embodiments, the radial lines 3202e between two adjacent concentric circles 3202d may also be unaligned.

[00233] In Fig. 7b of the implementation, for description, two imaging lens groups 320 in sub-areas with different distributions are set in a pair of glasses, and in practical application, the left and right imaging lens groups 320 of a pair of glasses are generally the same or similar in the sub-area distribution.

[00234] Of course, it may be known to the technical personnel in the field that in addition to the rectangular array and the radial concentric circle array above, the sub-areas may also be distributed in other arrays or unarrayed mode.

[00235] The sub-area determining unit 330 is configured to determine the corresponding sub-areas according to the projection of the gazing object on the imaging lens group, wherein, the projection is a projection of the gazing object on the imaging lens group in the line-of-sight direction, and may be calculated according to the relative position between the imaging lens group 320 and the gazing object and the line-of-sight direction.

[00236] The parameter determining unit 340 may determine the scaling parameter of the corresponding sub-areas according to the actual viewing distance from the gazing object to the eye or the actual area proportion of the fundus images of the gazing object on the fundus, and see relevant description of step S230 in the method embodiment for the specific determination method.

[00237] Generally, the sub-area determining unit 330 and the parameter determining unit 340 may be realized by integrating into the same processor, thereby reducing the weight of the glasses and increasing the portability thereof. The processor may be a Central Processing Unit CPU, or a specific integrated circuit ASIC (Application Specific Integrated Circuit), or one or more integrated circuits configured to implement the embodiment of this application.

[00238] The property adjusting unit 430 is not shown in Fig. 7a, it generally adjusts the scaling property of the corresponding sub-areas by outputting the voltage or current signals corresponding to the scaling parameter to the imaging lens group 320. [00239] In addition, the glasses may also comprise the time judging unit 410 and the refraction judging unit 420.

[00240] Wherein, the time judging unit 410 generally comprises a timer configured to monitor the time that the position of the focus point of user's eye remains unchanged, or monitoring the dwell time of the corresponding image in the central fovea of the yellow spot, and in the case that the time that the position of the focus point of user's eye remains unchanged exceeds the predetermined time; or, the dwell time of the corresponding image of the same object in the central fovea exceeds the predetermined time, it can be judged that the user is gazing the object currently.

[00241] The refraction judging unit 420 may be realized by using the existing refraction detecting device, and it is a prior art and the details are not described here.

[00242] Fig. 8a and Fig. 8b are diagrams of this application scenes of the device 300 of this application. Fig. 8a is a schematic diagram of the original view of the user driving a car, wherein, the front car 810 is the car in front of the car driven by the user (not shown), the side car 820 is the car in side front of the car driven by the user, and the racing line 830 is the lane line between the car driven by the user and the side car 820. Assuming that the user demands to view carefully the license plate number of the front car 810 at this time to identify whether his/her friend is driving the car, but the front car 810 is at a distance of more than 100 m and the license plate number cannot be identified by naked eyes.

[00243] Fig. 8b is a schematic diagram of the view obtained through the device 300 by the user, it can be seen that the front car 810 is brought close and the license plate number thereof 123456 is clear, at the same time, the side car 820 and the racing line 830 are still in the view and keep the original distance from the car, thereby facilitating in increasing the driving safety of the user.

[00244] The structure of the imaging device for local scaling of the embodiment of the present invention is shown in Fig. 9. The specific embodiment of the present invention is not intended to limit the specific implementation of the imaging device for local scaling, as shown in Fig. 9, the imaging device 900 may comprise:

[00245] a processor 910, a Communications Interface 920, a memory 930 and a communication bus 940. Wherein:

[00246] The processor 910, the communications interface 920 and the memory 930 complete the mutual communication through the communication bus 940.

[00247] The communications interface 920 is configured to communicate with other network elements.

[00248] The processor 910 is configured to implement the program 932, specifically, it can implement the relevant steps in the method embodiment shown in Fig. 9 above.

[00249] Specifically, the program 932 may comprise the program codes which comprise the computer operating instructions.

[00250] The processor 910 may be a Central Processing Unit CPU, or a specific integrated circuit ASIC (Application Specific Integrated Circuit), or one or more integrated circuits configured to implement the embodiment of the present invention.

[00251] The memory 930 is configured to store the program 932. The memory 930 may contain a high-speed RAM memory and may also comprise a non-volatile memory, such as at least one disk memory. Specifically, the program 932 may execute the following steps:

[00252] determining the gazing object of an eye;

[00253] determining the corresponding sub-areas of the imaging lens group according to the gazing object; the imaging lens group being configured to scale and image for the gazing object, comprising a plurality of sub-areas with adjustable scaling property;

[00254] and determining the scaling parameter of the corresponding sub-areas according to the gazing object.

[00255] For the specific implementation of the steps in the program 932, see the corresponding steps or modules in the above-mentioned embodiment, and the details are not described here. It may be clearly understood by persons skilled in the art that, for the purpose of convenient and brief description, for the detailed working process of the above mentioned device and module, refer to the corresponding process description in the above-mentioned method embodiment, and the details are not described here again. In summary, in the device of the embodiment, the user's fundus images of the gazing object can be scaled in a local scaling mode, so as to avoid changing the overall view of the user and to enable the user to conveniently observe the gazing object and to simultaneously correctly perceive the surrounding environment.

[00256] Common persons skilled in the art should appreciate that, in combination with the examples described in the embodiments here, units and method steps can be implemented by electronic hardware, or a combination of computer software and electronic hardware. Whether the functions are executed by hardware or software depends on the particular applications and design constraint conditions of technical schemes. Professional persons skilled in the art may use different methods to implement the described functions for each particular application, but it should not be considered that the implementation goes beyond the scope of this application.

[00257] When being implemented in the form of a software functional unit and sold or used as a separate product, the functions may be stored in a computer-readable storage medium. Based on this understanding, the technical scheme of this application, in essence, or the part contributing to the prior art or the part of the technical scheme may be embodied in the form of software product, and the computer software product is stored in a readable storage medium and comprises various instructions for enabling a computer apparatus (which may be a personal computer, a server, a network apparatus, etc.) to implement all or partial steps of the method of each embodiment of this application. However, the above-mentioned storage medium comprises any medium that may store program codes, such as a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.

[00258] The above implementations are only used for illustrating this application, and not intended to limit this application, and various modifications and improvements may be made by the ordinary technical personnel in relevant technical field within the spirit and scope of this application; accordingly, all equivalent technical schemes belong to the scope of this application and the scope of patent protection of this application should be limited by Claims.