Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
OCULAR SCREENING
Document Type and Number:
WIPO Patent Application WO/2018/100589
Kind Code:
A1
Abstract:
Examples of ocular screening are described herein. In an example, an examination system (100) for ocular screening comprises an illumination unit (104) for illuminating an eye of an examinee (142), and a metering engine (120) coupled to the illumination unit (104). The metering engine (120) may receive examining parameters, and based on the examining parameters received, determine characteristics of illumination receivable from the illumination unit (104) and generating control instructions to configure the illumination unit (104). In another example, based on the examining parameters received, the metering engine (120) may generate control instructions to configure the imaging unit (106). Further, an inspection engine (122) may receive examinee information, and based on examinee information received, determine a type of engaging-stimuli and trigger generating of an engaging-stimuli to obtain attention of the examinee. The images may be assessed to detect various ocular defects.

Inventors:
DAVIDE BREE ANNE (AU)
LAROIA MANISHA (IN)
LINGARD BRODIE MAE (AU)
TRIPATHI SUTEERTH (IN)
SHARMA PRADEEP (IN)
Application Number:
PCT/IN2017/050561
Publication Date:
June 07, 2018
Filing Date:
November 30, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SEC DEP OF BIOTECHNOLOGY (IN)
International Classes:
A61B3/00
Foreign References:
US20120293773A12012-11-22
US20070171363A12007-07-26
US20150061995A12015-03-05
Other References:
None
Attorney, Agent or Firm:
LAKSHMIKUMARAN, Malathi et al. (IN)
Download PDF:
Claims:
I/We claim:

1. A system (100) for ocular screening, the system (100) comprising:

a plurality of sensors (110);

an illumination unit (104) for illuminating the eyes of an examinee (142); a metering engine (120) coupled to the illumination unit (104), wherein the metering engine (120) is to:

receive examining parameters from the plurality sensors (110), wherein the examining parameters comprises one of ambient light, distance between the system (100) and the examinee and pupil size of the examinee;

based on the examining parameters received, determine characteristics of illumination required for the illumination unit (104) and generating control instructions to configure the illumination unit (104).

2. The system (100) as claimed in claim 1, wherein the illumination unit (104) comprises:

an illumination source (202) emitting a light beam for irradiating an eye or both the eyes of the examinee (142); and

an illumination driver (204) for accurate directing of the light beam at the eye of the examinee (142).

3. The system (100) as claimed in claim 1 comprises an imaging unit (106) for obtaining images, wherein the imaging unit (106) is coupled to the metering engine (120), wherein, the metering engine (120) is to:

based on the examining parameters received, generate control instructions to configure the imaging unit (106).

4. The system (100) as claimed in claim 3, wherein the imaging unit (106) comprises an imagining device (106), one or more optical lenses (208) and optical aperture (212).

5. The system (100) as claimed in claim 1, wherein the plurality of sensors (110) comprise:

an ambient light sensor (112) for measuring ambient light; and

a distance sensor (114) for measuring distance between the system (100) and the examinee (142).

6. The system (100) as claimed in claim 1, wherein the characteristics of illumination comprises one of eccentricity of illumination and intensity of illumination.

7. The system (100) as claimed in claim 1, wherein examinee information comprises one of identity, age and ethnicity of the examinee.

8. The system (100) as claimed in claim 1 comprises:

an inspection engine (122) to:

receive examinee information;

based on the examinee information received, determine a type of engaging-stimuli to be generated, and trigger generating of engaging- stimuli to obtain engagement of the examinee.

9. The system (100) as claimed in claim 1, wherein the inspection engine (122) is to:

receive input from an examiner (144); and

based on the examiner input received, trigger generating of an engaging- stimuli to obtain engagement of the examinee (142).

10. The system (100) as claimed in claim 8, wherein the engaging- stimuli comprises one of visual patterns and audio patterns.

11. The system (100) as claimed in claim 1 comprises:

an assessment engine (124) to: upon configuring of the illumination unit (104) and the imaging unit (106), and obtaining engagement of the examinee, trigger obtaining of images by the imaging unit (106).

12. A method (400) for ocular screening comprising:

receiving examining parameters from a plurality of sensors (110); and based on the examining parameters received, generating, by the metering engine (120), control instructions for configuring the imaging unit (106).

13. The method (400) as claimed in claim 12, wherein configuring the imaging unit (106) comprises:

adjusting position of optical lenses (208); and

adjusting position and size of optical aperture (212).

14. The method (400) as claimed in claim 12, wherein the method comprises: receiving, by an inspection engine (122), examinee information; and based on the examinee information received, triggering, by the inspection engine (122), generating of an engaging-stimuli to obtain engagement of the examinee.

15. The method (400) as claimed in claim 12 comprises:

upon configuring of the illumination unit (104) and the imaging unit (106) and obtaining of engagement of the examinee, triggering, by an assessment engine (124), obtaining of images by the imaging unit (106), and outputting, by the assessment engine (124), the images obtained by the imaging unit (106).

16. The method (500) as claimed in claim 12 comprises:

upon configuring of the illumination unit (104) and the imaging unit (106), and obtaining engagement of the examinee, triggering, by an assessment engine, (124) obtaining of images by the imaging unit (106), and outputting, by the assessment engine (124), the images obtained by the imaging unit (106).

17. A non-transitory computer readable medium comprising instructions, which when executed by a processing resource is to:

receive examining parameters;

based on the examining parameters received, generate control instructions for configuring the imaging unit (106); and

determine characteristics of illumination receivable from the illumination unit (104) and generate control instructions to configure the illumination unit (104).

Description:
OCULAR SCREENING

TECHNICAL FIELD

[0001] The present subject matter relates, in general, to ocular screening. BACKGROUND

[0002] Various screening techniques are used for diagnostic or for medical examination. With the advances in technology, screening techniques have become more efficient, less invasive and widely adopted in clinical practice.

[0003] Current methods of ocular screening may be easily implemented for adults, who are cooperative and are capable of following instructions from the practitioner. However, the same may not hold true for younger patients, such as children or toddlers. For example, young children may not be able to describe their vision or visual symptoms, due to which it may not always be possible to accurately examine a child's eyes. Furthermore, the younger patients may not follow the instructions from the practitioner. This may result in an incomplete medical examination, and in some cases, may also result in any medical condition to go undetected.

BRIEF DESCRIPTION OF THE DRAWINGS

[0004] The detailed description is described with reference to the accompanying figures. In the figures, the left- most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of the system(s) in accordance with the present subject matter are described, by way of examples only, and with reference to the accompanying figures, in which:

[0005] Fig. 1 illustrates a block diagram of an example system for ocular screening;

[0006] Fig. 2 illustrates a system diagram of an example of an examining unit of a system for ocular screening; [0007] Fig. 3 illustrates a system diagram of an example device for ocular screening;

[0008] Fig. 4 illustrates a flow chart of an example method for ocular screening; and

[0009] Fig. 5 illustrates a flow chart of another example method for ocular screening.

DETAILED DESCRIPTION

[0010] Ocular screening is performed for medical examination and detecting probable eye related abnormalities, disorders or diseases. Photo-refraction, also known as photo-retinoscopy or photo-screening, is an ocular screening technique which employs photography techniques for measuring refractive error and accommodative response of an eye of a patient. Light emitted from a light source placed close to a camera is reflected from the eye, which is then received by an imaging device, such as a camera. Photorefractive methods are generally preferred as they provide objective results and are quicker to perform. Further, photorefractive methods do not require that the patient maintain a specific posture suitable for the medical examination, which makes these methods highly suitable for examining the eyes of infants and young children.

[0011] Various photorefraction techniques may be used, such as orthogonal photorefraction, isotropic photorefraction, and eccentric photorefraction. Eccentric photo-screening technique is generally used to detect the presence of ocular defects, such as astigmatism and strabismus. However, the conventional eccentric photo-screening techniques may lack the ability to adequately determine these defects. In eccentric photorefraction, a crescent shaped red-reflex (retinal reflex) may be generated in the pupil plane. Inaccuracies may arise in detecting the crescent boundary of the reflex in order to assess the refractive error in the eye. Lack of pupil data defining its size and location during image processing may account for the low sensitivity and specificity of current devices. A common problem occurs when eyes of different refractive errors give similar red reflex results, particularly due to poor light stimulus. [0012] Different examining machines or systems may have different operation characteristics, and sensitivities, and thus may be operated differently, for example, such machines may require specific distance to be maintained between the examinee's eye and the examining machine for accurate results, and inconsistencies in the distance may hamper the accuracy of the results. Also, some examining machine may require very low distance between the examining machine and the examinee's eyes. This may lead to discomfort and behavioural reflexes, such as closing of the eye, which may further affect the quality of results.

[0013] Furthermore, paediatric eye screening is often not designed for accommodating child behaviour, such as short attention span and limited verbal communication skills. Short attention span prevents the younger patients or toddlers to concentrate on test objects during examination. Furthermore, the young patients may also not completely follow the instructions from the practitioner, for the ocular examination to be carried out. Hence, multiple repetitions of the screening procedure may be required. Also, as conventional diagnostic techniques may employ multiple testing charts, such as teller, lea, Cardiff, ETDRS and tumbling E/C, repetition of the charts may make the overall procedure time consuming.

[0014] Furthermore, the current photorefraction devices work at fixed working distance, and any variation in the working distance may affect the accuracy of the results. Hence, operating with such devices require a skilled operator for fixing the distance.

[0015] The aforementioned examination techniques may involve measuring the pupil size and pupil reflex of the eye. However, the pupil size and the pupil reflex may be affected by ambient light. Consequently, the examination results obtained may require adjustments to accommodate the effect of ambient light which may make the overall screening process time consuming and, therefore, uncomfortable for the examinee and the examiner.

[0016] The conventional ocular screening techniques may employ attention seeking stimulus for obtaining attention of the examinee for accurate examination. However, the conventional techniques generally employ same attention seeking stimulus using fixed light and sound patterns, irrespective of the age and cognitive abilities of the examinee. Therefore, such attention seeking stimuli may not prove effective on all the examinees.

[0017] The present subject matter provides approaches for ocular screening. The ocular screening may be performed by an examination system. The examining unit may further include an illumination unit and imaging unit. The illumination unit may emit a light beam for irradiating either a portion or the entire eye of the patient. The light reflected from the eye is received by the imaging unit. However, the examining system may be adapted for examining both the eyes of the examinee simultaneously. Further, the examining of the eye, hereinafter, may refer to examining one at a time or examine both the eyes of the examinee simultaneously.

[0018] In parallel one or more examining parameters are obtained prior to or during screening to overcome erroneous examination results. Examples of examining parameters include, but are not limited to ambient light, distance between the examination system and the examinee and pupil size of the examinee. In other examples, examinee information, such as identity, age and ethnicity of the examinee may be noted as well. The examining parameters may be obtained from various sensors, such as ambient light sensor, distance sensor and by measuring the pupil size. However, the examining parameters may be obtained from any other source as well without deviating from the scope of the present subject matter. Based on the examining parameters, the examination system for ocular screening configures illumination unit and imaging unit. For example, based on the examining parameters, the characteristics of illumination, such as intensity and eccentricity of illumination, produced by the illumination unit; and the focus and the field of view of the imaging unit is adjusted before carrying out the screening procedure.

[0019] The examination system for ocular screening may be then aligned with that of the examinee's eyes for ocular screening.

[0020] The present subject matter provides for ocular screening which allow the screening to be performed for varying working distances between examination system and the examinee. Further, the present subject matter takes into account the effects of ambient light on the pupil size and the pupil reflex. Furthermore, the present subject matter allows for real-time assessment of images, thereby minimizing the chances of the unsuitable images obtained and repetitions of the procedure due to inappropriate examining conditions. Yet further, the subject matter allows for generating attention seeking stimulus taking into account examinee's characteristics, such as age and cognitive ability.

[0021] These and other advantages of the present subject matter would be described in greater detail in conjunction with the following figures. While aspects of the described aspects of ocular screening can be implemented in any number of different configurations, the embodiments are described in the context of the following examination system(s).

[0022] These and other advantages of the present subject matter would be described in a greater detail in conjunction with the Figs. 1-5 in the following description. The manner in which the ocular screening is implemented and operated shall be explained in detail with respect to the Figs. 1-5.

[0023] It should be noted that the description merely illustrates the principles of the present subject matter. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described herein, embody the principles of the present subject matter and are included within its scope. Furthermore, all examples recited herein are intended only to aid the reader in understanding the principles of the present subject matter. Moreover, all statements herein reciting principles, aspects and embodiments of the present subject matter, as well as specific examples thereof, are intended to encompass equivalents thereof.

[0024] Fig. 1 illustrates a block diagram representation of an examination system 100 for ocular screening, as per an example. It should be noted that FIG. 1 provides various functional blocks and should not be construed as a limitation. The examination system 100 includes an examining unit 102, sensor(s) 110, engines 118, interface(s) 128, memory 130, and data 132. The examining unit 102 further may include an illumination unit 104, an imaging unit 106 and other examination unit(s) 108. The sensor(s) 110 may further include ambient light sensor 112, distance sensor 114 and other sensor(s) 116. It should be understood that the sensor(s) 110 may be implemented as an array of individual sensors, or may be implemented as consolidated sensors configurable to implement multiple functions without deviating from the scope of the present subject matter. The engine(s) 118 may include a metering engine 120, an inspection engine 122, an assessment engine 124 and other engine(s) 126. The data 132 includes data that is either predefined or generated as a result of the functionalities implemented by any of the engine(s) 114. Further, the data 132 may be partially or completely stored in the memory 130. Further, the data 132 may include sensor data 134, examining data 136, predetermined ocular data 138 and other data 140. It should be noted that such exemplifications are only indicative and should not be construed as limitation. The interface(s) 128 may include a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, network devices, and the like. The interface(s) 128 facilitates communication between the engine(s) 118 and various devices, such as sensor(s) 110 connected in examination system 100. In some cases, the interface 128 may also facilitate communications between the examination system 100 and one or more other computing devices.

[0025] The engine(s) 118 may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the engine(s) 118. In examples described herein, such combinations of hardware and programming may be implemented in a number of different ways. For example, the programming for the engine(s) 118 may be processor executable instructions stored on a non-transitory machine- readable storage medium and the hardware for the engine(s) 118 may include a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement engine(s) 118. In such examples, the examination system 100 may include the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to examination system 100 and the processing resource. In other examples, engine(s) 118 may be implemented by electronic circuitry. In another example the engine(s) may be implemented on external computing devices which communicate with the examination system 100 through the interfaces 128.

[0026] The memory 130 may store one or more computer-readable instructions, which may be fetched and executed so as to provide access to digital content using a machine -readable link. The memory 130 may include any non- transitory computer-readable medium including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.

[0027] The examination system 100 may be implemented inside a housing as a standalone device, and may have capabilities to communicate with other devices, or configurations which allow it to be coupled with other devices such as, ophthalmoscope, a torchlight, a computing device, such as a phone, a tablet, and a desktop.

[0028] The illumination unit 104 of the examining unit(s) 102 may further include an illumination source for irradiating the eye. The illumination source may emit a beam of light which may then be directed at one or both of the eyes of an examinee 142 to irradiate the eye. The illumination unit 104 may be powered by an electric source.

[0029] The imaging unit 106 may obtain images of the eye irradiated by the illumination unit 104. The imaging unit 106 may further include an imaging device, such as a camera, one or more optical lenses, and an aperture. Further, the imaging device, the one or more optical lenses and the aperture may be configurable to focus on the eye for capturing images.

[0030] The examination system 100 may further include the metering engine 120. The metering engine 120 may receive examining parameters, such as ambient light, distance between the examination system 100 and the examinee, and pupil size of the examinee. The metering engine 120 may receive the examining parameters from various sensors, such as ambient light sensor and distance sensor.

[0031] In an example, based on the examining parameters received, the metering engine 120 may determine one or more operating characteristics for operating the illumination unit 104. The operating characteristics may be further processed for generating control instructions to configure and/or operate the illumination unit 104. The one or more operating characteristics for operating the illumination unit 104 may include intensity of illumination and eccentricity of illumination. Further, the intensity of the illumination unit 104 may be configured by varying the power input to the illumination unit 104, and the eccentricity of the illumination unit 104 may be configured by changing the position of the illumination unit 104. In another example, based on the examining parameters received, the metering engine 120 may generate control instructions to configure and/or operate the imaging unit 106.

[0032] In operation, the metering engine 120 receives the examining parameters, such as ambient light, distance between the examination system 100 and the examinee 142 and pupil size of the examinee. Upon receiving the examining parameters, the metering engine 120 determines the characteristics of the illumination required. In an example, the metering engine 120, based on the ambient light, distance between the examination system 100 and the examinee 142 and pupil size of the examinee 142, may determine the intensity of the light required from the illumination unit 104. Further, the metering engine may determine the eccentricity of the light required from the illumination unit 104 based on the distance between the examination system 100 and the examinee. In another example, the metering engine may additionally receive examine information, such as age and ethnicity of the examinee, and based on the examinee information may determine the operating characteristics of the illumination unit 104. For example, for younger examinees, such as children, the intensity of light required from the illumination unit 104 may be different as compared to adults. Further, the intensity of light required from the illumination unit 104 may also vary on the basis of the ethnicity of the examinees. [0033] Upon configuring of the illumination unit 104, based on the control instructions generated by the metering engine 120, illumination unit 104 may emit a beam of light to irradiate the eye of the examinee.

[0034] The imaging unit 106 may obtain images of the examinee's eye irradiated by the illumination unit 104. In an example, the imaging unit 106 includes an imaging device, such as a camera. The imaging unit may further include one or more optical lenses and an aperture. However, in other examples, the imaging unit 106 may be implemented as an arrangement of one of photodiodes, photoresistors, and phototransistors. Further, the imaging unit 106 may be sensitive to multiple wavelengths of light, so that the imaging unit 106 may record the combined intensities of all wavelengths together. Alternatively, the imaging unit 106 may be sensitive to one wavelength and may record different wavelengths separately.

[0035] In an example, the metering engine 120, based on the examining parameters received, further generates control instructions to configure the imaging unit 106. The configuring of the imaging unit 106 may adjust the sensitivity of the imaging device, adjust the position of the optical lenses, and adjust the position and size of the aperture.

[0036] The examination system 100 further includes an inspection engine 122. In an example, the inspection engine 122 receives examinee information. Further, based on the examinee information received, the inspection engine 122 automatically determines a type of engaging-stimuli to be generated. Upon determining a type of engaging-stimuli to be generated, the inspection engine 122 may then trigger the generation of an engaging-stimuli to obtain engagement of the examinee.

[0037] As mentioned before, certain examinees, especially children may have short attention spans and may be easily distracted. Lack of attention on the part of the examinee may lead to obtaining of faulty images by the imaging unit which may turn out to be unsuitable for examination or may affect the accuracy of the examination results. Therefore, it may be desirable to obtain examinee's full attention while the examination is performed by the examination system 100. To obtain attention of the examinee, the examinee may be provided an engaging- stimuli that may prevent the examinee from being distracted. The engaging- stimuli may use the technique of arrested nystagmus based on optokinetic response of the eye, which may allow for accurate reading of hypermetropia and strabismus. Further, an engaging-stimuli may be selected which may work best for the examinee.

[0038] Returning to Fig. 1, based on the examinee information received, the inspection engine 122 may determine a type of engaging-stimuli to be generated. The engaging-stimuli may be provided in form of predetermined audio patterns and/or visual patterns. The visual patterns may include arrangement of coloured LEDs and/or images. For example, if the examinee information indicates the examinee to be a toddler, the inspection engine 122 may trigger playing of predetermined audio patterns. On the other hand, if the examinee information indicates pre-school children, the inspection engine 122 may trigger playback of predetermined visual patterns, such as popular cartoons or arrangements of alphabets, or a combination of audio patterns and visual patterns. Further, the audio patterns may include buzzer sounds, or sounds of objects, animals and characters which may be provided by buzzers or speakers. The visual patterns may be provided through a display, an arrangement of visible light sources, such as colored LEDs, or an image of a character or an object. Once the attention of the examinee 142 is obtained, the imaging unit 106 may capture images of the eye of the examinee. Further, the examiner 144 may judge whether attention of the examinee has been obtained based on visual inspection. The visual inspection may include whether the examinee's eye is open, the pupil size and illumination are in required ranges and the eye and the examination system 100 are aligned.

[0039] In another example, the inspection engine 122 receives an input from an examiner 144 for generating an engaging-stimulus. Based on the input received from the examiner, the inspection engine 122 triggers generation of an engaging- stimuli to obtain engagement of the examinee. In other words, the examiner may decide if an engaging-stimulus is required and the type of engaging- stimuli to be provided to the examinee. For example, the examiner may decide to provide the engaging- stimuli based on his visual observation of the examinee.

[0040] The examination system 100 may further include an assessment engine 124. Upon configuring of the illumination unit 104 and the imaging unit 106, and obtaining attention of the examinee, the assessment engine 124 may trigger obtaining of images by the imaging unit 106. The images obtained by the imaging unit 106 along with the examinee information may be stored as examination data 136.

[0041] The assessment engine 124 may perform real-time assessment of the obtained images. Based on the assessment, if the obtained images are not found to be of required quality, the assessment engine 124 may provide an indication to the examiner 144 to repeat the examination procedure. Further, the assessment engine 124 may indicate a need for reconfiguring the examining unit 102 (illumination unit 104 and imaging unit 106), based on which the examiner may then provide an input for reconfiguring of the examining unit 102. However, the assessment engine 124 may also coordinate with the metering engine 120 to automatically reconfigure the examining unit 102. In another example, the assessment engine 124 may indicate lack of attention of the examinee 142, following which the inspection engine 122 may cycle through the various audio patterns and visual patterns until examinee's attention is obtained. Alternatively, the examiner 144 may cycle through the various audio and visual patterns and their combinations manually. The examination procedure may be repeated until images of required quality are obtained.

[0042] Upon obtaining images of required quality, the assessment engine 124 may assess the images to detect probable ocular defects, ocular conditions, and abnormalities. In an example, the assessment engine 124 compares the images (stored as examining data 136 with predetermined ocular data 138. The predetermined ocular data 138 may include data pertaining to normal eye conditions, data pertaining to various eye disorders, and data indicating deviation from the normal eye conditions. Examples of ocular conditions and abnormalities detected by the assessment engine 124 may include, but are not limited to, refractive errors, strabismus, cataract and the presence of foreign bodies. The images obtained as well as the ocular conditions and abnormalities detected by the assessment engine 124 may then be displayed to the examiner via a display screen.

[0043] Fig. 2 illustrates an example of the examining unit 102. The examining unit 102, as illustrated, includes an illumination unit 104 and an imaging unit 106. The illumination unit 104 further includes an illumination source 202 and an illumination driver 204. The illumination source may be a Near Field Infrared (NIR) light LED and/or a visible light LED. In another example, the illumination unit 104 may include a plurality of illumination sources arranged in a way to obtain accurate eccentric photorefraction image. Furthermore, the illumination sources may be monochromatic or polychromatic. The imaging unit 106 further includes an imaging device 206, optical lenses 208, an optical adjuster 210 and an aperture 212 for aligning the eye of the examinee.

[0044] In operation, a light beam emitted by the illumination source 202 may be directed at one or both eyes of the examinee, and the reflected light beam may be received by the imaging unit 106. The position of the illumination source 202 of the illumination unit 104 may be adjustable to allow accurate directing of the light beam at the eyes of the examinee 142. The illumination source 202 may be adjusted by the illumination driver 204. As shown in the Fig. 2, the illumination source 202 may slide along the length of the illumination driver 204, and its position may be fixed at a point wherefrom the light beam may be accurately directed at the eye of the examinee. The position of the illumination source 202 may be adjusted and fixed manually by the examiner 144. Alternatively, the illumination driver 204 may be coupled to the assessment engine 124, and based on the images of the eye obtained from the imaging unit 106, the assessment engine 124 may cause the illumination driver 204 to adjust and fix the position of the illumination source 202. Further, the illumination driver 204 may adjust the eccentricity and intensity of the illumination source 202. The illumination driver 204 may be coupled to the metering engine 120. Based on the examining parameters, the metering engine 120 may determine the characteristics of illumination receivable from the illumination unit 104 and generate control instructions to configure the illumination source 202. The control instructions may be picked up by the illumination driver 204 which may then configure the eccentricity and intensity of the illumination source 202. The illumination unit 104 may be controlled to provide one of constant illumination, pulsed illumination, synchronized flashes and a combination of these. In an example, multiple illumination sources may be used, such that some of the illumination sources may be illuminated constantly, while rest of the illumination sources may be pulsed or synchronized with image capturing.

[0045] The imaging unit 106 receives the light beam emitted by the illumination source 202 after reflection from the eye of the examinee. As mentioned earlier, the imaging unit 106 may include an imaging device 206, optical lenses 208, aperture 212 and an optical adjuster 210. The imaging device 206 may be a digital camera. The aperture 212 may allow the light beam reflected from the eye of the examinee 142 to be received by the imaging device 206 via the optical lenses 208. The optical lenses 208 may focus the reflected light beam on the imaging device 206. Further, the optical lenses 208 and the aperture 212 may be adjustable by means of an optical adjuster 210 for accurately focusing the reflected light beam on the imaging device 206. As illustrated in the Fig. 2, the optical lenses 208 may slide relative to each other along the optical adjuster 210. For example, the position of the various optical lenses 208 may be adjusted and fixed to accurately focus the reflected light beam at the imaging device 206. Furthermore, the optical lenses 208 may be adjusted and fixed manually by the examiner 144. In an example, the optical adjuster 210 may be coupled to the assessment engine 124, and based on the images of the eye obtained by the imaging device 106, the assessment engine 124 may cause the optical adjuster 210 to adjust the position of the optical lenses 208, control opening and closing of aperture 212, and control the size of aperture 212.

[0046] In an example, as shown in Fig. 2, the illumination unit 104 may be positioned near the aperture side of the imaging unit 106. Therefore, during operation, light from the illumination unit 104 irradiates the examinee's eye placed near the aperture 212, and the light reflected from the eye is received by the imaging device 306 passing through the optical lenses 208. In another example (not shown), the illumination unit 104 may be positioned near the imaging device side of imaging unit 106. In operation, the light beam from the illumination unit 104 passes through the optical lenses 208 to irradiate the eye of the examinee 142, and the light reflected from the eye of the examinee is received by the imaging device 206 upon passing through the optical lenses 208.

[0047] Fig. 3 illustrates an example of the examination system 100 implemented as a device 300. The device 300 includes a housing 302. The housing 302 may be made of any suitable material, such as plastic and metal. The housing 302 houses the examining unit(s) 102, sensor(s) 110, engines 118, interface 128, memory storage device and data storage device.

[0048] The device 300 may further include a display screen 304, a sensor aperture 306, an illumination light aperture 308, an optical aperture 212, and an input touch screen panel 310. The display screen 304 may provide engaging stimuli in form of images and video playback. The sensor aperture 306 may facilitate the operation of the various sensors, such as ambient light sensor and distance sensor. For example, the sensor aperture 306 may allow emitting of infrared light rays, in case the sensors are implemented as infrared sensors. The illumination light aperture 308 may allow passing of illumination light from the illumination source to the examinee's eye, and an optical aperture 212 may facilitate the receiving of light reflected from examinee's eye by the imaging device 206. The illumination light aperture 308 may be in form of a slot that may allow sliding movement and adjusting of position and fixing of the illumination source, to ensure that the illuminating light is accurately directed at the eye of the examinee.

[0049] In an operation of the device 300, the examiner may position the device suitably to allow the light beam from the illumination source via the illumination light aperture 308 to irradiate the eye of the examinee. The position of the illumination source may be adjusted on the illumination light aperture 308 (slot) to accurately direct the light beam from the illumination source on the eye of the examinee. The position of the illumination source may be adjusted manually by the examiner, or alternatively, the illumination source may be adjusted automatically by an illumination driver coupled to the assessment engine. The assessment engine may obtain sensor data for example, the distance between the device and the examinee, the ambient light and the images of the eyes from the imaging unit, and based on these, the assessment engine may cause the illumination driver to adjust and fix the position of the illumination source. For example, if the distance between the examinee and the device changes, the position of the illumination source can be adjusted. In another example if the examinee's eye has not been accurately captured by the imaging unit due to improper illumination of the eye, the assessment engine may cause the illumination driver to change the intensity of the illumination source.

[0050] The various sensors may sense the examining parameters, such as the ambient light and the distance between the examination system device 300 and the examinees eye, and generate corresponding sensing signals. The sensing signals may be picked up by metering engine 120. Based upon the sensing signals, the metering engine 120 may generate control signals to configure the examining unit. In an example, the metering engine 120 may generate control signals to configure the intensity and eccentricity of the illumination unit 104. In another example, the metering engine 120 may generate controls to configure the various characteristics of the imaging unit 106, such as size and position of the aperture and position of the various optical lenses.

[0051] Upon configuring the examining unit, engaging stimuli may be provided to the examinee. The engaging stimuli may be provided in form of audio patterns or visual patterns based on the type of examinee. For example, toddlers may be provided with audio patterns through speakers, while the pre-schoolers may be provided with visual patterns through the display screen 304.

[0052] Light from the eye of the examinee is then received by the imaging unit via the optical aperture 212. A real-time assessment of the obtained images may be performed to determine if quality images have been obtained or if screening procedure may be repeated. Based on the real-time assessment, the examining unit(s) may be re-configured and/or a variation of engaging- stimuli may be provided to the examinee to obtain attention of the examinee. Once images of required quality are obtained, the images may be analysed to detect probable ocular defects, ocular conditions and abnormalities.

[0053] The device 300 may further include an input touchscreen 310 providing a communication interface. For example, the input touchscreen 310 may be used by the examiner to examine information and device settings. Further, the input touchscreen 310 may display the images obtained by the imaging unit, results of the analysis of the assessment engine, and device settings. The input touchscreen 310 may be a light emitting diode (LED) screen, a plasma screen, a liquid crystal display (LCD) screen or an organic light emitting diode (OLED). Further, a physical stimuli element, such as a haptic element may be provided for providing input by the examiner. Further, the device 300 may include an audio output, such as a speaker. Furthermore, a communication port 312 may be provided for enabling wired data transfer from and to the device 300. The communication port 312 may also be used to provide external power to the device during operation.

[0054] The working of the examination system 100 is explained in conjunction with Fig. 4, which illustrates an example method 400 for ocular screening. The order in which the method 400 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any appropriate order to execute the method 400 or an alternative method. Additionally, individual blocks may be deleted from the method 400 without departing from the spirit and scope of the subject matter described herein.

[0055] In an example, at block 402, various examining parameters are received. The examining parameters may include ambient light, distance between the examination system 100 and an examinee 142 and pupil size of the examinee. Further, the examining parameters may be obtained from various sensors, such as an ambient light sensor and a distance sensor.

[0056] At block 404, the imaging unit 106 is configured based on the examining paraments received at block 402. In an example, the imaging unit 106 includes an imaging device 206 like a digital camera, optical lenses 208, optical aperture 212 and an optical adjuster 210. Configuring the imaging unit may include adjusting the sensitivity of the camera, adjusting the size and position of the optical aperture 212, and adjusting the relative distance between the various optical lenses 208. For example, the size of the aperture 212 may be configured based on the ambient light sensed by the ambient light sensor to record detailed image data of the examinee's eyes.

[0057] At block 406, upon configuring the imaging unit 106, a type of engaging- stimuli is determined and provided to the examinee depending upon the examinee information to obtain attention of the examinee 142 during the screening procedure. The type of engaging-stimuli to be provided may be determined either manually by the examiner 144, or it may be determined automatically by an inspection engine 122, based on the type of the examinee 142. Further, the engaging-stimuli may be provided in form of audio patterns and/or image and visual patterns.

[0058] Once the imaging unit 106 is configured and attention of the examinee is ensured, at block 408, the imaging unit 106 starts obtaining images of the eyes of the examinee. The eye or pair of eyes under examination may be irradiated by a light beam produced by an illumination unit 104 which may be positioned facing the examinee. In an example, the examination may be performed with the illumination unit 104 kept at a distance of 1-2 meters form the eye of the examinee. However, the distance between the illumination unit 104 and the eyes of the examinee may vary based on the other examining parameters. The light beam upon reflection from the eyes is then captured by the imaging unit 106 to obtain an image of the eyes.

[0059] At block 410, the images obtained by the imaging unit 106 at the block 408 are assessed. The images may be assessed by an assessment engine 124. In an example, the assessment is carried out by comparing and analysing the obtained images against predetermined ocular data, such as data pertaining to normal eye conditions and various eye disorders. The assessment may be carried out in real-time to determine if the screening procedure may be repeated to obtained quality results, and based on the real-time assessment, the imaging unit 106 may be re-configured and a variation of engaging- stimuli may be provided to the examinee. Once quality images have been obtained, the assessment engine 124 assesses the images to detect probable ocular conditions and abnormalities, such as refractive errors, strabismus, cataract and the presence of foreign bodies. The assessment may be carried out to identify abnormalities, such as dissimilarity in ocular alignment of both eyes (based on Corneal Reflex and Hirschberg's principles), refractive errors, and presence of foreign bodies in the eye.

[0060] At block 412, upon assessment, the images obtained by the imaging unit 106 and the results about presence of ocular conditions and abnormalities in the eye are provided to the examiner for their reference. In an example, the images obtained and the results of the assessment are displayed on a display screen 310. Further, the obtained images and results may be communicated to other computing devices via various communication channels, such as Bluetooth, Wi- Fi, infrared, Near Field Communication (NFC), or ZigBee. In an example, the wireless channels may be implemented as Global system for Mobile Communication (GSM), Code Division Multiple Access (CDMA), Universal Mobile Telecommunications system (UMTS), Long term evolution (LTE), and High-Speed Packet Access (HSPA).

[0061] Fig. 5 illustrates an example method 500 of working of examination system 100 for ocular screening. At block 502, an examining parameter of ambient light is detected in the ocular screening environment. The ambient light may be detected by an ambient light sensor. At block 504, an examining parameter of working distance between the eye of the examinee 142 and the examination system 100 is determined. At block 508, based on the detected ambient light and the determined working distance between the eye of the examinee and examination system, the illumination unit 104 and imaging unit 106 of the examining unit(s) 102 are configured. Further, the configuring of the illumination unit 104 may include adjusting intensity and eccentricity of illumination receivable from the illumination unit 104, and configuring of the imaging unit 106 may include adjusting the sensitivity of the imagining device, adjusting the position of the optical lenses and the position and size of aperture.

[0062] At block 506, a type of engaging- stimuli to be provided and the timing of the providing the engaging-stimuli is determined. At block 510, the engaging- stimuli determined at block 506 is provided to the examinee to obtained attention of the examinee. The engaging-stimuli may be in form of one of or combination of audio patterns and visual patterns.

[0063] At block 512, upon configuring the illumination unit 104, illumination is provided to irradiate one or both the eyes of the examinee. At block 514, upon configuring of the illumination unit 104 and imaging unit 106 and ensuring attention of the examinee (after providing engaging- stimuli), the imaging unit 106 may start obtaining images. At block 516, the obtained images are assessed. The images may be assessed by an assessment engine 124. In an example, the assessment engine 124 may assess the images qualitatively and quantitatively by comparing the data obtained from images with the predetermined ocular data.

[0064] At decision block 518, based on the assessment of the obtained images, it is determined if the quality of the images is up to a required level. If the determined quality is as required, the ('YES' path), the method proceeds to the block 520. The images may be assessed quantitatively to analyse various aspects, such as pupil reaction, motility, visual fixation, binocular interaction, and visual acuity. Based on the assessment of the images, the probable ocular defects, ocular conditions, and abnormalities are diagnosed. For example, various conditions, such as Visual acuity, Astigmatism, Strabismus, myopia, hypermetropia, Nystagmus, Amblyopia, Eye Infections, Cataract, and Retinoblastoma may be diagnosed.

[0065] However, if at decision block 518, the obtained images are not found to be required quality ('NO' path), the method proceeds back to the block 506 and 508. As such, the examining unit 102 (illumination unit 104 and imaging unit 106) is re-configured at block 508. For example, the intensity and eccentricity of the illumination unit 104 is adjusted once again to ensure accurate obtaining of images by the imaging unit 106. Further, the position of the optical lenses and the position and size of aperture of the imaging unit 106 may be adjusted again at block 508. At block 506, the type and timing of engaging-stimuli is determined once again. The type and timing of engaging-stimuli may be determined either manually by the examiner or automatically by the inspection engine.

[0066] Thereafter, the blocks 510 and 512 may be repeated until obtained images are determined of required quality at step 514.

[0067] Although examples for the present disclosure have been described in language specific to structural features and/or methods, it should be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed and explained as examples of the present disclosure.