Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND SYSTEM FOR PERFORMING A VISUAL FUNCTION TEST
Document Type and Number:
WIPO Patent Application WO/2020/225544
Kind Code:
A1
Abstract:
The present techniques relate to a method and system for performing tests of visual function for all ages. The method comprises connecting a user device to a test device and displaying the visual function test on a display of the test device. Then, receiving, via a user interface on the user device, user input in response to the displayed visual function test and adapting the displayed visual function test in response to the received user input. The system and method may use a portable device such as a tablet or smartphone. Accordingly, the device may be positioned on a platform or podium. Thus, the device may be static during the testing.

Inventors:
ALLEN LOUISE (GB)
STARKIE STEPHEN (GB)
Application Number:
PCT/GB2020/051087
Publication Date:
November 12, 2020
Filing Date:
May 01, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
CAMBRIDGE ENTPR LTD (GB)
International Classes:
A61B3/02; A61B3/00; A61B3/024; A61B3/028; A61B3/032; A61B3/06
Domestic Patent References:
WO2016196803A12016-12-08
WO2017174817A12017-10-12
Foreign References:
JP2007105097A2007-04-26
GB2514529A2014-12-03
US8851678B22014-10-07
Attorney, Agent or Firm:
APPLEYARD LEES IP LLP (GB)
Download PDF:
Claims:
Claims

1 . A system for performing a visual function test, the system comprising:

a user device which comprises

a user interface for receiving user input in response to a displayed visual function test;

a processor and

a communications module; and

a test device which comprises

a communications module for communicatively coupling to the communications module of the user device;

a display for displaying the visual function test; and

a processor which is configured to

adapt the displayed visual function test in response to the received user input; wherein the system is configured to calculate the distance between the user device and the test device.

2. The system of claim 1 , wherein the test device processor is further configured to resize the displayed visual function test based on the calculated distance. 3. The system of claim 1 or claim 2, wherein the user device processor is further configured to determine whether the calculated distance is acceptable for displaying the visual function test and when it is determined that the calculated distance is outside the acceptable range, the user device is configured to output a signal to the user to adjust the distance between the user device and the test device.

4. The system of any one of the preceding claims, wherein the test device display is configured to display a glyph and the user device processor is configured to calculate at least one of the distance between the user device and the test device and a relative orientation of the user device to the test device from the displayed glyph.

5. The system of any one of the preceding claims, wherein the test device processor is configured to adapt the displayed visual function test a predetermined number of times before a user result is output.

6. The system of any one of the preceding claims, wherein the user device processor is configured to determine an adaptation to the displayed visual function test based on the received user input and

transmit the adaptation to the test device and

wherein the test device processor is configured to adapt the displayed visual function test based on the received adaptation.

7. The system of any one of claims 1 to 5, wherein the test device processor is configured to adapt the displayed visual function test by

receiving user input from the user device, and

determining an adaptation to the displayed visual function test based on the received user input.

8. The system of claim 7, wherein the visual function test is a visual acuity test displaying a plurality of optotypes and wherein the test device processor is configured to determine the adaptation by

determining whether the received user input is correct;

in response to the received user input being correct, decreasing a size of the optotypes; and

in response to the received user input being incorrect, increasing the size of the optotypes.

9. The system of any one of the preceding claims, wherein the user device processor is configured to calculate the relative orientation of the user device to the test device.

10. The system of claim 9, wherein the user device processor is further configured to determine whether the calculated orientation is acceptable for displaying the visual function test.

1 1 . The system of any one of the preceding claims, wherein the user device comprises a display for displaying a second visual function test.

12. The system of any one of the preceding claims, wherein the user device processor is configured to calculate a distance between the user device and the user.

13. The system of claim 12, further comprising a wearable member which comprises a glyph, wherein when the wearable member is worn by a user during testing, the glyph faces the user device.

14. The system of any one of the preceding claims, further comprising at least one occluder for separately testing each eye of a user and wherein the user device processor is configured to identify which eye is being tested.

15. The system of claim 14, wherein the at least one occluder comprises a glyph and the user device processor is configured to identify the glyph and determine which eye is being tested based on the identified glyph.

16. A method for performing a visual function test on a system comprising a user device and a test device, the method comprising:

connecting the user device to the test device;

calculating a distance between the user device and the test device;

displaying the visual function test on a display of the test device;

receiving, via a user interface on the user device, user input in response to the displayed visual function test; and

adapting the displayed visual function test on the test device in response to the received user input.

17. A non-transitory computer readable medium comprising program code which when implemented on a system causes the system to perform the method of claim 16.

18. A system for performing a visual function test, the system comprising:

a user device which comprises

a display for displaying the visual function test;

a user interface for receiving user input in response to a displayed visual function test; and

a processor; and

a wearable member which comprises a glyph,

wherein when the wearable member is worn by a user during testing, the glyph faces the user device and the processor is configured to calculate the distance between the user device and the user from the glyph.

19. The system of claim 19, wherein the visual function test is at least one of a visual field test, a contrast sensitivity test, a near vision test and a colour vision test.

20. The system of claim 18 or claim 19, wherein the processor is configured to calculate the distance by capturing an image of the glyph and fitting a bounding box around the captured image.

Description:
Method and System for Performing a Visual Function Test

Field

[0001] The invention relates to a method and system for performing tests of visual function for all ages.

Background

[0002] Visual function tests are subjective, psychophysical tests. Accuracy depends on the test being performed at the correct distance for each specific test, together with the explanation and/or enthusiasm of the trained individual administering the test. Changes in either of these factors, in addition to the subject’s level of interest, will result in a different measurement outcome. Accuracy of visual function measurements is particularly important in ophthalmology clinics, optometry assessments, vision screening and occupational or driving vision checks. Capacity issues within hospital setting relate to increasing numbers of referrals and staff shortages. Additionally, many non-ophthalmic areas of the hospital do not have access to visual function charts. Ophthalmic clinic workflow bottlenecks occur due to an insufficiency of nurses and/or vision charts available to test each patient prior to the clinic appointment. Optometrists have difficulties with vision testing when performing domiciliary visits and most do not have the charts required to test children under 5 years of age. Nationally recommended child vision screening at 4 to 5 years of age is not carried out in a third of Local Authorities due to cost and, where it is performed, false positive rates are high, possibly due to inaccurate distance measurements and the shyness and unfamiliarity of letter recognition by the child. DVLA and occupational health vision checks are rudimentary due to lack of standardized charts and the absence of assessor training. Pre-literate children and infants are poorly served with visual function tests which are paper based and degrade with time. As a result, many are either not tested or have inaccurate testing.

[0003] There are various devices for measuring information relating to a patient’s eyes. For example WO2017/174817 describes a distance measuring system which measures distances between a subject’s eye and one or more objects. A statistical distribution of the viewing distances is then determined. GB 2514529 describes a device for measuring the contrast sensitivity function of a user’s eyes. The device has a user input to allow the user to input their threshold of contrast sensitivity function across at least part of a contrast sensitivity function a plurality of times. [0004] The present applicant has recognised the need for a different method and system for performing a visual function test.

Summary

[0005] According to the present invention there is provided an apparatus and method as set forth in the appended claims. Other features of the invention will be apparent from the dependent claims, and the description which follows.

[0006] We describe a system for performing a visual function test, the system comprising: a user device which is an electronic device and which comprises a user interface for receiving user input in response to a displayed visual function test; a processor and a communications module; and a test device which is an electronic device and which comprises a communications module for communicatively coupling to the communications module of the user device; a display for displaying the visual function test; and a processor which is configured to adapt the displayed visual function test in response to the received user input.

[0007] We also describe a method for performing a visual function test, the method comprising: connecting a user device to a test device; displaying the visual function test on a display of the test device; receiving, via a user interface on the user device, user input in response to the displayed visual function test; and adapting the displayed visual function test in response to the received user input.

[0008] The system may be configured to calculate the distance between the user device and the test device. For example, the user device may calculate the distance. The test device processor may be further configured to resize the displayed visual function test based on the calculated distance. Resizing the test based on an accurate distance calculation will lead to more accurate results. The system may be configured to prevent the test from commencing until the devices are at an appropriate distance, e.g. the test device processor may be configured to delay display of the visual function test until an indication is received that the test device and user device are appropriately spaced.

[0009] The system and method may use a portable device such as a tablet, personal computer or smartphone. The device may be positioned on a platform or podium and may thus be static during the testing. The system and method may be used to deliver a suite of visual function tests, including visual acuity, visual field and colour and contrast sensitivity, with the potential to add new tests in the future. These tests can be used by infants through to adults. When a child or adult is being tested, the visual function tests typically do not require the input of an examiner and are effectively“self-testing”. Thus, in this example, the user input is the response by the child or adult being tested. When an infant is being tested, the input of a trained examiner, who may for example be observing the infant’s response via streamed webcam video from the test device in front of the child, is required. In this example, the user input is the examiner’s feedback which is input into the user device which alters the testing on the test device. The examiner may thus be holding the user device. In either arrangement, the system may reduce clinical time through self-testing, improve accuracy of tests and reduce unnecessary referrals, resulting in huge cost savings for the NHS. The user device and the test device may be any suitable pair of devices, e.g. a tablet and/or a smartphone which include the necessary components, including a processor.

[0010] The following features apply equally to the method and system. The test device processor may be configured to adapt the displayed visual function test a predetermined number of times before a user result is output. For example, the predetermined number may be a minimum threshold value to improve accuracy of the result. The user device processor may be configured to determine an adaptation to the displayed visual function test based on the received user input and transmit the adaptation to the test device. The test device processor may be configured to adapt the displayed visual function test based on the received adaptation from the user device. Alternatively, the test device processor may be configured to adapt the displayed visual function test by receiving user input from the user device, and determining an adaptation to the displayed visual function test based on the received user input. In other words, the adaptation may be calculated or obtained by either the user device or the test device and communication between the two devices leads to the adaptation of the test.

[0011] The visual function test may be a visual acuity test displaying a plurality of optotypes, e.g. letters, symbols or numbers which are used to test a user’s eye. The adaptation may be determined (by the test device or the user device) by determining whether the received user input is correct. In response to the received user input being correct, the adaptation may decrease a size of the optotypes; and in response to the received user input being incorrect, the adaptation may increase the size of the optotypes. The decrease and increase may be different or the same. For example, where the optotypes have a size on the logMAFt scale, the decrease may be by three points on the logMar scale and the increase may be by one point.

[0012] The system may perform a calibration phase before performing the test. The calculation of the distance may be part of the calibration phase. For example, the user device processor may be further configured to determine whether the calculated distance is acceptable for displaying the visual function test. An acceptance distance may be within a predetermined range, e.g. 1 to 5m, which provides accurate test result. When it is determined that the calculated distance is outside the acceptable range, the user device may be configured to output a signal (e.g. an audio or visual message) to the user to adjust the distance between the user device and the test device. The distance may also be calculated in real-time during the testing phase to improve accuracy of the test result. [0013] As part of the calibration phase, or during the testing phase, the user device processor may be configured to calculate the relative orientation of the user device to the test device. The user device processor may be further configured to determine whether the calculated orientation is acceptable for displaying the visual function test. An acceptance distance may be within a predetermined range, e.g. within a few degrees of direct viewing, which provides accurate test result.

[0014] As part of the calibration phase, the user device processor may be configured to identify the test device. Each test device may thus have a unique identification associated therewith. The unique identification may be stored within the user device, e.g. in memory or storage.

[0015] During the calibration phase, the test device display may be configured to display a visual cue, e.g. a glyph. The user device processor may be configured to calculate at least one of the distance between the user device and the test device, the relative orientation of the user device to the test device and the identity of the test device from the displayed visual cue. The user device may be configured to capture an image of the visual cue (e.g. using a camera). The captured image may then be processed as required. For example, the processor may calculate the distance by capturing an image of the glyph and fitting a bounding box around the captured image. The size of the bounding box may represent a fixed distance which is the correct distance between the test device and the user device. The user may be prompted to move the user device backwards or forwards until the glyph is a matching fit in the bounding box. Alternatively, the distance between the test device and the user device may be calculated based on the size of the glyph using standard techniques. For example, to identify the test device, the captured image may be processed to identify the visual cue and to compare with the unique identification for each test device stored within the user device.

[0016] The user device may also be configured to calculate a distance between the user device and the user. The overall distance between the user and the test device may thus be calculated from the sum of the distance between the user device and the test device and the distance between the user and the user device.

[0017] The user device may also comprise a display for displaying a second visual function test. The user device may thus also be used as a test device, e.g. when two devices are not needed or available; or depending on the type of test. For example, the visual function test may be a test which is done near a user, e.g. one or more of a visual field test, a contrast sensitivity test, a near vision test and a colour vision test. The user device may be used independently of the test device

[0018] We also describe system for performing a visual function test, the system comprising: a user device which comprises a display for displaying the visual function test; a user interface for receiving user input in response to a displayed visual function test; and a processor and a wearable member which comprises a glyph, wherein when the wearable member is worn by a user during testing, the glyph faces the user device and the processor is configured to calculate the distance between the user device and the user from the glyph. The wearable device may be in the form of an occluder and may be a pair of glasses.

[0019] The system may further comprise at least one occluder (e.g. a single eye occluder or a pin-hole occluder) for separately testing each eye of a user. During the calibration phase, at least one of the user device processor or the test device processor may be configured to identify which eye is being tested based on the occluder. For example, the at least one occluder may comprise a visual cue, e.g. a glyph and the processor is configured to identify the glyph and determine which eye is being tested based on the identified glyph. During the calibration phase, the user device may also be configured to calculate the distance between the user and the user device and/or the orientation of the user device relative to the user. For example, the processor of the user device may also be able to use the glyph to determine the orientation of the occluder (and hence the user) relative to the user device and to determine the distance from the user device.

[0020] The processor may calculate the distance by capturing an image of the glyph and fitting a bounding box around the captured image. The size of the bounding box may represent a fixed distance which is the correct distance between the user and the user device. The user may be prompted to move the user device backwards or forwards until the glyph is a matching fit in the bounding box. Alternatively, the distance between the user and the user device may be calculated based on the size of the glyph using standard techniques.

[0021] As in the calculation of the distance and/or orientation between the user device and the test device, the calculation of the distance and/or orientation between the user device and the user may be automatic. In other words, no measurement needs to be taken by the user and/or examiner present for the testing. This increases accuracy of the testing.

[0022] The user interface in the user device may be in the form of a touch sensitive display or may be any other suitable interface for a user to input feedback.

[0023] We also describe a non-transitory computer readable medium comprising program code which when implemented on a user device and/or a test device as described above causes the device to perform the method as described above.

[0024] In summary, the system and devices described above may be used in three different ways. The two devices may be paired to provide self-testing for an adult, e.g. for visual acuity. Alternatively or additionally, one of the devices may be used individually for additional testing. Alternatively or additionally, the two devices may be paired to provide testing for babies and infants in which footage of the test subject viewing a test on one device is broadcast to a clinician using the other device.

Brief description of drawings

[0025] For a better understanding of the invention, and to show how embodiments of the same may be carried into effect, reference will now be made, by way of example only, to the accompanying diagrammatic drawings in which:

[0026] Figure 1 a is a schematic illustration of a system for performing a visual function test.

[0027] Figure 1 b is a block diagram of the components of a user device in the system of Figure 1 a;

[0028] Figure 1 c is a block diagram of the components of a test device in the system of Figure 1 a;

[0029] Figure 2a is a flow chart of one method for determining the distance between the test device and the user device;

[0030] Figure 2b is a schematic illustration of the configuration of the system in the method of Figure 2a;

[0031] Figure 3 are examples of glyphs which may be used in the configuration stage shown in Figures 2a and 2b;

[0032] Figures 4a and 4b are schematic illustrations of two stages in a visual function test being performed by the system of Figure 1 a;

[0033] Figures 5a and 5b are front and rear views of a first occluder for use with the system of Figure 1 a;

[0034] Figures 6a and 6b are front and rear views of a second occluder for use with the system of Figure 1 a;

[0035] Figure 6c is a perspective view of an alternative occluder;

[0036] Figure 7a is a flow chart of one method for determining the distance between the user and the user device;

[0037] Figure 7b is an illustration of the configuration of the use of the occluder of Figure 6a in the system ; [0038] Figure 8a is a schematic illustration of a different visual function test being performed by the system of Figure 1 a;

[0039] Figure 8b is a view of the user device adapted for use in the arrangement of Figure 8a;

[0040] Figure 9 is a flowchart illustrating the steps of the method which may be performed using the system of Figure 1 a;

[0041] Figure 10 is a flowchart illustrating some detailed steps for an alternative or additional method performed using the system of Figure 1 a;

[0042] Figure 1 1 is a flowchart illustrating some detailed steps for an alternative or additional method performed using the system of Figure 1 a; and

[0043] Figure 12 is a graph illustrating a Bland-Altman Plot for adults.

Detailed description of drawings

[0044] Figure 1 a shows a system which can be used for performed a visual function test. As shown, the system comprises a user device 10 and a test device 40 which are at a distance d from one another. One or both of the user and the test devices may be any suitable portable electronic device, e.g. a tablet, smart phone or mobile device or may be any suitable static electronic device, e.g. a smart TV or a desktop computer. A portable electronic device may be statically mounted on a suitable platform, e.g. a podium or the wall, for use in the testing. Even when static devices are used, it is envisaged that the distance between the devices may be changeable. The user and test devices may be the same device or may be compatible devices which are able to communication with one another.

[0045] The user device 10 comprises a display 38 on which an image 12 is displayed and at least one camera 39. The user device 10 may include a forward facing camera for calculating the distance between the user and the user device and a rear facing camera for calculating the distance between the test device and the user device. Similarly, the test device 40 comprises a display 70 on which an image 42 is displayed and a camera 72. The images 12 and 42 may be related or different as explained in more detail below. One or both of the displays 38, 70 may be a touch sensitive screen which allows a user to enter information into the relevant device by touching an appropriate part of the screen. One or both of the cameras 39, 72 may be a rear or forward facing camera and/or there may be both rear and forward facing cameras. For example, a rear facing camera on the user device 10 may be used to capture an image of the test device 40 and similarly, a forward facing camera on the test device 40 may be used to capture an image of the user device 1 0. A forward facing camera on either the user device or the test device may be used to capture an image of a user. Such an image can be used to identify what type of occluder is being used (if any) during the visual function test or may be used to support tests in which the user’s eye motion is monitored by a clinician. The image may also be used to measure the distance of the user to the user device and/or test device.

[0046] Figure 1 b shows the detailed components of the user device 10. All of the components may be connected by buses as is well known in the art. The user device 10 comprises a processor 22 which may carry out the steps as described below. There may also be memory 24 which may be non-volatile or volatile and which may store data relating to the steps described below, e.g. input responses to the visual function test and/or a calculation of the distance d between the devices. As described above, the user device 1 0 comprises a display 38, e.g. a LCD, LED display or any suitable known display. The display may also be a touch screen. Alternatively, as illustrated the user device may comprise a separate user interface 26, for example a keyboard, a mouse, a voice activated input or any other standard user interface by which a user can input responses or information to the user device 10. As described above, the user device 10 also comprises a camera 39 which as described below may take an image of the test device 40 or an image of the user of the user device 10.

[0047] The user device 10 further comprises storage 32 for storing modules which are to be carried out on the device. For example, the storage 32 may comprise an operating system module 34 to enable the device to operate. The storage 32 may also comprise a test module 36 which may process any input from the user in response to the visual function test as described below. There may also be a distance module 37 for determining the distance between the user and test devices and/or the distance between the user and the user device as described below. There is also a communication module 28 which communicatively couples the user device 10 to the test device 10 to enable the user device 10 to communicate with the test device 40 (and vice versa). The communication may be via any suitable protocol, e.g. Wi-Fi (e.g. Wi-Fi direct), Bluetooth, or a multi-peer connectivity framework (e.g. iOS or webRTC.

[0048] Figure 1 c shows the detailed components of the test device 40. Many of the components are the same or similar to those used in the user device 10 and may be connected by buses as illustrated. The test device 40 comprises a processor 52 which may carry out the steps as described below. There may also be memory 54 which may be non-volatile or volatile and which may store data relating to the steps described below, e.g. a calculation of the distance d between the devices. As described above, the user device 10 comprises a display 70 which may also be a touch screen. Alternatively, as illustrated the user device may comprise a separate user interface 56. As described above, the user device 1 0 also comprises a camera 72.

[0049] The test device 40 further comprises storage 62 for storing modules which are to be carried out on the device. For example, the storage 62 may comprise an operating system module 64 and a test module 66. There may also be a distance module 68 for determining the distance between the user and test devices as described below. There is also a communication module 58 which communicatively couples the test device 10 to the user device 10 using any suitable protocol.

[0050] It will be appreciated that each of the user device 10 and the test devices 40 may be standard electronic devices having the necessary components to carry out the steps described below. Merely as an example, the tablet sold by Microsoft ® called the Surface Pro™ may be suitable because it enables good communication via Wi-Fi direct, has a central front camera and is a suitable size (i.e. above a minimum size). For another example, a smartphone may be used as the user device 10 and/or the test device 40. Also, a personal computer may be used as the user device 10 and/or the test device 40. Thus the components which are illustrated are merely illustrative of the well-known components within such devices and there may be more standard components which are not depicted. The steps which are described in the method below may be contained in a downloadable app which can be downloaded by the user device 10 and/or the test device 40. Optionally, the method may be delivered to the user device 10 and/or the test device 40 via a web browser. For instance, in one arrangement the user device 10 may be a smartphone which downloads the application and the test device may be a personal computer which accesses the method via a web browser, or vice versa. In another arrangement the user device 1 0 may be a tablet and the test device 40 may be a second tablet. The skilled person will appreciate that any other similar arrangement is viable. Furthermore, it will be appreciated that these example devices may be constructed, partially or wholly, using dedicated special-purpose hardware.

[0051] Terms such as‘component’, ‘module’ or‘unit’ used herein may include, but are not limited to, a hardware device, such as circuitry in the form of discrete or integrated components, a Field Programmable Gate Array (FPGA) or Application Specific Integrated Circuit (ASIC), which performs certain tasks or provides the associated functionality. In some embodiments, the described elements may be configured to reside on a tangible, persistent, addressable storage medium and may be configured to execute on one or more processors. These functional elements may in some embodiments include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. Although the example embodiments have been described with reference to the components, modules and units discussed herein, such functional elements may be combined into fewer elements or separated into additional elements.

[0052] Figures 2a and 2b illustrate one method for calibrating the distance d between the user device 10 and the test device 40 may be calibrated. As shown in a first step S200 of Figure 2a and schematically illustrated in Figure 2b, the test device 40 displays a visual cue 124, in the form of a glyph, on its display. The user device 10 captures an image of the test device, including the visual cue using its camera (step S202).

[0053] The distance between the user device 1 0 and the test device may then be calculated by processing the image of the visual cue. For example, the image may be processed to determine the translation and orientation of the glyph relative to the camera and hence to determine the distance. Such techniques are known in the art and may include combinations of the following standard techniques to determine the distance, e.g. greyscaling, edge detection, thresholding, blob counting, quadrilateral check, brightness difference check, quadrilateral straightening, Otsu thresholding, co-planar pose estimation, point re-ordering and cell recognition. The distance calculation may also include accessing a look-up table (or similar) for the device type and the focal length of the camera. The focal length may be calibrated from the nominal focal length of the camera against pre-measured distances to ensure accuracy of the distance calculations.

[0054] Alternatively, as shown in Figures 2a and 2b, the test distance may be a set distance, e.g. 4m, and thus the size of the captured image can be used to determine whether or not the distance between the two devices meets the requirement. For example, this may be achieved by the user device displaying the captured image 122 on its display. A bounding box 126 around the visual cue may also be displayed on the user device 10. The bounding box 126 may be sized to ensure that the distance between the two devices is the set distance. The bounding box 126 may be used as a visual signal to the user of the user device 10 to indicate when the user device 1 0 is the correct distance from and/or at the correct orientation relative to the test device 40 to carry out the visual function test. For example, the bounding box 126 may be highlighted in red to illustrate that the distance or orientation is not correct and may change to a different colour, e.g. green, when the user is correctly positioned for the test. In other words, the user makes a determination as to whether the displayed image fits correctly within the bounding box (S204) and if not, the user is effectively instructed to change position (S206). The process of capturing the image and fitting the captured image into the box is then repeated. If (or once) there is a correct fit, the system can proceed to the next phase (step S208).

[0055] Additionally or alternatively, the bounding box may be used by the processor of the user device (without displaying either the bounding box or the visual cue) to determine whether the captured image of the visual cue fits exactly within the box (step S204). If there is not a correct fit, the user may be presented with another audio or visual signal regarding distance. For example, a message bar 120 may be displayed on the user device 10 and a message with instructions to the user may be displayed, e.g.“move further away” may be displayed when the user device is too close to the test device and thus the user will change the position (step S206). If there is a correct fit, the system can proceed to the next phase (S208), e.g. the relative orientation of the devices may be checked in a similar manner to the distance and a message “straighten the device” may be displayed when the user device is at an angle to the test device. Alternatively, the message“hold steady to start the test” may be displayed when the user device and the test device are the correct distance (and orientation if tested) apart. Thus, the system is able to inform the user that the distance at which the test is run is the correct one.

[0056] Figure 3 illustrates a few glyphs which may be used. There may be multiple test devices, e.g. one for each type of test. A different glyph may be used for each test device so that the user device can correctly identify the test device. Such glyphs can be randomly generated as is well known, e.g. from the 9-bit code space (512 codes). Some of the glyphs may also be fixedly allocated to allow the identification of which eye is being tested as explained in more detail below. In this case, these glyphs are not available to be allocated to the test devices. The use of a glyph for distance detection and to identify the test device and/or occluder being used may be a more effective method than other techniques such as colour segmentation or distance determination using other visual cues in the form of boxes. This is because the glyph may be segmented as part of the detection process. Furthermore, such glyphs are typically less prone to lighting issues and/or background colour issues which may impact other techniques.

[0057] Figures 4a and 4b illustrate a visual function test which tests a user’s visual acuity being performed using the system. As shown the test device 40 display the test 134 which comprises a plurality of letters. The sizing of each letter in the test 134 is determined by the test device 40 based on the distance between the user device 10 and the test device 40. Given that the distance between the two devices has been accurately calculated, there is more confidence that the test is being applied accurately because the user is viewing the letters at the expected and correct size. In this example, the user device 1 0 does not display the test but displays a user interface to allow the user’s response to the test to be input. For example, the user interface may comprise a message bar 120 which displays a message explaining what the user is expected to do, e.g.“tap the central letter shown on the far screen”. It will be appreciated that audio or other visual signals may also be used to communicate the instructions to the user. The user interface also comprises a plurality of letters 132 from which the user can select their response. These letters 132 are relatively large so that the user is easily able to read these letters 132 to avoid any false identifications, for example, the letters may be displayed at 1 .7 logMAR In this arrangement, the user interface is in the form of a touch sensitive screen but it will be appreciated that a user could input the response in another way, e.g. manually via a keyboard or orally via speech recognition modules.

[0058] In response to the user’s input, the display on the test device 40 is changed. For example, if the user incorrectly identifies the letter on the test device, the size of the letters 134 may be increased as indicated. As in standard tests, the enlargement may increase the size by three points on the standard logMAR scale. Once the letters are large relative to the size of the display on the test device, they may be displayed in a crowding box 136. Alternatively, if the user responds correctly, the size of the letters may be decreased. As in standard tests, the decrease may be by a single point on the standard logMAR scale. In other words, the increasing and decreasing of the size of the letters may be different. It will be appreciated that letters are just one suitable form of optotype and other optotypes such as images or numbers may be used depending on the user being tested. For example, simple shapes may be used for testing children. Alternatively, the same shape in different orientations may be used and the user must identify the correct orientation. Examples of other tests which could be used are described for example in US8851678. It will be also appreciated by the skilled person that the gamification of the test may be implemented. Advantageously, gamification of the test may improve motivation during the test. For a non-limiting example, the test may offer a reward and/or display funny/colourful characters to make the test more appealing to children. Thus, for instance, beneficially, engagement with children may be increased.

[0059] Visual function tests may be performed on both eyes or on individual eyes. For testing each eye separately, occluders such as those illustrated in Figures 5a to 7 may be used. The occluders illustrated in Figures 5a to 6b are flat wearable devices which fit over a user’s nose and are held in place by the user. Alternatively, the occluders may be fashioned onto overglasses 170 as shown in Figure 6c. In the arrangement shown in Figure 6c, the occluder portion 1 72 is centrally hinged and may be moved between a first position in which it obscures one eye as shown in Figure 6c and a second position in which it obscures the other eye.

[0060] The occluders are made from an opaque material. Figures 5a and 5b show opposed sides of a pin hole occluder 150 which blocks the vision in both eyes except through a pin hole 152. It will be appreciated that the size of the pin hole 152 has been exaggerated in the drawings. Each side of the pin hole occluder 150 contains a visual cue 152, 154, again in the form of a glyph. The visual cue 152 shown in Figure 5a is different from the visual cue 154 shown in Figure 5b. When a user holds the pin hole occluder 150 in front of their eyes, the visual cue 1 52, 154 on the opposed face can be captured and identified by a camera on the user device.

[0061] For example when the pin hole occluder 150 is held in front of a user’s face in the orientation shown in Figure 5a, a user only has visibility through the pin hole 152 using their left eye and thus the visual cue 156 shown in Figure 5b can be used to confirm that left eye testing using a pin hole occluder is being carried out. Similarly, when the pin hole occluder 150 is held in front of a user’s face in the orientation shown in Figure 5b, a user only has visibility through the pin hole 152 using their right eye and thus the visual cue 154 shown in Figure 5a can be used to confirm that right eye testing using a pin hole occluder is being carried out. In these arrangements, the occluders are reversible but the same principles could be applied to separate right and left eye occluders. In other words, a single visual cue could be applied to each occluder on its rear side, i.e. the side which faces the user device. [0062] Figures 6a and 6b show opposed sides of a single eye occluder 160 with each side of the occluder 160 containing a visual cue 162, 164. A cutaway section 162 enables visibility through one eye. As described above, when the single eye occluder 160 is held in front of a user’s face in the orientation shown in Figure 6a, a user only has full visibility through their left eye but no visibility through their right eye and thus the visual cue 166 shown in Figure 6b can be used to confirm that left eye testing is being carried out. Similarly, when the single eye occluder 1 60 is held in front of a user’s face in the orientation shown in Figure 6b, a user only has visibility through their right eye and thus the visual cue 164 shown in Figure 6a can be used to confirm that right eye testing is being carried out.

[0063] In the arrangement shown in Figure 6c, the occluder portion 172 is moveable to cover different eyes. As the occluder portion is moved from the first position to the second position, a different glyph is revealed. Accordingly, the user device can determine which eye is being tested from the revealed glyph. It is also noted that the overglasses may be adapted to be pin-hole occluders by using a similar moveable occluder portion with different glyphs to allow pin-hole testing when necessary.

[0064] Figures 7a and 7b illustrate the calibration identifying what type of testing is being carried out and the distance from the user to the device. As shown in Figure 7b, the user is holding a single eye occluder 160 in front of their eyes. As shown in Figure 7a, the user device 10 captures an image of the user (step S702) and the single eye occluder 160 and displays this on the display. The visual cue 166 may be identified using standard techniques such as those described above to determine the distance between the two devices. The user may be given instructions during the calibration phase using appropriate audio/visual signals. Thus, a bounding box may be overlaid on the captured image and a determination as to whether the captured image fits correctly in the box is made (step S704). The determination may be made by the user as described above or by the user device. If the user device determines that the user is not in the correct position, the user may be encouraged to move (step S706). For example, the message bar 120 may be used to give the user instructions, for example“move closer and hold steady”. If the user is in the correct position, the process may continue to the next phase (step S708), e.g. to begin the testing.

[0065] The display may also show schematic visualisations 168 of the occluder in both orientations (i.e. right or left eye) so that the user can see that the correct occluder has been identified. The visual cue may also be used to ensure that the user is looking direct (i.e. straight) into the screen and is centred relative to the display.

[0066] It will be appreciated that whilst a single eye occluder is illustrated, a similar technique can be used with the pin-hole occluder. Thus, the calibration stage identifies which eye is being tested and which type of occluder is being used. The calibration stage may also be used to determine the distance between the user and the user device using the glyph. The overall distance between the user and the test device can thus be calculated from the sum of the distance between the user and the user device and the distance between the user device and the test device. The calibration stage may thus be performed by the user device 1 0. During the test stage, the user device may similarly capture an image of the user with the occluder and use the visual cue to determine whether the user is correctly positioned (i.e. at the correct distance and orientation) and maintains the correct position. This checking can be performed by the user device because the distance and orientation between the user device and test device can be established as described above by the test device. Alternatively, this checking process could be carried out by the test device because the test device and the user device may in some cases be interchangeable.

[0067] During both the test and the calibration phase, as described above in relation to Figure 2a and 2b, a bounding box may be applied around the visual cue when it is displayed and the colour of the bounding box may act as a visual signal to the user. For example, a red bounding box may be used to indicate that the user is not correctly positioned for the calibration stage and a green bounding box may be used to indicate that the user is correctly positioned for the calibration stage and also subsequent testing stages.

[0068] Figure 8a is a schematic illustration of a different visual function test being carried out by the system. In this arrangement, the user device 1 0 is being used by a clinician 214. The test device 40 displays the visual function test which a test suitable for a baby or infant 234. The test may be a stimulus such as a black and white grating 134 as illustrated or any other similar test. Typically such tests are displayed on one side of the screen and then the other. In this case, the baby 234 is not able to input any response and thus the clinician 214 inputs a response based on their assessment of the baby 234.

[0069] As shown in Figure 8b, the devices may be connected and adapted to allow a live streaming video of the baby’s response to the visual stimuli which is captured by a camera on the test device to sent to the user device. The live streaming may be done over any suitable communication protocol which has minimal latency, e.g. less than 100ms to allow the clinician to accurately assess the patient’s response to the stimulus. In the arrangement shown in Figure 8b, an accessory camera 310 is connected to the test device 40 to capture the baby or infant’s response to the test. For example as shown, the accessory camera may be mounted centrally on the test device or alternatively be mounted along the top edge of the test device. The connection between the camera and test device may be via any suitable connection, e.g. through a USB connection.

[0070] Returning to Figure 8a, to assist in the inputting of the response, an interface may be displayed, e.g. a message bar 120, to prompt the clinician to enter the appropriate information. In response to the input from the clinician, the test being displayed on the test device 40 may be adapted. In this arrangement, the distance between the user device and the test device is not critical to the performance of the test. However, the distance of the baby 234 from the test device 40 may be critical. An occluder or similar device which a visual cue may be held in front of the baby 234 so that the distance of the baby 234 from the test device may be calculated using the visual cue as described above. In other words, the invention has ability to calculate the distance between the user device 10 and the test device 40 and to change the test properties according to the result of the calculation. For instance, the test may not commence if the test device 40 and the user device 10 are not at the appropriate distance. For another example, the test may not commence if the test device 40 and the test subject are not at the appropriate distance. The appropriate distance may be pre-configured. Also, any above-stated test properties can be changed based on the determined distance. The skilled person will appreciate that these are non-limiting examples. Advantageously, the present techniques offer more flexibility and personalization.

[0071] Figure 9 is a flowchart showing the steps which may be carried out by the system. The first stage is to connect a user device to a test device (S800). As described above, the connection may be via any suitable communication and may be across the internet, a local network, or use any suitable protocol, e.g. Wi-Fi direct for point to point connections. The next sequence of steps which may be termed a calibration phase and may be carried out in any order. As illustrated, one step may be to determine the distance between the user and the test device (S802) and as explained above, this may be calculated by determining the distance between the user and the user device, e.g. by using the front facing camera on the user device to capture an image of a glyph (or other visual cue) on a wearable device worn by the user, and by determining the distance between the user device and the test device, e.g. by using the rear facing camera on the user device to capture an image of a glyph (or other visual cue) displayed on the test device. The distance calculation may also include accessing a look-up table (or similar) for the focal length of the rear-facing camera. The focal length may be calibrated from the nominal focal length of the camera against pre-measured distances to ensure accuracy of the distance calculations. The distance calculation may thus be corrected based on the focal length. In addition to determining the distance, the orientation of the user device relative to the test device and/or the orientation of the user relative to the user device may be determined too. The skilled person will also appreciate that the invention has ability to change the test properties according to the calculated distance. For example, a different test images/symbols may be displayed based on the determined distance.

[0072] At step S804, a decision to determine whether the calculated distance is correct is performed. For accurate test results, the user device needs to be positioned correctly relative to the test device. For example, the calculated distance may be compared to a minimum threshold (e.g. 1 m) and/to a maximum threshold (e.g. 5m) to determine that the user (or user device) and the test device are within an acceptable range (e.g. between 1 to 5m). If the distance is not within an acceptable range, the user is instructed to change the position of the user device (S806), e.g. by appropriate visual and/or audio signals being generated for the user as described above. The method then loops to the previous stage of determining the distance. Once the distance is correct, the method moves on to another calibration stage. It will be appreciated that a similar iterative loop can be applied to place the user device at the correct orientation relative to the test device, i.e. if the orientation is incorrect, the user is instructed to change the orientation and the determining and decision steps are repeated until the orientation is acceptable.

[0073] The next calibration phase is shown as the identification of the test device (S808). This is optional and may also be done using the same visual cue which is used for determining the correct relative positions of the user device and test device. As explained above, each visual cue, e.g. glyph, may be unique to a particular test device and thus the visual cue may be detected and then the associated test device may be identified. For example, the user device may store the associations between the visual cues and the test devices in storage or memory. The identification stage may also be used to confirm that the expected devices are being paired because there may be more than one device in close proximity. An error message may be generated if the user device recognises that the test device is not the correct device. It will be appreciated that this stage of the calibration may be done before or simultaneously with the position calculations.

[0074] The next calibration phase is shown as a determination of which eye(s) is being tested (S810). As explained above, this may be done by using a visual cue (e.g. glyph) on an occluder. Each visual cue, e.g. glyph, may be unique to a particular occluder (e.g. right eye pin hole occluder, left eye pin hole occluder, right eye occluder or left eye occluder), and thus the visual cue may be detected and then the associated eye which is being tested may be identified. For example, the user device may store the associations between the visual cues and the occluders in its storage or memory. The visual cue on the occluder may also be used by the user device to determine whether the user is at the correct orientation and/or distance from the user device. It will be appreciated that this stage of the calibration may be done before or simultaneously with the other calibration phases.

[0075] This stage may also be used if both eyes are being tested simultaneously, for example for DVLA testing and some babies or toddlers who refuse occlusion. If both eyes are being tested, the user can wear a wearable device (e.g. a glyph sticker on the forehead) and this can be detected in the same manner as the glyphs above. Use of such an indicator when two eyes are being tested can also be used to check distance and orientation between the user and the user device which can be needed as described above. [0076] Once calibration is complete, the testing phase may begin. Using the processes explained in the calibration phase, the user device may continually check the distance and/or orientation of the user device relative to the test device to ensure that the user stays within the acceptable parameters throughout the test and/or check the distance between the user and the user device. The visual function test is displayed on the test device (S812). It will be appreciated that continually obtaining accurate distance information may provide more accurate test results, particularly for visual acuity tests where the size of optotypes being displayed on the test device may be accurately determined based on the distance information and for visual field tests where the visual field parameters change depending on distance from subject to the user tablet. For many visual function tests, such as near vision, colour vision and contrast sensitivity assessment the test may also be displayed on the user device and the user input will be on this device. For distance visual acuity tests, the user input (S814), e.g. for a visual acuity test, of the matching optotype (e.g. letter, symbol, shape) that they can see on the test device.

[0077] When the user device receives the input, a determination of whether or not to adjust the test (S816) is made. This determination may be made by the user device and communicated to the test device. Alternatively, the user input may be communicated to the test device which determines whether any changes are required. The adjusted test is then displayed and user input for the adjusted test is received. In other words, there is iteration of steps S812 and S814. For example, in a visual acuity test, a plurality of user inputs is normally required to accurately determine the user’s results. If the user input is correct, the test may be adjusted to decrease the size of the optotype, e.g. by one point on a logMAFt scale. Alternatively, if the user input is incorrect, the test may be adjusted to increase the size of the optotype, e.g. by three points on a logMAFt scale. When it is determined that no further adjustments are required, e.g. a user has provided a sufficient number of responses, the test result may be output (S818). The skilled person will also appreciate that the results may be output in graphical representation compared to normative data with the ability to plot and monitor changes over time.

[0078] Figure 1 0 shows more detail of the process for using the user device for testing, e.g. without using the test device. The first shown step is to determine the distance between the user and the user device (Step S902). This may be done as described above. Furthermore, if the distance is determined to be incorrect (step S904), the user may be instructed to move the position of the user device (step S906). Once the user device is in the correct position relative to the user, the user device may determined which eye(s) are being tested (step S908), e.g. as described above using wearable members with appropriate visual cues. The steps of the calibration phase may be carried out in any order.

[0079] Once the calibration phase is over, the test may be displayed (step S910) on the user device. The test may be a visual field test, a contrast sensitivity test, a near vision test and a colour vision test or any other test which is conducted at a short distance from the user. During the test phase, user input is received (step S912) and the test may be adjusted (step S914) based on the feedback. Alternatively, if no adjustment is required (e.g. because the test is finished), the result may be output (step S916).

[0080] Figure 1 1 shows more detail of the process for using the test device for testing babies and infants. The first shown step is to determine the distance between the baby and the test device (Step S1002). This may be done as described above. Furthermore, if the distance is determined to be incorrect (step S1004), the adult with the infant may be instructed to move the position of the test device (step S1006). Once the test device is in the correct position relative to the infant, the test device may determined which eye(s) are being tested (step S1008), e.g. as described above using wearable members with appropriate visual cues. The steps of the calibration phase may be carried out in any order.

[0081] Once the calibration phase is over, the test may be displayed (step S101 0) on the test device. During the test phase, there is live streaming to the user device which is used by a clinician. User input (i.e. clinician input) is received (step S1 012) and the test may be adjusted (step S1 014) based on the feedback. Alternatively, if no adjustment is required (e.g. because the test is finished), the result may be output (step S1016).

[0082] Figure 12 is a graph illustrating a Bland Altman Plot for adults. The Bland Altman Plot is a graphical method which compares two measurement techniques. In this case, the plot compares results which are generated by a current visual acuity test (denoted Chart VA) and the present invention (denoted DigVis). The upper limit of agreement is 0.28logMAR and the lower limit of agreement is -0.25logMAR. The mean bias is +0.009 ±0.09 logMAR. Thus, it can be seen that the present invention (DigVis) generates similar results compared to current visual acuity test (Chart VA). This highlights the accuracy of the present invention.

[0083] Various combinations of optional features have been described herein, and it will be appreciated that described features may be combined in any suitable combination. In particular, the features of any one example embodiment may be combined with features of any other embodiment, as appropriate, except where such combinations are mutually exclusive. Throughout this specification, the term “comprising” or “comprises” means including the component(s) specified but not to the exclusion of the presence of others.

[0084] Attention is directed to all papers and documents which are filed concurrently with or previous to this specification in connection with this application and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference.

[0085] All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.

[0086] Although a few preferred embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that various changes and modifications might be made without departing from the scope of the invention, as defined in the appended claims. The invention is thus not restricted to the details of the foregoing embodiment(s). The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.