Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD, SYSTEM AND APPARATUS FOR INVESTIGATING OR ASSESSING EYE OR PUPIL MOVEMENT
Document Type and Number:
WIPO Patent Application WO/2021/217218
Kind Code:
A1
Abstract:
Method, system and apparatus (10) for capturing eye/pupil movement(s). Wearable apparatus (10) includes image capture means (12,18), a light source (16) to illuminate the eye(s). Video recordings can be stored in memory (20) and/or transmitted to a remote location. The apparatus (10) can include on-board image processing. Pupil shape and/or position can be determined e.g. from glints (28). Direction and speed of eye motion can be determined. Head movement (50) and eye motion (52) measurements can be obtained. Measurement by head movement sensors (32), body sensors (34) and/or environment sensors (38) (e.g. gravity) can be factored in to an assessment of the user's condition.

Inventors:
LEWKOWSKI KATHERINE ANN (AU)
RAJAN GUNESH P (CH)
DEACON ROBERT JOHN (AU)
Application Number:
PCT/AU2021/050398
Publication Date:
November 04, 2021
Filing Date:
April 30, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
NEUROTOLOGIX PTY LTD (AU)
International Classes:
A61B3/113; A61B5/398
Domestic Patent References:
WO2007128034A12007-11-15
Foreign References:
US20160262608A12016-09-15
US8690750B22014-04-08
US20170011263A12017-01-12
US20160154241A12016-06-02
Attorney, Agent or Firm:
GRIFFITH HACK (AU)
Download PDF:
Claims:
CLAIMS:

1. A method of assessing changes in eye state using a wearable apparatus covering a wearer’s eyes that detects, measures and/or records eye movement, changes in the eye(s) or movement of features of the eye(s), the method including capturing image data of the wearer's eyes for a period of time, communicating the captured image data for assessing or informing diagnosis, absence or presence of a disorder or medical condition.

2. The method of claim 1 , wherein a starting point and finishing point of the pupil(s) is used to determine direction and magnitude of eye or pupil movement.

3. The method of claim 2, wherein time taken to go from the starting point to the finishing point is used to determine speed and/or rate of movement.

4. The method of any one of claims 1 to 3, including capturing rotational motion of the eye(s) and/or pupil(s).

5. The method of any one of the preceding claims, wherein time taken for a full or partial rotation is used to determine speed and/or rate of rotation and/or other eye kinematics.

6. The method of any one of the preceding claims, wherein variation/change in angle and/or rate of change of such angle of the pupil(s) relative to a reference axis or point is captured as image data and/or measured.

7. The method of claim 6, wherein the reference axis or point is at a centre of the respective eye(s)/pupil(s) or based on a reference point/axis distant from the eye(s)/pupil(s) at which the eye(s)/pupil(s) is/are required to gaze.

8. The method of any one of the preceding claims, including calibrating the apparatus or set-up/position of the apparatus on a user for investigating changes in the eye(s) and/or the pupil(s) of the user.

9. The method of claim 8, wherein the calibration determines position of the pupil(s) of the user’s eye(s) relative to the apparatus. 10. The method of any one of the preceding claims, including using pixel count of a sensor of a camera or each said camera of the apparatus to determine angle of gaze of the wearer.

11. The method of claim 10, including determining angle of gaze from captured image data and/or image data recordings of eye/pupil position/movement.

12. The method of claim 10 or claim 11 , wherein time rate of change of the angle of gaze is determined and/or displayed.

13. The method of any one of the preceding claims, including determining frame position by detecting one or both eyes of a wearer of the apparatus is/are within a frame of reference covered by the at least one image capture/recording apparatus.

14. The method of any one of the preceding claims, including identifying pupil position and/or pupil diameter in a reference region.

15. The method of claim 14, wherein at least one zone outside of the reference region is ignored or rejected or treated as unwanted or unnecessary.

16. The method of claim 14 or claim 15, wherein a previsualisation/detection field within which is captured the eye as the object of interest is used to ensure capturing the eye in image data and unnecessary frame/field beyond the eye is removed/reduced.

17. The method of claim 16, wherein electronic processing is used or the image capture apparatus is zoomed/focused to the eye to remove extraneous areas.

18. The method of any one of the preceding claims, wherein a portion/window of an overall sensor area of the image capture apparatus is used for windowing such that captured image data is derived from a portion of a sensor area.

19. The method of claim 18, wherein the portion/window is moved to centralise with respect to the eye(s)/pupil(s) to frame the eye(s)/pupil(s) in frame.

20. The method of any one of the preceding claims, wherein position of the eye(s)/pupil(s) relative to one another and/or to a reference is used for positioning the apparatus on/relative to the user. 21. The method of claim 20, wherein height of a feature of one said eye relative to a feature of the other said eye and/or to at least one reference is used for positioning the apparatus on/relative to the user.

22. The method of claim 20 or claim 21 , wherein position of corners of the eyes are determined and used to compensate for apparatus position on the face.

23. The method of any one of the preceding claims, wherein one or more of pupil centre, pupil shape or pupil position relative to a horizontal plane/reference is used for calibration/position determination.

24. The method of claim 23, wherein, if the pupils are not aligned on a horizontal plane, degree of pupil misalignment is calculated and used to either realign eye images to a true horizontal plane or as an adjustment parameter in later image post processing and analysis.

25. The method of any one of the preceding claims, including measuring/imaging variation in pupil shape from generally circular to ovoid/ellipsoid and compensating for non-circular irregularity via the image capture means and/or processing means.

26. The method of claim 25, including determining from the pupil shape a confidence factor in the accuracy of the eye kinematics.

27. The method of claim 26, wherein the confidence factor is used during analysis to modify or reject eye kinematic measurements.

28. The method of claim 26 or claim 27, wherein the confidence factor includes one or more of pixel count representing the shape of the pupil image, percentage of eyelid closure obstructing the view of the pupil, size of the pupil, angle of the pupil from the capture apparatus centre.

29. The method of any one of the preceding claims, including detecting partial or full closure of the eyes/eyelids.

30. The method of claim 29, including using threshold timing may to allow for a user blinking.

31. The method of claim 30, wherein, blinking is detected if eye full or partial eye closure is within a threshold time. 32. The method of claim 29, wherein if eye full or partial closure is beyond a threshold time an alert and/or stop recording and/or shut down/go into sleep mode occurs.

33. The method of any one of the preceding claims, wherein an alert is provided, or the apparatus ceases operating or does not start working until external light is removed or falls to an accepted level within the apparatus.

34. The method of any one of the preceding claims, wherein the at least one image capture apparatus is provided below a plane of straight-ahead/horizontal gaze of the user.

35. The method of any one of the preceding claims, including detecting one or more of head movement, head orientation and/or eye orientation or pupil orientation relative to the head and/or the apparatus.

36. The method of any one of the preceding claims, including detection and/or imaging of presence and/or location of at least one glint in a pupil or both eyes relative to respective pupil centre.

37. The method of claim 36, including identifying the at least one glint and position of the at least one glint relative to an edge of a pupil, iris or centre of a pupil.

38. The method of claim 37, wherein, for a given light source position within the apparatus, position of the glint is used to determine eye and/or pupil centre position.

39. The method of claim 37 or claim 38, wherein distance and/or position of the glint(s) from the respective pupil centre is used to determine apparatus position and/or orientation on the face of the user.

40. The method of any one of claims 37 to 39, wherein distance between two said glints and/or between at least one said glint and the centre of the pupil or other reference is used to compensate for non-ideal apparatus placement/orientation on the user’s head/face with respect to eye position.

41. The method of any one of the preceding claims, including using one or more of environmental metrics, body metrics or head metrics, in addition to information/data obtained or derived from image capture information/data from the apparatus, to assess neurological and/or vestibular condition or activity. 42. The method of claim 41, wherein the environmental metrics includes environmental information/data from at least one of underwater, temperature, ambient temperature, temperature change, air/water pressure and/or humidity or water density/salinity, high altitude, space flight, outer space and space living conditions, reduced/low gravity living, artificial, centrifuge and/or rotation induced gravity, obtained/derived information/data.

43. The method of claim 42, wherein the environmental metrics include one or more of change in gravity, direction and/or amount of gravity, radiation, atmospheric pressure, hydrostatic pressure, pollen count, oxygen level, presence of one or more gases, and/or the level of any gas or combination of gases.

44. The method of any one of the preceding claims, including assessing effects on a person of high altitude, space flight, off-earth time, low/reduced gravity relative to earth, by detecting/measuring eye position/changes in eye position relative to head position with change in environmental metrics/conditions.

45. The method of claim 41, wherein the body metrics include monitoring/measuring body functions and/or activities, one or more of blood pressure, body temperature, pulse, ECG, EEG, blood glucose, hearing level/ability, lung capacity, exercise sensors and information from body scans/images.

46. The method of claim 41, wherein the body metrics is sensed, measured and/or used in combination with eye kinematics/eye state measured by or derived from the apparatus to determine/assess changes in the body that occur before and/or during neuro-vestibular dysfunction or changes in neuro-vestibluar function.

47. The method of any one of the preceding claims, wherein one or more images or image data is used to identify the wearer or features of the wearer for personal identification, authentication, security, access control, authority to control/use equipment, authority to allow others such access/use.

48. The method of any one of the preceding claims, including determining general health, personal health, stress, fatigue or illness by detection/measurement of eye state or changes in eye state.

49. The method of any one of the preceding claims, including assessing data relating to head movement and eye or pupil motion. 50. The method of claim 49, the assessing further including assessing data relating to at least one measured body function and/or at least one environmental factor.

51. The method of claim 50, wherein the at least one body function includes at least one of heart rate, blood pressure, body temperature, oxygen level

52. The method of claim 50 or claim 51, wherein the at least one environmental factor includes at least one of ambient temperature, air pressure, humidity, location, gravity.

53. The method of any one of claims 49 to 52, including determining at least one threshold, safe range, cut-off.

54. The method of claim 53, wherein the at least one threshold, safe-range or cut-off includes determining one or more of a safe threshold for a person conducting an activity, a threshold for time spent in or presence in a particular gravity or change of gravity.

55. A system for use in assessing changes in eye state, the system including a wearable apparatus for covering a wearer’s eyes that detects, measures and/or records eye movement, changes in the eye(s) or movement of features of the eye(s), the system including image data capture means for capturing image data of motion or at least one feature of the wearer's eyes for a period of time, communication means for communicating the captured image data, and processing means for use in assessment or informing diagnosis, absence or presence of a disorder or medical condition.

56. The system of claim 55, wherein the apparatus is configured to capture a starting point and finishing point of the pupil(s) to determine direction and magnitude of eye or pupil movement.

57. The system of claim 56, wherein the system is configured to capture image data and determine time taken for the pupil(s) to go from the starting point to the finishing point for the system to determine speed and/or rate of movement.

58. The system of any one of claims 55 to 57, including a reference axis or point determined at a centre of the respective eye(s)/pupil(s) or based on a reference point/axis distant from the eye(s)/pupil(s) at which the eye(s)/pupil(s) is/are required to gaze.

59. The system of any one of claims 55 to 58, including pixel count of a sensor of a camera or each said camera of the apparatus to determine angle of gaze of the wearer. 60. The system of any one of claims 55 to 59, including processing means to determine angle of gaze of the eye(s) from image data and/or image data recordings of eye/pupil position/movement.

61. The system of any one of the preceding claims, including a reference region applicable to pupil position and/or pupil diameter.

62. The system of claim 61 , including at least one zone outside of the reference region, wherein such at least one zone is ignored or rejected or treated as unwanted or unnecessary during processing.

63. The system of claim 61 or claim 62, including a previsualisation/detection field within which the eye or pupil is captured as an object of interest for capturing the eye in image data and unnecessary frame/field beyond the eye or pupil is removed/reduced.

64. The system of claim 63, including electronic processing or zoom/focus of the image capture apparatus to remove extraneous areas.

65. The system of any one of claims 55 to 64, wherein a portion or window of an overall sensor area of the image capture apparatus is provided for windowing such that captured image data is derived from a portion of a sensor area.

66. The system of claim 65, wherein the portion or window centralise with respect to the eye(s)/pupil(s) to frame the eye(s)/pupil(s) in frame.

67. The system of any one of claims 55 to 66, including one or more of pupil centre, pupil shape or pupil position relative to a horizontal plane/reference for calibration/position determination.

68. The system of any one of claims 55 to 67, including measurement means for measuring/imaging variation in eye/pupil shape from generally circular to ovoid/ellipsoid, and compensation means for compensating for non-circular irregularity via the image capture means and/or processing means.

69. The system of any one of claims 55 to 68, including one or more of pixel count representing the shape of the pupil image, percentage of eyelid closure obstructing the view of the pupil, size of the pupil or angle of the pupil from the capture apparatus centre as a confidence factor for determining accuracy of eye/pupil kinematics. 70. The system of any one of claims 55 to 69, including a light detector configured to detect ingress of external light and instigate an alert, or the apparatus is stopped operating or is prevented from operating until the external light is removed or falls to an accepted level within the apparatus.

71. The system of any one of claims 55 to 70, wherein the at least one image capture apparatus is provided below a plane of straight-ahead/horizontal gaze of the user/wearer.

72. The system of any one of claims 55 to 71, including at least one measurement means to measure one or more of environmental metrics, body metrics or head metrics, in addition to information/data obtained or derived from image capture information/data from the apparatus, to assess neurological and/or vestibular condition or activity.

73. The system of claim 72, wherein the environmental metrics includes environmental information/data from at least one of underwater, temperature, ambient temperature, temperature change, air/water pressure and/or humidity or water density/salinity, high altitude, space flight, outer space and space living conditions, reduced/low gravity living, artificial, centrifuge and/or rotation induced gravity, obtained/derived information/data.

74. The system of claim 73, wherein the environmental metrics includes one or more of change in gravity, direction and/or amount of gravity, radiation, atmospheric pressure, hydrostatic pressure, pollen count, oxygen level, presence of one or more gases, and/or the level of any gas or combination of gases.

75. The system of any one of claims 73 or 74, wherein the body metrics include monitoring/measuring body functions and/or activities, one or more of blood pressure, body temperature, pulse, ECG, EEG, blood glucose, hearing level/ability, lung capacity, exercise sensors and information from body scans/images.

76. The system of claim 75, wherein the body metrics in combination with eye kinematics/eye state measured by or derived from the apparatus to determine/assess changes in the body that occur before and/or during neuro-vestibular dysfunction or changes in neuro-vestibular function.

77. The system of any one of claims 55 to 76, wherein one or more images or image data identifies the wearer or features of the wearer for personal identification, authentication, security, access control, authority to control/use equipment, authority to allow others such access/use.

Description:
METHOD, SYSTEM AND APPARATUS FOR INVESTIGATING OR ASSESSING EYE OR PUPIL MOVEMENT

FIELD OF THE INVENTION

[0001 ] The present invention relates to the investigation or assessment of changes in the eye, such as pupil position, pupil size, eye position and/or eye movement. Such investigation can include measuring and monitoring eye/pupil movement.

[0002] One or more forms of the present invention relate(s) particularly, though not exclusively, to an apparatus, system and/or method for the investigation of motion and/or changes in the eye and pupil, such as eye position, pupil size, pupil position and/or eye movement (such eye motion patterns as in nystagmus), which may be associated with vestibular and other neurological disorders.

BACKGROUND TO THE INVENTION

[0003] The presence of a number of medical and health problems can be detected by changes in the eye and/or its pupil, including changes in the position, movement and dilation of the eye and/or its pupil.

[0004] For example, it has been found that one of the leading causes of dizziness and balance problems is associated with the vestibular system in the inner ear or in the vestibular pathways and control centre of central nervous system.

[0005] The presence of nystagmus during a dizziness attack can suggest to a clinician that there is a vertiginous component to it and the direction of the nystagmus may provide some evidence to a specialist in the field, of more specific information, such as which ear (or which part of the ear) has the active disease.

[0006] In addition, the pattern of the nystagmus of one or both eyes can also indicate which part of the central nervous system is affected.

[0007] Nystagmus refers to rapid involuntary eye movements presenting as one or both eyes moving from side to side, up and down or rotating or with discordance between the two eyes. Nystagmus is understood to be caused by neurological or inner ear disorders [0008] Dizziness and balance problems constitute a major public health problem. A significant proportion of adults have had an episode of dizziness occurring with enough intensity or frequency to promote a visit to the doctor. It is one of the most difficult complaints to assess, as it is a subjective symptom of potentially numerous causes. Dizziness is often an episodic symptom, with the frequency of episodes highly variable. As a patient will very rarely have an episode in the clinic, clinicians frequently need to rely solely on the patient's (often unintentionally misleading) report of the symptom.

[0009] Accordingly, accurate diagnosis of balance symptoms is important not only to exclude potentially serious central causes but also to aid successful treatment. Unfortunately, accurate diagnosis is not possible in most cases, or delayed. Studies have shown that general practitioners (GPs) rarely failed to refer urgent cases but often failed to refer patients with persistent vestibular conditions.

[0010] Part of the difficulty is that incumbent nystagmography apparatus for detecting and recording nystagmus are large, expensive machines that are only available for use in-clinic by specialists. Due to the size, complexity and expense of these machines they are limited to laboratory or office-based use for specific tests. Those machines are not suited for use monitoring patients engaged in their everyday activities at the time of a dizzy attack.

[0011] Known ‘in-clinic’ apparatus measure eye movements using Videonystagmography (VNG) equipment. These systems are large, expensive and limited to in-clinic use. They are primarily used to record eye movements during tests that are designed to bring on symptoms.

[0012] Dizziness can be severely impairing. It can affect both physical and mental health as well as impede or otherwise affect the ability to work. Furthermore, it is significantly associated with the risk of falling and can lead to avoidance behaviour.

[0013] There are a number of underlying causes to dizziness. These include neurological, inner ear, psychiatric (especially anxiety and depression), chemical and cardiovascular pathologies. Diagnosis of the root cause is critical to exclude life- threatening conditions (such as stroke) and ensure appropriate referral for proper treatment and care for ongoing conditions. [0014] Unfortunately, dizziness is a non-specific symptom and is often very difficult to assess. It is usually an episodic symptom with a high proportion of people reporting recurrent attacks. Episodes can last from seconds to days, and they frequently occur at home or work where there is often no means to record the symptoms of the event. As a result, the clinician often relies on the patient's account of their history of dizziness, and these accounts can be highly subjective and inconsistent.

[0015] There are four main types of dizziness: vertigo, presyncope, disequilibrium and light-headedness. Vertigo is often a defining symptom in inner ear disorders and neurological conditions but can also be present in other pathologies.

[0016] Vertigo can be objectively verified in the clinic by examining eye movements and determining the presence of a particular type of nystagmus related eye movements.

[0017] Nystagmus can be induced artificially during controlled testing in the clinic to test a limited activity range of the vestibular system

[0018] Nystagmus can occur during times of dizziness. Capturing eye/pupil movements during an episode differentiates vertigo from the other types of dizziness, and further information such as the direction and speed of the nystagmus can often lead to diagnosis that is more specific. Furthermore, the difference in movement of the nystagmus relative to each eye can further contribute useful information regarding the underlying cause.

[0019] A apparatus and process for investigating nystagmus is discussed in international patent application PCT/AU2005/001671 published as W02006/047816. However, such an apparatus and method has at least one limitation and/or drawback.

[0020] One of the main problems with obtaining an accurate diagnosis of the cause of dizziness episodes (that can be acute) is that they can be unpredictable and frequently occur away from the medical practitioner, such as at home or work, where there is no equipment to record external symptoms of the event.

[0021 ] In these circumstances medical practitioners rely on the patient's account of their history of dizziness/vertigo episodes, and these accounts are highly subjective. This situation makes a complex diagnostic case even more challenging, as there are many different underlying pathologies associated with dizziness. For example, dizziness can be due to inner ear disorders, low blood pressure, neurological disorders (such as Parkinson's disease), medications, cancer, anxiety, low iron levels and low blood sugar. In these circumstances, doctors find it very difficult to identify and distinguish the underlying cause and seriousness of the dizziness and are uncertain about what type of specialist the patient needs to see.

[0022] Movement of the eyes or pupils during an episode of dizziness can give important information to the underlying pathology. The main eye and/or pupil movement observed during vertigo or dizziness is nystagmus. Nystagmus is an objective sign of the symptom of vertigo or dizziness and can help in the diagnosis of vestibular (peripheral-inner ear) or neurologic disorders. The absence of nystagmus can help the doctor focus on various other causes.

[0023] Nystagmus includes repetitive, rhythmic, to-and-fro involuntary eye movements. Such movements can include a slow drift of the eye(s) or pupil(s) in one direction (slow phase) followed by a faster correction movement (fast phase) in the opposite direction with or without a rotational component and with or without opposing nystagmus directions between the two eyes and/or pupils. Nystagmus are classified by the direction of the slow phase and can be in either horizontal (left, right), vertical (up, down) or torsional (clockwise, anti-clockwise) direction.

[0024] A single nystagmus is rarely a useful clinical sign, rather a series of similar nystagmus must be observed. For this reason, velocities of all slow waves, measured and plotted over time, can give a useful clinical picture.

[0025] Patients with nystagmus should preferably be observed for a period of 2 minutes or more to exclude the possibility of alternating nystagmus and to ensure that the current nystagmus are recorded accurately/clearly.

[0026] Several recordings during one episode can track the progress of the symptoms.

[0027] Several recordings can be used to detect recovery nystagmus (change of direction of nystagmus).

[0028] Some nystagmus episodes can last hours or even days.

[0029] Where nystagmus can be recorded and movements tracked, clinicians are provided with an objective indicator of the underlying pathology and ultimately the cause of a patient's dizziness, vertigo or imbalance; this reduces reliance on artificially induced assessment, subjective interpretation and recall of symptoms by the patient, which is often misleading and an inefficient use of time and resources

[0030] Neurologic and peripheral vestibular inner ear disorders can be differentiated based on the direction of nystagmus; purely vertical nystagmus usually indicating central origin (neurological) while horizontal nystagmus are often associated with peripheral (vestibular) conditions.

[0031 ] A specific example of the diagnostic utility of nystagmus is found in differentiating Meniere's disease from vestibular migraine. Research suggests that vestibular migraine is associated with nystagmus that will not change direction during an episode while those associated with Meniere's disease will change direction in the recovery phase.

[0032] It is with the aforementioned in mind that the present invention has been developed.

[0033] References to prior art in this specification are provided for illustrative purposes only and are not to be taken as an admission that such prior art is part of the common general knowledge in Australia or elsewhere.

SUMMARY OF THE INVENTION

[0034] One or more forms of the present invention was developed with a view to improving investigation of changes in the eye(s) and/or pupil(s), such as for investigating the presence of nystagmus or other eye feature (eye and/or pupil) movement related conditions.

[0035] A apparatus according to one or more forms of the present invention can be used away from the clinic, can be wearable (e.g. temporarily for recording/sensing an episode or for a longer time period e.g. if a series of episodes occur close together), and does not need a specialist to operate it.

[0036] However, it will be understood that the apparatus may also be used more generally in oculography and for the investigation of other changes of one or more eye features e.g. in the eye, iris or pupil.

[0037] It will be appreciated that the apparatus can be used to monitor treatment outcomes. For example, once a treatment has been started (e.g. surgical, medical or physiotherapy), the apparatus can be used to monitor the affect, progress or outcome of the treatment.

[0038] The apparatus can detect/monitor if there is a change in spontaneous nystagmus or monitor the progress/recovery of a vestibular injury.

[0039] The apparatus can optionally be used to monitor any vestibular side effects after stroke and other neurologic conditions, after neurosurgical and otologic (ear) surgeries head or ear surgery (e.g. after vestibular schwannoma surgery or cochlear implantation).

[0040] It will be appreciated that the apparatus can optionally be used to monitor vertigo or dizziness symptoms during clinical trials including pharmaceutical trials.

[0041] In particular, the apparatus can optionally be used as a remote patient monitoring (RPM) apparatus that determines the presence of vertigo or dizziness symptoms in relation to the introduction of a treatment such as those that may treat dizziness and balance problems associated with the vestibular system in the inner ear, vestibular pathways or the central nervous system. Example: pharmaceuticals to treat migraine.

[0042] Furthermore, vertigo and dizziness are common side effects in many drug categories (antihypertensives, antibiotics, diuretics, NSAIDS, antidepressants, antimigrainous effects etc.).

[0043] The apparatus can be used to remotely monitor and analyse vertigo/dizziness as an adverse effect during the administration of medication and drugs.

[0044] Establishing the speed and direction of nystagmus over time is important for determining the underlying mechanisms responsible for the nystagmus. For example, up-beating and down-beating nystagmus are often associated with lesions of the brain, while left-beating and right-beating nystagmus are often associated with peripheral lesions in the inner ear. Therefore, when analysing the captured/recorded eye movements, the clinician needs to know accurately if the direction of the movement represented on the video is true.

[0045] The apparatus may unintentionally or through patient intervention be positioned incorrectly on the face, and the video recording of eye position consequently taken at a non-optimal angle. As a result, the patterns/tracings may not be truly representative of the actual direction of the eye movement. [0046] One or more forms of the present invention includes position calibration on the user’s head/face.

[0047] Calibration (or compensation) refers to referencing a particular feature of the apparatus/method to a reference or creating a reference with which to base ongoing measurements, recording or use etc. The present invention encompasses various embodiments of calibration, as described herein, such as frame positioning (windowing), apparatus positioning (correct position on user’s face/head), and gaze positioning/pupil position (adjusting/compensating for angle of gaze). Other forms of calibration or compensation are envisaged to fall within the scope of the present invention.

[0048] The apparatus may include a self-contained portable face cover for at least part of a wearer’s face including the eyes, such as a mask, goggles or glasses, for use in the investigation of motion of the eye(s), motion of one or more eye or pupil features and/or changes in the eye or pupil, such as presented during nystagmus.

[0049] The apparatus may include: a light omitting cover for the eyes; a suitable light source to illuminate the wearer's eyes; a digital image sensor to capture images; a data processing unit for processing the captured images; a storage medium to record the processed images; and, a communications port (wired and/or wireless) for communicating images and commands to an external receiver.

[0050] A method of using the apparatus to investigate changes of the eye is also encompassed within one or more forms of the present invention.

[0051 ] For example, one or more forms of the present invention may include a method of investigating changes in the eye using a apparatus that detects, measures and/or records eye movement, changes in the eye(s) or movement of features of the eye(s), the method including applying the apparatus, such as a face cover for at least part of the face including the eyes, to a wearer’s face (covering the eyes), capturing images of the wearer's eyes for a period of time after said applying the apparatus; and communicating the captured images from the apparatus to an external receiver for analysis or analysing the images within the apparatus, wherein such analysis of the images is for assessing or informing diagnosis, absence or presence of a disorder or medical condition. [0052] One or more forms of the present invention includes a method of calibrating the apparatus or set-up/position of the apparatus on a user for investigating changes in the eye(s) and/or pupil(s) of the user, such as nystagmus.

[0053] Such calibration may be conducted to determine the position of the pupil(s) of the user’s eye(s) relative to the apparatus. The position needs to be known so as when the pupil(s) move, the angular degrees of movement can be accurately determined.

[0054] It will be appreciated that nystagmus are categorised by their speed and direction.

[0055] The starting point and finishing point of the pupil(s) may be used to determine direction and magnitude of movement.

[0056] Time taken to go from the eye/pupil motion starting point to the finishing point (e.g. from one to the other) can be used to determine speed and/or rate of movement e.g. as degrees per second.

[0057] Rotational motion of the eye/pupil (e.g. turning about a fixed or moving point) can be detected. Time taken for the rotational motion may be used to determine speed and/or rate of rotation and other eye kinematics.

[0058] Variation/change in angle and/or rate of change of such angle of the pupil relative to a reference axis or point can be measured. The reference axis or point can be, for example, positioned at a centre of the eye/pupil or based on a reference point/axis distant from the eye/pupil at which the eye/pupil is required to gaze.

[0059] Changes in angular velocity of eye/pupil movement are relative to each other or to a reference, rather than absolute. It is the pattern of the changes over time that is most useful.

[0060] A slow phase velocity (speed) of nystagmus is usually between 0 degrees/sec to 100 degrees/sec. A fast phase can be more than 100 degrees/sec.

[0061 ] One or more aspects of the present invention provides a system and/or method of setting up or calibrating the apparatus for use to help ensure correct frame position (eyes correctly positioned for viewing/recording movement of the eyes and/or correct positioning of the apparatus on the user’s head/face and/or correct position within a sensor area of the image to be recorded). [0062] Preferably, pixel count of a sensor of the camera or each said camera may be used to determine angle of gaze of the user.

[0063] Preferably, determining angle of gaze of a user need not be done in real time, but may be done so. However, such determination of angle of gaze may be determined from the recording(s) of eye/pupil position/movement and post-calculated/post-determined, and may be displayed over time, such as in a graph.

[0064] Angle of gaze may be displayed with respect to time e.g. graphing rate of change of angle of gaze.

[0065] Frame position (frame calibration or windowing): refers to the eye feature position (e.g. pupil position) within the captured video/picture frame. This preferably occurs for each eye.

[0066] Apparatus position: refers to the position of the apparatus (which may be embodied as goggles/cameras) on the user’s head/face. This is in relation to the horizontal plane with respect to the user’s eyes or other features of or around the eyes.

[0067] Pupil position (calibration of gaze or gaze calibration): refers to determining the correct angle of gaze (such as relative from a central position or horizontal) for the positions of the eye. This allows for the correct magnitude of eye movements to be calculated.

[0068] Frame Position (windowing)

[0069] Frame position preferably includes detecting one or both eyes of a wearer of the apparatus is/are within a frame of reference covered by at least one image recording apparatus, such as a camera e.g. a video camera.

[0070] The apparatus may detect, sense or recognised at least one eye feature of a user’s eye(s), e.g. one or both pupils. For example, the apparatus may recognise iris pattern of a user base on a previous use or recording by the apparatus for the same user.

[0071 ] One or more other eye feature may be recognised by the apparatus, such as pupil position, iris position, movement, patterning and/or colour or a combination of any two or more thereof. [0072] Pupil position and/or diameter in a reference region may be identified. Zones outside of the region may be ignored or rejected by the apparatus/system/method i.e. treated as unwanted or unnecessary areas.

[0073] It will be appreciated that the eyes/pupils need not be absolutely perfectly centralized in the frame/field, rather, sufficiently centralized in order to remove extraneous areas and ensure capture of the eyes/pupils of interest is the aim.

[0074] For example, a relatively wide/large visualisation/detection field (e.g. relatively large viewing ‘frame’ within which is captured the eye as the object of interest) can be employed to ensure capturing the eye. In such a large field/frame the eye can be identified and the unnecessary frame/field outside of the eye can be removed/cut down electronically through processing or the image capture apparatus (e.g. a camera) can be zoomed/focused to the eye to remove extraneous areas.

[0075] Reducing the frame/field to cover the areas of interest (e.g. the eye) provides for higher definition imaging in the smaller field/frame area of interest. This could also be done manually e.g. by the clinician in the initial set up of the apparatus with a patient, such as by selecting a desired frame from a larger full frame picture/image.

[0076] Selecting a desired frame rate and/or frame resolution (e.g. pixel count) can be used to determine the definition and file size of the video recording.

[0077] One or more forms of the present invention utilises windowing. For example, a window derived from the image sensor.

[0078] Windowing can reduce file-size and/or increase frame rate. Windowing can, for example, utilise a window of 640x480 pixels of a sensor.

[0079] A portion of an overall sensor area may be used for windowing. That is, image data may be derived from a portion of a sensor area rather than the entire area of the sensor. Alternatively, or in addition, not every pixel across a sensor need be used.

[0080] To reduce overall size of recorded images (stills and/or video), a portion of the overall sensor may be used or selected pixels from across the overall sensor, or a combination of a portion of the sensor and some of the pixels within that portion. [0081 ] Position of the eye feature(s) can be moved (such as electronically) to centralise on the sensor.

[0082] Preferably, a reduced image window (e.g. 640x480 pixels) from the image sensor is utilised to reduce file-size and increase frame rate.

[0083] One or more embodiments envisages using eye feature (e.g. pupil) location relative to the head/window/apparatus) to dynamically move the window to centralise with respect to the eyes. This will give the benefit of still having the eye in frame if the apparatus is mispositioned on the head.

[0084] Windowing can also use pupil location to dynamically move the window to centre on the eyes (e.g. during recording). This gives the benefit of still having the eye in frame if the apparatus, such as incorporated in or as goggles/glasses or other headset, are miss-positioned.

[0085] Apparatus Position

[0086] Eye feature position or eye position relative to position in the viewing frame of the apparatus (e.g. frame position relative to the eyes/eye feature) can be adjusted based on the determined position of the pupil(s), or any other feature, within the region.

[0087] The apparatus may be positionally adjusted on the face of the wearer/user or the eye/pupil position may be compensated for in the viewing frame by processing within the apparatus. For example, the apparatus may be moved on the face to position the eyes/pupils centrally with respect to viewing alignment of the camera(s).

[0088] Alternatively, or in addition, eye/pupil position can be centralized automatically by the apparatus/method e.g. electronically by moving and/or reducing visualised field/frame area to ensure eyes/pupils are sufficiently centralized.

[0089] Position of the pupils relative to one another and/or to a reference (such as at least one real or virtual horizontal reference, such as a line, with respect to the apparatus) may be used for positioning the apparatus on/relative to the user e.g. apparatus position calibration. [0090] Height of a feature of one eye, such as a pupil, relative to a feature of the other eye, such as the other pupil, and/or to the at least one reference, may be used for positioning the apparatus on/relative to the user e.g. apparatus position calibration.

[0091 ] Height of eye features (e.g. pupils) relative to one another and/or to the reference can be used to determine whether the apparatus has moved or is not correctly positioned on the user, such as in a horizontal apparatus position/orientation.

[0092] If the user’s eye features (e.g. pupils) are not at the same height as viewed by the camera(s), that is, the eye features (e.g. pupils) are at different relative height positions, the apparatus can be assumed to be not level.

[0093] Position of the corners of the eyes (canthi - the inner and outer canthi are, respectively, the medial and lateral ends/angles of the palpebral fissure) may be determined and used to compensate for apparatus position on the face.

[0094] One or more forms of the present invention envisages determining apparatus position/orientation, e.g. non-ideal (non-horizontal with respect to the eyes/pupils) placement of the apparatus, on a user’s head.

[0095] Apparatus position can affect the calibration of the angle of gaze (e.g. pupil position) , and therefore determining the position/orientation of the apparatus with respect to the user’s head/face/eyes can be valuable for correcting for incorrect positioning/orientation or compensating for a particular position/orientation to ensure accuracy of any assessment/recording of eye kinematics.

[0096] By lining the centre of the pupils up - when the pupils are not aligned on the horizontal plane, one or more embodiments of the present invention encompasses either determining position over the whole video image (realign) or use the determined true horizontal plane to determine the direction of the nystagmus (e.g. in the analysis), which can be used to influence graphs and/or interpretation.

[0097] This can be achieved by using a correction factor based on the calculated rotation necessary to be aligned. Further discourse on such ‘correction factor’ is provided below.

[0098] Gaze calibration/Pupil position [0099] The ellipticity, and particularly change of ellipticity, of the pupils can be assessed by using an apparatus, system and/or method of the present invention.

[00100] One or both of pupil centre and pupil shape, and/or pupil position relative to a horizontal plane/reference, may be used for calibration/position determination e.g. to set up the apparatus correctly to ensure accurate and useful image capture to aid correct assessment. Such assessment may be used for gaze calibration, apparatus position calibration/determination and/or frame position.

[00101 ] For example, if the pupils are not aligned on a horizontal plane, the degree of misalignment is calculated and used to either realign the pair of eye images to the true horizontal plane or as an adjustment parameter in later image post processing and analysis. Such assessment can be particularly useful where the user has mispositioned the apparatus on their face.

[00102] To accurately measure/detect the direction of the eye/pupil movements (in particular, the direction of the nystagmus) in relation to head position, it is preferable to know that when the apparatus records eye/pupil movement in a particular direction that this is the true direction of the eyes/pupils relative to the head. If the apparatus is not placed exactly horizontally aligned with the eyes, the apparatus will perceive a direction that is inaccurate, which may need to be compensated for.

[00103] Angle of gaze of the user (such as relative to horizontal or other reference) may be determined. Determination of angle of gaze of the eye(s) for a user/wearer may be based on the projected ovoid/ellipsoid shape of the pupil(s) for a particular patient/user at a given point in time.

[00104] Change in eye/pupil shape from generally circular to more oval/ellipsoid can be measured and compensated for by the image capture means and/or processing means.

[00105] Analysis of the observed pupil shape may be used to determine the confidence in the accuracy of the eye kinematics (‘confidence factor’). The confidence that the centre of the pupil has been accurately calculated may be represented as a real number between 0 and 1 inclusive.

[00106] This ‘confidence factor’ can be displayed alongside the eye movement graphs, in reports or any other representation of the eye kinematics. [00107] The confidence factor may be used during analysis to modify or reject eye kinematic measurements by the clinician or the system.

[00108] For example, artefacts contributing to the ‘confidence factor’ may include, the pixel count representing the shape of the pupil image, the percentage of eyelid closure obstructing the view of the pupil, the size of the pupil, the angle of the pupil from the capture apparatus centre.

[00109] The apparatus/method may detect eye closure, for example, blinking, partial or full closure of the eyes/eyelids.

[00110] Threshold timing may be used to allow for a user blinking. For example, if eye closure is within a threshold time, the apparatus detects that blinking is occurring.

[00111] If eye closure is beyond a threshold time and/or an amount of blinking exceeds a threshold limit (count of blinks or total time spent blinking, such as a proportion of eyes open versus eyes closed), the apparatus may provide an alert and/or stop recording and/or shut down/go into sleep mode.

[00112] The apparatus being unable to sufficiently detect the eyes/pupils may be due to external light on the camera sensor(s).

[00113] Ambient/exterior light is to be omitted from reaching the wearer’s/user’s eyes so that the user cannot fixate their eyes on any specific feature/object. It will be appreciated that if the wearer/user does fix their gaze on an object, they can override (stop or reduce) the spontaneous eye movements to be captured.

[00114] An alert may be provided, or the apparatus may cease or not start working until the external light is removed or falls to an accepted level within the apparatus.

[00115] Known camera and/or infra-red LED positions on the apparatus may be used to determine, predict and/or estimate motion and/or location of a patient’s eye(s)/pupil(s). For example, known camera and/or infra-red LED positions on the apparatus may provide known reference positions, such as distance and/or angle of separation between camera and LED(s) on the apparatus or relative to the eye(s)/pupil(s). [00116] It will be appreciated that detection of too much light (light level or wavelength or intensity or light for a period of time, or a combination thereof) reaching a camera/sensor can be used to indicate that the apparatus is not on the face or not correctly positioned e.g. that light is leaking in.

[00117] Preferably at least one said camera may be provided below a plane of horizontal gaze of the user. For example, when a person looks down, the top eyelid often partially closes which may obscure the pupil. However, when a person looks up, their bottom eyelid does not close and obscure the pupil. It is therefore beneficial to provide at least one camera below the plane of horizontal gaze (e.g. relatively low down) and aiming up towards the eye(s).

[00118] The low camera angle image can be morphed using a keystone correction to represent a front view image.

[00119] Head movement and/or head orientation can be detected, such as by at least one accelerometer, camera image, gyroscope, magnetic sensor, or a combination of any two or more thereof, provided by the apparatus.

[00120] Such head movement and/or head orientation, and preferably eye/pupil orientation relative to the head and/or apparatus can be used to help determine a correction factor (such as mentioned above).

[00121] Head movement and/or orientation detection may be used in addition to the eye position/movement recordings. For example, when capturing eye position/movement recordings by the camera(s), head position/orientation can be synchronised with the eye position recordings, which can be used to determine what physical orientation and/or activity the user is doing at the time of recording.

[00122] Head movements may be detected (e.g. by the apparatus) and correlated as a precursor to an episode/attack or during an episode/attack.

[00123] Co-ordinating detection and/or measurement of the head and eye/pupil movements together can be used to detection of specific conditions. For example, detecting/measuring the difference between head movement and eye/pupil movement, or first detecting head movement and then detecting eye/pupil movement, can be used to detect specific conditions, such as benign paroxysmal positional vertigo (BPPV). [00124] BPPV is generally considered to be caused by crystals forming, then aggregating and interfering within a part of the balance system in the inner ear. With BPPV vertigo is triggered by specific changes in position of the head (e.g. when turning in bed or tilting head back, looking up).

[00125] The apparatus can detect the specific head movement/s that occurred before the dizziness and then record the eye movements that ensued as a result. Not only will this help diagnose conditions such as BPPV, it can determine the ear and the specific otolith (inside the inner ear’s vestibular labyrinth) that is being affected. This allows for treatment to be specifically tailored to the individual.

[00126] Glint or glints will be understood as being the reflection of light from the eye and detectable as, or indicated as, a visual feature, such as detected/seen by a camera/sensor or another person.

[00127] A glint or glints can be understood to be light reflected off the eye, such as from the pupil or iris and noticeable/detectable as a spot or patch of light in the eye. For example, a glint can be a reflection in the eye from a light source within the apparatus, such as infrared LEDs illuminating one or both eyes.

[00128] Location of at least one glint in a pupil or both eyes (e.g. in one or more pupils and/or irises of the respective eye) relative to respective pupil centre can be used to determine/estimate the centre of the eye or each eye.

[00129] One or more embodiments of the present invention utilises location of glints in a pupil or pupils relative to the respective pupil(s) centre(s) to determine/estimate the location of the centre of a patient’s/user’s eye/pupil.

[00130] Alternatively or in addition, eye geometry, assumptions of eye geometry, known camera and IR led locations may be used to improve accuracy and reduce the time to estimate the location, eye kinematics and the centres of a patient's pupil(s)/eye(s).

[00131 ] Eye geometry can be assumed and may be used to determine/estimate location of a patient’s eye(s)/pupil(s).

[00132] A detection source, such as a camera, can identify the glint and its position on the eye e.g. relative to an edge of a pupil, iris or centre of a pupil, or, from the position of the glint in the eye, the centre of pupil can be determined. Consequently, for a given light source position, such as a light source fixed in position within the apparatus and illuminating the eye, the position of the glint can be used to determine eye and/or pupil centre position.

[00133] Reflection (glints) from the eyes of light from the illumination source within the apparatus may be detected by the at least one camera. Distance and/or position of the glints from the respective pupil centre may be used to determine apparatus position on the face of the user i.e. whether or not the apparatus is correctly positioned.

[00134] Distance between one or more glints and/or between at least one glint and the centre of the pupil may be used to compensate for non-ideal apparatus placement/orientation on the user’s head/face with respect to eye position.

[00135] Alternatively, or in addition, relative positions of glint(s) to pupil centre may be used for gaze calibration.

[00136] For example, if the eye is further from the camera/imaging apparatus, glints in the pupil will appear closer together. Knowing the distance from the camera to the eye and assuming standard or estimated eyeball dimensions allows for the calculation of gaze angle

[00137] One or more forms of the present invention provides the capture of information during controlled testing or simulated episodes of nystagmus e.g. in the clinic that induce nystagmus to test the current state of the vestibular equipment/system in the clinic.

[00138] One or more forms of the present invention provides an apparatus, apparatus, system or method that automatically detects eye movements, such as exhibited during nystagmus.

[00139] Statistical learning, machine learning or artificial intelligence (Al) algorithms can be employed to determine or refine the processing used to analyse captured images/image data of eye movements.

[00140] The apparatus, apparatus, system or method can be arranged and/or configured to detect patterns of nystagmus and report on them (see for example figures 3A, 3B, 3C, 3D, 4A, 4B, and 4C). [00141] Body and Environmental Metrics.

[00142] Eye information includes: any data from captured eye image data (e.g. video or still images) as current eye state (e.g. from individual frames of video or still images) or as movement or change over a period of time (eye state over time).

[00143] Eye state includes: one or more of the position, shape or size of one or both pupils; the condition/state, markings or any irregularities of the iris; irregularities in the sclera or limbal ring; any sign of infection on or in the eyeball, eyelid position, eye lid shape, any markings or infection around the eye; water around the eye; size and/or shape of the caruncle.

[00144] Eye state over time includes: changes in the position of the pupil(s) (e.g. nystagmus), changes in pupil size, pulsing movement of any parts of the eye and the surrounds thereof; eye closure (e.g. drooping eyelids or repetitive blinking); eye twitches; any changes to body parts mentioned in the eye state.

[00145] Augmented analytics includes: statistical learning, machine learning, artificial intelligence (Al) and /or any other statistical, computing or mathematical algorithms that use eye state information, eye kinematics (motion) or any other eye information associated with information from one or more sensors, including environmental sensors (e.g. temperature, pressure, atmospheric pressure, humidity, air/wind speed, magnetic field, direction, orientation (e.g. up, down, prone) - such as relative to a ground plane or horizontal/vertical, body movement sensors (such as head movement from accelerometers), and biomedical sensors (such as blood pressure, body temperature)).

[00146] Analytics, e.g. augmented analytics, can incorporate sensor information obtained/derived from the apparatus and/or obtained/derived from one or more sensors external to the apparatus.

[00147] Sensor information and/or analytics may be used to determine:

• Eye state in relation to environmental, body and/or head movement, and/or body metrics, or any combination thereof.

• Eye state over time, such as derived eye kinematics, in relation to the environment, body and/or head movement, and/or body metrics. • Threshold in which changes to any aspect or aspects of the environment, body, head, body movement, head movement and/or body metrics, are associated with or cause changes in eye state over time. For example, a threshold at which reduction or increase in atmospheric pressure induces nystagmus.

• A combination of thresholds of two or more changes in body state, environmental state and/or body metrics relates to or causes changes in eye state or eye state over time.

• Detection of eye behaviours related or that may be symptomatic of at least one medical condition (e.g. an underlying medical condition exacerbated by a change in environmental or body metrics or a medical condition caused by such a change).

[00148] One or more embodiments of the present invention incorporates monitoring of the environment and/or sensing/providing environmental information, such as to be used in combination with data/information directly from or derived from the apparatus.

[00149] Environmental information/data can include underwater, high altitude, space flight, outer space and space living conditions(e.g. reduced/low gravity living; artificial, centrifuge and/or rotation induced gravity) obtained/derived information/data. Such information/data can include ambient temperature, temperature change, air/water pressure and/or humidity or water density/salinity (e.g. for underwater).

[00150] Sensors for sensing environmental conditions can include detection of gravity, changes in gravity, direction and/or amount of gravity, radiation (e.g. type, level and/or change), atmospheric pressure, hydrostatic pressure, temperature, humidity, pollen count, oxygen level, presence of one or more gases, and/or the level of any gas or combination of gases.

[00151 ] Environmental and/or body metrics may be combined with measurement/detection of eye state and/or eye kinematics, such as over time, to assess the influence of environmental factors on the neurological or vestibular system. For example, the effects on the neuro-vestibular system of underwater pressure or the effects of reduced/no gravity or changes in pressure/gravity over time. [00152] For example, for assessing the effects on a person of high altitude and/or space flight - eye position/changes in eye position (such as nystagmus) in relation to head position with the change in environmental conditions (reduced/low/changes of gravity and/or air pressure and/or temperature)

[00153] Body metrics can include monitoring/measuring body functions and activities. Such body metrics can include one or more of measuring blood pressure, body temperature, pulse, ECG, EEG, blood glucose level, hearing level/ability, lung capacity, exercise sensors and information from body scans/images (MRI, PET, CT, X- ray, photogrammetric).

[00154] Body metrics can be sensed, measured and/or used in combination with eye kinematics/eye state measured by or derived from the apparatus (e.g. from the apparatus - pre or post processing) to determine/assess changes in the body that occur before and/or during neuro-vestibular dysfunction or changes in neuro-vestibluar function.

[00155] The combination of body metric information/data with eye state data and/or eye kinematics can be used to determine one or more triggers of vestibular system dysfunction, including long term and/or short term affect vestibular injuries/changes have on the body. This can help to understand how the vestibular system is affected by changes in the body and how the body is affected by the vestibular system.

[00156] Augmented analytics in combination with body metrics, identified patterns and triggers can help to determine the underlying cause of neurological and/or vestibular dysfunction. For example, to determine that hyperventilation precluded nystagmus eye movements.

[00157] Eye closure information/data can indicate fatigue and the presence of nystagmus indicates balance dysfunction. Together or separately this information can indicate whether a body meets fitness thresholds or to perform one or more particular functions. For example, the combined information can be used to assess a patient for fall risk after surgery or the ability to safely perform certain tasks after or during an extended period in space, undergoing space flight, a change in gravity on the body, or living in a low/reduced gravity environment e.g. compared to mean or sea level gravity on earth. [00158] Sensors on the body (torso, one or more of limbs, hands, feet, etc.) and head can include one or more head movement sensors (such as on the apparatus or external to the apparatus of the present invention), internal of the body (implanted, inserted or ingested), external of the body (mounted to the body or worn). Sensors may also be external and remote from the body, such as image capture means for photogrammetry.

[00159] Sensors can include accelerometers, gyroscopes and electro-static accelerometers.

[00160] The vestibular system can be triggered by changes to head position (roll, pitch, yaw) and in the normal case results in eye movements opposite the direction of movement. This is called the vestibular ocular reflex and it is a gaze stabilising reflex. If the vestibular system is compromised (e.g. during a change in gravity) then the vestibular ocular reflex acts accordingly in an expected manor. Measuring both the head movement and corresponding eye movement can be used to assess whether the vestibular system is functioning correctly. Any inconsistency with the response to head movement indicates a malfunctioning vestibular system or a vestibular system working differently/unexpectedly.

[00161 ] Body metrics and body/head movement can be used in combination with environmental measurements/information/data to determine/assess the effect on the neuro-vestibular system of the environmental changes/conditions.

[00162] Likewise, detected changes in the state or function of the neuro-vestibular system can be used to assess why a change in the environment/body metric and movement occurred. For example, analysing the vestibular ocular reflex (which considers eye and head movements) in relation to changes in gravity or radiation on the body. Such assessments could be used to determine a gravity threshold where the vestibular ocular reflex becomes affected.

[00163] Security/Identification

[00164] One or more forms of the present invention may obtain eye status information for biometric security/identification purposes.

[00165] One or more images obtained by an embodiment of the present invention can be used to identify the wear or features of the wearer for identification e.g. for authentication, security, access control, authority to control/use equipment, authority to allow others such access/use etc.

[00166] Portability of a user wearer apparatus according to an embodiment of the present invention allows the apparatus to be deployed readily without the need for a fixed installation.

[00167] For example, iris imaging within the apparatus can exclude extraneous light and desired illumination can be provided to obtain desired image quality.

[00168] Sclera imaging, limbal ring imaging, eye shape imaging, distance between eyes/pupils, position, size and complexity of blood vessels in the eye, may be used to assess the person for security/identification purposes.

[00169] Health status

[00170] Detection/measurement of eye state or changes in eye state by the apparatus/system of the present invention can be used to assess general health and any condition affecting the subject person, such as drug use, alcohol use, stress, fatigue or illness.

[00171 ] Such information/data can be combined with information from other sensors, such as external or internal body sensors.

[00172] Sclera imaging, eye shape imaging, distance between eyes/pupils, position, size and complexity of blood vessels in the eye, and any changes thereto, may be used to assess the person for health/change in health purposes.

[00173] One or more embodiments may include assessing data relating to head movement and eye or pupil motion, which may include assessing data relating to at least one measured body function and/or at least one environmental factor. The at least one body function may include at least one of heart rate, blood pressure, body temperature. At least one environmental factor may include at least one of ambient temperature, air pressure, humidity, location.

[00174] One or more embodiments may include determining at least one threshold, safe range, cut-off. The at least one threshold, safe-range or cut-off may include determining one or more of a safe threshold for a person conducting an activity, a threshold for time spent in or presence in a particular gravity or change of gravity.

[00175] A further aspect of the present invention provides a system for use in assessing changes in eye state, the system including a wearable apparatus for covering a wearer’s eyes that detects, measures and/or records eye movement, changes in the eye(s) or movement of features of the eye(s), the system including image data capture means for capturing image data of motion or at least one feature of the wearer's eyes for a period of time, communication means for communicating the captured image data, and processing means for use in assessment or informing diagnosis, absence or presence of a disorder or medical condition.

[00176] The apparatus may be configured to capture a starting point and finishing point of the pupil(s) to determine direction and magnitude of eye or pupil movement. The system may be configured to capture image data and determine time taken for the pupil(s) to go from the starting point to the finishing point for the system to determine speed and/or rate of movement.

[00177] The system or apparatus may include use of a reference axis or point determined at a centre of the respective eye(s)/pupil(s) or based on a reference point/axis distant from the eye(s)/pupil(s) at which the eye(s)/pupil(s) is/are required to gaze.

[00178] Pixel count of a sensor of a camera or each said camera of the apparatus can be used to determine angle of gaze of the wearer.

[00179] Processing means can be provided to determine angle of gaze of the eye(s) from image data and/or image data recordings of eye/pupil position/movement.

[00180] A further feature includes a reference region applicable to pupil position and/or pupil diameter. At least one zone may be outside of the reference region, wherein such at least one zone is ignored or rejected or treated as unwanted or unnecessary during processing.

[00181 ] A previsualisation/detection field may be within which the eye or pupil is captured as an object of interest for capturing the eye in image data and unnecessary frame/field beyond the eye or pupil is removed/reduced. [00182] Electronic processing or zoom/focus of the image capture apparatus can be provided to remove extraneous areas.

[00183] A portion or window of an overall sensor area of the image capture apparatus may be provided for windowing such that captured image data is derived from a portion of a sensor area. Preferably, the portion or window is central with respect to the eye(s)/pupil(s) to frame the eye(s)/pupil(s) in frame.

[00184] One or more of pupil centre, pupil shape or pupil position may be relative to a horizontal plane/reference for calibration/position determination.

[00185] Measurement means, such as one or more sensors and/or algorithms, can be provided for measuring/imaging variation in eye/pupil shape from generally circular to ovoid/ellipsoid, and compensation means for compensating for non-circular irregularity via the image capture means and/or processing means.

[00186] One or more of pixel counts or patterns can represent the shape of the pupil image, percentage of eyelid closure obstructing the view of the pupil, size of the pupil or angle of the pupil from the capture apparatus centre as a confidence factor for determining accuracy of eye/pupil kinematics.

[00187] A light detector can be configured to detect ingress of external light and instigate an alert, or the apparatus is stopped operating or is prevented from operating until the external light is removed or falls to an accepted level within the apparatus.

[00188] The apparatus can include or be a wearable apparatus, such as goggles, face mask, headset etc.

[00189] At least one image capture apparatus may be provided below a plane of straight-ahead/horizontal gaze of the user/wearer.

[00190] At least one measurement means may be provided to measure one or more of environmental metrics, body metrics or head metrics, in addition to information/data obtained or derived from image capture information/data from the apparatus, to assess neurological and/or vestibular condition or activity.

[00191 ] A further aspect of the present invention provides calibration (or compensation) of a wearable apparatus for use in assessing a neurological or vestibular condition, including referencing a particular feature of the apparatus to a reference or creating a reference with which to base ongoing measurements, recording or use. One or more embodiments of calibration includes frame positioning (windowing), apparatus positioning (correct position on user’s face/head), and gaze positioning/pupil position (adjusting/compensating for angle of gaze). Other forms of calibration or compensation are envisaged to fall within the scope of the present invention. Calibration (or compensation) may include one or more of:

• Eye feature position (e.g. pupil position) within the captured video/picture frame. This preferably occurs for each eye.

• Apparatus position calibration: position of the apparatus (which may be embodied as goggles/cameras) on the user’s head/face. This is in relation to the horizontal plane with respect to the user’s eyes or other features of or around the eyes.

• Pupil position (calibration of gaze or gaze calibration): determining the correct angle of gaze (such as relative from a central position or horizontal) for the positions of the eye. This allows for the correct magnitude of eye movements to be calculated.

[00192] Assessing effects of space travel/time spent off-earth (e.g. reduced gravity environments relative to earth gravity)

[00193] One or more embodiments of the present invention finds application in assessing space travellers prior to, during and/or after space travel or time spent on non earth planet, extra-terrestrial body or an off-earth installation, vehicle or habit.

[00194] Embodiments of the present invention may be employed to assess long term effects of space travel on space travellers returned to earth, or the effects of time spent in space in reduced gravity environments (such as space craft, space stations, non-earth planet habitats e.g. the Moon or Mars).

BRIEF DESCRIPTION OF THE DRAWINGS

[00195] One or more embodiments of the present invention will hereinafter be described with reference to the accompanying Figures, in which:

[00196] Figures 1 A to 1C show examples of horizontal, vertical and torsional eye movement with respect to nystagmus. [00197] Figure 2 shows a chart of nystagmus related to horizontal eye movement (speed and direction) plotted over time on a graph with horizontal eye position on the y axis and time on the x axis.

[00198] Figures 3A to 3D show patterns (e.g. speed and direction of nystagmus over time) of captured nystagmus utilising a apparatus/method according to at least one embodiment of the present invention.

[00199] Figures 4A to 4C show an alternative arrangement for displaying captured nystagmus patterns (speed and direction over a time period) using radial stacked histogram charts according to at least one embodiment of the present invention.

[00200] Figures 5A to 5D relate to steps taken for adjusting for misalignment of the apparatus relative to eye position of a patient according to at least one embodiment of the present invention.

[00201 ] Figure 6 shows a side sectional view of a apparatus according to an embodiment of the present invention.

[00202] Figures 7 A and 7B show examples of glints reflected in the eye of a wearer as captured by image capture means of a apparatus according to an embodiment of the present invention.

[00203] Figures 8A and 8B show an eyeball with its pupil in two different locations. Shown are 2 gaze vectors, the lines from the centre of the eye through the centre of the pupils.

[00204] Figures 9A, 9B, 9C and 9D show an eyeball with its pupil being 5mm diameter in two locations being centred with respect to the capture apparatus image and being 45 degrees upwards from the capture apparatus image.

[00205] Figure 10 shows an alternative embodiment of the present invention.

[00206] Figure 11 shows an example of an embodiment of the present invention incorporating environmental sensing/measurements.

[00207] Figure 12 shows an example according to an embodiment of the present invention incorporating body sensing/measurement. [00208] It will be appreciated that one or more embodiments of the present invention can incorporate environmental sensing/measurements and body sensing/measurements.

[00209] Figure 13 shows an example of capturing head movement and eye motion data according to an embodiment of the present invention.

[00210] Figure 14 shows an example of a patient experience ‘storyboard’ flowchart and employment of one or more embodiments of the present invention.

DESCRIPTION OF PREFERRED EMBODIMENT(S)

[00211 ] As shown by way of example in Figures 1A to 1C, typical examples of nystagmus.

[00212] Figure 1A shows nystagmus presenting as horizontal nystagmus e.g. left nystagmus where the eye moves slowly to the patient’s right and rapidly returns horizontally to the left.

[00213] Figure 1B shows nystagmus presenting as vertical nystagmus e.g. down beating nystagmus where the vertical slow eye movement upward is followed by rapid vertical downward rotation.

[00214] Figure 1C shows torsional nystagmus e.g. clockwise nystagmus with the eye slowly rotating anticlockwise followed by a rapid counter torsional clockwise rotation.

[00215] Figure 2 shows a chart of nystagmus related eye movement (e.g. speed and direction of movement) plotted over time on a graph with horizontal eye position on the y-axis (gaze right at the top and gaze left at the bottom) and time on the x-axis. This figure shows left beating nystagmus.

[00216] Nystagmus over time can be plotted a number of ways. Horizontal and vertical tracings are currently adopted. There are also, new ways to display the data. This is driven by knowing exactly the pupil position (due to realignment of the horizontal plane from lining up the pupil or using glints (28 Figs 7A, 7B)) and therefore accurately know the direction and speed of the nystagmus.

[00217] The features of nystagmus such as shape, amplitude and frequency can be analysed. Nystagmus are usually described by speed of eye movement (e.g. of the slow phase) and direction of eye movement (e.g. of the fast phase). For example, 16 sec right beating nystagmus. Instead of direction they can also be defined in relation to gravity (geotropic: towards earth, ageotropic: away from earth).

[00218] According to one or more embodiments of the invention, captured nystagmus motions can be charted/displayed showing direction and velocity of movement over time.

[00219] For example, Figure 3A shows by way of example charted strong right beating nystagmus that reduces in velocity over a minute and a half of time.

[00220] Figure 3B shows an example of charted captured strong persistent left beating nystagmus.

[00221 ] Figure 3C shows charted captured mild persistent up-beating nystagmus.

[00222] Figure 3D shows charted captured direction changing nystagmus with respect to time. Moderate right-beating nystagmus for 40 seconds followed by a direction change to left beating nystagmus.

[00223] Figures 4A to 4C show alternative visualisation of nystagmus results as captured by the apparatus/method of the present invention.

[00224] In particular, Figure 4A shows left beating nystagmus (some with slight upward component) of varying velocity.

[00225] Figure 4B shows occasional upward and to the right beating nystagmus of mild to moderate velocity.

[00226] Figure 4C shows strong left beating nystagmus.

[00227] Being able to visualise the pattern (speed and direction) of nystagmus during an attack significantly aids in diagnosing the underlying cause of the attack. The present invention provides a new modality of capturing, measuring and displaying time of attack nystagmus and being able to view and present the pattern (speed and direction) significantly improves the assessment methodology. [00228] It is important to have the apparatus correctly positioned on the patient’s head/face or to be able to determine position of the apparatus on the patient’s face and compensate for sub-optimal positioning to ensure useable/correct results.

[00229] Figures 5A to 5D show a technique according to one or more embodiments of the present invention that can be employed to correct misalignment in recordings due to incorrect apparatus placement.

[00230] In this example, the patient has pure left beating nystagmus, as shown in Figure 5A. Eyes move slowly to the patient’s right and then a fast movement back to the patient’s left. In this embodiment, the tracing should show left beating nystagmus with no movement on the vertical scale. The upper tracing shows the horizontal (left and right) movement and the lower tracing represents the vertical (up and down) movement of the eyes.

[00231 ] Figure 5B shows an example (Case 1) of what happens to the eye movements in Figure 5A when the apparatus is horizontally and vertically accurately aligned with eyes. In this case, the eye movement recordings show the true direction of eye movement (e.g. right and left movement are true right and left movement). This translates to accurate tracings. Note that there is no vertical component shown in the tracings as there was no vertical eye movement.

[00232] Figure 5C shows an example (Case 2) of what happens to the eye movements in Figure 5A when the apparatus is incorrectly positioned. That is, the apparatus is not horizontally and vertically aligned with the patient’s eyes. In this case the tracings do not reflect the true direction of the nystagmus. Note that there is a vertical component shown in the tracings although no vertical eye movement occurred.

[00233] When the apparatus is not positioned correctly (Case 2), the misalignment (correction factor) can be measured and used to recalculate eye position so the tracings reflect the true direction of the eye movements. This is explained in Figure 5D.

[00234] Figure 5D shows that using the captured images (video images or series of still images), the degree of misalignment is calculated. This can be achieved by using the pupil centres as an indication of the horizontal plane

[00235] Other methods within the present invention can also be used to determine the horizontal plane. These include determining the position of the corners of the eye or any other feature of the eye and surrounds. For this example the pupil centres are used, as shown in Figure 5D.

[00236] As shown in Steps 1 to 4 of figure 5D:

• In step 1, the degree of misalignment of the pupils from horizontal is determined.

• In step 2, correction is applied by rotating the image/video back by the degree of misalignment so that the pupils are horizontal and the frame is non-horizontal.

• In step 3, frame correction is applied to centre the pupils within the frame.

• In step 4, the corrected tracing of eye movement during nystagmus are captured.

[00237] As an alternative in steps 2 to 4, the video can be kept the same and the calculated degree of misalignment can be used to recalculate the tracings.

[00238] It will therefore be appreciated that the present invention envisages many ways to plot the nystagmus. For example, horizontal and vertical tracings can be used.

[00239] Embodiments of the present invention also envisage measuring, plotting and/or assessing nystagmus relative to gravity. Eye motion (optionally also head movement) can be tracked/measured and compared with gravity (gravity direction, strength of gravity, change in strength and/or direction).

[00240] The nystagmus of lateral canal BPPV can be categorized as either always towards the ground (geotropic) or always towards the sky (ageotropic).

[00241 ] Alternatively, knowing the position (due to realignment of the horizontal plane from lining up the pupil as well as the calibration data using glints 28 (Figs 7A, 7B) or ovoid shape of pupil as example), the direction of the nystagmus can be plotted.

[00242] Figure 6 shows a side sectional view of a apparatus 10 according to an embodiment of the present invention.

[00243] In particular, the apparatus 10 is provided as a wearable apparatus, such as goggles or other type of headset.

[00244] When the patient is experiencing an attack of vertigo, dizziness or imbalance they place the apparatus in front of their eyes 13. The apparatus is turned on, such as by an on-off switch 11 provided on the apparatus (or remotely by wireless communication control).

[00245] Two lens(es)/cameras 12 record the eyes/eye features (e.g. movement of the pupils).

[00246] There is preferably a cover to the lenses for each lens. The cover may be opaque to the wearer/user so that the patient’s eyes are in complete darkness with no internal reflections in order to avoid any visual distractions. Recording can of course still be made through the opaque cover i.e. the cover is opaque the user and not to the lenses/cameras.

[00247] A seal 14, (such as a flexible, foam, plasticized or rubberized seal) is preferably provided around a face side periphery of the apparatus to prevent external light entering the cavity so the patient cannot see any visual distractions.

[00248] A separate, removable adapter (such as a face mask) can be removably attached to the apparatus in order to accommodate different head sizes/face shapes.

The removable adapter (e.g. face mask) can be disposable or cleanable. The removable adapter (e.g. face mask) may include a surround for contact with the user’s face, the surround may be removable from the apparatus.

[00249] Preferably the surround and/or adapter (e.g. face mask) is formed of or includes a flexible material or combination of materials, such as a polymer, e.g. silicon, rubber and/or foam based. The face mask may be shaped so as to fit around the user’s eyes and over their nose e.g. over the bridge of the nose, similar to googles for use in water or skiing, for a comfortable light omitting fit e.g. sealing around the contact area with the skin. The adapter (e.g. face mask) can be of a light blocking material to prevent light passing to the eyes and image capture means that would consequently impair image capture of the eyes.

[00250] The patient's eyes are illuminated by a light source 16, such as by one or more infrared LEDs, incorporated as part of/attached to/within the apparatus. This is so the wearer/user cannot fixate on any point, therefore overriding the spontaneous eye movements. [00251] One or more cameras 18, such as video cameras, (e.g. one for each eye, or a single camera and a mirror arrangement) record the patient’s eye movements (nystagmus) during an attack episode e.g. a vertigo attack.

[00252] The video recordings are stored, such as on the memory apparatus 20, which may preferably be located mounted within or on the apparatus.

[00253] Preferably the memory apparatus is connected for data transfer with electronic circuitry 22 controlling the apparatus.

[00254] When the patient’s episode attack has finished, the apparatus is removed from their head.

[00255] The video recordings can be provided to a processing system.

[00256] The recordings are preferably compressed and eye movements logged.

[00257] Processing of the eye movements can occur within/on-board the apparatus.

[00258] Other data, such as the quality of the image, how long the pupil was open, can be stored and can be analysed on the unit or transmitted to be analysed remote from the apparatus, such as at a clinic.

[00259] Capture/recording of eye position/movement (optionally with head position/orientation measurements) can be transmitted to a remote location, such as the ‘cloud’ or remote server, to be processed there.

[00260] The apparatus can be battery powered, such as by an on-board battery 24 and/or can receive power through a power connection 26.

[00261 ] Eye kinematic measurements, such as gaze angle and rate of movement/change of movement/rate of change of movement, can be calculated and used to assess presence of a medical condition, such as nystagmus or other neurological disorder.

[00262] Angular velocity can be determined by calculating rate of change of gaze angle (e.g. change of angle between gaze vectors for a given time interval). [00263] Further, angular acceleration can be determined by calculating the rate of change of angular velocity for a given time interval.

[00264] Angular jerk can be determined by calculating the rate of change of angular acceleration for a given time interval.

[00265] Angular jerk measurements can be used to detect when the eye movement transitions from a slow phase to a fast phase and or when the eye movement changes direction.

[00266] Definition: ‘within phase' means sub-periods or intervals of time within a traditionally identified phase of a nystagmus e.g. slow phase or fast phase, such as shown by way of example in Figures 2, 5A and 5B.

[00267] Analysis of the eye kinetics within phases (e.g. slow phase) may find eye movement patterns relating to the speed, position, direction, changes of direction, acceleration of the pupil (or other eye features) and other parameters.

[00268] For example, the slow phase of a horizontal right beating nystagmus, may have smaller periods of vertical up beating nystagmus or some other notable feature that are contained within the time interval defining the slow phase.

[00269] Eye kinematics including the angular jerk measurements can be used to identify 'within phase' transitions that may not be detected with incumbent clinic eye tracking apparatus. These 'within phase' transitions will lead to new research opportunities for identifying and diagnosing diseases with dizziness symptoms.

[00270] Identifying 'within phase' eye kinematics may lead to better diagnosis of complex cases. This includes cases where multiple diseases, conditions or factors may contribute to or confound dizziness symptoms when using traditional nystagmus analysis.

[00271 ] Furthermore ‘within phase’ eye kinetics may lead to further information about the underlying pathology in patients with vestibular dysfunction. For example it may give information as to the functioning of the vestibulo-cerebellar pathway and/or the vestibular nuclei.

[00272] The present invention includes algorithms to detect traditional nystagmus, their related eye kinematics and ‘within phase’ eye kinetics. [00273] The apparatus and software patent includes algorithms to detect 'within phase' eye kinematics.

[00274] Figures 8A and 8B show identical pupil locations relative to the image capture means (e.g. camera). They illustrate how differences in the calculated eyeball centre result in errors in the angles between gaze vectors and therefore errors in angular velocity.

[00275] Results/statistics (such as summary or full statistics) of one or more patient episodes of nystagmus based on the eye movements can be provided by the apparatus or as processed remotely and displayed on a local display (local to the apparatus) or transmitted to a remote display (e.g. tablet, pc, web browser client) . For example, the apparatus/method may categorise the extent or type of nystagmus (e.g. 6 degree per second right beating) and display the type or types (or pattern) to give a clinician instant information/knowledge. The apparatus may include or have access to processing means to process and supply a suggested test.

[00276] The apparatus 10 for such remote oculography monitoring and software together provide medical specialists with a unique custom solution to aid their diagnosis of vertigo, dizziness and balance symptoms as a result of neurological and vestibular (inner ear) conditions.

[00277] One or more embodiments of the present invention provides a portable apparatus designed to obtain and/or record patient’s eye movements during episodes/attacks of dizziness away from the clinic. That is, patients can administer the apparatus themselves when they are at home, work or during recreational activities.

[00278] The medical specialist can review the captured data. This application allows specialists to view and analyse the uploaded patient data as well as collaborate with colleagues regarding the assessment of patients’ symptoms.

[00279] The system allows for the day-to-day administration and collection of patient information. Stored information includes one or more of:

• eye movement video sequences;

3D orientation and position of the head during the episodes; • eye kinematics and tracking data;

• Other eye feature information (e.g. pupil size);

• history;

• questionnaire data; and

• information pertinent to diagnosis.

[00280] A software platform facilitates use of the apparatus and can help to identify patients for specific treatment options and enables research from the captured data including severity, timing and location of episodes.

[00281 ] Additionally, the system can include monitoring the use of a number of the apparatus, and can oversee the setup and administration of clinics employing the apparatus.

[00282] The apparatus 10 can include goggles configured to monitor and/or capture patient’s eye/pupil movement(s), such as during episodes of dizziness. When the patient begins to experience an episode, the patient wears the apparatus. The apparatus commences recording the episode automatically or when switched on.

[00283] It is appreciated that users/patients will have the apparatus for 2-4 weeks and clinics will possess a number of the apparatus to deploy to a respective number of users/patients, with potential for spare apparatus being kept on hand.

[00284] Alternatively, the patient or each patient may keep the respective apparatus. The patient could choose to keep the apparatus permanently (e.g. for a cost) and only the specialist/clinician can provide a report/analysis when the apparatus transmits new/updated measurements/data of one or more nystagmus episodes.

[00285] In use, the apparatus captures video of the eye kinematics and preferably patient position (e.g. accelerometer, gyroscope) data.

[00286] Preferably, the apparatus may perform on-board processing that occurs during or after the video capture. For example, the on-board processing may detect if the patient closes their eyes during the recording and/or after the capture may compress the video data and/or log particular events, such as variability in eye movement or eye closure or periods of non-movement between bouts of eye movement.

[00287] The captured data can be sent (streamed, sent in packets or sent as a whole) from the apparatus to a platform e.g. via hard wire or wireless, such as via a mobile phone network.

[00288] If a mobile network is not available, the apparatus can include the facility to store the data on-board to be downloaded e.g. at a later time (such as via USB, internal or removable storage).

[00289] Preferably, software on the platform is cloud based and accessed through a web-app.

[00290] By way of explanatory example, a patient can have custody of the apparatus permanently or for a time period. When the patient senses or begins experiencing an episode of dizziness, the patient puts on the apparatus and commences recording eye movements using the on-board image capture means.

[00291 ] The patient can receive a visual, audible and/or vibration alert that the apparatus is recording. If the patient shuts their eyes during recording or if ambient light enters the apparatus thereby affecting recording, the apparatus gives an alert to the patient.

[00292] If the eyes are closed and/or light entering continues for a period e.g. 10 seconds, the recording can be halted/paused and the patient notified. Alternatively, if the apparatus successfully records eye movement images for a predetermined threshold period, say, 120 seconds, the apparatus could indicate that successful recording is completed. The patient can cease recording. The captured image data is stored, processed on-board and/or sent for remote processing.

[00293] With respect to the clinic side for deployment of the apparatus, a patient is instructed when and how to use the apparatus. After sufficient episode recordings are completed, the patient returns the apparatus to the clinician or, as otherwise arranged, keeps the apparatus.

[00294] Recorded data may be obtained directly from the apparatus (as processed data, semi-processed data for additional processing, or raw data for subsequent processing) and the obtained data is analysed. The image data can be transmitted from the apparatus for remote processing, such as by hard-wired, local wireless transmission e.g. Wi-Fi and/or mobile phone network link for remote processing and/or storage and/or analysis.

[00295] Episode recording may be monitored in real or near real time by clinicians, such as by automatic detection and transmission of recorded or direct image data as an episode occurs.

[00296] The present invention provides advantages of:

• Reduced contact time with patients and increase the likelihood of arriving at a definitive diagnosis.

• Ability to diagnose more patients due to (a) the reduced occurrence of repeat appointments with existing undiagnosed patients, and (b) shorter appointments with all patients.

• Provide a low cost assessment option requiring minimal capital expenditure that provides a rapid return on investment.

• Improved patient care.

• Provides a source of income for the clinics.

[00297] For the patient, the value proposition for apparatus/system is:

• The apparatus/system provides a more convenient method of recording their spontaneous nystagmus.

• Shorter waiting times for diagnosis and treatment.

• A quicker path to the correct specialist and an increased likelihood of a correct diagnosis.

• Empowerment of the patient by being actively involved in identifying the cause.

• Symptoms are recorded objectively and not reliant on patient report, therefore providing credibility to the patient’s medical complaint.

Lower costs due to the need for fewer tests. • Fewer incidents of misdiagnosis and inappropriate treatment.

• Reduced loss of income due to illness.

• Reduced stress

[00298] Further benefits are reduce costs due to earlier identification of the causes of dizziness in the care chain and hence facilitate fewer expensive specialist consultations, creation of efficiencies in the healthcare system by facilitating remote monitoring of patient's dizziness episodes and allow clinicians to see more patients in the same amount of time, reduction on the volume of 'falls' (and other outcomes that are the result of dizziness) due to earlier diagnosis of the causes of dizziness.

[00299] With reference to Figures 9A to 9D, the reference numerals indicate the eyeball 1 , pupil projection onto capture apparatus image plane 2, actual pupil height 3, projected pupil height 4 and projected pupil shape 5.

[00300] In Figure 9A the projection of the pupil image is 5mm high, and the pupil projection is a circle with 5mm diameter.

[00301 ] In Figure 9C where the gaze (from eyeball 1) is 45 degrees upward, the pupil is still 5mm however the pupil projection is a 3.54mm high by 5mm wide ellipse.

[00302] The change in shape of the ellipse projection of the pupil can be used to calculate the gaze angle and thereafter all related eye kinematics.

[00303] Information/data 30 directly or indirectly from the apparatus 10 of the present invention can be combined with sensed information/data from one or more head movement sensors 32 (such as one or more accelerometers), one or more body/body function sensors 34 and/or one or more external sensors 36, or a combination of any two or more thereof.

[00304] One or more external sensors 36 can include one or more environmental sensors 38a and/or one or more remote sensors 38b for remote monitoring of the person or their body factors e.g. remote cameras for capturing images for assessing behaviour, change in behaviour or activity (fitness, undertaking tasks, interaction with others etc.), temperature, blood pressure - see, for example, Figure 11. For example, one or more environmental factors, such as temperature, air pressure, humidity, altitude, gravity, and/or rate of change of one or more thereof, can be measured. Measurement of the environmental factor(s) may be factored into assessing patient data, such as eye motion data 42 obtained via the apparatus 10. Analysis may be conducted using cloud computing/storage 44. Analysis in the cloud can examine and/or document relationship between environmental functions (such as temperature, air pressure (barometric pressure), humidity, gravity etc.) and the presence of eye motion or type of eye motion (such as nystagmus).

[00305] One or more body sensors 34 can include internal, body mounted or worn sensors, such as a heart rate monitor worn around the chest, a pulse sensor on the wrist, a blood pressure monitoring cuff on the arm. See, for example Figure 12. For example, oxygen level and/or heart rate maybe measured and optionally provided as data for analysis with eye motion data 42 obtained using the apparatus 10. Eye motion data and the body measurement data can be provided via a cloud data service 44.

[00306] Analysis 46 can be conducted through cloud computing service. Such analysis may examine and/or document relationship between body functions (such as heart rate, oxygen level) and eye motion (such as nystagmus). Analysis/results can be displayed 48 for diagnosis/patent assessment.

[00307] Gravity (G) and/or changes in gravity (G t ) over time can be measured.

[00308] It will be appreciated that, whilst commensurate measurement of one or more of body, head and external parameters with eye state/movement is preferred, it can be possible to separately assess a person’s physical parameters and assess gravity later or earlier.

[00309] Sensor information/data and/or information/data from the apparatus 10 can be provided for analysis by an analyser/processor 40.

[00310] Reaching or exceeding one or more threshold values of measured/sensed head metrics, body metrics and/or environmental metrics can be used to determine whether effect on eye state or change or eye state over time.

[00311 ] Head movement 50 and eye motion characteristics 52 can be measured. Accelerometer data 54 related to head movement can be provided as an input to understanding relationship between head movement and eye motion. Eye motion (such as a nystagmus pattern) can be detected 56, such as measurement of eye motion over time.

[00312] The eye motion pattern 58, such as indicating BPPV (benign paroxysmal positional vertigo) can be provided as part of the assessment of the relationship between head movement and eye motion - see 60.

[00313] Cause and effect relationship between specific head movement and specific eye motion can be assessed 60.

[00314] The specific otolith that is at fault can be assessed. For example, when an otolith is activated by specific head movement it causes an unwanted defect that is detectable or fails to cause an expected eye movement. By knowing the specific head movement and expected corresponding eye motion, the particular ear can be determined and the semi-circular canal within that ear can be identified that is causing the problem (See 62).