Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
GEOLOCATION OF HEAD-MOUNTED IMAGE SENSOR USING CELESTIAL NAVIGATION
Document Type and Number:
WIPO Patent Application WO/2020/261255
Kind Code:
A1
Abstract:
System and method for determining position of head-mounted image sensor using celestial navigation. A head-mounted image sensor, worn by an operator, captures at least one image of an external scene comprising a celestial view, pursuant with natural head movements of operator. A processor receives orientation of head-mounted image sensor in reference coordinate system, extracts parameters of celestial bodies using stored celestial data, captured image, and received orientation, and determines position of head-mounted image sensor based on extracted parameters, such as based on difference between relative angle and expected relative angle of single celestial body or constellation of celestial bodies in captured image. The image sensor may be situated in a mobile platform. A default geolocation device, such as an IMU subject to angular drifts, may provide a default geolocation estimate of mobile platform, and may be monitored, updated or calibrated using determined position of head-mounted image sensor.

Inventors:
TREYVAS IGOR (IL)
Application Number:
PCT/IL2020/050611
Publication Date:
December 30, 2020
Filing Date:
June 02, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ELBIT SYSTEMS LTD (IL)
International Classes:
G01C21/02; G01C21/16; G02B27/01
Foreign References:
US20040246463A12004-12-09
US20150330789A12015-11-19
CN108917746A2018-11-30
US20190360811A12019-11-28
US20170131096A12017-05-11
Other References:
DANIEL N. MARTICELLO JR., MARK S. SPILLMAN: "Joint helmet-mounted cueing system accuracy testing using celestial references", HELMET-AND HEAD-MOUNTED DISPLAYS IV, vol. 3689, 12 July 1999 (1999-07-12), pages 126 - 134, XP055779708, Retrieved from the Internet
Attorney, Agent or Firm:
ELIEZRI INTELLECTUAL PROPERTY PATENT ATTORNEYS AND LAW OFFICE et al. (IL)
Download PDF:
Claims:
CLAIMS

1. A system for determining the position of a head-mounted image sensor using celestial navigation, the system comprising:

a head-mounted image sensor, worn by an operator, the head-mounted image sensor configured to capture at least one image of an external scene comprising a celestial view, pursuant with the natural head movements of the operator; and

a processor, configured to receive an orientation of the head-mounted image sensor in a reference coordinate system, to extract parameters of celestial bodies using: stored celestial data; the captured image; and the received head-mounted image sensor orientation, and to determine at least the position of the head-mounted image sensor based on the extracted celestial bodies parameters.

2. The system of claim 1 , wherein the head-mounted image sensor is situated in a mobile platform.

3. The system of claim 2, further comprising a default geolocation device, coupled to the mobile platform, and configured to provide a default geolocation estimate of the mobile platform.

4. The system of claim 3, wherein the default geolocation device comprises an inertial measurement unit (IMU), subject to angular drifts. 5. The system of claim 1 , further comprising a display, viewable by the operator, the display configured to display visual content relating to the celestial navigation.

6. The system of claim 5, wherein the display is configured to display a synthetic image of a celestial view to the operator in accordance with a default geolocation estimate, and wherein the position of the head-mounted image sensor is determined based on a manual alignment of the operator head to align at least one celestial body in the displayed synthetic image with a corresponding celestial body in an external celestial scene viewed by the operator.

7. The system of claim 1 , wherein the processor is configured to determine at least the position of the head-mounted image sensor based on the extracted celestial bodies parameters by determining the difference between the relative angle of a single celestial body in the captured image with an expected relative angle thereof, and determining at least the position of the head-mounted image sensor in accordance with the relative angle difference and the head- mounted image sensor orientation.

8. The system of claim 3, wherein the processor is configured to determine at least the position of the head-mounted image sensor based on the extracted celestial bodies parameters by determining the difference between the relative angle of a constellation of celestial bodies in the captured image with an expected relative angle thereof, compensating the default geolocation device in accordance with the relative angle difference, and determining at least the position of the head-mounted image sensor using the compensated default geolocation device.

9. The system of either of claims 7 or 8, wherein the processor is further configured to extract terrain features in the captured image, and to determine the relative angle of at least once celestial body based on the extracted terrain features.

10. The system of either of claims 7 or 8, wherein the processor is further configured to obtain supplemental information relating to the scene, and to determine the relative angle of at least one celestial body further based on the obtained scene information.

1 1. The system of claim 10, further comprising at least one element configured to provide the scene information, the element selected from the group consisting of:

a digital elevation model (DEM), comprising terrain images of a geographical region;

a geographic information source, for providing climatological data;

an ambient light sensor, for detecting ambient light information of the scene; and

platform instruments, for providing location measurements of the platform.

12. The system of claim 1 , wherein the head-mounted image sensor is configured to capture the image of an external scene through an intermediate optical element, and wherein the captured image is processed to compensate for optical distortions resulting from the intermediate optical element.

13. The system of claim 3, wherein the processor is configured to obtain an updated geolocation estimate by applying a differential weighting to the default geolocation estimate and to a celestial navigation based geolocation estimate.

14. The system of claim 3, wherein the processor is configured to monitor or update or calibrate the default geolocation device, using the determined position of the head-mounted image sensor. 15. A method for determining the position of a head-mounted image sensor using celestial navigation, the method comprising the procedures of:

capturing at least one image of an external scene comprising a celestial view, using a head-mounted image sensor worn by an operator, pursuant with the natural head movements of the operator; receiving an orientation of the head-mounted image sensor in a reference coordinate system;

extracting parameters of celestial bodies using: stored celestial data; the captured image, and the received head-mounted image sensor orientation; and

determining at least the position of the head-mounted image sensor based on the extracted celestial bodies parameters.

16. The method of claim 15, wherein the head-mounted image sensor is situated in a mobile platform.

17. The method of claim 15, wherein a default geolocation estimate of the mobile platform is provided by a default geolocation device coupled to the mobile platform. 18. The method of claim 17, wherein the default geolocation device comprises an inertial measurement unit (IMU), subject to angular drifts.

19. The method of claim 15, further comprising the procedure of displaying visual content relating to the celestial navigation on a display viewable by the operator.

20. The method of claim 19, wherein a synthetic image of a celestial view is displayed to the operator in accordance with a default geolocation estimate, and wherein the position of the head-mounted image sensor is determined based on a manual alignment of the operator head to align at least one celestial body in the displayed synthetic image with a corresponding celestial body in an external celestial scene viewed by the operator.

21. The method of claim 15, wherein the procedure of determining at least the position of the head-mounted image sensor based on the celestial bodies parameters comprises: determining the difference between the relative angle of a single celestial body in the captured image with an expected relative angle thereof; and

determining at least the position of the head-mounted image sensor in accordance with the relative angle difference and the head- mounted image sensor orientation.

22. The method of claim 17, wherein the procedure of determining at least the position of the head-mounted image sensor based on the celestial bodies parameters comprises:

determining the difference between the relative angle of a constellation of celestial bodies in the captured image with an expected relative angle thereof;

compensating the default geolocation device in accordance with the relative angle difference; and

determining at least the position of the head-mounted image sensor using the compensated default geolocation device.

23. The method of either of claims 21 or 22, wherein the relative angle of at least one celestial body is determined based on the terrain features extracted in the captured image.

24. The method of either of claims 21 or 22, wherein the relative angle of at least one celestial body is further determined based on obtained supplemental information relating to the scene. 25. The method of claim 15, wherein the image of an external scene is captured through an intermediate optical element, and wherein the captured image is processed to compensate for optical distortions resulting from the intermediate optical element. 26. The method of claim 17, further comprising the procedure of obtaining an updated geolocation estimate by applying a differential weighting to a default geolocation estimate and to a celestial navigation based geolocation estimate. 27. The method of claim 17, further comprising the procedure of monitoring or updating or calibrating the default geolocation device, using the determined position of the head-mounted image sensor.

Description:
GEOLOCATION OF HEAD-MOUNTED IMAGE SENSOR USING

CELESTIAL NAVIGATION

FIELD OF THE INVENTION

The present invention generally relates to geolocation and navigation systems, particularly for airborne platforms.

BACKGROUND OF THE INVENTION

Modern aircraft navigation systems can be classified into two general categories. Precise navigation may include global navigation satellite systems (GNSS) such as global positioning system (GPS), GLONASS, BeiDou and Galileo, or ground station systems, such as: very high frequency omni-directional range (VOR), tactical air navigation system (TACAN), and non-directional radio beacons (NDB). Imprecise navigation may include: inertial navigation systems, Doppler or radar systems (which transmit radio signals and analyze the reflected signals), and systems based on verified digital terrain elevation data (DTED) or a digital elevation model (DEM), using image analysis and altitude data. Each existing navigation technique has certain limitations, and may not be applicable or may be subject to significant inaccuracies in certain situations. For example, the precise navigation systems are reliant on some form of physical infrastructure, whether airborne/satellite components or ground components, and such infrastructure is subject to deterioration, tampering, malfunctioning and interference (both intentional and unintentional). Imprecise navigation systems are inherently characterized by inaccuracies and instabilities, where measurement accuracy may deviate over time, such as on account of drift errors, noise and other measurement errors.

The reliability and the availability of navigation support infrastructure is a global issue with significant implications. The most commonly used navigation systems nowadays for most applications are satellite-based systems. Considerable efforts have been made to strengthen the reliability of the existing satellite infrastructure and reinforce the ability to withstand potential problems, including the development of duplicate and supplemental elements for backup purposes. Conversely, attempts are made by hostile factions to sabotage navigational infrastructure and interfere with the provision of accurate global navigation capabilities, such as by interference or obstruction of navigational satellite signals or by tampering or damaging of the satellites themselves and supporting components. The ability to continuously maintain reliable and accurate geolocation and navigation capabilities is crucial for a wide range of industries and applications in contemporary society. These include both civilian and military applications, ranging from vehicle navigation in land, sea or air; geographic mapping; facilitating emergency services and rescue operations; tracking of objects or persons; and the like. Celestial navigation is a longstanding form of navigation that involves angular measurements of stars and other celestial bodies in relation to the horizon. A given celestial body is always located directly over a certain point on the surface of the Earth, which represents the geographic position of the celestial body, defined by the latitude and longitude of the point. The measured angle between the celestial body and the visible horizon is directly related to the geographic position of the celestial body and the position of the observer. The celestial body general position (GP) can be determined using published tabulated data, such as in nautical almanacs, which are generally updated annually. The concepts behind celestial navigation have been known since the 19 th century and have been adapted over time with developed variations such as the intercept method and the ex-meridian method. Celestial navigation has gradually become redundant in view of modern and more convenient satellite-based and ground-based navigation systems, but was commonly used in previous eras for aviation and marine navigation.

U.S. Patent No. 7,197,381 to Sheikh et al., entitled:“Navigation system and method utilizing sources of pulsed celestial radiation”, is directed to a mobile receiver mounted on a spacecraft, satellite or other vehicle for detecting signal pulses generated by pulsars or other celestial objects. A timer generates a timing signal corresponding to the reception and detection of the pulsed celestial radiation, and the timing signal is used to calculate a time offset between predicted and measured pulse reception at the mobile receiver. Incorporating predetermined models of the pulse arrival times within an inertial reference frame, a position offset of the mobile receiver can be determined using the time offset. A digital memory stores information including the positions and pulse timing model parameters of known sources of pulsed celestial radiation with respect to a chosen inertial reference frame.

U.S. Patent No. 8,597,025 to Belenkii et al., entitled:“Celestial weapons orientation measuring system”, discloses a miniature celestial direction detection device for determining the pointing direction of a weapon during a simulated discharge. The device is mounted on and aligned with the weapon and includes a camera for imaging at least one celestial object. A processor is programmed with a celestial catalog providing known positions at specific times of celestial objects and algorithms for automatically calculating target direction information based on the inclination measured by an inclinometer and the known positions of the celestial object as provided by the celestial catalog and as imaged by the camera. A pretend enemy target wears a GPS detector and transmitter, and a processor determines if the target is hit or missed by the weapon discharge, based on known locations of the weapon and target and the determined pointing directions of the weapon.

U.S. Patent No. 8,620,582 to Dods et al., entitled: “Device position method and apparatus using celestial objects”, discloses a method for determining the position of an electronic device based upon a celestial object. The electronic device receives an image containing at least one celestial object, and an indicator selects a celestial object on a display. Data indicating a relative angle of the device with respect to the Earth is received from a sensor, and the time when the indicator is overlaid on the celestial object is determined. The position of the electronic device is determined by comparing the location of the celestial object in the image and the relative angle data at the time of indication to a database in response to the indication, and the current position is displayed.

U.S. Patent No. 9,453,731 to Vos et al., entitled:“System and method for determining orientation relative to Earth”, discloses determining the orientation of a fixture relative to the Earth using celestial bodies. A sensor with an imaging device is coupled to the fixture and detects a celestial body by identifying candidate regions in an image and eliminating false detections using a template image. A processor receives signals indicative of positions of the detected celestial body relative to the sensor over a period of time, receives information relating to a time and date of the detection time period, and receives information relating to an expected relationship between the detected celestial body and the Earth and the time and date, and determines the orientation of the fixture accordingly. A Kalman filter provides a more accurate orientation determination when determining one of: true north, azimuth, elevation angle, and bank angle. U.S. Patent No. 9,791 ,278 to McCroskey et al., entitled: “Navigating with star tracking sensors”, is directed to a navigation system and method. A measured direction of celestial objects with respect to a body is determined using start tracking sensors mounted to the body. An expected direction of at least one celestial object with respect to the body is calculated based on a current navigation solution for the body. An updated navigation solution for the body is calculated based on the path of the moving celestial object, and the differences between the expected directions and measured directions of the celestial bodies.

Marticello Jr, DN, Spilman MS. “Joint helmet-mounted cueing system accuracy testing using celestial references”, In Helmet- and Head- Mounted Displays IV, 12 July 1999 (Vol.3689, pp.126-134) describes techniques for testing the pointing accuracy of a Joint Helmet-Mounted Cueing System (JHMCS) utilizing celestial bodies as reference points. A JHMCS is designed to cue and verify cueing of high off-axis sensors and weapons in an air-to-air engagement environment, to allow the pilot to slave the sensor/weapon line of sight (LOS) to the pilot LOS. The pointing accuracy is defined as how close the JHMCS computed LOS is to the actual pilot LOS. To test pointing accuracy through the pilot range of motion, truth data must be established at various azimuths and elevations. Surveyed ground locations do not allow testing at different helmet elevations, while airborne targets do not provide measurement precision. Therefore, celestial bodies whose locations are precisely known for a given time and date at a specific location can be used as truth data for testing LOS accuracy.

Chinese Patent No. 103017759A to Hao et al., entitled:“Method to overcome space unorientation and delusions”, is directed to a method for overcoming spatial orientation obstacles and illusions so a driver (such as a pilot) can understand his spatial relative position in real-time and avoid traffic accidents. A database file is established of time and reference coordinates corresponding to the spatial coordinates of a celestial body target. The spatial coordinates of an aircraft in flight is obtained by a space locator (e.g. GPS), the aircraft azimuth and attitude are determined by an electronic compass and gyro, and time information is obtained. Reference coordinates and spatial coordinates of a celestial target is retrieved from the database based on the time information. The relative position of the celestial target and the aircraft is calculated based on the spatial coordinates and reference coordinate points of the aircraft. The aircraft position (azimuth and attitude) is determined again, according to which it is determined whether the celestial target would be displayed in the field of view of the aircraft display. If determined to be in the field of view then the aircraft spatial coordinates and the celestial body position is displayed in accordance with the aircraft azimuth and attitude, otherwise it is not displayed and the method reverts back to the initial step. SUMMARY OF THE INVENTION

In accordance with one aspect of the present invention, there is thus provided a system for determining the position of a head-mounted image sensor using celestial navigation. The system includes a head- mounted image sensor, worn by an operator, and a processor. The head-mounted image sensor is configured to capture at least one image of an external scene comprising a celestial view, pursuant with the natural head movements of the operator. The processor is configured to receive an orientation of the head-mounted image sensor in a reference coordinate system, to extract parameters of celestial bodies using: stored celestial data; the captured image; and the received head-mounted image sensor orientation, and to determine at least the position of the head-mounted image sensor based on the extracted celestial bodies parameters. The head-mounted image sensor may be situated in a mobile platform. The system may further include a default geolocation device, coupled to the mobile platform, and configured to provide a default geolocation estimate of the mobile platform. The default geolocation device may include an inertial measurement unit (IMU), subject to angular drifts. The system may further include a display, viewable by the operator, the display configured to display visual content relating to the celestial navigation. The display may be configured to display a synthetic image of a celestial view to the operator in accordance with a default geolocation estimate, where the position of the head-mounted image sensor is determined based on a manual alignment of the operator head to align at least one celestial body in the displayed synthetic image with a corresponding celestial body in an external celestial scene viewed by the operator. The processor may be configured to determine at least the position of the head-mounted image sensor based on the extracted celestial bodies parameters by determining the difference between the relative angle of a single celestial body in the captured image with an expected relative angle thereof, and determining at least the position of the head-mounted image sensor in accordance with the relative angle difference and the head-mounted image sensor orientation. The processor may be configured to determine at least the position of the head-mounted image sensor based on the extracted celestial bodies parameters by determining the difference between the relative angle of a constellation of celestial bodies in the captured image with an expected relative angle thereof, compensating the default geolocation device in accordance with the relative angle difference, and determining at least the position of the head-mounted image sensor using the compensated default geolocation device. The processor may be further configured to extract terrain features in the captured image, and to determine the relative angle of at least once celestial body based on the extracted terrain features. The processor may be further configured to obtain supplemental information relating to the scene, and to determine the relative angle of at least one celestial body further based on the obtained scene information. The scene information may be provided by: a digital elevation model (DEM), comprising terrain images of a geographical region; a geographic information source, for providing climatological data; an ambient light sensor, for detecting ambient light information of the scene; and/or platform instruments, for providing location measurements of the platform. The head-mounted image sensor may be configured to capture the image of an external scene through an intermediate optical element, and wherein the captured image is processed to compensate for optical distortions resulting from the intermediate optical element. The processor may be configured to obtain an updated geolocation estimate by applying a differential weighting to the default geolocation estimate and to a celestial navigation based geolocation estimate. The processor may be configured to monitor or update or calibrate the default geolocation device, using the determined position of the head-mounted image sensor

In accordance with another aspect of the present invention, there is thus provided a method for determining the position of a head- mounted image sensor using celestial navigation. The method includes the procedures of capturing at least one image of an external scene comprising a celestial view, using a head-mounted image sensor worn by an operator, pursuant with the natural head movements of the operator, and receiving an orientation of the head-mounted image sensor in a reference coordinate system. The method further includes the procedures of extracting parameters of celestial bodies using: stored celestial data; the captured image, and the received head-mounted image sensor orientation, and determining at least the position of the head-mounted image sensor based on the extracted celestial bodies parameters. The head-mounted image sensor may be situated in a mobile platform. A default geolocation estimate of the mobile platform may be provided by a default geolocation device coupled to the mobile platform. The default geolocation device may include an inertial measurement unit (IMU), subject to angular drifts. The method may further include the procedure of displaying visual content relating to the celestial navigation on a display viewable by the operator. A synthetic image of a celestial view may be displayed to the operator in accordance with a default geolocation estimate, and wherein the position of the head-mounted image sensor is determined based on a manual alignment of the operator head to align at least one celestial body in the displayed synthetic image with a corresponding celestial body in an external celestial scene viewed by the operator. The procedure of determining at least the position of the head-mounted image sensor based on the celestial bodies parameters may include: determining the difference between the relative angle of a single celestial body in the captured image with an expected relative angle thereof, and determining at least the position of the head-mounted image sensor in accordance with the relative angle difference and the head- mounted image sensor orientation. The procedure of determining at least the position of the head-mounted image sensor based on the celestial bodies parameters may include: determining the difference between the relative angle of a constellation of celestial bodies in the captured image with an expected relative angle thereof, compensating the default geolocation device in accordance with the relative angle difference, and determining at least the position of the head-mounted image sensor using the compensated default geolocation device. The relative angle of at least one celestial body may be determined based on the terrain features extracted in the captured image. The relative angle of at least one celestial body may be further determined based on obtained supplemental information relating to the scene. The image of an external scene may be captured through an intermediate optical element, and wherein the captured image is processed to compensate for optical distortions resulting from the intermediate optical element. The method may further include the procedure of obtaining an updated geolocation estimate by applying a differential weighting to the default geolocation estimate and to a celestial navigation based geolocation estimate. The method may further include the procedure of monitoring or updating or calibrating the default geolocation device, using the determined position of the head- mounted image sensor. BRIEF DESCRIPTION OF THE DRAWINGS

The present invention will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which:

Figure 1 is a schematic illustration of a system for determining the position of a head-mounted image sensor using celestial navigation, constructed and operative in accordance with an embodiment of the present invention;

Figure 2 is a schematic illustration of angular measurements determined in the system of Figure 1 deployed in an aircraft, operative in accordance with an embodiment of the present invention;

Figure 3 is an exemplary sequence of displayed views for visual cueing a celestial navigation by an aircraft operator, operative in accordance with an embodiment of the present invention; and

Figure 4 is a block diagram of a method for determining the position of a head-mounted image sensor using celestial navigation, operative in accordance with an embodiment of the present invention.

DETAILED DESCRIPTION OF THE EMBODIMENTS

The present invention overcomes the disadvantages of the prior art by providing a system and method for determining the position of a head-mounted image sensor using celestial navigation techniques. The determination is based on celestial parameters extracted from an image with a celestial view as captured using a head-mounted image sensor worn by a user, which utilizes the natural head movements of the user without altering or disrupting their ordinary behaviour. The system and method may provide for reliable and accurate geolocation of an aircraft or other moving platform containing another navigation system, such as a satellite-based navigation system, that may sometimes be unavailable or subject to inaccuracies, such as due to interference or sabotage attempts or due to inherent errors. The celestial-based geolocation determination of the present invention can be used to monitor another navigation system and to correct or calibrate its output. The system of the present invention may be easily adapted to a variety of different platforms without requiring additional dedicated infrastructure, and may utilize existing equipment, such as commonly found on standard aircrafts.

Reference is now made to Figure 1 , which is a schematic illustration of a system, generally referenced 1 10, for determining the position of a head-mounted image sensor using celestial navigation, constructed and operative in accordance with an embodiment of the present invention. System 1 10 includes a processor 1 12, a head-mounted image sensor 1 14, an orientation detector 1 16, a display 1 18, a database 120, a user interface 122, and aircraft sensors and flight instruments (ASFI) 124. Processor 1 12 is communicatively coupled with image sensor 1 14, with orientation detector 1 16, with display 1 18, with database 120, with user interface 122, and with ASFI 124. Orientation detector 1 16 is further coupled with image sensor 1 14.

System 1 10 is generally installed on an aircraft, referenced 100, although some components may reside at a different location and may be accessible to processor 1 12 through a wireless communication link. The term“aircraft” as used herein should be broadly interpreted to refer to any type of vehicle capable of flight, including but not limited to: an airplane or a helicopter, of any type or model, and encompassing also such aircrafts with at least partial autonomous flight control capabilities. System 1 10 may alternatively be implemented (at least partially) on another type of moving platform other than an aircraft, such as a vehicle, an automobile, a motorcycle, or a ship or marine vessel. Further alternatively, system 1 10 may be partially installed on a stationary platform communicatively coupled with head-mounted image sensor 1 14 worn by a user capable of movement independent of a separate platform, such as a pedestrian. The term "user" herein refers to any individual person or group of persons operating the system or method of the present invention. For example, the user may be an aircraft pilot or other flight crew member, where the system is installed (at least partially) in the cockpit of an aircraft. The terms "user" and "operator" are used interchangeably herein.

Head-mounted image sensor 1 14 is disposed on the head of the user, such as being integrated with an existing pilot helmet, which may include additional components pertaining to aircraft flight assistance. Accordingly, the alignment of image sensor 1 14 is linked to the general head direction of the user, such that the image sensor field of view shifts in correspondence with the user head movements. Image sensor 1 14 is configured to capture images of an external scene that includes at least a view of celestial bodies in the sky. For example, image sensor 1 14 may capture images through a transparent canopy of the aircraft cockpit. Image sensor 1 14 may be any type of sensor device capable of acquiring an image representation of the scene, including the acquisition of any form of electromagnetic radiation at any range of wavelengths (including visible and non-visible wavelengths). For example, image sensor 1 14 may be a forward looking infrared (FLIR) camera or a charge-coupled device (CCD) camera. Image sensor 1 14 is operative to acquire at least one image frame, such as a sequence of consecutive image frames representing a video image, which may be converted into an electronic signal for subsequent processing and/or transmission.

Orientation detector 1 16 provides measurements of at least the orientation or line-of-sight (LOS) of image sensor 1 14, which represents a viewpoint of the external scene as imaged by image sensor 1 14. For example, orientation detector 1 16 may include a head tracking device configured to determine the real-time head direction of the user. Orientation detector 1 16 may include one or more devices or instruments configured to measure the orientation, and optionally also the position, of image sensor 1 14 with respect to a reference coordinate system, including but not limited to: an inertial navigation system (INS); an inertial measurement unit (IMU); motion sensors or rotational sensors (e.g., accelerometers, gyroscopes, magnetometers); a compass; a rangefinder; a camera; and the like. Orientation detector 1 16 may also be embodied by a combination of complementary sensors, such as a wide-angle camera supplemented with a narrow-angle camera with overlapping field of views. Orientation detector 1 16 may be a pre-existing sensor or instrument of aircraft 100 utilized for other purposes but adapted for the additional application of providing orientation measurements of image sensor 1 14. Orientation detector 1 16 may provide orientation measurements with respect to a reference coordinate system defined in relation to aircraft 100, or with respect to alternative reference coordinates not linked to aircraft 100.

Display 1 18 displays an image to the user, such as visual information or instructions relating to the navigation of aircraft 100. Display 1 18 may be a transparent or "see-through" display device, such that the user can simultaneously observe the displayed image overlaid in the foreground onto a background view of the external environment viewable through the display. Display 1 18 may be embodied by a fixed display, such as a head-up display (HUD) or a head-down display (HDD) integrated in the cockpit of aircraft 100. Alternatively, display 1 18 may be a head-mounted display (HMD) embedded within a helmet or other wearable apparatus worn by the user (and further integrated with head- mounted image sensor 1 14).

Database 120 contains relevant information which may be used by system 1 10, such as known general positions of celestial bodies included in publications of nautical almanacs (e.g., "Air Almanac"). Database 120 may include additional information that may assist with navigation, such as maps or models of different geographic areas, including 3D geographic models, such as a digital elevation model (DEM) or digital terrain model (DTM). Database 120 may further include weather or climate information, or flight routes of aircraft 100. Database 120 may be located externally to aircraft 100 but communicatively coupled with system 1 10, such that system 1 10 may receive information from database 120 while aircraft 100 is in flight. Database 120 may be truncated according to operation parameters or according to the real-time position of aircraft 100 (i.e., only the required sections of the relevant maps/models/almanacs may be provided).

User interface 122 allows the user to control various parameters or settings associated with the components of system 1 10. For example, user interface 122 can allow the user to provide instructions or select parameters relating to the navigation of aircraft 100. User interface 122 may include a cursor or touch-screen menu interface, such as a graphical interface, configured to enable manual input of instructions or data. User interface 122 may also enable the user to communicate with external sources.

Aircraft sensors and flight instruments (ASFI) 124 includes various devices configured to measure or detect real-time flight information associated with aircraft 100, such as: position, attitude, location, heading, altitude, airspeed, velocity, rate of turn indication, slip- skid indication, course deviation indication, and the like. ASFI 124 may incorporate various onboard flight instruments or other diagnostic tools. For example, ASFI 124 may include at least one aircraft navigation system, which may include components or applications associated with a global navigation satellite systems (GNSS), such as: GPS, GLONASS, BeiDoui and Galileo, or ground station navigation system, such as: VOR, TACAN and NDB. ASFI 124 may also detect relevant environmental information, such as: temperature, pressure, wind speed and wind direction, and the like. System 1 10 may also receive flight information from external data sources, such as a ground radar (e.g., at an airport or air traffic control station), or an automatic dependent surveillance- broadcast (ADS-B) system.

Processor 1 12 generally performs any data processing required by system 1 10, and may receive information or instructions from other components of system 1 10. In particular, processor 1 12 may determine navigational parameters of aircraft 100 and head-mounted image sensor 1 14, and direct the displaying of a navigation image respective of the real-time flight location of aircraft 100, as will be discussed further hereinbelow.

The components and devices of system 1 10 may be based in hardware, software, or combinations thereof. It is appreciated that the functionality associated with each of the devices or components of system 1 10 may be distributed among multiple devices or components, which may reside at a single location or at multiple locations. For example, the functionality associated with processor 1 12 may be distributed between multiple processing units. Processor 1 12 may be part of a server or a remote computer system accessible over a communications medium or network, or may be integrated with other components of system 1 10, such as incorporated with a computer in aircraft 100. System 1 10 may optionally include and/or be associated with additional components not shown in Figure 1 , for enabling the implementation of the disclosed subject matter. For example, system 1 10 may include a power supply (not shown) for providing power to various components, and may further include a memory or storage unit (not shown) for temporary storage of images or other data.

The term“image” as used herein may refer to a video image or a plurality of image frames presented in sequence. In accordance with an embodiment of the present invention, a displayed navigation image may be a video image that is dynamically updated to correspond to the real time updated flight location of aircraft 100.

The term "repeatedly" as used herein should be broadly construed to include any one or more of: "continuously", "periodic repetition" and "non-periodic repetition", where periodic repetition is characterized by constant length intervals between repetitions and non-periodic repetition is characterized by variable length intervals between repetitions.

The term "celestial body" as used herein refers to any naturally occurring and observable physical entity or structure, or group of entities or structures, located outside the Earth's atmosphere in the observable universe. Examples of celestial bodies may include, but are not limited to: a star, a star cluster, the Moon, the Sun, a planet, a planetary system, a nebula, an asteroid, and a comet.

The operation of system 1 10 will now be described in general terms, followed by specific examples. Reference is made to Figure 2, which is a schematic illustration of angular measurements determined in the system of Figure 1 deployed in an aircraft, operative in accordance with an embodiment of the present invention. Processor 1 12 receives at least one image of a celestial scene, referenced 130, captured using head-mounted image sensor 1 14 which operates in conjunction with the natural head movements of the user wearing the head-mounted sensor. For example, a pilot or flight crew member situated in the cockpit of aircraft 100 and wearing a flight helmet on which image sensor 1 14 is integrated, naturally performs typical gestures and motions, including head movements, in the course of operating aircraft 100. The captured image includes at least a sufficient view of the sky containing one or more observable celestial bodies, but may also depict non-celestial features, such as ground terrain. For example, the image may represent a perspective view of the external environment above the horizon as seen through the cockpit canopy while aircraft 100 is in flight. When image sensor 1 14 is inside aircraft 100, optical distortions arising from the canopy or windshield may need to be compensated for, such as via image rendering techniques.

Processor 1 12 receives orientation measurements of image sensor 1 14 from orientation detector 1 16, which may be a device configured to determine the real-time head direction of the user, such as an IMU, or one or more motion/rotational sensors or an optical sensor (camera). Orientation detector 1 16 may also be a pre-existing aircraft instrument adapted for this additional purpose. The orientation measurements are provided with respect to a reference coordinate system, such as reference coordinates defined in relation to aircraft 100. Referring to Figure 2, orientation detector 1 16 provides an orientation of head-mounted image sensor 1 14 indicated as "angle a", representing an angle respective of aircraft reference coordinates. The orientation measurements may alternatively be provided with respect to other reference coordinates which is not linked to aircraft 100. The orientation measurements provided by orientation detector 1 16 may be subsequently transformed to another reference coordinate system, such as into global reference coordinates defined in relation to the Earth, for example, using information relating to the global position and orientation of aircraft 100 as obtained using an aircraft IMU or other navigational system of ASFI 124.

Processor 1 12 then extracts geometric parameters of one or more celestial bodies in the captured image. The parameter extraction may utilize available celestial information stored in database 120, and may further utilize the orientation measurements of image sensor 1 14. In particular, processor 1 12 may determine the relative angle in the captured image of a single celestial body (e.g., an individual star) or a group of celestial bodies (e.g., a star constellation) with respect to the horizon or a zero-pitch reference line. Processor 1 12 may further obtain the known general position of the corresponding celestial bodies, in relation to some reference coordinate system, from celestial information stored in publicly available sources such as nautical almanacs accessible from database 120. The published general position data can be used to determine the relative angle of the celestial bodies with respect to a common reference coordinate system, such as respective of the horizon. The extracted geometric parameters of the celestial bodies may also include additional pertinent information, such as the size, shape, brightness intensity, color attributes, velocity or motion profile, distinguishing characteristics, and the identification of the relevant celestial body.

Based on the extracted parameters of the celestial bodies, the position and orientation of head-mounted image sensor 1 14 can be determined. The position and orientation can be determined respective of global coordinates (i.e., defined in relation to the Earth), using the orientation of image sensor in relation to aircraft 100, and the coordinates of aircraft 100 in relation to the Earth (obtained using an aircraft navigational system of ASFI 124). The determined position and orientation can be used for navigation of aircraft 100 on which image sensor 1 14 is situated, such as to calibrate or correct or to backup an existing navigation system of aircraft 100. For example, processor 1 12 may determine the position and orientation of image sensor 1 14 from the difference between the actual relative angle of a celestial body (in the captured image) and its expected relative angle (as obtained from stored celestial information). In another example, processor 1 12 may use the difference between the actual relative angle and the expected relative angle of a selected celestial body grouping, such as a constellation of stars, to serve as an updater to an existing aircraft navigation system such as an IMU, similar to a GPS updater. This serves to limit the effect of the IMU angular drift by using a more stable angular source, enabling the IMU to eventually produce a more precise position output then it would without the updater. Processor 1 12 may calculate an updated position using the IMU data and angular updates and determine the global location coordinates of image sensor 1 14 or aircraft 100, or may simply determine the divergence with respect to estimated location coordinates, as obtained from a default aircraft navigation system. The divergence from the estimated location can then be directly provided as a correction to the navigation system or indicated to the aircraft operator, such as a visual indication on display 1 18, for manually correcting the navigational system or for providing situational awareness of the drifted position.

System 1 10 may utilize supplementary information to assist the determination of the geolocation coordinates (position and orientation) of image sensor 1 14. For example, the captured image may include ground terrain features, which may be compared with known geographic information stored in database 120, such as a digital terrain model (DTM), to assist with the geolocation. For example, when head-mounted image sensor 1 14 captures a forward-facing image of the external environment of aircraft 100 corresponding to the aircraft operator LOS, the imaged view will generally include both a portion of the sky and a portion of the land, above and below the horizon. Image sensor 1 14 may also capture primarily ground terrain images under certain scenarios, such as when aircraft 100 is flying at a relatively low altitude. Accordingly, system 1 10 may obtain and utilize different types of images in different situations, such as ground terrain images during the day and celestial images at night, or celestial images at high altitude flights and ground terrain images at low altitude flights. In general, system 1 10 may use only ground terrain images, or may use only celestial images, or may use both types of images, for determining the geolocation, depending on the situation.

System 1 10 may also obtain supplementary information relating to the imaged scene, such as: the ambient lighting (e.g., obtained from an ambient light sensor), weather or climate data (e.g., obtained from a weather or climatological model), the date and time (e.g., obtained from an internal clock), and the like, in order to adapt the extraction or determination of celestial body parameters in accordance with the information as necessary. For example, the identification of a celestial body in the captured image may be facilitated when comparing with stored images and/or celestial information obtained under different conditions as the captured image, such as during different ambient lighting or a different time of day.

System 1 10 may use the determined geolocation (position and orientation) coordinates of image sensor 1 14 for navigating aircraft 100. Navigational information may be provided visually, such as by displaying image content on display 1 18 viewable by the user operating aircraft 100. For example, navigational information may be displayed in the form of synthetic (virtual reality) image content overlaid onto a view of the external scene, which may be a physical view of the external environment seen through a transparent display, or may be a synthetic image of the environment, e.g., obtained using a synthetic vision system (SVS) based on images captured by an aircraft camera or sensor in real-time. The images may be displayed in alignment with the user LOS as detected via orientation detector 1 16. The displayed image may include supplementary symbols, graphics or text, associated with relevant portions of the environment, in order to provide notable information or to emphasize significant features. Further displayed information may include flight instructions, navigational parameters, weather and climate data (e.g., instrument meteorological conditions (IMC)), and other important information relating to the real environment of the aircraft. Various notifications or warnings may also be displayed, such as via a text message or symbol representation, or in audible form.

The geolocation determined from the extracted celestial bodies parameters may be used to supplement or update or calibrate a geolocation measurement obtained from an existing navigation system. For example, ASFI 124 may include one or more navigational systems or instruments configured to provide geolocation coordinates of aircraft 100, such as a satellite-based system, a ground-based system, an IMU; a Doppler-based system; a DTED based system, or any combination thereof. System 1 10 may obtain a first geolocation estimate from the existing navigation system, and obtain a further celestial-based geolocation estimate, and then determine a final geolocation based on a combination of the two estimates, such as using a differential weighting. For example, the geolocation estimate obtained from the existing navigation system and the geolocation estimate obtained from the celestial determination may each be assigned a relative weighting according to relevant factors, such as the expected accuracy or reliability level thereof, which may vary under different conditions. Alternatively, the geolocation estimate of the existing navigation system may be used as a default baseline, and may be corrected or calibrated using the celestial- based geolocation determination. For example, the celestial-based geolocation can be used to correct for inherent errors in the existing navigation system, such as IMU angular drifts, that may accumulate over time. Additionally, the celestial-based geolocation can be used to determine and reveal other navigational systems errors arising from malicious intervention, such as due to geolocation spoofing or GPS spoofing. The present invention may also provide for reinforcement and monitoring in the event of a malicious attack. For example, if hostile factions attempt to take control of the aircraft, such as by executing a cyber-attack such as a GPS spoofing attack, then an alert can be raised immediately, by comparing the default (spoofed) geolocation determination with the celestial-based determination which may be executing in the background as a backup.

Display 1 18 may also be used to provide cues for the position determination, even without using an image captured by image sensor 1 14. Based on the detected orientation of image sensor 1 14 and orientation of aircraft 100, the estimated position and time, it is possible to extract the celestial body angular location (azimuth and elevation) that should appear in the present field of view of the aircraft operator LOS. The image will be inaccurate due to the inaccuracies in the existing aircraft navigation system, such as due to angular drifts in the IMU. The image can be displayed on the head-mounted display of the operator in conjunction with an external celestial view. The operator can provide a manual indication of his head movements by pausing or freezing the synthetic image of the celestial scene and moving his head to align the drifted synthetic celestial bodies with the corresponding celestial bodies in the actual sky view. Based on the required head movements to align the celestial bodies in the synthetic and external images, the relative angle correction of the celestial bodies can be determined, from which the position and orientation of the image sensor 1 14 and/or aircraft can be determined, and/or the inaccuracies in the existing navigation system can be corrected (e.g., calibrating the IMU angular drifts). In this manner, the use of the image sensor is effectively obviated.

Reference is now made to Figure 3, which is an exemplary sequence of displayed views for visual cueing a celestial navigation by an aircraft operator, operative in accordance with an embodiment of the present invention. Displayed images 141 , 142, 143 represent visual instructions to the aircraft operator (e.g., pilot) for calibrating the aircraft flight path by using the celestial based navigation to correct inaccuracies in the default aircraft navigational system. The initial image 141 represents a synthetic view of an external celestial scene displayed to the operator on a head-mounted display (HMD), in accordance with the aircraft orientation and operator LOS and estimated position and time obtained from the aircraft IMU (or other default aircraft navigational system). As a result of IMU angular drifts, the celestial view of image 141 displayed according the estimated operator LOS is offset in relation to the real external celestial view (as seen through the HMD). The subsequent image 142 depicts the operator freezing the synthetic celestial view on the HMD and shifting his head direction and/or LOS such that the displayed (synthetic) celestial bodies will align or match up with the equivalent celestial bodies present in the real external view. In effect, the operator manually indicates the LOS shift required to align the synthetic and real celestial bodies, which corresponds to the divergence or IMU inaccuracy that requires calibration. In the final image 143, the divergence in the aircraft navigation requiring calibration ("3 miles, 280 degrees") is symbolically displayed to the operator, where the divergence is determined from the manually indicated operator LOS shift required operator to align the synthetic celestial view with the real celestial view.

It will be appreciated that the present invention may provide reliable and accurate geolocation capabilities in a moving platform, such as an aircraft, and can be utilized in situations where other navigation systems are unavailable or subject to inaccuracies, such as due to interference or sabotage or due to inherent errors that may accumulate over time. The geolocation is based on existing aircraft equipment which is generally already available on standard aircrafts and thus does not require procurement and installation of new equipment. The celestial- based geolocation of the present invention utilizes the natural head movements of the aircraft operator, without requiring modifications or alterations to his regular behavior and without disrupting or diminishing his capacities during the demanding flight process. For example, the image sensor configured to captured the celestial view images is integrated with an existing head-mounted device worn by the user, such as a flight helmet, and does not require a dedicated mechanism for controlling the movement and operation of the image sensor. Furthermore, there is no need for an interface with navigation systems or avionics in the aircraft, as the results may be displayed directly to the user, such as on a head- mounted display integrated with the head-mounted sensor. The system may be easily adapted to a variety of different platforms, which generally already contain most of the required system components. There is no need for dedicated infrastructure or accommodation of specialized standards, and may be applied worldwide. The present invention provides the ability to monitor and correct inaccuracies in other commonly used navigation systems such as satellite-based systems, and provides an efficient and precise geolocation which is not subject to inherent errors such as drifting errors. Finally, the system is effectively passive and does not require actively transmitting or radiating harmful signals, which could also potentially lead to unwanted exposure of the aircraft in a covert mission.

Reference is now made to Figure 4, which is a block diagram of a method for determining the position of a head-mounted image sensor using celestial navigation, operative in accordance with an embodiment of the present invention. In procedure 172, at least one image of an external scene including a celestial view is captured using a head-mounted image sensor, pursuant with the natural head movements of the user. Referring to Figures 1 and 2, head-mounted image sensor 1 14 captures an image of the external scene of aircraft 100, where image sensor 1 14 operates in conjunction with the natural head movements of the user. The captured image includes a sky view 130 containing at least one observable celestial body, but may also contain non-celestial features such as ground terrain features.

In procedure 174, an orientation of the head-mounted image sensor is received in a reference coordinate system. Referring to Figures 1 and 2, orientation detector 1 16 provides orientation measurements of image sensor 1 14 respective of a reference coordinate system, such as angle a defined in relation to aircraft 100.

In procedure 176, parameters of at least one celestial body is extracted using: the captured image, stored celestial data, and the received head-mounted image sensor orientation. Referring to Figures 1 and 2, processor 1 12 extracts geometric parameters of one or more celestial bodies in the captured image. For example, processor 1 12 may determine the relative angle in the captured image of a single celestial body (e.g., an individual star) or a group of celestial bodies (e.g., a star constellation) with respect to the horizon or a zero-pitch line, and may further obtain the general position of the corresponding celestial bodies from information stored in database 120.

In procedure 178, at least the position of the head-mounted image sensor is determined based on the extracted celestial body parameters. Procedure 178 may be performed by: determining the difference between the relative angle of a celestial body in the captured image and the expected relative angle thereof (sub-procedure 180), and determining the position of the image sensor in a global coordinate system, in accordance with the relative angle difference and the received head-mounted image sensor orientation (sub-procedure 182). Referring to Figures 1 and 2, processor 1 12 determines at least the position, and optionally the orientation, of image sensor 1 14 from the difference between the actual relative angle of a celestial body (in the captured image) and the expected relative angle of the same celestial body (as obtained from stored celestial information). The position and orientation may be determined in relation to a global coordinate system, such as using the orientation of image sensor 1 14 in relation to aircraft 100 and the coordinates of aircraft 100 in relation to the Earth. Additional information may also be used to supplement the geolocation determination, such as terrain features in the captured image which may be compared with stored terrain data, or ambient light or weather or time information relating to the imaged scene which may influence the properties of the extracted features. The determined position and orientation may be used to calibrate or correct a geolocation estimate obtained using another navigation system of aircraft 100, such as a GPS or IMU. For example, the updated orientation of image sensor 1 14 may be determined based on the updated position determined from an updated IMU.

Procedure 178 may alternatively be performed by: determining the difference between the relative angle of a celestial body group in the captured image and the expected relative angle thereof (sub-procedure 184), compensating the angular drift of an inertial measurement unit in accordance with the relative angle difference (sub-procedure 186), and determining the position of the image sensor in a global coordinate system, using the compensated IMU. Referring to Figures 1 and 2, processor 1 12 calibrates or corrects the output of an existing aircraft navigation system such an IMU of ASFI 124, based on the difference between the relative angle and the expected relative angle of a constellation of stars or other celestial body grouping, and determines the position and orientation of image sensor 1 14 using the calibrated/corrected IMU. Additional information may also be used to supplement the geolocation determination, such as terrain features in the captured image which may be compared with stored terrain data, or ambient light or weather or time information relating to the imaged scene which may influence the properties of the extracted features. The determined position and orientation may be used to calibrate or correct a geolocation estimate obtained using another navigation system of aircraft 100, such as a GPS or IMU. For example, the updated orientation of image sensor 1 14 may be determined based on the updated position determined from an updated IMU, or can be precisely determined at a stage when a GPS signal still exists and then used later when the GPS signal is absent.

In procedure 190, visual feedback is displayed to a user wearing the head-mounted image sensor. Referring to Figures 1 and 3, display 1 18 displays navigational information to the aircraft operator, such as visual instructions for maneuvering aircraft 100 along a flight route in accordance with a celestial-based aircraft geolocation determined by system 1 10. Display 1 18 may provide visual cues for the operator to manually calibrate inaccuracies resulting from angular drift in the aircraft IMU. In particular, a synthetic celestial view is displayed on an HMD worn by the operator in accordance with the IMU geolocation estimate and operator LOS, where the synthetic view is offset relative to a real external celestial view seen through the HMD (image 141 ) due to the IMU inaccuracies. By shifting his head and LOS orientation to align or match up the celestial bodies in the synthetic view with those of the real view (image 142), the operator provides a manual indication of the divergence in the IMU, which is then symbolically displayed to the operator (image 143) allowing for correction and calibration of the divergence. The method of Figure 4 is generally implemented in an iterative manner, such that at least some of the procedures are performed repeatedly, in order to provide a dynamic display of a simulation environment which changes in accordance with the real-time maneuvering of aircraft 100 by the aircraft operator (e.g., pilot).

While certain embodiments of the disclosed subject matter have been described, so as to enable one of skill in the art to practice the present invention, the preceding description is intended to be exemplary only. It should not be used to limit the scope of the disclosed subject matter, which should be determined by reference to the following claims.