Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A SYSTEM AND METHOD FOR DETECTING LAUNCHED MISSILES
Document Type and Number:
WIPO Patent Application WO/2013/108253
Kind Code:
A1
Abstract:
A system for detecting a flying target, the target rotating about its longitudinal axis and having a beacon coupled thereto. The rotation of the target substantially blocks and unblocks the light emanated from the beacon, giving rise to target characteristics that include modulated optical signals at a given wavelength. The system includes a sub¬ system configured to receive optical signals of a scene and acquiring therefrom, at a given frame rate, a digital signal that includes a succession of frames each including a plurality of pixels. The system further includes a processor configured to process selected frames of the succession including searching for pixels having characteristics that correspond to the target characteristics, and if identified, indicating on the target.

Inventors:
TIDHAR GIL (IL)
SHOAM EFRAIM (IL)
BAUM TOMER (IL)
Application Number:
PCT/IL2013/050046
Publication Date:
July 25, 2013
Filing Date:
January 16, 2013
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ELTA SYSTEMS LTD (IL)
International Classes:
F41G3/14; F41H11/02; G01S3/784
Foreign References:
US5332176A1994-07-26
US5999652A1999-12-07
US6677571B12004-01-13
Other References:
None
Attorney, Agent or Firm:
REINHOLD COHN AND PARTNERS (P.O.Box 13239, 62 Tel Aviv, IL)
Download PDF:
Claims:
CLAIMS:

1. A method for detecting a flying target; the target rotating about its longitudinal axis and having at least one beacon coupled thereto; the rotation of the target substantially blocks and unblocks the light emanated from the beacon, giving rise to target characteristics that include modulated optical signals at a given wavelength, the method comprising:

(a) receiving optical signals of a scene and acquiring therefrom, at a given frame rate, a digital signal that includes a succession of frames each including a plurality of pixels;

(b) processing selected frames of said succession including searching for at least one pixel having pixel characteristics that correspond to said target characteristics, and if identified, indicating said target.

2. The method according to claim 1, wherein said selected frames being said succession of frames.

3. The method according to any one of the preceding Claims, wherein said pixel characteristics refer to a window embracing more than one pixel.

4. The method according to Claim 3, wherein said pixel characteristics are evaluated with respect to the summed value of the pixels in the window.

5. The method according to any one of the preceding Claims wherein said pixel characteristics include a pixel modulation frequency that is proportional to said optical modulation frequency and said frame rate.

6. The method according to any one of the preceding Claims, wherein said pixel characteristics include said given wavelength.

7. The method according to Claim 6, wherein said pixel characteristics are tested for complying with said wavelength before obtaining said digital signal.

8. The method according to any one of the preceding Claims, wherein said pixel characteristics include pixel's value intensity.

9. The method according to any one of the preceding Claims, wherein said pixel characteristics include identifying a pattern of pixels that correspond to a flight trajectory of the target.

10. The method according to any one of the preceding Claims, wherein said processing includes, with respect to at least, selected from among said succession of frames:

(a) performing image non-uniformity and bad pixel correction for compensating on intrinsic optical sensor and optics related errors;

(b) computing image registration for motion correction to said frames;

(c) applying a narrow band pass temporal filter to the registered frames for coarse identification of pixels each having pixel characteristics that are likely to represent target's pixel characteristics;

(d) applying a High Pass Filter (HPF) to said identified pixels, for analyzing spatial distribution in said pixels and applying a score to pixels such that the better the score the more likely the pixel's characteristics resemble target's pixel characteristics;

(e) applying Constant False Alarm Rate CFAR for forwarding to a frequency comparison computation stage only pixels having a sufficiently high pixel characteristics score;

(f) applying said frequency comparison for each candidate pixel that was forwarded by said CFAR for determining, in a more accurate manner, than said stage (c) whether said pixel characteristics are likely to comply with the target's pixel characteristics, where a better score indicates better correspondence; and

(h) indicating on target including pixel (x,y) location and time of the detection event.

11. The method according to Claim 10, wherein following said (f), testing pixel's characteristics for identifying a pattern that corresponds to flight trajectory of the target and indicating on oncoming target including pixel (x,y) location and time of the detection event.

12. The method according to any one of the preceding Claims wherein said missile being a SAGGER missile.

13. The method according to any one of the preceding Claims, wherein said protected object being a moving object.

14. The method according to any one of Claims 1 to 13, wherein said protected object being a stationary object.

15. The method according to any one of the preceding Claims, wherein said indicating includes pixel (x,y) location as well as time of the detection.

16. The method according to any one of the preceding Claims, further comprising, detecting if said flying target is oncoming towards a protected object.

17. The method according to Claim 16, further comprising, in the case of indication of said oncoming target, activating counter-measure means for destroying the target.

18. A system for detecting a flying target; the target rotating about its longitudinal axis and having at least one beacon coupled thereto; the rotation of the target substantially blocks and unblocks the light emanated from the beacon, giving rise to target characteristics that include modulated optical signals at a given wavelength, the system comprising:

a sub-system configured to receive optical signals of a scene and acquiring therefrom, at a given frame rate, a digital signal that includes a succession of frames each including a plurality of pixels;

a processor configured to process selected frames of said succession including searching for at least one pixel having pixel characteristics that correspond to said target characteristics, and if identified, indicating on said target.

19. The system according to Claim 18, wherein said processor is further configured to detect if said flying target is oncoming towards a protected object.

20. A system according to any of Claims 18 or 19, wherein said subsystem includes an optics sub-system for receiving oncoming signals and a cluster of optical sensors for acquiring a succession of video frames at a given frame rate.

21. The system according to Claim 20, wherein said optics sub-system includes a spectral filter module for obtaining maximum Signal to Noise (S/N) ratio between target and background, and wherein the objective lens module is configured in accordance with certain parameters, including the desired detection range, such that the larger the focal length, the larger the detection distance.

22. The system according to any one of Claims 20 or 21, wherein said cluster being a camera.

23. The system according to Claim 22, wherein said camera includes a sensor module sensitive to visible light configured to acquire the received optical signal and converts it to a digital signal; said camera further including a Frame Grabber and Proximity Electronics module configured to capture video at a given frame rate for feeding to said processor.

24. The system according to Claim 23, wherein said sensor utilizes CMOS or CCD.

25. The system according to Claim 23, wherein CMOS camera being a Bayer camera.

26. The system according to any one of Claims 18 to 25, further comprising a database for storing target characteristics of various targets.

27. A computer program product for detecting a flying target; the target rotating about its longitudinal axis and having at least one beacon coupled thereto; the rotation of the target substantially blocks and unblocks the light emanated from the beacon, giving rise to target characteristics that include modulated optical signals at a given wavelength; the computer program product embodying a computer readable storage medium storing a computer program for executing the following stages including

(a) acquiring at a given frame rate, a digital signal that includes a succession of frames each including a plurality of pixels;

(b) processing selected frames of said succession including searching for at least one pixel having pixel characteristics that correspond to said target characteristics, and if identified, indicating said target.

28. A method for detecting a flying target; the target rotating about its longitudinal axis and having at least one beacon coupled thereto; the rotation of the target substantially blocks and unblocks the light emanated from the beacon, giving rise to target characteristics that include modulated optical signals at a given wavelength, the method comprising:

(a) receiving optical signals of a scene and acquiring therefrom, a digital signal that includes at least one pixel;

(b) processing said pixel including identifying pixel characteristics that correspond to said target characteristics, and if identified, indicating on said target.

Description:
A SYSTEM AND METHOD FOR DETECTING LAUNCHED MISSILES

FIELD OF THE INVENTION

The invention is generally in the field of detecting launched missiles.

BACKGROUND OF THE INVENTION

There are numerous available anti-vehicle (such as a tank) guided missiles, e.g. the known AT-3 SAGGER. The description below will occasionally refer to Fig. 1 illustrating a typical launching scene of a SAGGER missile towards a target. The missile's primary mode of operation is by ground mounted on a "suitcase" launcher. As shown in Fig. 1 , the missile is wire guided and the launch may take place such that there is no line-of-sight (LOS) 101 between the physical location of the missile's launch site 102 and the target (vehicle 103). It is normally desired that a detection (typically electro-optical) system mounted on (or in the vicinity of) the designated target would detect the threatening missile during launch (e.g. by detecting the launch incident flare 104) thus allowing ample time for the target vehicle to apply counter-measure means and/or maneuver to avoid hit.

However, electro-optical detection systems which are designated to detect an oncoming threat cannot rely on a launch-incident flare during launch since, as specified above, in many operational scenarios, there is no LOS (101) between the incident flare and the electro-optical detection system (fitted on or in the vicinity of the protected vehicle - not shown in Fig. l). In the particular case of SAGGER, even if LOS exists, the flare's intensity is relatively low, hindering the detection system to detect it from a large distance, say of a few kilometers (which, in many cases, is the actual range-to-target of the launched SAGGER).

There is thus a need in the art to detect anti- vehicle missiles, such as the SAGGER during flight. There is a further need in the art to detect anti-vehicle missiles based on their unique characteristics during flight.

SUMMARY OF THE INVENTION

In accordance with an aspect of the presently disclosed subject matter, there is provided a method for detecting a flying target; the target rotating about its longitudinal axis and having at least one beacon coupled thereto; the rotation of the target substantially blocks and unblocks the light emanated from the beacon, giving rise to target characteristics that include modulated optical signals at a given wavelength, the method comprising:

(a) receiving optical signals of a scene and acquiring therefrom, at a given frame rate, a digital signal that includes a succession of frames each including a plurality of pixels;

(b) processing selected frames of the succession including searching for at least one pixel having pixel characteristics that correspond to the target characteristics, and if identified, indicating the target.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a method, wherein the selected frames are a succession of frames.

In accordance with an embodiment of the presently disclosed subject matter, there is yet further provided a method, wherein the pixel characteristics refer to a window embracing more than one pixel.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a method, wherein the pixel characteristics are evaluated with respect to the summed value of the pixels in the window.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a method wherein the pixel characteristics include a pixel modulation frequency that is proportional to the optical modulation frequency and the frame rate.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a method, wherein the pixel characteristics include the given wavelength. In accordance with an embodiment of the presently disclosed subject matter, there is further provided a method, wherein the pixel characteristics are tested for complying with the wavelength before obtaining the digital signal.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a method, wherein the pixel characteristics include pixel's value intensity.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a method, wherein the pixel characteristics include identifying a pattern of pixels that correspond to a flight trajectory of the target.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a method, wherein the processing includes, with respect to at least, selected from among the succession of frames:

(a) performing image non-uniformity and bad pixel correction for compensating on intrinsic optical sensor and optics related errors;

(b) computing image registration for motion correction to the frames;

(c) applying a narrow band pass temporal filter to the registered frames for coarse identification of pixels each having pixel characteristics that are likely to represent target's pixel characteristics;

(d) applying a High Pass Filter (HPF) to the identified pixels, for analyzing spatial distribution in the pixels and applying a score to pixels such that the better the score the more likely the pixel's characteristics resemble target's pixel characteristics;

(e) applying Constant False Alarm Rate CFAR for forwarding to a frequency comparison computation stage only pixels having a sufficiently high pixel characteristics score;

(f) applying the frequency comparison for each candidate pixel that was forwarded by the CFAR for determining, in a more accurate manner than the stage of performing image non-uniformity and bad pixel correction for compensating on intrinsic optical sensor and optics related errors, whether the pixel characteristics are likely to comply with the target's pixel characteristics, where a better score indicates better correspondence; and

(h) indicating on target including pixel (x,y) location and time of the detection event. In accordance with an embodiment of the presently disclosed subject matter, there is further provided a method, wherein following applying a High Pass Filter (HPF) to the identified pixels, for analyzing spatial distribution in the pixels and applying a score to pixels such that the better the score the more likely the pixel's characteristics resemble target's pixel characteristics , testing pixel's characteristics for identifying a pattern that corresponds to flight trajectory of the target and indicating on oncoming target including pixel (x,y) location and time of the detection event.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a method wherein the missile being a SAGGER missile.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a method, wherein the protected object is a moving object.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a method, wherein the protected object being a stationary object.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a method, wherein the indicating includes pixel (x,y) location as well as time of the detection.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a method, further comprising detecting if the flying target is oncoming towards a protected object.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a method, further comprising, in the case of indication of the oncoming target, activating counter-measure means for destroying the target.

In accordance with an aspect of the presently disclosed subject matter, there is further provided a system for detecting a flying target; the target rotating about its longitudinal axis and having at least one beacon coupled thereto; the rotation of the target substantially blocks and unblocks the light emanated from the beacon, giving rise to target characteristics that include modulated optical signals at a given wavelength, the system comprising:

a sub-system configured to receive optical signals of a scene and acquiring therefrom, at a given frame rate, a digital signal that includes a succession of frames each including a plurality of pixels; a processor configured to process selected frames of the succession including searching for at least one pixel having pixel characteristics that correspond to the target characteristics, and if identified, indicating on the target.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a system, wherein the processor is further configured to detect if the flying target is oncoming towards a protected object.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a system, wherein the subsystem includes an optics sub-system for receiving oncoming signals and a cluster of optical sensors for acquiring a succession of video frames at a given frame rate.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a system, wherein the optics sub-system includes a spectral filter module for obtaining maximum Signal to Noise (S/N) ratio between target and background, and wherein the objective lens module is configured in accordance with certain parameters, including the desired detection range, such that the larger the focal length, the larger the detection distance.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a system, wherein the cluster is a camera.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a system, wherein the camera includes a sensor module sensitive to visible light configured to acquire the received optical signal and converts it to a digital signal; the camera further includes a Frame Grabber and Proximity Electronics module configured to capture video at a given frame rate for feeding to the processor.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a system, wherein the sensor utilizes CMOS or CCD.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a system, wherein CMOS camera being a Bayer camera.

In accordance with an embodiment of the presently disclosed subject matter, there is further provided a system, further comprising a database for storing target characteristics of various targets.

In accordance with an aspect of the presently disclosed subject matter, there is further provided a computer program product for detecting a flying target; the target rotating about its longitudinal axis and having at least one beacon coupled thereto; the rotation of the target substantially blocks and unblocks the light emanated from the beacon, giving rise to target characteristics that include modulated optical signals at a given wavelength; the computer program product embodying a computer readable storage medium storing a computer program for executing the following stages including

(a) acquiring at a given frame rate, a digital signal that includes a succession of frames each including a plurality of pixels;

(b) processing selected frames of the succession including searching for at least one pixel having pixel characteristics that correspond to the target characteristics, and if identified, indicating the target.

In accordance with an aspect of the presently disclosed subject matter, there is further provided a method for detecting a flying target; the target rotating about its longitudinal axis and having at least one beacon coupled thereto; the rotation of the target substantially blocks and unblocks the light emanated from the beacon, giving rise to target characteristics that include modulated optical signals at a given wavelength, the method comprising:

(a) receiving optical signals of a scene and acquiring therefrom, a digital signal that includes at least one pixel;

(b) processing the pixel including identifying pixel characteristics that correspond to the target characteristics, and if identified, indicating on the target.

BRIEF DESCRIPTION OF THE DRAWINGS

In order to understand the invention and to see how it may be carried out in practice, a preferred embodiment will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:

Fig. 1 is a typical launching scene of a SAGGER missile towards a target;

Fig. 2 is a schematic illustration of a BEACON fitted on a flying SAGGER missile that rotates about its longitudinal axis;

Fig. 3 is an exemplary operational scenario of detecting a threat missile during flight, in accordance with certain embodiments of the invention;

Fig. 4 is a generalized system architecture, in accordance with certain embodiments of the invention; and

Fig. 5 is a generalized sequence of operations, in accordance with certain embodiments of the invention.

DETAILED DESCRIPTION OF THE INVENTION

In accordance with an aspect of the invention, the detection is based on the fact that during the missile's flight it rotates in a well defined frequency. Attention is now directed to Fig. 2 showing a schematic illustration of a red BEACON 20 fitted on a SAGGER missile 21 that rotates about its longitudinal extent, during flight.

The red beacon 20 serves as a visual indication allowing the operator to wire- guide the missile towards the target.

The rotation of the missile substantially blocks and unblocks the light emanated from the beacon, giving rise to a modulated optical signal at a given wavelength, forming part of target characteristics.

Note that the term wavelength should be construed as embracing also a wavelength range.

Thus, as shown in snapshot 201 of Fig. 2, the light emanated from the beacon is blocked, whereas when the missile rotates in a segment of 90 degrees clockwise the beacon's light is unblocked (see 20 in snapshot 202). Moving on, when the missile rotates for an additional segment of 90 degrees 203 the beacon's light is still unblocked, whereas when the missile rotates for still another 90 degrees segment 204, the beacon's light is blocked. This sequence of substantially blocking and unblocking gives rise to a modulated optical signal which depends on the rotation of the missile including the rotation rate thereof and the number of segments (as well as their length ) that the light is substantially blocked (i.e. not necessarily fully blocked) and unblocked (i.e. not necessarily fully unblocked) in each rotation. Note that there are parameters that affect the modulated signal. Thus, for example, the closer the missile to the electro-optical system, the stronger the intensity of the received signal. By way of another example, the spatial orientation of the missile affects the signal intensity, for instance shortly after launch, the missile's orientation is more vertical, facilitating thus a better LOS between the beacon and the electro-optical system, thereby increasing the intensity of the modulated signal as received by the electro-optical system of the invention, whereas when the missile proceeds in the flight trajectory it changes its orientation to be a more flat trajectory, thereby partially concealing the LOS between the beacon and the receiver system, thus possibly reducing the intensity of the modulated signal when received by the electro-optical system. Obviously, other parameters may affect the modulated signal. Thus, for example, utilization of more beacons may affect the frequency and intensity of the modulated signal.

As will be explained in greater detail below, in accordance with certain embodiments, an electro-optical system is configured to detect a modulated digital signal (that is proportional also to the specified optical signal modulation frequency) and apply further processing on the modulated digital signal for providing an indication of an oncoming threat/target (that is possibly aimed at a protected object) in good time before the missile hits the protected object. In the latter case the detection system may be located on/in or near the protected object.

Note that the term "protected object" refers to a target object whether moving (such as a vehicle, e.g. tank) or stationary, such as a building.

Note that in accordance with certain embodiments the target is not necessarily an oncoming target, namely not aimed at protected object. In the latter example, the detection system of the invention may be located at a different location than the designated hitting area of the target.

As specified above, the electro-optical detection system is not bound by the number of illumination devices such as a BEACON fitted on the flying object and/or to the location thereof and or to a specific operational wavelength of the light of the optical signal emanated from the beacon.

Turning now to Fig. 3, it illustrates a typical operational scenario for detecting a missile during flight, in accordance with certain embodiments of the invention. While not shown in Fig. 3, the scenario starts with a missile launch. The electro-optical system 30 mounted on, or in the vicinity, of the protected object, is operative to receive optical signals indicative of real-world scene(s) (including clutters) at a distance of, say, a few kilometers. The signals are received and a digital signal that includes a succession of frames is acquired therefrom and processed, and in case it complies with predetermined signal characteristics that correspond to the target characteristics, namely unique characteristics of a true threat/target during flight (e.g. of a SAGGER missile), an appropriate indication is provided. Note that the signal gets stronger as the missile approaches the detection system (locations 31 to 34).

In accordance with certain embodiments, the specified signal characteristics include wavelength and modulation characteristics (such as proportional frame rate and optical signal modulation frequency of a target). As will be explained in greater detail below, the digital signal includes a succession of streaming video (frames) that are acquired at a given frame rate from the received optical signal and are processed by an image processing sub-system.

A true threat/target may be reflected as an area (consisting of one or more pixels) (in a series of frames of the succession ) having with respect to each pixel, certain pixel characteristics. Note that in accordance with certain embodiments, due to known per se inherent limitations of the electro-optical system, such as lens defocusing and diffraction, the digitized received signal may extend over more than one pixel, and, in the latter case, known per se compensation techniques may be applied such as static or dynamic size window embracing a few pixels and e.g. summing the values of the (grey levels of) pixels in the window. These may still be regarded as pixel characteristics.

In accordance with certain embodiments, pixel characteristics include a given wavelength (e.g. indicative of a given color - corresponding to red in the case of SAGGER) and pixel modulation frequency that is proportional to the specified optical modulation frequency and to the frame rate. Note that a given wavelength may, in certain embodiments, encompass also wavelength range.

In accordance with certain embodiments, pixel characteristics may further include, among others, a series of pixels (35) on a series of frames (of said succession) having a given pattern that corresponds to flight trajectory of the threat/target (e.g. from position 31 to position 34).

Note that the specified series of pixels, the resulting pattern, as well as the four snapshots of the flying missile's position, are provided for illustrative purposes only, and are by no means binding.

Turning now to Fig. 4, there is shown a generalized system architecture 40, in accordance with certain embodiments of the invention.

As shown, incoming signals ( 41 representative of the scene) are received by optics sub-system 42, and after having been incident on the optical sub-system 42, are acquired as a succession of video frames at a given frame rate by the camera sub-system 43 and thereafter processed by image processing sub-system 44 which provides as an output indication e.g. whether or not a true threat/target missile (e.g. SAGGER) is approaching towards the protected object. Turning at first to optics system 42, it includes spectral filter module 45 and objective lens module 46. The spectral filter transmission is optimized, such that maximum Signal to Noise (S/N) ratio is achieved between target and background. The S/R is optimized since the camera's filter blocks any radiation which is outside the designated wavelength (spectral band) of the target. In this way the modulated optical signal which emanates from the target as received by the optics-sub-system 42 is not averaged by the clutter optical signal emanated at the other spectral band. Target in this case refers to the beacon having, by this example, spectral emission of highest intensity in a wavelength that corresponds to the red light. The objective lens 46 is selected in accordance with certain parameters including the desired detection range, such that the larger the focal length, the larger the detection distance. Note however that longer detection distance is achieved at the penalty of narrower detection sector.

The camera further includes sensor module (47) (utilizing e.g. CMOS or CCD), sensitive to visible light, which acquires the received optical signal and converts it to a digital signal, and Frame Grabber and Proximity Electronics module (48) that enable capturing video at a given frame rate for feeding to the image processing sub-system (44). The latter includes Image Processor module (49) and User /External output module (401). As will be explained in greater detail with reference to Fig. 5 below, Image Processor module (49) is responsible for pixels processing. The output module 401 handles messaging or any other output formats to the user or to (e.g. external) a system and in case that such indication is representative of an oncoming threat/target, various actions may be activated, such as applying counter measure means for destroying the oncoming threat/target. In accordance with certain embodiments a DB may be utilized to store target characteristics, e.g. in case that the electro-optic system is designated to detect any of those in the repertoire of possible threats/targets.

Note that the system architecture is not bound by the system, sub-systems and/or modules illustrated and explained with reference to Fig. 4. Thus, for example, a cluster of optical sensors (where camera is an example) is utilized, or in accordance with certain embodiments a single optical sensor is utilized.

Attention is now drawn to Fig. 5 illustrating a generalized sequence of operations, in accordance with certain embodiments of the invention.

The video streaming (51) is generated by the camera sub-system 43 as a succession of video frames that are acquired at a given frame rate. By one embodiment, this rate is at least double the modulation frequency of the optical signal emanated from a target , e.g. 120 Hz -). The so acquired frames are fed to signal processing module (49) and undergo various processing stages as illustrated in Fig. 5. In stage 52, the incoming video streaming (frames) undergoes known image non-uniformity and bad pixel correction for compensating on intrinsic camera and optics related errors, all as known per se. In addition, the video streaming undergoes image registration for motion correction and, as a preparatory step for subsequent image processing, all as known per se. The motion correction may be applied globally to the entire image, or different correction may be applied to different image portions, all as known per se. The registration between frames is performed in order to eliminate or reduce undue effects such as camera motion (e.g. if the latter is fitted on a mobile platform).

After registration between frames, and as will be explained in greater detail below, there follows a processing stage attempting to identify (pixel) characteristics in a series of frames.

In stage 53, a narrow band pass temporal filter (e.g. IIR) proportional to target rotation frequency is applied to the video streaming outputted from computational stage 52.

This will serve as a coarse stage for confining the data of interest to data that is likely to be indicative of the sought threat/target (e.g. the SAGGER).

Note that if a true event is encountered, the target of interest is identified at a relatively large distance (of a few kilometers) it is likely to fall in a single pixel (representing an area of a few square meters) or a few (say 2 to 4) adjacent pixels in a Focal Plane Array (FPA) that stores a given frame (of the succession of acquired frames). If, on the contrary, a pixel is detected at a shorter distance, the target may embrace a few pixels. Note that in accordance with certain embodiments, due to known per se inherent limitations of the electro-optical system, such as lens defocusing and diffraction, the digitized received signal may extend over more than one pixel, and, in the latter case, known per se compensation techniques may be applied such as static or dynamic size window embracing a few pixels and, e.g. summing the values of the (grey levels of) pixels in the window.

Pixel characteristics may include wavelength, which by this example is a priori determined by employing a spectral filter (see e.g. 45 in Fig. 4) adapted to transfer an optical signal received only at the desired wavelength. Pixel characteristics may further prescribe pixel modulation frequency that is proportional to the specified optical signal modulation frequency and to the specified frame rate. Thus, for instance, if a true target emerges in a given pixel (at X,Y location of the FPA) and further assuming a frame rate of 120Hz and modulated optical signal frequency < 0.5 * frame rate , then it is expected that for each one of a first series of frames, the pixel value (at the specified X,Y location) will have a grey level value that corresponds to unblocked energy radiated from the beacon, and thereafter in each one of a subsequent second series of frames, the specified pixel (at the X,Y location) will have a grey level value that corresponds to blocked energy. Then, in a subsequent third series of frames, the specified pixel (at X,Y location) will have again a grey level value that corresponds to unblocked energy radiated from the beacon, and so forth, in compliance with the optical signal modulated (see e.g. Fig. 2). The length of each one of the specified series of frames (whether identical or different) not only depends upon the optical signal modulation frequency, but rather also on the frame rate. Note that in accordance with certain embodiments, the signal is not devoid of any external effects and therefore the grey levels that correspond to "blocked" and "unblocked" states may be "contaminated" by various effects but will be still discernable one from the other.

As may be recalled, at a certain stage when the target advances (or when the camera is fitted on a mobile platform, or both), it may be reflected in a different pixel location of the FPA (see e.g. Fig. 2) where initially the target was detected in pixel 301 and later on when the target proceeded in its flight trajectory, it emerged in neighboring pixel 302. As will be explained in greater detail below, pixel characteristics also address this characteristic by identifying a given pattern that corresponds to a flight trajectory of a threat/target.

Reverting to stage 53 of Fig. 5, it should be noted that a Narrow Band Pass filter is a fast processing filter (e.g. the specified IIR). The latter is used considering the fact that many pixels are processed per each frame of incoming streaming video (at the specified frame rate of, say 120 Hz).

The net effect of using a fast processing filter is that many false positive indications may occur (i.e. pixel locations seemingly representing a true target), however the number of false positive identified pixels is considerably smaller than the entire matrix of pixels in the FPA, alleviating the burden on the subsequent processing stage to identify pixels being representative of a true target. Recall also that at this stage only pixels of the desired wavelength are processed (possibly indicative of the sought beacon light) due to the a priori filtering of the spectral filter module 45. Note that only candidate pixels that successfully passed the Narrow Band Pass filter processing will be fed to the subsequent HPF processing. Note, incidentally, that in accordance with certain embodiments the specified wavelength analysis of the pixel characteristics may be implemented instead of a preliminary filter (e.g. 45 of Fig. 4) by different elements, say by a sensor (e.g. 47 which being for example Bayer) or in accordance with certain other embodiments by the signal processor 49.

Having undergone narrow band-pass filtering, there follows an application of a known per se High Pass Filter (HPF) for analyzing spatial distribution for eliminating "large" events considering the fact that the digitized signal that corresponds to the sought target falls in one or a small number of pixels. Large events may be for example a signal expanding over a large cluster of pixels which may represent a real life scenario of traffic lights of a large lorry that has traversed the FOV of the camera. The specified HPF may discard all pixels that do not comply with a true event scenario. At the end of the HPF stage, a score is given to the so processed signal. The higher the score, the more the pixel's characteristics correspond to a true target's characteristics (and obviously not belonging to a "large event"). A low score may be indicative of e.g. pixel modulation frequency that is not sufficiently close to the target's pixel modulation frequency and/or possibly belonging to a cluster of many pixels indicative of "a large event".

The pixel data with their associated scores are subjected to a threshold for Constant False Alarm Rate CFAR (whose operation is generally known per se) which will forward to the next processing stage only those pixels having a sufficiently high score, constituting a candidate alarm list.

Stage 54 includes a finer and more accurate analysis in order to refine the candidate alarm list to include pixels having pixel characteristics that better correspond to the characteristics of a true threat/target. The current fine computational stage (54) applies a more computationally demanding, and therefore more accurate processing, and therefore can determine pixel characteristics of higher accuracy, giving rise to pixels (from among the candidate pixels in the candidate alarm list) having a sufficiently good score. Typically, although not necessarily, the candidate alarm lists may include only a few tens or hundreds of pixels per frame (depending, among others, on the processing capacity) and, accordingly, in the current fine processing stage 54, a more computationally demanding processing is feasible per each frame (in the succession of frames according to the video frame rate) in order to identify pixel(s) that comply with the target's pixel characteristics.

Note that the processing of a relatively small number of pixels (in the candidate alarm list) in the current fine processing stage is in contrast to the previous coarse stage (53) where many pixels were processed in each frame in order to construct the candidate alarm list.

Thus, in accordance with certain embodiments, in stage 54, the more computationally demanding frequency comparison (e.g. FFT) is applied for each candidate pixel (per frame) in order to ascertain in a more accurate manner whether it complies with the target's pixel characteristics. Similar to stage 53 discussed above, also stage 54 involves inter-frame processing in order to find an X,Y location having pixel characteristics (by this embodiment pixel's modulation frequency that corresponds to target's pixel modulation frequency). Each pixel is assigned with a score depending on the target's pixel modulation frequency. The resulting pixels' scores are compared to a threshold which may be fixed, or different (e.g. in different thresholds for different parts of the frame) and only those pixels whose scores e.g. exceed or equal the threshold, will be further processed, as they have better pixel characteristics (including wavelength and modulation characteristics). Note that in accordance with certain embodiments, pixel characteristics do not necessarily refer only to pixel's modulation frequency but possibly also to other parameters such as pixel intensity, all as discussed in detail above.

Having determined a candidate true alarm list, and before moving to identify target's flight trajectory, it can already be indicated at this stage that a target of interest has been detected. The next step would reveal the target's flight trajectory e.g. for determining an oncoming target aimed at a protected object(s). Bearing this in mind, there commences stage 55 which concerns testing pixel's characteristics being indicative of a sought target flight trajectory. This may include analyzing pixels in the alarm list resulting from the previous calculation for identifying a pattern (that corresponds to flight trajectory of the threat/target (see e.g. Fig. 2). To this end, in accordance with certain embodiments, a so-called Kalman Filter is utilized. By this embodiment, the pixels that passed this processing stage are included in a true alarm list.

In accordance with certain embodiments, the characteristics of the flight trajectory of a threat/target (e.g. a flying SAGGER) may stipulate that the threat/target flies towards the protected object or in dangerous vicinity thereto.

Whilst the description has focused on detection of a single flying threat/target, obviously the system is capable of providing indication on a few true events, e.g. a salvo of flying missiles flying simultaneously or one after the other. In both cases, each missile (either in the salvo or in the series) is represented as a series of true pixels that comply with the specified characteristics.

In the next stage (56) (operable whether or not stage 55 is applied), an indication e.g. detection message and alarm of a true encountered event is yielded which may include for example pixel (x,y) location (of those included in the true alarm list) as well as time of the detection. In response to the specified indication, an action may be invoked, e.g. activation of counter-measure means.

It should be noted that while various embodiments have referred to pixel characteristics, they do not necessarily refer to a single pixel. For instance, as has been explained above, due to inherent limitations of the lens, the signal may be smeared across a few pixels, and accordingly a pixel window (either of static or dynamic size as the case may be) may embrace a few pixels, say 4, whose value, in one embodiment, may be summed. In accordance with certain embodiments, the summed values in a pixel window may be regarded as a pixel when evaluating pixel characteristics.

Note that in accordance with certain embodiments a single optical sensor is utilized (rather than a cluster of optical sensors). The sequence of operations described with reference to Fig. 5 will be applied mutatis mutandis (e.g. not utilizing streaming video, image frames , HPF, FAR etc.)

Note that the system architecture in Fig. 4 is provided for illustrative purposes only and is by no means binding. Accordingly, the system architecture may be modified by consolidating two or more blocks/modules/units/systems and/or by modifying at least one of them and or by deleting at least one of them and replacing one or more others, all as required and appropriate, depending upon the particular implementation. Note that the flow chart illustrating a sequence of operations in Fig. 5 is provided for illustrative purposes only and is by no means binding. Accordingly, the operational stages may be modified by consolidating two or more stages and/or by modifying at least one of them and or by deleting at least one of them and replacing one or more others, all as required and appropriate, depending upon the particular implementation.

Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing", "computing", "applying", "receiving", "detecting", "searching", "obtaining", "indicating" and "acquiring" or the like, include actions and/or processes of a computer that manipulate and/or transform data into other data, said data represented as physical quantities, e.g. such as electronic quantities, and/or said data representing the physical objects. The term "computer" should be expansively construed to cover any kind of electronic device with data processing capabilities, including, by way of non-limiting example, a personal computer, a server, a computing system, a communication device, a processor (e.g. digital signal processor (DSP), a microcontroller, a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), etc.), any other electronic computing device, and/or any combination thereof.

The operations in accordance with the teachings herein may be performed by a computer specially constructed for the desired purposes or by a general purpose computer specially configured for the desired purpose by a computer program stored in a computer readable storage medium.

The various elements of the embodiment described above may be combined with different embodiments and/or aspects of the invention described above.

It will also be understood that the system according to the presently disclosed embodiments be implemented, at least partly, as a suitably programmed computer. Likewise, the presently disclosed embodiments contemplate a computer program being readable by a computer for executing the disclosed method. The presently disclosed embodiments further contemplate a machine-readable memory tangibly embodying a program of instructions executable by the machine for executing the disclosed method. The present invention has been described with a certain degree of particularity, but those versed in the art will appreciate that various alterations and/or modifications may be applied without departing from the scope of the following Claims: