Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR GUIDANCE OF A PROJECTILE
Document Type and Number:
WIPO Patent Application WO/2011/101843
Kind Code:
A1
Abstract:
A system and a method operative in daylight or at night for launching and guiding a projectile (117) toward a selected target are disclosed. The projectile is equipped with a computer (119) and is provided with a reference image of the target (116) prior to launch. After launch, the projectile derives images of the target with image acquisition means (122) disposed thereon. A computer program first recognizes the image of the target on the post-launch images by comparison with the reference image, and second, guides the projectile by relating the image of the target to at least one border of the image. The computer program initiates explosion either on impact or in proximity of the target.

Inventors:
ROTKOPF MENAHEM (IL)
YOSUB SHAY (IL)
AHARON OREN (IL)
Application Number:
PCT/IL2011/000156
Publication Date:
August 25, 2011
Filing Date:
February 15, 2011
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
RAFAEL ADVANCED DEFENSE SYS (IL)
ROTKOPF MENAHEM (IL)
YOSUB SHAY (IL)
AHARON OREN (IL)
International Classes:
F41G7/22; G01S3/786
Domestic Patent References:
WO2009104112A22009-08-27
Foreign References:
US5947413A1999-09-07
US6488231B12002-12-03
GB2105941A1983-03-30
US6487953B12002-12-03
Attorney, Agent or Firm:
LOWY Avi, GLUCKSMAN-LOWY (Haifa, IL)
Download PDF:
Claims:
Claims

1. A method for guidance and control of a projectile (117) launched toward a selected target (116), the method comprising:

providing and operating image acquisition devices (111, 122) for deriving scenery images including the target, the images having image borders (BRD),

the method being characterized by comprising the steps of:

a. deriving an image after projectile launch and saving that image as a trajectory image,

b. searching for the target on the trajectory image and when found, saving the trajectory image as a current reference image,

c. deriving a disposition of the target relative to at least one border of the current reference image,

d. comparing the disposition of the target on the current reference image with an a priori selected guidance disposition relating the target to at least one border and guiding the projectile to acquire the selected guidance disposition on a next trajectory image, and repeating steps a to d.

2. The method according to Claim 1 , further comprising the step of:

guiding the projectile in operative association with a type of explosion initiation selected prior to launch.

3. The method according to Claim 2, further comprising the step of:

selecting the type of explosion initiation from a group including initiation on impact with the target, initiation delayed after impact with the target, and explosion initiation in proximity of the target.

4. The method according to Claim 1 , further comprising the steps of:

deriving an initial image prior to projectile launch,

providing a projectile computer (119) operating an image comparison computer program for recognizing the target by comparison of a trajectory image with the initial image. 5 The method according to Claim 1 , further comprising the steps of:

providing lighting means (118) for allowing use of daylight image acquisition means (122D) in insufficient lighting conditions.

6. The method according to Claim 1 , further comprising the step of:

providing a range finder (112) for deriving a range separating a projectile launching platform (110) from the target and launching the projectile according to attack data,

providing a projectile computer (119) using the attack data for converting the range into time to target, and

starting to derive images after projectile launch at one of both a predetermined range and a predetermined point in time.

7. The method according to Claim 6, further comprising the steps of:

guiding the projectile according to a type of explosion initiation selected prior to launch, and

initiating explosion when the target on a trajectory image resides in a guidance disposition at the time to target.

8. The method according to Claim 6, further comprising the steps of:

guiding the projectile according to a type of explosion initiation selected prior to launch as an explosion on impact on the target, and

after launch:

providing guiding means (123) operative for guiding the projectile into a guidance disposition where the target is centered between vertical borders (LVB, RVB) and horizontal borders (TB, BB).

9. The method according to Claim 6, further comprising the steps of:

launching the projectile for overshooting the range to the target and in accordance with a type of explosion initiation selected prior to launch as an explosion in proximity of the target, and

after launch: providing guiding means (123) operative for guiding the projectile into a guidance disposition where the target is adjacent at least one border (BRD) of a trajectory image, and

selecting the at least one border from a group consisting of a left vertical border (LVB), a right vertical border (RVB), and a bottom border (BB).

10. The method according to Claim 6, further comprising the steps of:

launching the projectile for overshooting the range of the target and operative in accordance with a type of explosion initiation selected prior to launch as an explosion in proximity of the target, and

after launch:

providing guiding means (123) operative for guiding the projectile into a guidance disposition wherein the target is centered between vertical borders (LVB, RVB) of a trajectory image.

11. The method according to Claim 1 , further comprising the step of:

providing at least one computer program stored in digital form in a computer memory (M) that is readable and executable by a computer (119) disposed in the projectile,

operating the at least one computer program for controlling guidance according to: an initial image derived prior to launch and to a succession of trajectory images derived after launch, and

a type of explosion initiation selected prior to launch.

12. A computer program for operating the method steps of claim 1.

13. A system configured for guidance and control of a ballistic proj ectile (117) launched toward a selected target (116), the system comprising:

a platform image acquisition device (111),

a projectile image acquisition device (122) disposed on the projectile,

projectile guiding means (123) configured for guiding the projectile on trajectory, a projectile computer (119) disposed in the projectile and operatively coupled to the projectile image acquisition device and to the guiding means, the projectile computer operating at least one computer program stored in a memory (M) operatively coupled thereto, and

a rangefinder (112) for deriving a range as a distance separating the platform from the target,

the system being characterized by comprising:

the platform image acquisition device being operated prior to launch to derive a scenery image as an initial image including the target,

the projectile image acquisition device being operated after launch to derive a plurality of scenery images as a sequence of trajectory images including the target, and the projectile computer being loaded with the initial image and being configured to analyze trajectory images in sequence to:

search for and recognize the target on a trajectory image by image comparison with the initial image, and when found, to save the trajectory image as a current reference image,

derive a disposition of the target relative to at least one border (BRD) of the current reference image,

compare the disposition of the target on the current reference image with an a priori selected guidance disposition wherein the target has a selected relation with the target and with at least one border, and

guide the projectile to acquire the selected guidance disposition on a next current reference frame.

14. The system according to Claim 13, further comprising:

explosion initiation means (124) being coupled to the at least one projectile computer and configured to initiate a type of explosion initiation selected prior to launch.

15. The system according to Claim 14, wherein:

the type of explosion initiation is selected from a group including initiation on impact with the target, initiation delayed after impact with the target, and initiation in proximity of the target.

16. The system according to Claim 13, further comprising:

the projectile computer operating the at least one computer program to run an image comparison computer program configured to recognize the target on at least one trajectory image by comparison with the initial reference image.

17. The system according to Claim 13, further comprising:

lighting means (118) being provided to permit use of daylight image acquisition means (122D) in insufficient lighting conditions.

18. The system according to Claim 13, further comprising:

the projectile image acquisition device being controlled by the projectile computer to start derivation of trajectory images at one of both a predetermined range and a

predetermined point in time.

19. The system according to Claim 13, further comprising:

the projectile computer being provided with attack data and operated to:

convert the range into a time to target, and

to initiate explosion when the target on a current reference image resides in a guidance disposition at the time to target.

20. The system according to Claim 13, further comprising:

the projectile being guided according to a type of explosion initiation selected prior to launch as an explosion on impact on the target, and

the projectile guiding means being operated to guide the projectile into a guidance disposition wherein the target on a reference image is centered between vertical borders (LVB, RVB) and horizontal borders (TB, BB).

21. The system according to Claim 13, further comprising:

the projectile being launched to overshoot the range to the target according to a type of explosion initiation selected prior to launch as an explosion in proximity of the target, and after launch:

the projectile guidance guiding means (123) being operated to guide the projectile into a guidance disposition wherein the target on a current reference image has a selected relation with at least one border (BRD) of the reference image.

22. The system according to Claim 21 , further comprising:

the at least one border being selected from a group consisting of a left vertical border (LVB), a right vertical border(RVB), and a bottom border (BB).

23. The system according to Claim 21 , further comprising:

after launch:

the projectile computer (119) being provided with attack data for converting the range into time to target, and

a warhead (126) being initiated in explosion at a time shorter than the time to target.

24. The system according to Claim 21 , further comprising:

after launch:

the projectile guiding means (123) being operated to guide the projectile into a guidance disposition wherein the target is centered between vertical borders (LVB, RVB) of the reference image.

Description:
SYSTEM AND METHOD FOR GUIDANCE OF A

PROJECTILE

Technical Field

The present invention relates to a method and a system for the engagement and destruction of a target by use of a projectile launched from a platform, and relates more particularly, to a controllably guided projectile operating image derivation means, and driven by computer software program(s).

Background Art

The attack of a target by a projectile is well known per se. For example, US Patent No. 6,487,953 to Mclngvale et al. recites "A fire control system for a short range guided missile in which the system is contained on a vehicle such as a track vehicle that can be easily camouflaged from the enemy and including video means for acquiring a target desired to be destroyed and utilizing this video information as reference information in an automatic target handoff correlator which correlates the reference information from video signals from a missile that was launched from the vehicle and guided by a missile control computer to the field of view of the selected target with the automatic target handoff correlator comparing the reference video signals and the seeker video signals from the missile to place the target in the center of the field of view of the seeker and once this is done a missile autotracker is commanded to lock-on and guide the missile to the selected target."

However, it would be much more cost effective to engage a target by use of a low- cost projectile instead of requiring the use of a more expensive guided missile.

Summary of Invention

Technical Problem

The problem is the lack of cost-efficiency when the need arises to use an expensive guided missile by lack of other means, to destroy a relatively cheap target.

Solution to Problem

The solution to the problem is provided by a ballistic projectile less sophisticated but much cheaper than a guided missile. Instead of an expensive seeker head as in a guided missile, the launched guidable projectile does not require a gimbaled seeker. Guidance is provided by comparing images frames including the target, such as a initial image frame and a plurality of successively derived trajectory image frames. The guidable projectile is guided toward the target and is operated by comparison of a distance on the image that separates the image of the target from the border of the image. This means that a prior image is compared with a next image including the target, and the projectile is guided toward the target according to the difference of the distance on the image. Thereafter, the before last derived image may become the reference image to be compared with the last derived image.

Advantageous Effects of Invention

One advantage is the provision of a weapon system much less expensive than existing equipment. Another advantage is that the image acquisition device 122 on board of the projectile needs not to be gimbaled since fixed optics suffices. Furthermore, simple fixed optics with only a narrow field of view do meet the requirements.

It is an object of the present invention to provide a method and a system for guiding a projectile (117) launched toward a selected target (116) by providing image acquisition devices (111, 122) for deriving scenery images including the target, the images having image borders (BRD). The projectile may be stabilized in roll but if not so, then at least one computer program is operated to stabilize the derived images and for appropriately operating the guiding means (123) of the projectile. The method comprises the steps and means for a) deriving an image after projectile launch and for saving that image as a trajectory image, b) searching for the target on the trajectory image and when found, for saving the trajectory image as a current reference image, and c) for deriving a disposition of the target relative to at least one border of the current reference image. Furthermore, the method and the system comprise the steps and means to d) compare the disposition of the target on the current reference image with an a priori selected guidance disposition which relates the target to at least one border and guiding the projectile to acquire the selected guidance disposition on a next trajectory image. Thereafter, the steps a) to d) are repeated.

It is another object of the present invention to provide a method and a system for guiding the projectile in operative association with a type of explosion initiation selected prior to launch. It is yet an object of the present invention to provide a method and a system for deriving an initial image prior to projectile launch, and for providing a projectile computer (119) operating an image comparison computer program for recognizing the target by comparison of a trajectory image with the initial image.

It is one more object of the present invention to provide a method and a system for providing lighting means (118) allowing use of daylight image acquisition means (122D) in insufficient lighting conditions.

It is yet one more object of the present invention to provide a method and a system that provide a range finder (112) for deriving a range separating a projectile launching platform (110) from the target, then launching the projectile according to attack data, and providing a projectile computer (119) using the attack data for converting the range into time to target, and last, starting to derive images after projectile launch at one of both a predetermined range and a predeteirnined point in time.

It is yet still one more object of the present invention to provide a method and a system for guiding the projectile according to a type of explosion initiation selected prior to launch, and for initiating explosion when the target on an image resides in a guidance disposition at the time to target.

It is a further object of the present invention to provide a method and a system for guiding the projectile according to a type of explosion initiation selected prior to launch as an explosion on impact on the target, and after launch, providing guiding means (123) operative for guiding the projectile into a guidance disposition where the target is centered between vertical borders (LVB, RVB) and horizontal borders (TB, BB).

It is moreover an object of the present invention to provide a method and a system for launching the projectile for overshooting the range to the target in accordance to a type of explosion initiation selected prior to launch as an explosion in proximity of the target.

Then, after launch, providing guiding means (123) operative for guiding the projectile into a guidance disposition where the target is adjacent at least one border (BRD), and selecting the at least one border from a group consisting of a left vertical border (LVB), a right vertical border (RVB), and a bottom border (BB).

It is an additional object of the present invention to provide a method and a system for launching the projectile for overshooting the range of the target and operative in accordance with a type of explosion initiation selected prior to launch as an explosion in proximity of the target. Then, after launch, providing guiding means (123) operative for guiding the projectile into a guidance disposition wherein the target is centered between vertical borders (LVB, RVB).

It is yet an additional object of the present invention to provide a method and a system for providing at least one computer program stored in digital form in a computer memory (M) that is readable and executable by a computer (119) disposed in the projectile, and for operating the at least one computer program for controlling guidance according to an initial image derived prior to launch and to a succession of images derived after launch, and according to a type of explosion initiation selected prior to launch.

It is further an additional object of the present invention to provide a computer program for operating the method steps of claim 1.

It is still an additional object of the present invention to provide a system configured for guidance and control of a ballistic projectile (117) launched toward a selected target (116). The system comprises a platform image acquisition device (111) disposed on a platform (110), a projectile image acquisition device (122) disposed on the projectile, and projectile guiding means (123) configured for guiding the projectile on trajectory. The system further comprises a projectile computer (119) disposed in the projectile and operatively coupled to the projectile image acquisition device and to the guiding means, where the projectile computer operates at least one computer program stored in a memory (M) operatively coupled thereto. Finally, a rangefinder (112) is provided for deriving a range as a distance separating the platform from the target. The system comprises the platform image acquisition device being operated prior to launch to derive a scenery image as an initial image including the target, and the projectile image acquisition device being operated after launch to derive a plurality of scenery images as a sequence of trajectory images including the target. The system further comprises the projectile computer being loaded with the initial image and being configured to analyze each trajectory image in sequence to search for and recognize the target on a trajectory image by image comparison with the initial image, and when found, to save the trajectory image as a current reference image. The projectile computer is further configured to derive a disposition of the target relative to at least one border (BRD) on the current reference image, and to compare the disposition of the target on the current reference image with an a priori selected guidance disposition where the target is related to at least one border. Furthermore, the projectile computer is also configured to guide the projectile to acquire the selected guidance disposition on a trajectory image to be saved as a a next current reference frame.

Brief Description of Drawings

Non-limiting embodiments of the invention will be described with reference to the following description of exemplary embodiments, in conjunction with the figures. In the figures, identical structures, elements, or parts that appear in more than one figure are preferably labeled with a same or similar number.

Fig. 1 is a block diagram of a first phase one, starting from before launch, and including launch of the projectile,

Fig. 2 is a block diagram of a second phase two, extending from projectile launch to target destruction,

Figs. 3a and 3b are a flowchart of phase one and phase two operating in day mode and with an impact type of warhead initiation,

Figs. 4a and 4b are a flowchart for phases one and two, including various options for day and night modes of operation as well as impact and proximity types of explosion initiation, and

Fig 5 is a schematic presenting the variables associated with the calculation of equations 1 and 2 for deriving the time to proximity warhead explosion initiation, and

Fig. 6 shows an image frame with borders.

Description of Embodiments

The embodiments of the present invention refer to a system and a method for launching and guiding a trajectory-controllable projectile towards a target. The projectile is able to autonomously acquire the target, for the destruction thereof by explosion of a warhead.

Fig. 1 is a block diagram of a first phase one, starting from before launch, and including target selection and launch of the projectile, while Fig. 2 relates to a second phase two, extending from projectile launch to target destruction.

Fig. 1 shows a target 116, and a platform 110 whereon components of the system are disposed, which platform may be operated by an operator 115. The operator 115 may man the platform 110 or operate the platform by remote control. The platform 110 from which the projectile is launched is possibly a weapons system that may be operated on land, from water, or from the air. Handheld launchers are also included. The projectile may be barrel-launched or self-propelled by a rocket motor, and the angle of inclination at launch may range from close to 0° and up to almost 90°. For example, guns or cannons firing in low trajectories, as well as mortars shooting in high trajectories are suitable as launchers.

The platform 110, such as a tank 110 for example, preferably has a platform- mounted image acquisition device 111, or a remotely disposed one, selected for example as a narrow field of view day or night camera 111. In addition, the platform 110 may have a distance rangefinder 112 or rangefinder 112, such as a laser rangefinder, and a projectile launcher 114. Both the platform camera 111 and the rangefinder 112 may be coupled in unidirectional communication with the projectile 117 prior to launch, as described hereinbelow.

If desired, the range separating the platform 110 from the target 116 may be derived by means other than the rangefinder 112, such as from maps or tables available on the platform 110 or communicated thereto from one or more sources external of the platform 110. This means that the operator 115 may derive the range with means that either are disposed to the platform 110, or receive the range from an exterior source.

Typically, a platform day image acquisition device 11 ID is used to derive images of the target 116 when good illumination conditions prevail, whereas a platform night image acquisition device 11 IN is used to derive images of the target 116 in poor illumination conditions or in darkness. The platform may be equipped with both a day image acquisition device H ID and a night image acquisition device 11 IN. Alternatively, the platform may also be equipped with one day or one night image acquisition device 111, in which case a day to night image conversion tool or a night to day image conversion tool may be used accordingly, to enable effective operation in any given illumination conditions. The rangefinder 112 for deriving the distance from the platform 110 to the target 116 is coupled to the projectile 117, to allow a computer in the projectile 117 specified as projectile computer 119 to be loaded with the derived range distance. The derived range may be used by the projectile computer 119 to perform computations, prior to launch, while the projectile 117 still resides in the launcher 114, but also after launch. derived range may be used by the projectile computer 119 to perform computations, prior to launch, while the projectile 117 still resides in the launcher 114, but also after launch.

The platform image acquisition device 111 may be a narrow Field of View, FOV, device configured to derive image frames of the scene including the selected target. In the description hereinbelow, a frame is accepted as containing an image, which is surrounded by borders. For the sake of brevity, reference may be made to an image having borders.

The platform image acquisition device 111 or platform camera 111, which may be used to derive an image of the target 116, is coupled to the projectile 117 for loading of the derived image into the projectile 117 as an initial image. The derived initial image of the target 116 may be used thereafter in phase two, after launch. The projectile launcher 114 is commanded and controlled by the operator 115, for launching an adequately selected autonomously guidable projectile 117 in controllable trajectory towards the target 116. The various elements and components of the projectile 117 are shown in Fig 2.

The platform image acquisition device 111, or platform camera 111 may be a narrow-angle optical device having an optical axis that is pointed towards or at a predetermined angle to an object of interest, such as the target 116 for example. The optical axis of the platform camera 111, which may be seen as a virtual cross hair, or simply cross hair, may be aimed at and on the target or adjacent thereto.

If desired, the initial image may be derived away from the projectile launching platform 110 and be supplied thereto for loading into the projectile 117. In other words, the platform image acquisition device 111 is not necessarily disposed on the projectile- launching platform. For example, an initial image taken by a remotely piloted airborne vehicle, or from a satellite in space, may be communicated to and loaded into a projectile 117, such as a mortar bomb. That option is required when using a projectile that is launched at an angle of say, more than 45°. This means that the operator 115 either derives the image of reference by help of an image acquisition device disposed on the platform 110, or receives the reference image to be stored in the projectile 117 from an external source.

Fig. 2 is a schematic block diagram illustrating the main elements and components of the projectile 117 and the mutual relation between the projectile after launch and the target 116. means 118, and explosion initiation means 124. Furthermore, the computer 119 is coupled in bidirectional communication with a projectile image acquisition device 122, or projectile camera 122, or camera 122 for short.

The projectile computer 119 is configured for operation and activation of the projectile camera 122 while on trajectory, and derivation of a plurality of scenery image frames in sequence, including the target 116, which images are referred to as trajectory images. These trajectory image frames are loaded in sequence as input into the memory M, which is coupled to the computer 119, but is not shown in the Figs. The memory M stores at least one computer program in digital form that is readable and executable by the computer 119. In turn, the projectile computer 119 operates the projectile guiding means 123 in association with the initial image and with the plurality of trajectory image frames, to direct and guide the projectile to the target 116, and to command the explosion initiation means 124 to explode the warhead 126, not shown in the Figs.

Thereby, once launched from the platform 110, the projectile 117 is autonomous and self-guided, and operates in a fire-and-forget mode of target engagement, or target attack.

As described in detail hereinbelow, the lighting means 118, which are commanded by the projectile computer 119, become operative to illuminate the target 116, when necessary.

Typically, a projectile day-image acquisition device 122D may be used to derive trajectory image frames of the target 116 in well-illuminated lighting conditions, and a projectile night image acquisition device 122N may be used to derive trajectory image frames of the target 116 in insufficient lighting conditions, such as in obscurity, and in the dark. When a day camera equips the projectile 117, projectile lighting means 118 may be used to light the target 116 during night operation for example

The projectile guiding means 123 are configured to enable guidance by trajectory modification and/or correction when in flight. The explosion initiation means 124, for example a fuze 124 or an appropriately configured electronic circuit 124, are configured to explode the warhead 126, to destroy the target. The projectile co puter 119 is configured to be provided with, and store the necessary data and instructions for command and control of the projectile 117.

A series of operations that must be performed prior to launch of the projectile 117 is provided as an example. First, the operator 115 sights a scene and then selects a target 116 in that scene.

Second, the operator sets a mode of operation selected as either a day mode or a night mode, according to the existing illumination conditions. In well-illuminated lighting conditions, the preferable mode is the day mode, whereas in darkness, the preferable mode is the night mode.

Third, the operator 115 activates the appropriate platform camera 111, either the day camera H ID or the night camera 11 IN, according to the selected mode of operation, to derive an image of the scene including the selected target 116. When applicable, the operator may activate both the day camera H ID and the night camera 11 IN, or the operator may activate the single available camera 111 and if necessary, operate a day to night or a night to day image conversion tool accordingly. Furthermore, the operator 115 triggers the rangefinder 112 to obtain the range to the target 116. Optionally, the range to the target 116 may be derived by other means, such as indicated on maps, compiled in listings, or as received from any other source.

Fourth, according to the nature of the target 116, the operator 115 selects an appropriate type of warhead explosion initiation, for example proximity initiation or impact initiation. With proximity initiation, the warhead 126 of the projectile 117 is detonated near the target 116, whereas with impact initiation, the warhead is detonated by impact of the projectile 117 on the target 116. Optionally, the explosion may be delayed by adding a predetermined delay of time after impact. For soft targets such as personnel and light vehicles, proximity initiation may be more effective, and for other targets 116, impact initiation may be suitable whereas for armored vehicles or for fortified structures, initiation with an additional delay after impact may be favored. Delaying initiation of the warhead 126 when operating in impact-type of engagement, may improve the effectiveness of destruction for certain types of targets. The delayed initiation enables the warhead 126 to further penetrate into the target 116, and to explode when embedded deep within the target.

Fifth, a projectile 117 equipped with an appropriate image acquisition device 122 is selected according to the chosen mode of operation, and loaded into the launcher 114. For day mode, the selected projectile 117 may be equipped with a day camera 122D whereas for night mode the selected projectile 117 may be equipped with a night camera 122N. Alternatively, when the projectile 117 is equipped with both a day camera 122D and a night camera 122N, the appropriate camera 122 may be selected and activated by the projectile computer 119. Another alternative for night operation may be to select a projectile equipped with a day camera 122D and with lighting means 118 configured to light the target 116 so as to enable the day camera 122D to provide image frames taken in good illurnination conditions, in which image frames the target may be detected.

With the example of the series of operations to be performed prior to projectile launch, these operations may be performed in any preferred sequence, and are not limited to any specific sequential order.

Following accomplishment of the prior-to-projectile-launch preparations by the operator 115, but still prior to launch, data and commands have to be communicated to the projectile 117. It is to the operator 115 to command the loading of the projectile computer 119 with attack data, including the image of the target 116, the range to the target, and the selected type initiation of explosion of the warhead. The selected kind of projectile camera 122 may also be entered into the projectile computer 119 if the projectile 117 is equipped with more than one camera 122. Alternatively, when on trajectory, the projectile computer 119 may automatically select the appropriate camera 122 according to lighting conditions or both day camera 122D and night camera 122N may be operated.

Accordingly, the projectile computer 119 receives and stores the attack data, including namely, the image of the target 116, the range to the target 116, the selected mode of operation, and the selected type of warhead initiation. Thereafter, the image of the target 116 will serve as an initial image frame to be compared with a projectile image, which is an image derived by the projectile image acquisition device 122 during flight. The range from the launching platform to the target 116 enables the projectile computer 119 to compute the time of flight that will elapse between projectile launch and arrival to the target 116. This time of flight is referred to hereinbelow as the time to target.

Thereafter, the post launch time of activation of the projectile camera 122 may be computed with reference to the time to target. It is preferable to operate the projectile camera 122 after the peak of the trajectory, when the projectile is in nose-down attitude. Evidently, with a narrow FOV projectile image acquisition device 122, a nose-up attitude of the projectile 117 will probably not allow to derive images of the target. Typically, for the example where the platform 110 is a tank, the range from the launching platform to the target 116 runs from one to five km. If desired, the projectile image acquisition device 122 may be operated as late as when separated by only about 150 m from the target 116. Both the time to target and the time to activation of the projectile camera 122 may be computed according to the range to the target 116, to the ballistic data of the projectile 117, or attack data as described hereinbelow.

The ballistic data of the projectile 117, or the attack data, may be loaded a priori into the projectile computer 119, or be loaded into the projectile just prior to launch. Time to target and time to activation of the projectile camera 122 may also be computed on the platform 110 or be loaded into the projectile, prior to launch.

Thereafter, the projectile 117 is fired from the projectile launcher 114 in the direction of the target 116.

As described hereinabove, two phases of operation may be considered: a first phase starting prior to and including launch of the projectile 117, and a second phase lasting from projectile launch to target destruction.

Phase two of the engagement of a target 116 during daytime and with impact initiation of the warhead 126 is now described as an example.

In phase two, after launch, the projectile computer 119 activates the projectile camera 122, or if applicable, two projectile cameras 122, at a predetermined time of activation to derive a succession of trajectory image frames, which are loaded sequentially into the projectile computer 119. The projectile computer 119 runs at least one computer program, such as an image comparison program using the initial image, to detect the image of the target 116 on a trajectory image frame derived by the projectile camera 122. The computer 119 may skip all the images that do not include an image of the target 116, and such images empty of the target, are not referred to in the description.

If the projectile 119 is equipped with both a day camera 122D and a night camera 122N, then an image comparison algorithm may be performed both on the day image and on the night image to increase the probability to detect the target and to increase the reliability of the detection. Trajectory image frame derivation is repeated in sequence and each frame is searched until the image of the target 116 is detected. Once the image of the target is detected in a trajectory frame, that frame is stored in memory M, not shown in the Figs, and becomes a current reference image. After guidance of the projectile is performed, if necessary, a next trajectory frame is acquired and analyzed, and if the target 117 is found thereon, becomes the next reference frame. In the present example of initiation by impact on the target 116, when the image of the target 117 is detected in a reference image frame and is found not to be in the center thereof, the projectile computer 119 forwards trajectory correction commands to the projectile guiding means 123, so that the image of the target 117 will become centered in a next reference image frame. In other words, the projectile guiding means 123 guides the projectile 110 toward the center of the target 116 toward which the optical axis is aimed. This means that the cross hair is on the target 116. Else, if the image of the target 116 is found to be already centered in the reference frame, no projectile guidance action is taken and sequential trajectory image frame derivation continues.

Since an image frame as shown in Fig. 6 includes an image surrounded by borders BRD, centering of an image is meant with reference to the horizontal and vertical borders of the image. An image has a left vertical border LVB and a right vertical border RVB, as well as a top border TB and a bottom border BB forming the horizontal borders. This means that centering refers to an image of the target 117, taken for example as the center of gravity of the target, which may be selected as single pixel, being disposed at the center between the horizontal borders, respectively LVB and RVB, and the vertical borders, respectively TB and BB.

Operations on the trajectory image of the target 116 are preferably simplified, but not necessarily so, by referring to one pixel of the entire target image, such as for example the center of gravity of the image. Such a single pixel, or a group of pixels if desired, may be provided as a built-in option, or be selected by the operator 115.

Sequential trajectory frame derivation, while driving or maintaining the image of the target 116 to the center of the reference image frame, is repeated successively until the warhead is initiated by impact on the target. This means that the cross hair is disposed and maintained on the target 117.

Day mode and impact type of initiation

Figs. 3 a and 3b are an exemplary flowchart showing the steps of one possible exemplary sequence of operation for phase one and phase two in day mode and impact type of warhead initiation.

Step 301 is the start of the sequence, which continues to step 302. Step 301 is the start of the sequence, which continues to step 302.

In step 302, the operator 115 selects a target 116 and sets an impact type of warhead initiation, which automatically sets a guidance disposition, whereafter flow control continues to step 303 where the operator 115 sets a day mode of operation. The guidance disposition is a specific relation between the image of the target 117 and at least one border BRD of the reference image. For impact initiation, the target 117 will be disposed in the center between the borders of the reference frame. Thereafter, the sequence continues in step 304.

In step 304, the operator 115 uses the platform day camera 11 ID to derive an initial image of the target 116. In addition, the operator 115 also operates the rangefinder 112 to derive the distance from the platform 110 to the target 116. Next, flow continues to step 305.

In step 305, the operator 115 commands the download and saving of target data into the memory M of the projectile computer 119. Target data includes: the initial image of the target 116, the range to the target, the selected day mode of operation, and the impact type of warhead initiation, thus including the guidance disposition. Target data may also include complete trajectory data according to projectile 117 launch conditions, ambient atmospheric data as well as either as computed or as loaded from firing tables, also with angle of launch and with muzzle velocity or firing charge details. The various types of data that may be loaded in the projectile 117 are referred to as the target data or the attack data. Sequence flow now continues to step 306.

In step 306, the projectile computer 119 computes the time to target 116 and the time to activation of the projectile camera 122. Alternatively, the time to activation of the camera 122 is computed on the platform 110 and loaded into the projectile 117. Preferably, the time from launch to activation of the projectile camera 122 is more than half the time to target. Control now passes to step 307.

In step 307, the operator 115 commands launch of the projectile 117 towards the target 116, and the projectile computer 119 starts to count or measure the time elapsed from the moment of launch. Next, comes step 308.

In step 308, the projectile computer 119 compares the time elapsed from projectile- launch to the camera activation time computed in step 306, and when the activation time is reached, the sequence continues to step 310. Else, step 308 is repeated. In step 310, the projectile computer 119 activates the projectile camera 122 and the sequence continues to step 311a.

In step 311a, the projectile camera 122 derives a first trajectory image frame, which is loaded into and saved in the memory M of the projectile computer 119. The projectile computer 119 now runs an image comparison computer program to search for and detect the image of the target 116 in the first trajectory image frame, by comparison with the initial image. Then the flow continues to step 31 lb.

In step 311b, if the image of the target 116 is found in the first trajectory image frame, then the sequence continues to step 311c, but else, control flow returns to step 311a. It is noted that a trajectory image wherein the target 116 is not detected is disregarded and that a next trajectory image is automatically derived, even though not shown as such in Figs. 3 a and 3b.

In step 31 lc, the first trajectory image frame containing the image of the target 116 is saved in memory M, to become the current reference image. Then the sequence continues to step 31 Id.

In step 3 l id, a next trajectory image is derived, is checked for detection therein of the target 116, and if found, is stored in memory M as the current reference frame. The procedure now continues to step 313.

In step 313, the computer 119 checks the guidance disposition of the image of the target 116 relative to at least one border, or here, to the four borders BRD of the reference frame derived in step 3 l id. If the image of the target 116 is detected in the center between the four borders BRD, thus in the middle between the two vertical borders, namely the left vertical border LVB and the right vertical border RVB, and the in the middle between the two horizontal borders, which are the top border TB and the bottom border BB, then control flows to step 315. This means that the virtual cross hair of the guidance disposition is right on top the image of the target 116, or that the optical axis of the projectile camera 122 is oriented to hit the target in the center thereof. However, if the image of the target 116 does not appear in the center of the reference frame, then step 314 commands a correction of the trajectory of the projectile 117, so as to drive the image of the target to the center of the next reference frame(s). Thereafter, control flow returns to step 31 Id for another loop. In step 315, if the projectile computer 119 is still operative, the sequence returns to step 31 Id. Else, if the computer is not operative anymore, this indicates that explosion by impact has occurred, and the sequence comes to end in step 319.

Operation at night with a night platform camera 11 IN and with a night projectile camera 122N is evidently possible since the derived images are compatible night images. In contrast, when operating at night with a night platform camera 11 IN and with a projectile day camera 122D, the target has to be illuminated to allow good quality target images. In this last case, image conversion may be necessary to achieve image compatibility, whereby effective image frame comparison may be performed. Image conversion may be achieved by operation of an appropriate image processing computer program.

Three different exemplary schemes are available when operating at night with a night platform camera 11 IN in association with a projectile day camera 122D and with lighting means 118.

First, image conversion of the platform's night target image, for transformation into a day image, may be performed on the platform 110, which transformed image is then loaded into the projectile computer 119 prior to launch.

Second, the night image of the target 116 may be loaded into the projectile computer 119 and be converted on board of the projectile 117 to become a day image.

Third, the initial image, here the platform night image of the target 116, may be loaded into the projectile 117 and each one of the trajectory image day frames derived by the projectile day camera 122D may be converted to become a night image. Any of these schemes makes it possible to compare initial images derived from the platform 110 or not, with trajectory images derived by the projectile 117.

Figs. 4a and 4b depict an exemplary flowchart for phase one and phase two, including various options of day and night modes of operation, as well as impact and proximity types of warhead explosion initiation. Figs. 4a and 4b refer to a projectile 117 equipped with a day camera 122D and with lighting means 118 for operation at night. Appropriate image conversion is performed prior to launch on board of the platform 110.

Optional modes of operation of the steps of Figs. 4a and 4b are first described in general; whereafter Figs. 4a and 4b are used to illustrate an exemplary control flow for various operational options. General mode of operation

In Figs. 4a and 4b, step 401 starts an exemplary sequence, which continues to step

402.

In step 402, the operator 115 selects a target 116 and accordingly sets the type of warhead explosion initiation, which automatically sets a guidance disposition, such as for example, to either impact type or proximity type. The example is limited to two types of explosion initiation, even though many other options are practical. Control now flows to step 403.

In step 403, the operator 115 sets a mode of operation for either day or night operation. For day mode, control flows to step 404 but for night mode control flows to step 404a.

In step 404, the operator 115 may use the platform day camera 11 ID to derive an initial image of the target 116. In addition, the operator 115 may also use the rangefinder 112 to derive the distance from the platform 110 to the target 116. The sequence continues to step 405.

in step 404a, the operator 115 may operate the platform night camera 11 IN to derive an initial night image of the target 116. Thereafter, an initial day image may be derived out of the night image, using a night-image to visible-light-image conversion computer program. In addition, the operator 115 may also trigger the rangefinder 112 to derive the range from the platform 110 to the target 116. The sequence continues to step 405.

In step 405, the operator commands download of target data into the projectile memory M. The target data or attack data and were described hereinabove with respect to step 305 of the day mode and impact type of initiation. Control flows next to step 406.

In step 406, the projectile computer 119 calculates the time to target and the time to activation of the camera 122. If desired, the time for activation of the camera 122 is loaded by the operator 115. Alternatively, the time to projectile camera activation is computed on board of the platform 110 and loaded into the projectile 117. The time of activation of the lighting means 118 is preferably the same as the time for activation of the camera 122 but may be loaded by the operator 115, or be computed onboard the platform 110 or onboard the projectile 117. The sequence continues to step 407. In step 407, the operator 115 commands launch of the projectile 117 towards the target 116. The projectile computer 119 begins to count or measure the time elapsed from the moment of launch. The sequence continues to step 408.

In step 408, the projectile computer 119 compares the time elapsed from projectile launch to the activation time of the camera 122 as computed in step 406 or as preloaded by the operator 115, and when the time of activation is reached, the sequence continues to step 409. Else, step 408 is repeated.

In step 409 the projectile computer 119 decides operation according to the day or night mode selected in step 403. For day mode the sequence continues to step 410, but for night mode, the sequence continues to step 410a.

In step 410, the projectile computer 119 activates the projectile camera 122 and then the sequence continues to step 411a.

In step 410a, the projectile computer 119 activates the day camera 122D and the lighting means 118. Then the sequence continues to step 411a.

In step 411a, the camera 122 derives a first trajectory frame, which is loaded and saved in the memory M. The projectile computer 119 now analyzes the trajectory image frame by running an image comparison computer program to detect the image of the target 116 in the derived trajectory frame, by comparison with the priorly derived initial image already resident in memory M. Then flow continues to step 411b.

In step 411b, if the image of the target 116 is detected in the first trajectory image frame, by image comparison with the initial frame as analyzed by the projectile computer 119, then the trajectory image is saved in memory M as the current reference frame. Next, the sequence continues to step 41 lc, otherwise control flow returns to step 411a.

Step 41 lc is an optional step for an optional operation, such as wherein the projectile trajectory may be corrected as described hereiribelow for example. Control then passes to step 41 Id. In case the image of the target 116 is not found in one out of the plurality of successive trajectory images, control returns to step 41 Id even though not shown as such in Figs. 4a and 4b.

In step 41 Id, the camera 122 derives a next in sequence trajectory image, which is loaded into the memory M of the projectile computer 119. Then the projectile computer 119 runs an image comparison computer program to detect the target 116 in the last derived trajectory image by comparison with the current reference image. When the image of the target 116 is detected, the last derived trajectory image frame is saved to become the current reference image. Then the sequence continues to step 412.

It is noted that after a few more procedure steps following step 41 Id, control returns to step 41 Id for a new loop in which a next trajectory frame is derived, checked for detection of the target 117 therein, and if the target is found, the trajectory frames is stored in memory M, and so again. Thereby, the reference image is successively replaced.

In step 412 control flows according to the type of explosion initiation selected in step 402, which was loaded into memory M in step 405. For impact initiation, the sequence continues to step 413, but for proximity initiation, the sequence continues to step 413a.

In step 413, the computer 119 checks the guidance disposition of the target 116 relative to at least one border BRD, or to the four borders of the current reference image frame derived in step 41 lc. If the image of the target 116 is detected in the center between the four borders BRD, thus in the middle between the two vertical borders, namely the left vertical border LVB and the right vertical border RVB, and the in the middle between the two horizontal borders, which are the top border TB and the bottom border BB, this means that the virtual cross hair of the guidance disposition is right on top the image of the target 116, or that the optical axis of the projectile camera 122 is oriented to hit the target in the center thereof. Then control flows to step 415. However, if the image of the target 116 does not appear in the center of the frame, then step 414 commands a correction of the trajectory of the projectile 117, so as to drive the image of the target to the center of the next reference frame(s). Thereafter, control flow returns to step 41 Id for another loop.

In step 415, if the computer has not been destroyed by impact and is still operative, the sequence returns to step 41 Id. Else, with the computer 119 inoperative, it is accepted that impact has occurred, and therefore, the sequence comes to an end in step 419.

Initiation of explosion of the warhead 126 in proximity of the target 116 may be achieved in many ways, according to different guidance dispositions and algorithms, even though only limited examples thereof are described hereinbelow.

Time initiation is one exemplary option that takes advantage of the range to the target 116 derived prior to launch by use of the distance rangefinder 112. The range being known, together with target data or attack data, it is possible to derive time to target. For initiation of explosion in proximity of the target 116, the time to target has to be adjusted according to the desired proximity distance separating the point of explosion away from the target. The projectile 117 is thus guided toward the target 116 according to the guidance disposition for impact on the target, for explosion in close proximity to the target after the lapse of the time to target minus some amount of time. Thus, the projectile 117 may be guided along an impact explosion initiation trajectory for explosion after the time to target less a predetermined amount of time.

Initiation of explosion in proximity may also be achieved by setting guidance disposition in overshoot for example, meaning that the projectile 117 is launched to a range greater than the distance separating the platform away from the target 116. In such a case, according to an appropriate guidance disposition, the trajectory of the projectile 117 will avoid impact on, but pass in proximity over and above the target 116. A narrow- angle projectile camera 122 approaching the target from above will thus see the image of the target grow steadily, which image will disappear below the bottom border of the reference image when the projectile comes close to the target and passes thereover. Disappearance of the target 116 out of the field of view of the projectile camera 122 is thus a signal for triggering explosion initiation in proximity. This means that the virtual cross hair may be aimed above the target 116, or that the optical axis is set at a small angle relative to the straight line to the target. An explosion initiation command may thus be delivered according to the reference angle of the target 116 relative to the optical axis of the camera 122.

When operating in the proximity type of target engagement by guidance disposition for range overshoot, improvements of the precision of the proximity distance away from the target 116 may be obtained by use of a camera 122 having a wide-angle field of view in the vertical direction. With such a wide-angle vertical field of view camera, the projectile computer 119 may calculate the time to initiate explosion of the warhead 126 according to geometric relations, such as provided by equations 1 and 2 hereinbelow.

Fig. 5 presents a schematic illustration as an example in relation with the equations 1 and 2 described hereinbelow. In equation 1, X is the distance from the projectile 117 to the target 116, and H is the height of the projectile 117 above the terrain surface S. The height H is approximated according to the range distancing the platform 110 away from the target 116, and to the attack data of the projectile 117. The angle β, which is the angle of the field of view of the projectile camera 122, allows the calculation of the distance X from the projectile 117 to the target 116, with a good degree of accuracy. Thereafter, since the velocity V of the projectile is known from the attack data, the Estimated Time of Arrival, or ETA, is derived by help of equation 2.

H

X = (Equation 1)

(Equation 2)

Further improvement of the precision of target acquisition may be obtained by performing evaluation of the ETA, or by the distance range to the target 116, according to the rate at which the image of the target descends in the trajectory image derived by the projectile camera 122.

As an alternative example, the guidance disposition of the projectile 117 may be aimed to miss the target 116 either to the left or to the right thereof, and be commanded to explode at a respective side of the target. To this end, for example with a narrow lateral field of view camera 122, the image of the target may be kept adjacent a vertical left or right border, respectively LVB and RVB of the current reference image, and may be commanded to initiate explosion when the target 116 disappears from the current image, in the same manner as described hereinabove with reference to guidance disposition for overshoot. The virtual cross hair may thus be set off the target 116, either to the left or to the right in a guidance disposition appropriate for explosion on impact or in slight overshoot. This means that the optical axis is pointed at an angle away from initiation on impact. If desired, a time delay may be added to delay the explosion.

As an option for explosion in proximity, it is possible to guide the projectile 117 in a selected guidance disposition set for impact with the target 116 and to initiate a timed explosion. To this end, the time to target minus a predetermined amount of time may provide the time to explosion at which the projectile 117 will be near the target 116.

When explosion initiation is operative in relation to a border BRD of a reference image, care is taken to prevent explosion due to, for example, any disturbance such as momentarily loss of target or momentary perturbation of projectile attitude. Since the range to target is known, the time to target may easily be derived, and that time to target may be used to prevent a premature explosion. This means that the time to target may be used in conjunction with the guidance disposition for the explosion initiation command.

Another option of guidance disposition for initiation of explosion in proximity may include sampling at various points in time along the trajectory, and to derive a prediction of the time when the projectile 117 will reach a desired distance away from the target 116, to then command initiation of explosion with an additional delay of time or not.

Besides the examples described hereinabove, many additional guidance dispositions and computer programs may achieve explosion initiation in proximity of the selected target 116, such as for example, according to the rate of growth of the target 116 on the image.

In step 413a, for guidance disposition in proximity initiation by overshoot for example, the computer 119 checks and analyzes the disposition of the image of the target

116, e.g. the center of gravity thereof, relative to the vertical borders, namely LVB and RVB, of the current reference image frame, as derived in step 41 Id. With the projectile 117 being launched in flat trajectory for example, and when past the peak of the trajectory, the image of the target 116 is expected to descend toward the bottom border BB in each successive reference image frame in comparison with the previous image frame. If analysis of the image of the target 116 detects that the target is disposed in the lower bottom portion of the image, and also centered between the vertical borders thereof, namely the left vertical border LVB and the right vertical border RVB, then control flows to step 416a. If not so, then control flows to step 414a.

In step 414a the computer 119 commands a trajectory correction of the projectile

117, which correction is intended to drive the image of the target 116 to appear centered between the vertical borders, respectively LVB and RVB, in the next reference frame to be stored. Next, control flow returns to step 41 Id.

In step 416a, the projectile computer 119 calculates the time to explosion initiation, based for example on the geometrical calculations described hereinabove with reference to Fig. 5, and begins to count time. Then, the sequence continues to step 417a.

In step 417a, the projectile computer 119 compares the time calculated in step 416a to the lapsed counted time, and when that calculated time is reached or has lapsed, then control flows to step 418a. Otherwise, the sequence loops again through step 417a. Steps 416a and 417a may be replaced by other different steps related to explosion initiation. For example, explosion initiation may be commanded at the moment when the descending image of the target 116 disappears below the bottom border BB of the current reference frame, or following an additional time delay thereafter. Then, control flows to step 418a.

In step 418a the computer 119 outputs an explosion initiation command signal to activate explosion of the warhead 126 by proximity. The sequence then ends in step 419.

Fig. 4 is now used to describe examples of the different command flow paths taken along the flowchart in association with various combinations of day or night modes of operation, and of impact or proximity warhead initiation. The following example refers to day mode operation with proximity explosion initiation.

Day mode and proximity type of initiation

In Fig. 4, step 401 starts an exemplary sequence, which continues to step 402.

In step 402, the operator 115 selects a target 116 and sets a proximity type of warhead explosion initiation, which in this case, automatically sets a guidance disposition for explosion in proximity of the target 11 . Then the process flow continues to step 403.

In step 403, the operator sets a day mode of operation. The sequence continues to step 404.

In step 404, the operator 115 may use the platform day camera H ID to derive an initial image of the target 116. In addition, the operator 115 also uses the rangefinder 112 to derive the distance from the platform 110 to the target 116. Next, flow continues to step 405.

In step 405, the operator commands download of target data into the memory M of the projectile 117. Target data and attack data are interchangeable wording, and were described hereinabove with respect to step 305 of the day mode and impact type of initiation. Sequence flow now continues to step 406.

In step 406, the projectile computer 119 may compute or receive the time to target and the time to activation of the projectile camera 122. If desired, the time to activation of the projectile camera 122 is loaded by the operator 115. Alternatively, the time to camera activation is computed on the platform 110 and loaded into the projectile 117. Control flow now passes to step 407. In step 407, the operator 115 commands launch of the projectile 117 towards the target 116, and the projectile computer 119 begins to count or measure the time elapsed from the moment of launch. Next, comes step 408.

In step 408, the projectile computer 119 compares the time elapsed from projectile- launch to the activation time of the camera 122 as computed in step 406 or as preloaded by the operator 115, and when the activation time is reached, the sequence continues to step 410. Else, step 408 is repeated.

In step 410, the projectile computer 119 activates the projectile camera 122 and the sequence continues to step 411a.

In step 41 la, the camera 122 derives a first trajectory frame, which is loaded into the memory M. The projectile computer 119 now analyzes the trajectory frame by running an image comparison computer program to detect the image of the target 116 in the derived trajectory frame, by comparison with the priorly derived initial image already resident in memory M. Then the flow continues to step 41 lb.

In step 411b, if the image of the target 116 is detected in the first trajectory image frame, by image comparison with the initial frame as analyzed by the projectile computer 119, then the trajectory image is saved in memory M as the current reference frame. Next, the sequence continues to step 411c, otherwise control flow returns to step 411a.

In step 411c, is an optional step for an optional operation, such as wherein the projectile trajectory may be corrected as described hereinbelow for example. Control then passes to step 41 Id. In case the image of the target 116 is not found in one out of the plurality of successive trajectory images, control returns to step 41 Id even though not shown as such in Figs. 4a and 4b.

In step 41 Id, the projectile camera 122 derives a next in sequence trajectory image, which is loaded into the memory M of the projectile computer 119. Then the projectile computer 119 runs an image comparison computer program to detect the target 116 in the last derived trajectory image by comparison with the reference image. When the image of the target 116 is detected, the last derived trajectory image frame is saved to become the current reference image. Then the sequence continues to step 413a.

Some examples of explosion in proximity of the target 117 provided hereinabove with reference to the general mode of operation are applicable In step 413a, for guidance disposition in proximity initiation by overshoot for example, the computer 119 checks and analyzes the disposition of the image of the target

116, e.g. the center of gravity thereof, relative to the vertical borders, namely LVB and RVB, of the current reference image frame, as derived in step 41 Id. With the projectile 117 being launched in flat trajectory for example, and when past the peak of the trajectory, the image of the target 116 is expected to descend toward the bottom border BB in each successive reference image frame in comparison with the previous image frame. If analysis of the image of the target 116 detects that the target is disposed in the lower bottom portion of the image, and also centered between the vertical borders thereof, namely the left vertical border LVB and the right vertical border RVB, then control flows to step 416a. If not so, then control flows to step 414a.

In step 414a the computer 119 commands a trajectory correction of the projectile

117, which correction is intended to drive the image of the target 116 to appear centered between the vertical borders, respectively LVB and RVB, in the next reference frame to be stored. Next, control flow returns to step 411 d.

In step 416a, the projectile computer 119 calculates the time to explosion initiation, based for example on geometrical calculations described hereinabove with reference to

Fig. 5, and begins to count time. Optionally, time to target less a short amount of time is derived. Then, the sequence continues to step 417a.

In step 417a, the projectile computer 119 compares the time calculated in step 416a to the lapsed counted time, and when that calculated time is reached or has lapsed, then control flows to step 418a. Otherwise, the sequence loops again through step 41 Id.

Steps 416a and 417a may be replaced by other different steps related to explosion initiation. For example, explosion initiation may be commanded at the moment when the descending image of the target 116 disappears below the bottom border BB of the frame, or following an additional time delay thereafter. Then, control flows to step 418a.

In step 418a, the computer 119 outputs an explosion initiation command signal to activate explosion of the warhead 126 by proximity. The sequence then ends in step 419.

Figs. 4a and 4b are now used to describe a further example of operation, namely operation in night mode and with the impact type of initiation. _ .

Night mode and impact type of initiation

In Figs. 4a and 4b, step 401 starts an exemplary sequence, which continues to step

402.

In step 402, the operator 115 selects a target 116 and accordingly sets the impact type of warhead initiation which automatically sets a guidance disposition. Control now flows to step 403.

In step 403, the operator sets a night mode of operation. Then control flows to step 404a.

In step 404a, the operator 115 activates the platform night camera 11 IN to derive an initial night image of the target 116. Thereafter, a day image is derived out of the night image, using a night-image to visible-light image conversion algorithm. In addition, the operator 115 also commands the rangefinder 112 to derive the range from the platform 110 to the target 116. The sequence continues to step 405.

In step 405 the operator commands download of target data into the memory M. Target data and attack data are interchangeable wording, and were described hereinabove with respect to step 305 of the day mode and impact type of initiation. The sequence continues to step 406.

In step 406, the projectile computer 119 calculates the time to target and the time to activation of the camera 122D and of the lighting means 118. If desired, the time for activation of the projectile camera 122 is loaded by the operator 115. The time for activation of the lighting is preferably the same as the time for activation of the camera 122. Alternatively, the time to camera and lighting activation is computed on the platform 110 and loaded into the projectile 117. The sequence continues to step 407.

In step 407, the operator 115 commands launch of the projectile 117 towards the target 116 and the projectile computer 119 begins to count or measure the time elapsed from the moment of launch. The sequence continues to step 408.

In step 408, the projectile computer 119 compares the time elapsed from projectile launch to the activation time of the camera 122 as computed in step 406 or as preloaded by the operator 115, and when the activation time is reached, the sequence jumps to step 410a. Else step 408 is repeated. In step 410a the projectile computer 119 activates the day camera 122D and optionally at the same time, the lighting means 118. Then the sequence continues to step 411a.

In step 41 la, the camera 122 derives a first trajectory frame, which is loaded into the memory M. The projectile computer 119 now analyzes the trajectory frame by running an image comparison computer program to detect the image of the target 116 in the derived trajectory frame, by comparison with the priorly derived initial image already resident in memory M. Then control flow continues to step 41 lb.

In step 411b, if the image of the target 116 is detected in the first trajectory image frame, by image comparison with the initial frame as analyzed by the projectile computer 119, then the trajectory image is saved in memory M as the current reference frame. Next, the sequence continues to step 411c, otherwise control flow returns to step 411a.

Step 41 lc is an optional step for an optional operation, such as wherein the projectile trajectory may be corrected as described hereinbelow for example. Control then passes to step 41 Id. In case the image of the target 116 is not found in one out of the plurality of successive trajectory images, control returns to step 41 Id even though not shown as such in Figs. 4a and 4b.

In step 41 Id the camera 122D derives a next in sequence trajectory image, which is loaded into the memory M of the projectile computer 119. Then the projectile computer 119 runs an image comparison computer program to detect the target 116 in the last derived trajectory image by comparison with the current reference image. When the image of the target 116 is detected, the last derived trajectory image frame is saved to become the current reference image. Then the sequence skips step 412 and continues to step 413.

It is noted that after a few more procedure steps following step 41 Id, control returns to step 41 Id for a new loop in which a next trajectory frame is derived, checked for detection of the target 117 therein, and if the target is found, the trajectory frames is stored in memory M, and so again. Thereby, the reference image is successively replaced.

In step 413, the computer 119 checks the guidance disposition of the target 116 relative to at least one border BRD, or to the four borders BRD of the current reference image frame derived in step 411c. If the image of the target 116 is detected in the center between the four borders BRD, thus in the middle between the two vertical borders, namely the left vertical border LVB and the right vertical border RVB, and the in the middle between the two horizontal borders, which are the top border TB and the bottom border BB, this means that the virtual cross hair of the guidance disposition is right on top the image of the target 116, or that the optical axis of the projectile camera 122 is oriented to hit the target in the center thereof. Then control flows to step 415. However, if the image of the target 116 does not appear in the center of the frame, then step 414 commands a trajectory correction of the projectile 117, so as to drive the image of the target to the center of the next reference frame(s). Thereafter, control flow returns to step 41 Id for another loop.

In step 415, if the computer has not been destroyed by impact on the target and is still operative, the sequence returns to step 41 Id. Else, with the computer 119 inoperative, it is accepted that impact has occurred, and therefore, the sequence comes to an end in step 419.

Figs. 4a and 4b are now used to describe a further example of operation, namely operation in night mode with proximity type of initiation, with a night camera 11 IN on the platform 110 and with a daylight camera 112D and lighting means 118 aboard the projectile 117.

Night mode and proximity type of initiation

In Figs. 4a and 4b, step 401 starts an exemplary sequence, which continues to step

402.

In step 402, the operator 115 selects a target 116 and sets the proximity type of warhead initiation, which automatically sets a guidance disposition. Then the flow continues to step 403.

In step 403, the operator sets a night mode of operation. The sequence continues to step 404a.

In step 404a, the operator 115 may operate the platform night camera 11 IN to derive an initial night image of the target 116. Thereafter, an initial day image is derived out of the night image, using a night image to visible light image conversion computer program. Evidently, conversion to visible light is superfluous when a night image acquisition device 122N is carried by the projectile 117. In addition, the operator 115 also commands the rangefinder 112 to derive the range from the platform 110 to the target 116. The sequence continues to step 405. In step 405, the operator commands download of target data into the projectile memory M. The target data or attack data and were described hereinabove with respect to step 305 of the day mode and impact type of initiation. The sequence continues to step 406.

In step 406, the projectile computer 119 calculates the time to target and the time to activation of the camera 122D and of the lighting 118. If desired, the time for activation of the camera 122D is loaded by the operator 115. Alternatively, the time to projectile camera and lighting activation is computed on board of the platform 110 and loaded into the projectile 117. The time of activation of the lighting means 118 is preferably the same as the time for activation of the camera 122D but may be loaded by the operator 115, or be computed onboard the platform 110 or onboard the projectile 117. The sequence continues to step 407.

In step 407, the operator 115 commands launch of the projectile 117 towards the target 116 and the projectile computer 119 begins to count or measure the time elapsed from the moment of the launch. The sequence continues to step 408.

In step 408, the projectile computer 119 compares the time elapsed from projectile launch to the activation time of the camera 122 as computed in step 406 or as preloaded by the operator 115, and when the time of activation is reached, the sequence continues to step 410a. Else, step 408 is repeated.

In step 410a the projectile computer 119 activates the day camera 122D and the lighting means 118. Then the sequence continues to step 41 la.

In step 411a the day camera 122D derives a first trajectory frame which is loaded into the projectile memory M. The projectile computer 119 now analyzes the trajectory image frame by running an image comparison algorithm to detect the image of the target 116 in the derived trajectory frame, by comparison with the priorly derived initial image already resident in memory M. Then the flow continues to step 41 lb.

In step 411b, if the image of the target 116 is detected in the first trajectory image frame, by image comparison with the initial frame as analyzed by the projectile computer 119, then the sequence continues to step 411c, otherwise flow control returns to step 11a.

Step 41 lc is an optional step for an optional operation, such as wherein the projectile trajectory may be corrected as described hereinbelow for example. Control then passes to step 41 Id. In case the image of the target 116 is not found in one out of the plurality of successive trajectory images, control returns to step 41 Id even though not shown as such in Figs. 4a and 4b.

In step 41 Id, the camera 122 derives a next in sequence trajectory image, which is loaded into the memory M of the projectile computer 119. Then the projectile computer 119 runs an image comparison computer program to detect the target 116 in the last derived trajectory image by comparison with the current reference image. When the image of the target 116 is detected, the last derived trajectory image frame is saved to become the current reference image. Then the sequence skips step 412 and continues to step 413a.

It is noted that after a few more procedure steps following step 41 Id, control returns to step 41 Id for a new loop in which a next trajectory frame is derived, checked for detection of the target 117 therein, and if the target is found, the trajectory frames is stored in memory M, and so again. Thereby, the reference image is successively replaced.

Some examples of explosion in proximity of the target 117 provided hereinabove with reference to the general mode of operation are applicable

In step 413a, for guidance disposition in proximity initiation by overshoot for example, the computer 119 checks and analyzes the disposition of the image of the target

116, e.g. the center of gravity thereof, relative to the vertical borders, namely LVB and RVB, of the current reference image frame, as derived in step 41 Id. With the projectile 117 being launched in flat trajectory for example, and when past the peak of the trajectory, the image of the target 116 is expected to descend toward the bottom border BB in each successive reference image frame in comparison with the previous image frame. If analysis of the image of the target 116 detects the target to be disposed in the lower bottom portion of the image, and also centered between the vertical borders thereof, namely the left vertical border LVB and the right vertical border RVB, then control flows to step 416a. If not so, then control flows to step 414a.

In step 414a the computer 119 commands a trajectory correction of the projectile

1 17, which correction is intended to drive the image of the target 116 to appear centered between the vertical borders, respectively LVB and RVB, in the next reference frame to be stored. Next, control flow returns to step 41 Id.

In step 416a, the projectile computer 119 calculates the time to explosion initiation, based for example on geometrical calculations described hereinbelow with reference to Fig. 5, and begins to count time. Optionally, time to target less a short amount of time is derived. Then, the sequence continues to step 417a.

In step 417a, the projectile computer 119 compares the time calculated in step 416a to the lapsed counted time, and when that calculated time is reached or has lapsed, then control flows to step 418a. Otherwise, the sequence loops again through step 417a.

Steps 416a and 417a may be replaced by other different steps related to explosion initiation. For example, explosion initiation may be commanded at the moment when the descending image of the target 116 disappears below the bottom border BB of the current reference frame, or following an additional time delay thereafter. Then, control flows to step 418a.

In step 418a, the computer 119 outputs an explosion initiation command signal to activate explosion of the warhead 126 by proximity. The sequence then ends in step 419.

The method and the system for guidance a projectile 117 have been described hereinabove. Guidance is achieved by use of optics such as image acquisition devices 111 and 122, and is driven by at least one computer software program operated by a computer 119. Prior to launch of the projectile 117, an initial image of the selected target 116 is derived, possibly from the platform 110. The initial image has image borders BRD, and shows the disposition of the target relative to at least one border of the initial image. The initial image is then loaded into the projectile and saved in memory M.

On trajectory, a series of sequential trajectory images of the selected target 116 is derived, which sequential images are referred to as the trajectory images. In flight, a trajectory image is compared with the initial image to find and recognize the target on the trajectory image. When the target 117 is found on a trajectory image, that trajectory image is stored in memory M in the projectile 117 to become a current reference image.

For guidance of the projectile 116 toward the selected target 117, the guiding means 123 will be operated to correct the trajectory of the projectile, thus to "drive" the target shown in the current reference image toward an a priori chosen guidance disposition. The process of repeatedly deriving a trajectory image for storing as a new reference image for guidance of the projectile 117 recurs in sequence for each trajectory image on which the target is found. The guidance disposition is chosen according to the type of explosion selected for engagement of the target 116. For example, for explosion initiation by impact on the target 116, the guiding means 123 will drive the target into a guidance disposition where the target is disposed in the center of the image, thus in the middle between the left vertical border LVB and the right vertical border RVB, and in the middle between the top border TB and the bottom border BB. In contrast, for initial explosion in proximity of the target 116, the engagement disposition may be different. For example, for explosion initiation in proximity to the target, the engagement disposition may call for the target to be kept adjacent to a border BRD.

The guidance disposition is chosen according to the type of explosion initiation selected for engagement of the target 116. For example, for explosion initiation by impact on the target 116, the guidance disposition has the target disposed in the center of the image, thus in the middle between the left vertical border LVB and the right vertical border RVB, and in the middle between the top border TB and the bottom border BB. The guiding means 123 will thus strive to drive the target on the current reference image to become centered. In contrast, for explosion initiation in proximity of the target 116, the guidance disposition may be different. For explosion initiation in proximity of the target, the guidance disposition may have the target, for example, in a disposition adjacent to a border BRD, or in a disposition that is dynamic, where the target is centered between the left vertical border LVB and the right vertical border RVB and descends gradually toward the bottom border BB of the trajectory images.

Industrial Applicability

The method and the system described hereinabove are applicable in the weapon systems industry.

It will be appreciated by persons skilled in the art, that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention is defined by the appended claims and includes both combinations and sub-combinations of the various features described hereinabove as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description. Reference Signs List

FR frame

LLB left lateral border

M memory

RLB right lateral border

BB bottom border

TB top border

S terrain surface

110 platform or tank

111 platform image acquisition device

H ID platform day image acquisition device

111N platform night image acquisition device

11 IND day and night image acquisition device

112 distance rangefmder

1 14 projectile launcher

115 operator

116 target

117 projectile

118 lighting means

119 projectile computer

122 projectile image acquisition device or camera

122D projectile day-image acquisition device

122N projectile night image acquisition device

123 projectile guiding means

124 explosion initiation means

126 warhead

301-319 procedure steps in Figs. 3 A and 3B

401-419 procedure steps in Figs. 4A and 4B