Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
FIREARM SIGHT HAVING UHD VIDEO CAMERA
Document Type and Number:
WIPO Patent Application WO/2012/068423
Kind Code:
A2
Abstract:
A sighting apparatus (10) for a firearm (08) is capable of tiring at least a first and second projectile out of a firearm (08) barrel, the sighting apparatus (10) includes a video camera (30) having a sufficient frame speed rate and resolution to be capable of tracking the path of the first projectile when shot from the firearm and capturing the video image of the flight path. The series of video images include at least a first image taken of a target (70) containing field (42) that is captured at a time before and generally concurrently with the firing of the first projectile, and additional images taken of a target (70) containing field (42) that are captured before and generally concurrently with the projectile flying to and reaching the distance of the target. A video display screen (40) is provided for the user to employ to sight the target (70) and aim the firearm. The video display (40) includes a display of an image of the target (70) containing field (42) and a reticle (60) positioned at the center of the display (40) to permit the user to aim the firearm (08) by positioning the reticle (60) over the target (70). A processor (50) includes an input interface in communication with the camera (30) for enabling the processor to receive captured images from the camera (30), an output interface in communication with the video display (40) for enabling the processor (50) to deliver information to the video display (40) to enable the video display (40) to display images of the target area, a memory for storing captured images, and a computer program for operation the processor (50) to process image information captured by the camera. The software and processor (50)process the first image and the additional video images to determine the flight path of the projectile and the point where the projectile impacts the target field (42) or passes by the intended target (70) impact point and adjusts for the variance between the two points by moving or dragging the image of the target field (42) so that the impact point or the point where the projectile passes the intended target point is centered under the reticle (60) in preparation for the next shot to improve the accuracy of the next shot.

Inventors:
RUDICH DAVID (US)
Application Number:
PCT/US2011/061288
Publication Date:
May 24, 2012
Filing Date:
November 17, 2011
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
RUDICH DAVID (US)
International Classes:
G06G7/80
Foreign References:
US7810273B22010-10-12
US5026158A1991-06-25
US3295128A1966-12-27
Attorney, Agent or Firm:
INDIANO, E. Victor (7845 Rough Cedar Lane Suite 30, Indianapolis IN, US)
Download PDF:
Claims:
What is Claimed is: 1. A sighting mechanism (10) for a firearm (08) comprising:

a UHD digital video camera (30) arranged on a firearm (08) parallel to its barrel (322) which records a target sighting field (42),

a video screen (40) arranged in a sighting field (42) of a marksman operating the firearm (08) and arranged to display a target image (70) that is recorded by the UHD video camera (30),

an integrated digital computer (50) unit having a video input interface for digital image data of the UHD video camera (30) and having an output interface for the viewing screen (40), whereby, aside from the target image (70) recorded by the UHD video cameras (30), the viewing screen (40) displays information for the marksman that supports the aiming and is calculated by the computer unit (50) as a function of the data that is incoming by means of the input interface,

wherein the digital computer unit (50) comprises an image processing computer that allows at least a selectable image portion of the image data received from the UHD video camera (30) to be superimposed in a pixel precise fashion and in real-time to form a target image to be displayed on the screen (40), and the digital computer unit (50)comprises a ballistics computer that can be used to position the target image (70) displayed on the screen and a reticule (60) that is either faded into said target image (70) or situated on the screen (46) with respect to each in an automatic manner and in real time according to the data that is incoming through the input interfaces such that the position of the image of a real point of impact (80) of the most recently fired projectile from the firearm on the target or the point where the projectile passes the intended point of impact is automatically moved or dragged so that it is centered under the fixed position of the reticle (60) in preparation for the next shot. 2. The sighting mechanism (10) according to claim 1 , wherein the sighting mechanism (10) is arranged to be used on various types of weapon and weapon systems (08) and is movable from a weapon or weapon system of a first type to a weapon or weapon system of a second type without having to make changes to the sighting mechanism ( 10) and without having to input any data to the sighting mechanism whatsoever.

3. A sighting mechanism (10) for a firearm, comprising:

at least a digital video camera (30) arranged on a firearm (08) parallel to its barrel (322) which records a target sighting field (42),

a video screen (40) arranged in a sighting field (42) of a marksman operating the firearm (08)and displaying a target image (70) that is recorded by the video camera (30), a digital computer unit (50) having a video input interface for digital image data of the video camera (30) and having an output interface for the video screen (40),

whereby, aside from the target field (42) image recorded by the video camera (30) the video screen (40) displays an information for the marksman that supports the aiming and is calculated by the computer unit (50) as a function of the data that is incoming by means of the input interface and

wherein the digital computer unit (50) comprises an image processing computer (50) that allows at least a selectable image portion of the image data received from the video camera (30) to be displayed in a pixel precise fashion and in real-time to form a target field image on the video screen (40), the digital computer unit (50) further comprises a ballistics computer that can be used to position the target field (42) image displayed on the screen (40), and a reticule (60) faded into the target image (70) and situated at the center of the screen (42) directly over the point of impact (80) of the last projectile that was fired or the point where the projectile passed an intended point of impact by means of the image being moved in an automatic manner and in real time according to the data that is incoming through the input interfaces such that the position of the reticule (60) in the target image coincides with a real point of impact of a projectile from the firearm (08) on the target (70). 4. A sighting apparatus ( 10) for a firearm (08) capable of firing at least a first and second projectile out of a firearm barrel (322), the sighting apparatus ( 10) comprising

(a) a video camera (30) having a sufficient frame speed rate and resolution to be capable of tracking the path of the first a projectile when shot from the firearm (08) and capturing a series of images, the series of images including

(i) at least one first image taken of a target containing field that is captured at a time before and generally concurrently with the firing of the first projectile, and

(ii) at least one second image taken of a target containing field that is captured before and generally concurrently with the projectile reaching the distance of the target (70).

(b) a video display screen (40) for the user to employ to sight the target (70) and aim the firearm (08), the video display including a display (40) of an image of the target (70) containing field (42) and a reticle (60) positioned to permit the user to aim the firearm (08) by positioning the reticle (60) over the target,

(c) a processor (50) including

(i) an input interface in communication with the camera (30) for enabling the processor to receive captured images from the camera (30),

(ii) an output interface in communication with the video display (42)for enabling the processor to deliver information to the video display (42) to enable the video display (42) to display images of the target area.

(iii) a memory for storing captured images, and

(iv) a computer program for operation the processor to process image information captured by the camera (30),

wherein the software and processor process the first image and the second image to determine a spatial difference between a position of the target (70) relative to the reticle (60) in the first image, and a position of the projectile relative to the reticle (60) in the second image, and correcting for deviations from linear in the path of the projectile between the firearm (08) and the target (70) by adjusting the relative position of the reticle (60) and target (70) displayed on the video display (40) to improve likelihood of the second projectile striking the target (70). 5. The sighting apparatus (10) of Claim 4 wherein the video camera (30) comprises an ultra high definition video camera (30) and at least one image taken comprises an image taken immediately prior to the firing of the first projectile. 6. The sighting apparatus (10) of Claim 4 wherein the video camera (30) continuously captures images in a time span beginning prior to the firing of the first projectile and ending after the first projectile has had sufficient time to travel to the target (70), further comprising a sensor for sensing movement of the firearm resulting from the firearm firing a projectile. 7. The sighting apparatus ( 10) of Claim 6 wherein the sensor is in communication with the processor (50) for delivering firearm (08) firing information relating to firearm

(08) movement resulting from firing the first projectile, for causing the processor (50) to select and store at least one image captured prior to the receipt of the firearm firing information for use as the initial image or images. 8. The sighting apparatus ( 10) of Claim 4 wherein the firearm (08) fires a plurality of projectile, wherein the first projectile is selected from one of the plurality of projectiles and the second projectile is selected from one of any of the other plurality of projectiles other than the first projectile. 9. The sighting apparatus of claim 4, further comprising a mounting member for fixedly coupling at least one of the camera (30), processor (50) and video display (40) to the firearm (08). 10. The sighting apparatus of claim 4

wherein the firearm (08) capable of firing at least a first and second and third projectile out of a firearm barrel,

wherein the series of images includes at least a one image taken of a target containing field that is captured at a time before and generally concurrently with the second projectile reaching the distance of the target (70), and

wherein the software and processor process the second image and the third image to determine a spatial difference between a position of the target (70) relative to the reticle (60) in the second image, and a position of the projectile relative to the reticle (60) in the third, and correcting for deviations from linear in the path of the second projectile between the firearm and the target (70) by adjusting the relative position of the reticle (60) and target (70) displayed on the video display to improve likelihood of the second projectile striking the target (70). 1 1 . The sighting apparatus ( 10) of Claim 4 wherein the software includes an image

recognition function for recognizing an impact point made by the first projectile. 12. The sighting apparatus (10) of Claim 1 1 wherein the software employs the recognized impact point made by the first projectile as the position of the projectile in the second image for adjusting the relative position of the reticle (60) and the target (70) displayed on the video display.

13. The sighting apparatus (10) of Claim 12 wherein the software employs the recognized impact point and position of the target (70) in the first image to determine the spatial distance and directional relationship between the position of the target relative to the reticle (60) in the first image, and the position of the projectile relative to the reticle (60) in the second image, for adjusting the relative position of the reticle (60) and target (70) displayed on the video display (40) to improve the likelihood of the second projectile striking the target. 14. The sighting apparatus ( 10) of Claim 13, wherein the software employs the image recognition function for recognizing a lack of an impact point made by the first projectile,

wherein the software further includes a projectile trajectory determination feature for determining the trajectory of the first projectile on at least a portion of its path during an interval between the firing of the projectile and the capture of the second image. 15. The sighting apparatus (10) of Claim 4, wherein the software includes a projectile trajectory determination function for determining the trajectory of the first projectile on at least a portion of its path during an interval between the firing of the projectile and the capture of the second image. 16. The sighting apparatus ( 10) of Claim 15 wherein the series of images captured by the video camera (30) include a sufficient number of images captured in a time interval between the capturing of the first image and the capturing of an image a point generally concurrently with the projectile reaching the distance of the target to permit the projectile trajectory determination function to determine the trajectory of the first projectile. 17. The sighting apparatus (10) of Claim 16 wherein the projectile trajectory

determination function determines the spatial distance and directional relationship between the position of the target relative to the reticle in the first image, and the trajectory of the first projectile for adjusting the relative position of the reticle (60) and target displayed on the video display (40) to improve the likelihood of the second projectile striking the target.

18. The sighting apparatus ( 10) of Claim 17 wherein the projectile trajectory determination function includes an extrapolation function to extrapolate the path of the first projectile between a point wherein the camera (30) loses sight of the first projectile and a point generally concurrently with the projectile reaching the distance of the target (70) for permitting the sighting apparatus to estimate the position of the projectile at a point generally concurrently with the projectile reaching the distance of the target (70).

19. A sighting apparatus ( 10) for a firearm (08) capable of firing at least a first and second projectile out of a firearm barrel (327), the sighting apparatus ( 10) comprising

(a) a video camera (30) having a sufficient frame speed rate and resolution to be capable of tracking the path of a projectile when shot from the firearm (08) and capturing a series of images, the series of images including

(i) at least a first image taken of a target containing field (42) that is captured at a time before and generally concurrently with the firing of the projectile, and

(ii) additional images taken of a target containing field (42) that is captured before and generally concurrently with the projectile reaching the distance of the target.

(b) a video display screen (40) for the user to employ to sight the target and aim the firearm (08), the video display (40) including a display of an image of the target containing field (42) and a reticle (60) positioned to permit the user to aim the firearm (08) by positioning the reticle (60) over the target (70),

(c) a processor (50) including

(i) an input interface in communication with the camera (30) for enabling the processor (50) to receive captured images from the camera (30),

(ii) an output interface in communication with the video display (40) for enabling the processor (50) to deliver information to the video display (40) to enable the video display (40) to display images of the target area (42).

(iii) a memory for storing captured images, and

(iv) a computer program for operation the processor to process image information captured by the camera (30),

wherein the software and processor process the images to determine a spatial difference between a position of the intended target (70) centered under the fixed reticle (60) when a shot is taken and the point where the projectile that is fired impacts the target field or passes by the intended target point and automatically moves or drags the target field so that the actual point of impact or the point where the projectile passes the intended target point is centered under the fixed reticle (60) in preparation for the next shot to improve the accuracy of the next shot. 20. The sighting apparatus (10) of Claim 19, wherein the software includes a projectile trajectory determination function for visually recording, determining and then plotting the trajectory of a projectile on at least a portion of its path during an interval between the firing of the projectile and the completion of the projectile's flight path to or past the intended target (70). 21 . The sighting apparatus ( 10) of Claim 20 wherein the series of images captured by the video camera include a sufficient number of images captured in a time interval between the capturing of the first image and the capturing one or more additional images to permit the projectile trajectory determination function to determine the trajectory and the point of impact on the target field of the first projectile or the point where the projectile passed by the intended target point. 22. The sighting apparatus (10) of Claim 21 wherein the projectile trajectory

determination function corrects the spatial distance and directional relationship between the position of the intended target point centered under the reticle (60) in the first image, and the point of impact of a projectile or the point where the projectile passed by the intended target point by moving or dragging the image of the target field (42) so that the actual point of impact of the projectile or the point where the projectile passes the intended target point is centered under the fixed reticle (60) in order to improve the accuracy of the next shot. 23. The sighting apparatus ( 10) of Claim 22 wherein the projectile trajectory

determination function includes an extrapolation function to extrapolate the path of the first projectile between a point wherein the camera loses sight of the first projectile and a point generally concurrently with the projectile reaching the distance of the target (70) for permitting the sighting apparatus to estimate the position of the projectile at a point generally concurrently with the projectile reaching the distance of the target (70). 03 - pet final version sent to PTO 17 November 201 1

Description:
FIREARM SIGHT HAVING UHD VIDEO CAMERA

I. Technical Field of the Invention.

[001 ] The present invention relates to fire arms, and more particularly, to a firearm system having a sighting mechanism that enables the user to achieve a better target hit rate by enabling the user to correct for such things as distance, weather conditions, windage and gravity.

II. Background of the Invention

[002] It is often difficult for firearms to achieve a high degree of accuracy in hitting their targets when the firearms solely employ an optical sighting mechanism such as open "iron" sites to a sighting telescope. This difficulty is caused in particular by various influences having an increasing impact on the ability to accurately aim the rifle, as the distances from the rifle to the target increase. One influence on the inaccuracy of a projectile is that a projectile travels along a ballistic trajectory that is determined by the design and fabrication of the firearm.

[003] The type of ammunition used also influences the trajectory of a projectile. Moreover, for the same ammunition, the cartridge temperature and barrel temperature at the time of discharging each projectile, both influence the course of the projectile's trajectory. For the reasons stated above, it is useful to provide a sighting mechanism for a firearm that is capable of making corrections that take into account the existing circumstances that influence the trajectory of the projectile. Preferably, the device's ability to correct are such that they can be altered automatically and performed and made virtually instantaneously.

[004] Several attempts have been made to overcome the problems discussed above.

Examples of such attempts are shown in United States Patent Publication No.

2005/0268521 Al ; US Patent No. 5,026, 158, to Golubic; US Patent No. 6,070,355, to Day; US Patent No. 7,926,219 B2; US Patent No. 7,292,262 B2, to Towery et al; US Patent Pub No. US 2010/0251593 A 1 , to Backlund et al.; EP 0966 647 B 1 ; DE 101 05 036 A 1 ; DE 42 18 1 18 C2; U.S. Pat. No. 6,449,892 B 1 ; U.S. Pat No.

5,675, 1 12 A; and U.S. Pat No. 7,810,273 B2 Although the above-mentioned devices likely perform their intended duties in a workmanlike manner, room for improvement exists.

[005] It is therefore one object of the present invention to provide a sighting mechanism that provides for accurate aiming by the marksman, while being simple to operate and quick to actuate.

III. SUMMARY OF THE INVENTION

[006] A sighting apparatus for a firearm is capable of firing at least a first and second

projectile out of a firearm barrel, the sighting apparatus includes a video camera having a sufficient frame speed rate and resolution to be capable of tracking the path of each projectile when shot from the firearm and capturing a series of images. The series of images include at least a first image taken of a target containing field that is captured at a time before and generally concurrently with the firing of the first projectile, and additional images taken of a target containing field that is captured before and generally concurrently with the projectile reaching the distance of the target, a video display screen is provided for the user to employ to sight the target and aim the firearm. The video display includes a display of an image of the target containing field and a reticle positioned to permit the user to aim the firearm by positioning the reticle over the target. A processor includes an input interface in communication with the camera for enabling the processor to receive captured images from the camera, an output interface in communication with the video display for enabling the processor to deliver information to the video display to enable the video display to display images of the target area, a memory for storing captured images, and a computer program for operation the processor to process image information captured by the camera. The software and processor process the first image and the additional images to determine a spatial difference between the position of the intended target point centered under the reticle in the first image, and a position of the projectile relative to the intended target point in the second image, and correcting for deviations from linear in the path of the projectile between the firearm and the target by moving the relative position of the image of the target field so that it is centered under the reticle displayed on the video display to improve the accuracy of the next shot.

[007] One feature of the present invention is that a high speed, ultra high definition digital video camera ("UHD camera") can be mounted on a firearm parallel to its barrel that records a target sighting field and each projectile in flight. Alternately, the firearm sight can be monitored wirelessly or via a wired peripheral operatively linked to a UHD Camera. [008] A preferred embodiment can include a digital computer or processor having as an input and interface for the ultra high definition video camera and having an output interface for the video screen whereby the digital computer unit determines the moment that the recoil of the firearm from a discharge of a shot abruptly alters the incoming image field, while determining the point of impact of the projectile or the point where the projectile passes the intended point of impact. These determined point(s) are compared to the point of the center of the reticle on the image field immediately before the disruption caused by the recoil calculated by the computer unit as a function of the data that is incoming by means of the input interface in preparation for the next shot.

[009] Another feature of the present invention is that a digital computer or processor is incorporated into the UHD camera for recording and digitally controlling the video input, and/or the digital computer or processor is operatively connected to the firearm sight image gathering apparatus. The image input from the firearm sight can be controlled so that a fixed reticle in the firearm sight is superimposed over the target field. The target field image is moved with respect to the fixed reticle in order to align the actual point of impact of a projectile or the point where the projectile passed by the intended point of impact with the central position of the reticle.

[0010] Where the UHD camera does not detect an actual point of impact or the point where the projectile passes the intended point of impact, the processor determines the track path of the last projectile fired and provides a solution where the projectile impact would have been, or the point where the projectile passed by the intended point of impact and shifts the position of the image field in the sighting device accordingly. If and to the extent that the UHD camera cannot track the projectile from the muzzle of the firearm all of the way to the final destination of the projectile, the computer extrapolates from the trajectory, the angle, and the speed of the projectile to the extent that the UHD camera can track the projectile, as well as any discernable impact that the projectile may make on the target field to determine its precise point of impact or the point where the projectile passes the intended point of impact.

[001 1 ] Applicant believes that superior weapon firing accuracy is achieved by moving the image of the target field automatically to align the actual point of impact of the last projectile fired or the point where the projectile passed by the intended point of impact with the center of the reticle, the reticle being fixed in the sighting device. Projectile firing causes a recoil signature that can be distinguished from other types of target field image movement in a video camera. Recoil can be accommodated for in adjusting the movement of the target field by programming the device to select an image with the reticle displayed the instant before recoil occurs so that the actual point of impact, the projected point of impact or the point where the projectile passed by the intended point of impact is used in order to move the image of the target field to place the the point directly at the center of the reticle to perfectly sight in the sighting device and the firearm to enhance the accuracy of the next shot.

[0012] Preferably, the computer in the sighting device is programmed so that if and to the extent that the UHD camera cannot track the projectile from the muzzle of the firearm all of the way to the final destination of the projectile, the computer extrapolates from the trajectory, the angle, and the speed of the projectile as well as any discernable impact that the projectile may make on the target field to determine its precise point of impact or the point where the projectile passes the intended point of impact. A digital computer or processor preferably has an interface for the ultra high definition video camera to input data to the processor. The processor has an output interface for the video screen.

[0013] The processor is programmed so that the digital computer unit determines the moment that the recoil of the firearm from a discharge of a shot abruptly alters the incoming image field while determining the point of impact of the projectile or the point where the projectile passes the intended point of impact and compares it to the point of the center of the reticle on the image field immediately before the disruption caused by the recoil calculated by the computer unit as a function of the data that is incoming by means of the input interface in preparation for the next shot. The digital computer unit is programmed to correct the variance between the point of impact (or the point where the projectile passes the intended point of impact) and the intended point of impact.

[0014] This variance is corrected by centering the image of the point of impact or the point where the projectile passes the intended point of impact on the video screen directly under the center of the fixed reticle in preparation for the next shot thereby perfectly sighting in the sighting device and the firearm.

[0015] In the event that there is no point of impact on the target field, an integrated distance measuring instrument such as a laser range finder, a measuring transducer or a distance determining algorithm utilizing the known size of an object in the target field is utilized to calculate the point where the projectile passes the intended point of impact. If and to the extent that the UHD camera cannot track the projectile from the muzzle of the firearm all of the way to the final destination of the projectile, the computer extrapolates from the trajectory, the angle, and the speed of the projectile to the extent that the UHD camera can track the projectile as well as any discernable impact that the projectile may make on the target field to determine its precise point of impact or the point where the projectile passes the intended point of impact.

[0016] A further feature of the present invention is that a digital computer or processor

having as an input an interface for the ultra high definition video camera and having an output interface for the video screen is provided. The digital computer unit determines the moment that the recoil of the firearm from a discharge of a shot abruptly alters the incoming image field while determining the point of impact of the projectile or the point where the projectile passes the intended point of impact. This is compared to the point of the center of the reticle on the image field immediately before the disruption caused by the recoil calculated by the computer unit as a function of the data that is incoming by means of the input interface. In preparation for the next shot, the video screen displays a corrected position of the target image under a superimposed reticle calculated by the computer unit as a function of the data that is incoming by the means of the input interface in preparation for the next shot.

IV. Brief Description of Drawings

[0017] Fig. 1 is a highly schematic diagramic view of a sighting mechanism mounted on a firearm according to the invention;

[0018] Fig. 2a is a side view of a typical rifle and a typical prior art rifle mounted "scope" sighting system;

[0019] Fig. 2b is a side schematic view of a typical rifle with a sighting device of the present invention mounted to the weapon;

[0020] Fig. 3a is a perspective view of a typical military style weapon with an embodiment of the present invention mounted thereon;

[0021 ] Fig. 3b is a perspective view of a typical military style weapon having another

embodiment of the present invention mounted thereon;

[0022] Fig. 4 is another highly schematic view of the sighting mechanism of the present invention;

[0023] Fig. 5 is a schematic view illustrating the targeting features and aspects of the present invention;

[0024] Fig. 6 comprises a flow chart depicting the logic sequence used by the processor to determine whether an adjustment should be made to the sight; and

[0025] Figs. 7a-d are sequential drawings depicting the sighting device of the system and targets, as the device moves through its adjustment process.

V. Detailed Description

[0026] A. An Overview of the Present Invention.

[0027] A sighting mechanism of the present invention is characterized in that a high speed, ultra high definition digital video camera is arranged on the firearm in such a manner that it has a lens capture area disposed parallel to the barrel of the firearm so that the camera can and does capture the target field, the area surrounding the target field, and the flight path of a fired projectile on a video screen. An integrated digital computer unit is in communication with the camera. The computer has a video input interface for receiving digital image data from the video camera In essence, the integrated digital computer unit comprises a digital image processing computer that allows a selectable image portion of the image data received from the video camera to be superimposed in a pixel precise fashion and in real-time to form a target image and an image of the projectile in flight and to be displayed on the screen

[0028] The digital computer can be used to position the target image displayed on the screen and a reticle that is situated on and at the center of the screen in an automatic manner and in real time based upon the data that is being received from the camera through the input interface such that the position of the point of impact on the target image or the point where the projectile passes the intended point of impact is directly under the reticle at the center of the video screen. In the event that there is no point of impact on the target field, an integrated distance measuring instrument such as a laser range finder, a measuring transducer or a distance determining algorithm utilizing the known size of an object in the target field is utilized to calculate the point where the projectile passes the intended point of impact.

[0029] If and to the extent that the UHD camera cannot track the projectile from the muzzle of the firearm all of the way to the final destination of the projectile, the computer extrapolates from the trajectory, the angle and the speed of the projectile (to the extent that the UHD camera can track the projectile) as well as any discernable impact that the projectile may make on the target field to determine its precise point of impact or the point where the projectile passes the intended point of impact. By so determining where the projectile hits, or passes, one can then determine the variation between the point at which the gun is aimed and the point at which the projectile hits, to thereby determine the variance in the projectile casued by such things as humidity, barometric pressure, gravity, distance, and wind.

[0030] The sighting mechanism of the invention is believed to allow for very precise target striking accuracy since the ultra high definition digital video camera and the pixel precise digital image superimposition in real time provide for very high image quality at high resolution and low thermal and digital noise levels and low pixel noise levels and thus yield a very high quality real image of the target. Preferably, the camera provides not only an ultra high definition resolution, but also provides shots at a very high speed (e.g. 300 frames per second or greater.

[0031 ] The present invention provides the potential to correct for substantially all material parameters influencing the trajectory of the projectile automatically and quickly. Preferably, the integrated digital computer unit displays the image field immediately prior to the sudden movement of the image field caused by recoil of the firearm from a discharged shot. The integrated digital computer unit then instantaneously determines the point of impact of the projectile that is fired or the point where the projectile passes the intended point of impact from the data that is inputted from the high speed, ultra high definition video camera. The position of the target image is then adjusted so that the point of impact on the image screen or the point where the projectile passes the intended point of impact is directly under the reticle that is centered on the video screen.

[0032] In the event that there is no point of impact on the target field, an integrated distance measuring instrument such as a laser range finder, a measuring transducer or a distance determining algorithm utilizing the known size of an object in the target field is utilized to calculate the point where the projectile passes the intended point of impact. If and to the extent that the UHD camera cannot track the projectile from the muzzle of the firearm all of the way to the final destination of the projectile, the computer extrapolates the likely trajectory, the angle, and the speed of the projectile from the trajectory, angle, and speed information of the projectile from that portion of the projectile's flight that the UHD is able to track. Additionally any information relating to any discernable impact that the projectile may make on the target field can be added to the extrapolated values to determine a very close approximation of the precise point of impact or the point where the projectile passes the intended point of impact.

[0033] Though this process, the firearm should be sited in perfectly for the next shot, and perfectly corrected for all variables that affect the trajectory of the projectile. The video screen in the sighting field of the marksman then shows both the real time target as a real time image and the reticle in a clear display. The marksman advantageously has no need to interpret, assess, or analyze data displayed to him, but rather can focus solely on aiming the firearm, since the correction of the position of the reticle relative to the target image is carried out automatically.

[0034] Through the use of the present invention, the target and the reticle are optically

visualized significantly better and simpler that the view one receives through a sighting telescope which cannot provide automatic digital correction of the position of the reticle relative to the image of the target and which cannot correct for any influences on the trajectory of the projectile. The digital computer unit integrated into the sighting mechanism processes the incoming data and uses it to calculate the position of the reticle relative to the image of the target on the video screen such that the real point of impact of the projectile on the target or the point where the projectile passes by the intended point of impact coincides with the position of the center of the reticle on the image of the target on the screen.

[0035] The marksman operating the firearm can therefore rely on the image on the screen and does not need to correct the direction of the firearm based on his own experience or his own perception of environmental parameters such as wind, humidity, distance and the like. Accordingly, many of the inherent variables that impact a shot are accounted for to thereby increase the hitting accuracy for any firearm upon which the sighting device is mounted, as the primary variable remaining to be accounted for is the steadiness of the hands of the marksman operating the firearm, or the support upon which the firearm is placed.

[0036] Since no environmental sensing devices are required with the present invention, no firearm or ammunition related data needs to be inputted, no mechanical adjustment or adjustment by motor(s) of parts of the sighting mechanism are required and no mechanical effort is required. Thus, cost savings are achieved along with a reduction provided by the reduction or elimination or the sensitivity of the device to wear and tear and damage. The sighting mechanism can advantageously be used without any adjustment or prior input of data pertaining to any firearms any ammunition or firearms system upon which the sighting mechanism is mounted.

[0037] B. Detailed description of the drawings.

[0038] A sighting mechanism 10 is shown schematically in Figs. 1 and 4 as being mounted to a firearm such as rifle 20. The mechanism 10 includes an ultra high definition digital video camera 30 with a digital processor 50 integrated into the camera 30 or the mounting base of the camera and wirelessly connected to the video output and the viewing screen 40 of the camera. The sighting mechanism 10 is attached to a firearm 20 above the barrel that is partially schematically shown in Fig. 1.

[0039] The sighting mechanism includes a mounting system that enables it to be mounted on the firearm. Preferably the adaptor is a universal type mounting adaptor so that the sighting mechanism 10 can be used on various types of firearms and weapon systems and is movable from a firearm or weapon system of a first type to a firearm or weapon system of a second type without having to make changes to the sighting mechanism and without having to input any data to the sighting mechanism whatsoever.

[0040] The high speed, ultra high definition digital video camera 30 is arranged so that the lens is positioned for being parallel to the barrel 22 so that the images captured by the UHD camera 30 are generally along the path that a projectile fired out of the barrel will take.

[0041 ] The video camera 30 is connected to the integrated computer unit 50 by means of a suitable input interface 33. Accordingly, the camera 30 delivers images of an aimed- for target 70, Fig. 4, whereby at least a portion of the image is digitally imposed in the computer unit 50 in a pixel precise fashion and in real time. Accordingly, a good and clear image of the target 70, Fig. 4 is attained even if the target distance is large.

[0042] Moreover, the sighting mechanism comprises a viewing screen 40 that displays a portion of the image of the target field 42 that is recorded by the high speed, ultra high definition video camera 30 and is inputted into the computer unit 50 and displayed on the display screen 40 such that a marksman or weapons user has a good view of the target 70. A reticle 60 is faded into the target field 42 or otherwise placed on the center of the display screen 40. [0043] Turning now to Figs. 3a and 3b, the operator of the weapon 320, 321 aims the

weapon 320, 321 by positioning the weapon in such a way that the reticle 360, 361 displayed in the display screen 340, 341 is centered on the target 370, 371 that the operator of the weapon 320, 321 wishes to hit. In the Fig 3a embodiment, the display screen 340 is mounted adjacent to the weapon so that movement of the gun 320 will be isolated from the display screen 340. In Fig 3b. the display screen 341 is fixedly coupled to the weapon 321.

[0044] Once the operator has aimed the weapon 320, 321 and acquired his target 370, 371 , the operator is ready to fire the weapon 320, 321. Once the operator fires the weapon 320, 321 , the processor 350, 351 detects that a shot has been fired. The processor 350, 351 records the video image taken by the camera 330, 331 just prior to the shot being fired. In order to do this, the camera 330, 331 is constantly capturing images. The processor 350, 351 is constantly recording some cache of video and maintaining it in memory. The processor 350, 35 1 does not need to retain a large amount of data recorded prior to the shot, but rather, only enough so that it will have video of the target and reticle position immediately prior to the shot being fired. Other images captured prior to the firing of the shot may be discarded or dumped from memory.

[0045] Turning now to Fig. 8, once the processor 350, 351 has detected that a shot has been fired, the processor 350, 351 starts recording to ensure that it has saved captured images taken by the camera 330 immediately prior to the shot being fired, thereby ensuring that an appropriate member of such "just before the shot" images are not lost by being dumped. The processor 50 continues to record and save captured images of the flight of the projectile and, if applicable, images that capture the impact of the bullet in the target field 42. Once the processor 350, 35 1 has recorded the flight of the projectile and or the impact of the projectile in the target field, the processor 50 can then calculate whether the projectile struck an object in the field 70, or traveled to the destination that was intended by comparing the recorded video images to the position of the reticle on the target taken immediately prior to the shot.

[0046] Fig 5 shows that the operator aligned the reticle 60 on the target 70 and fired the weapon. The images captured immediately prior to the shot show that the reticle was centered on the target 70. After the shot, the projectile traveled in the path 92 as indicated by the actual projectile path 92. By comparing the intended projectile path 90 to the actual projectile path 92, the processor 50 can calculate the deviation between the actual projectile path 92 and the intended projectile path 90 and through processing by sofrware driven processor 50, can use this information to correct the centering of the reticle 60 accordingly.

[0047] This correction of the reticle would, in a preferred embodiment adjust the position of the image displayed on the display screen 40, relative to the reticle. For example, if the user was sighting on the target's head, but the actual path of the projectile 92 deviated such that the projectile struck the target thirty inches (76.2 cm) below the target's head by striking the target 70 in the navel, the position of the reticle 60 relative to the target would be adjusted to account for this thirty inch (76.2 cm) deviation at the target position. When so adjusted, when the user next sighted in on the head of the target, the changed relative position of the reticle 60 and image 42 would cause the user to actually be aiming thirty inches (76.2 cm) above the head of the target, even though the user has the cross-hairs of the reticle 60 squarely on the target's head. This deviation between actual and corrected images on the display in the projectile's projected thirty inch drop, to thereby cause the projectile to hit the target squarely in the head, which was the target upon which the user sighted.

[0048] Turning now to Figs. 7a-7d, Fig. 7a represents a picture of the sighted target 70 immediately prior to a shot from the weapon 20 being fired. Fig. 7b represents a picture of the sighted target after the shot was fired and after the projectile impacted the target field 42. In Fig 7b it will be noticed that the point of impact 80 does not line up with the center of the reticle 60 as desired. The processor 50 compares the point of impact 80 with the position of the center of the reticle 60 and re-adjusts the position of the target field image with relation to the reticle 60 on the display screen. Figure 7c depicts the recorded image of a shot fired after the processor 50 has adjusted the reticle 60 position for the next shot. As shown, the processor 50 uses either the path or the point of impact as a reference point to re-adjust the field of view in relation to the reticle for the next shot.

[0049] Fig 6 shows a flow chart of a logic process that the processor 50 can use to

determine if an adjustment to the ridicule 60 position is needed, as desirable. As illustrated in the diagram, an adjustment to the relative position of the image and reticle is only made if the point of impact of the previously fired projectile, or the path of the previously fired projectile differs from the intended point of impact or the intended flight path. If the path or point of impact is different than intended, then the processor will make the necessary adjustments to correct the position of the target field in relation to the reticle. [0050] Turning now to Figs. 1 , l a, 3a and 3b, various placements of the various components of the device will now be discussed.

[0051 ] As best shown in Figs. 1 , 2b and 3b all of the primary components of the device 10, including the UHD camera 30, processor 50 and display screen 40 are all mounted onto an upper surface of the firearm 08. This is a similar configuration to the placement of the camera 331 , processor 351 and video display 341 of Fig. 3b. This placement has many advantages, as through the use of compact dedicated electronics, the sighting mechanism "package" can be made small enough so as to not interfere significantly with the operation of the weapon and can be very portable, since the entire device 10 is carried around with the weapon. Additionally, having all of the components in one place creates a neat and tidy package for the user.

[0052] Alternately, one or more of the components can be separated from the gun. As shown in Fig. 3a, the camera 330 and processor 350 are mounted to the gun 320. However, the video display screen 340 is mounted separately from the gun, and is operatively coupled to the gun 320, through either hard wire configuration or preferably, a wireless communication link, such as BIueTooth.

[0053] One of the benefits of separating the video display 340 from the gun is that it permits a larger video display screen 340 to be used, than one whose size is constrained by the need to place it on top of the gun 320. More importantly, the placement of the video screen 340 on a separate mounting away from the gun 320 isolates the video display screen 340 from gun movement, which may have benefits in reducing the processing difficulties encountered in processing the image information taken by the camera, to derive at the re-positioned image.

[0054] The computer unit 50 compares the relative positions of the reticle 60 over the image of the target 70 immediately prior to the computer or an integrated accelerometer making the determination that the recoil from a shot has caused the field of view of the target image to be abruptly shaken or altered. The computer 50 compares a position of the reticle 60 over the target 70 image immediately prior to the shot being fired with the point that the computer 50 unit determines from the video input from the ultra high definition video camera 30 is the actual point of impact 80 of the projectile that is fired or the point where the projectile passes the intended point of impact. The computer unit 50 then rectifies the discrepancy between the two positions by shifting position of the image of the target field so that the point of impact or the point where the projectile passes the intended point of impact is directly under the center of the reticle 60. The sighting mechanism 10 and firearm 20 are thereby perfectly sighted in for the next shot to be fired at the target field (42).

[0055] Figs. 7a-7d are exemplary monitor output images from a weapon sight made in

accordance with an embodiment of the present invention. Fig. 7a shows the target field image 42 and reticle 60 position immediately prior to a shot being fired. In Fig. 7b, an uncorrected target field image shown immediately after the shot, in which the center of the reticle 70 is shown with respect to an impact point 80 where the projectile passes by the intended target 70 (i.e., the X shows the impact position or the point where the projectile passed by the intended target in the two dimensional image of a projectile monitored by the gun sight).

[0056] Fig. 7c is the corrected image from Fig. 7b. To make the correction, the system of the present invention 10 moves the image field 42 placement on the display screen so that the point of impact or the point 80, (Fig. 7b) where the projectile passed by the intended target 70 of the last projectile fired is aligned with the center of the reticle 60. Once so positioned, a user firing his second shot (Fig. 7c) can aim the gun at the center of the target 70. The position of the image has been shifted to account for the deviation in the projectile path caused by factors such as humidity, distance, wind, barometric pressure, etc,. Therefore, aiming the gun at the center of the "viewed, shifted" target will cause the fired projectile to strike the spot 80 at which the user was aiming. In an alternate embodiment, a cursor can show how far the impact position of the prior projectile has been shifted in the image field.

[0057] Turning now to Fig. 6, a flowchart is shown that helps to illustrate the operation of the device is shown. Flowchart box 600 comprises the first step in the process, wherein the gun fires its projectile. Box 600 contemplates the shot fired as the first shot that the user takes at the target 70.

[0058] Turning now to Box 610, the first decision point occurs when a determination is made as to whether the projectile hit within the target area 42. This is determined through the interaction of the camera that is taking pictures of the target area so that the device 10 can get a fix on the spot 80 impacted by the projectile. These images are forwarded to the processor 50 for processing the information. The results of these captured images and processed images can be displayed on the video display 40 wherein the user can make a visual determination of whether the projectile hit the object 70 within the target area 42 that the user can see.

[0059] If the projectile did hit something within the target area 42, the next decision box 620 seeks to determine whether the projectile hit the actual target 70.

[0060] A determination of whether the projectile hit the target 70, begs the decision of

whether an additional shot is necessary. If the projectile hit the target 70, as shown in box 630, there is no need to continue to the procedure by taking a second shot, since the target 70 has been hit. Since the target has been hit, and there is no need for a second shot, there is no necessary need to adjust the relative positions of the reticle 60 and the target 70. Even if the user decides to take a second shot, the fact that the projectile hit the target, suggests that no further adjustment is necessary between the position of the reticle 60 and the target 70.

[0061 ] On the other hand, if the projectile did not hit the target as shown at box 632, the processor goes through its calculations, to determine the difference in position between the point at which the rifle was aimed, and the point at which the projectile hit (whatever it hit) to make an adjustment in the relative position of the reticle 60 and target 70. The adjustment is made so that on the second shot, the user can sight the weapon directly on the target and hit the target since the deviation in the projectile projection path will be taken into account and adjusted for when resetting and adjusting the relative positions of the reticle 60 and target 70.

[0062] Turning back to the decision box 610, if the projectile did not hit within the target area, the processor 50 and camera 30 will then have no impact point at which to capture images of and record and process in the processor 50.

[0063] As there is no image of the place where the projectile hit, the processor is then

employed to calculate the projectile path. As described above, the projectile path is calculated by mathematically processing the image of the projectile that is shown in the images captured by the camera 30, during the time after the projectile is fired or until such time as either the projectile hits its impact point, or some other

predetermined time has passed.

[0064] The above is shown at decision box 634. The next decision box 636 asks the question of whether the projectile path is aligned with the target. If the projectile path is aligned with the target 70, it is highly likely that the projectile hit the target, but that the impact mark made by the projectile is not visible or recognizable by the camera 30 and processor 50. However, if the projectile path does align with the target 70, one moves then to decision box 638 that states that you stop the process, as there is no need for adjustment.

[0065] Since the target 70 was likely hit by the projectile, there likely is no need to adjust for a second shot. However, even if a second shot is desired, the fact that the projectile likely hit the target 70 suggests that the current alignment will serve well to enable the user to hit the target with a second shot, since there exists relatively little or no deviation between the target sighted in the reticle and the point impacted by the projectile.

[0066] It will be appreciated that this scenario could also describe the second projectile fired by the weapon. For example, if the user fired the rifle the first time, and the projectile hit the target area 70 but the projectile did not hit the target 70, the processor would be required to readjust the sight correct, as shown at decision box 632. Assuming this adjustment was made, the gun on firing the second time, could have launched the projectile along a path that enabled the projectile to hit the target, although the projectile impact spot was not seen. This would then suggest that the adjustment made at decision box 632 was a correct adjustment, and that any further shot (if so desired) could be made as the target was properly "sighted in".

[0067] On the other hand, if the projectile path did not align with the target, one then arrives at the decision point of decision box 640. As such a point, the processor 50 readjusts the relative position of the reticle 60 and the image, so that the user, on a subsequent shot can sight the target such that it is in the middle of the reticle, thereby hitting the target with the deviations in projectile path already being accounted for through the processor and alignment.

[0068] In an alternate embodiment, a cursor can be shown in the image field to indicate the prior shot, a series shots or a tracer pattern. Software and systems for tracking a target in a video monitor are used extensively in weapons systems. These include Cursor On Target or "CoT" technologies, mapping technologies, global positioning systems, etc., and can be used to monitor multiple targets, multiple weapons and projectile tracking histories. Various software and hardware systems have been developed, some of great sophistication and expense, e.g., U.S. Patent 5,686,690. Although good at what they do, such systems still require significant training for use, are quite bulky and/or heavy, etc. While it is possible to have a gun mount that would automatically adjust azimuth and elevation to fix on a target, this is impractical for maximum individual mobility.

[0069] While such prior art systems are impractical, aspects of the technology incorporated into the prior art, target sighting and tracking can be applied by one of skill in the art without undue experimentation in creating a weapon sight and weapon system in accordance with the present invention. For example technologies for moving an image with respect to a point in an image field are known in other, non-related, non- analogous applications such as in Internet mapping programs. In such programs, moving a cursor over a map causes the image can be re-centered with respect to the cursor.

[0070] In the alternative, an image can be viewed from a fixed point while the image is

moved with respect to the fixed point. Image processing and Graphical User Interface (GUI) technology is included in a wide variety of commercially available computing systems and video cameras, even low cost models, include editing capabilities that allow for the superimposition of markings.

[0071 ] Use of the present invention with different weapons can be accomplished by placing a weapon in a fixed mount, establishing a firing monitor on the weapon to detect when the weapon is fired and the displacement associated with firing under different conditions and using different ammunition. While an image data gathering device can be fixed to the weapon or placed in a known position with respect to the weapon, processing of the data therefrom can be done remotely.

[0072] Data can be transmitted to a processor wirelessly, and more than one image data

gathering device may be used, so that the track of a projectile can be better monitored. For example, an ultra high definition, high speed camera can be used to collect image data, and this data used in accordance with the embodiments described above. A second such camera could be used to help provide depth of field and to help calculate distance to target. Further, the present invention can be used with technologies that enhance human vision, such as infrared imaging, thermal imaging, filtering, etc.

[0073] As is apparent from the foregoing specification, the invention is susceptible of being embodied with various alterations and modifications which may differ particularly from those that have been described in the preceding specification and description. It should be understood that I wish to embody within the scope of the patent warranted hereon all such modifications as reasonably and properly come within the scope of my contribution to the art.