Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
TARGET TRACKING SYSTEM
Document Type and Number:
WIPO Patent Application WO/2022/225598
Kind Code:
A1
Abstract:
A system and method for tracking a target. An acquisition and track sensor (ATS) includes a wide field of view (FOV) camera system continuously capturing an image of the target. The ATS can be a mid-wave infrared tracker. The ATS identifies the target for tracking and cues the target for tracking. A fine track sensor (FTS) includes a narrow FOV camera system. The FTS can be a short-wave infrared tracker. The FTS continuously re-centers the narrow FOV around the target after the target is cued by the ATS. The ATS and FTS include independent steering mechanisms for respective camera systems.

Inventors:
BRAUNREITER DENNIS (US)
FREEMAN MATTHEW (US)
DOUGLAS DAVE S (US)
BONTRAGER PAUL (US)
MORALES ANIBAL (US)
Application Number:
PCT/US2022/016647
Publication Date:
October 27, 2022
Filing Date:
February 16, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
RAYTHEON CO (US)
International Classes:
G01S3/786; F41G3/16; F41G5/08; F41G7/22; F41H13/00; G06T7/246; H04N5/232
Domestic Patent References:
WO2020008465A12020-01-09
Foreign References:
US20070075182A12007-04-05
US20210103056A12021-04-08
US5341142A1994-08-23
EP2738513A12014-06-04
Other References:
VAHEY MICHAEL ET AL: "Parallel processing architectures for image processing systems", PROCEEDINGS OF SPIE; [PROCEEDINGS OF SPIE ISSN 0277-786X VOLUME 10524], SPIE, US, vol. 10279, 25 April 1995 (1995-04-25), pages 102790E - 102790E, XP060094360, ISBN: 978-1-5106-1533-5, DOI: 10.1117/12.204205
Attorney, Agent or Firm:
MARAIA, Joseph M. et al. (US)
Download PDF:
Claims:
CLAIMS WHAT IS CLAIMED IS: 1. A target tracking system for tracking a target comprising: an acquisition and track sensor (ATS) including a wide field of view (FOV) camera system continuously capturing an image of the target, the ATS configured to identify the target for tracking and cue the target for tracking; and a fine track sensor (FTS) including a narrow FOV camera system, wherein the FTS continuously re-centers the narrow FOV camera system around the target after the target is cued by the ATS, wherein the ATS and FTS include independent steering mechanisms for respective camera systems. 2. The system of claim 1, wherein: the target tracking system is a high energy laser (HEL) tracking system; the ATS is a mid-wave infrared (MWIR) tracker; and the FTS is a short-wave infrared (SWIR) tracker. 3. The system of claim 2, wherein the SWIR tracker includes: a centroid tracker configured to filter clutter and noise from the images, determine a centroid of the target, and generate a plurality of recursive reference images of the target based on the images; and a correlator tracker to determine a shift between successive recursive reference images of the target and generate an offset for an aimpoint for target.

4. The system of claim 2, further comprising: a target state estimator, the target state estimator configured to estimate a current state of the target by fusing data from the correlator tracker, the centroid tracker, and the MWIR, wherein recursive reference images are updated based on the estimated current state of the target. 5. The system of claim 2, wherein: the SWIR tracker includes a high rate SWIR camera system and a low rate SWIR camera system; and the MWIR camera system includes a low rate MWIR camera system. 6. The system of claim 1, wherein the FTS is configured to track a plurality of features of the target, including a plurality of segments of the target. 7. The system of claim 6, wherein: the system is a high energy laser (HEL) tracking system including a HEL beam; and the system is configured to point the HEL beam to the cued target based on the FTS. 8. The system of claim 7, wherein the system is configured to run a battle damage assessment mode, wherein the system is configured to: detect when, after the target is struck by the HEL beam, kinematics of the target deviate from expected dynamics of the target; detect when one or more of the segments of the target become spatially separated from the target and determine target damage; and provide a target damage assessment indicator based on the target damage determined.

9. The system of claim 8, wherein: the system is configured to reassess the target after the HEL beam is turned off to determine the target damage, the target damage based in part on a distance between spatially separated segments of the target. 10. The system of claim 1, wherein the ATS is configured to cue the target for tracking when the target features meet a plurality of predetermined criteria, including: target size, target shape, target velocity, signal-to-noise ratio, and an amount of time appearing. 11. A method of tracking a target with a target tracking system comprising: operating an acquisition and track sensor (ATS) including a wide field of view (FOV) camera system to continuously capture an image of the target, identify the target for tracking, and cue the target for tracking; and operating a fine track sensor (FTS) including a narrow FOV camera system to continuously re-center the narrow FOV around the target after the target is cued by the ATS, wherein the ATS and FTS include independent steering mechanisms for respective camera systems. 12. The method of claim 11, wherein: the target tracking system is a high energy laser (HEL) tracking system; the ATS is a mid-wave infrared (MWIR) tracker; and the FTS is a short-wave infrared (SWIR) tracker. 13. The method of claim 12, wherein operating the FTS includes: operating a centroid tracker to filter clutter and noise from the images, determine a centroid of the target, and generate a plurality of recursive reference images of the target based on the images; and operating a correlator tracker to determine a shift between successive recursive reference images of the target and generate an offset for an aimpoint for target. 14. The method of claim 12, further comprising: estimating, with a target state estimator, a current state of the target by fusing data from the correlator tracker, the centroid tracker, and the MWIR; and updating the recursive reference images are based on the estimated current state of the target. 15. The method of claim 12, wherein: the SWIR tracker includes a high rate SWIR camera system and a low rate SWIR camera system; and the MWIR camera system includes a low rate MWIR camera system. 16. The method of claim 11, further comprising tracking a plurality of features of the target with the FTS, including a plurality of segments of the target. 17. The method of claim 16, wherein the system is a high energy laser (HEL) tracking system including a HEL beam, the method further comprising: pointing the HEL beam at the target, after the target is cued, based on the SWIR tracker. 18. The method of claim 17, further comprising: running the system in a battle damage assessment mode, wherein the system does the following: detects when, after the target is struck by the HEL beam, kinematics of the target deviate from expected dynamics of the target; detects when one or more of the segments of the target become spatially separated from the target; determines target damage; and provides a target damage assessment indicator based on the target damage determined. 19. The method of claim 18, further comprising: turning the HEL beam off; reassessing the target after the HEL beam is turned off to determine the target damage, the target damage based in part on a distance between spatially separated segments of the target. 20. The method of claim 11, wherein the ATS cues the target for tracking when the target features meet a plurality of predetermined criteria, including: target size, target shape, target velocity, signal-to-noise ratio, and an amount of time appearing.

Description:
TARGET TRACKING SYSTEM FIELD OF THE TECHNOLOGY The subject disclosure relates to object detection and more particularly to target targeting using high energy laser (HEL) systems. BACKGROUND OF THE TECHNOLOGY High energy laser (HEL) weapons systems require a fine tracking system to point the laser beam accurately on a target. A wide area medium infrared camera system is required to acquire and track the target either autonomously or assisted by the operator to cue and point the fine tracking system by closing a track loop around a gimbaled sensor. The wide area camera field of view contains other targets of interest, masked by background clutter if of a smaller size (as in the case of mortars and UAS), and possess varying illumination conditions. The wide area tracking system must continuously track during the HEL engagement for re-acquisition, battle damage assessment, and acquisition of other targets once the primary target is destroyed. Targets under track will maneuver, emit HEL beam heating effects, have varying illumination, and break apart. Current infrared (IR) and visible tracking algorithms and systems are designed for either open loop or closed loop tracking of targets but not optimal for HEL mission requirements. For example, while the target is heating, image artifacts appear and obscure the target shape affecting computer vision and/or traditional correlation tracker techniques. Although these trackers provide track on multiple targets, they do not track multiple pieces of the target that become spatially separated in case of target break lock during destruction that move separately. This makes it impossible to accurately assess damage to a target. As such, these trackers are designed to track a target generally, but are not equipped to track the target in a way that allows for an indication of battle damage or target kill. Further, engaging fast moving targets such as rockets and mortars requires the HEL weapon system to automatically acquire, track, and provide a target aimpoint to point the HEL beam. The target under track will maneuver, be relatively stationary or moving in a cluttered background, emit HEL beam heating effects, and have varying illumination passively or from an active illuminator, all of which impact a correlation tracker performance by integration of negative effects into the target signature used for correlation causing track break lock. In the case of mortars and rockets tracking the target can be quite small in a given pixel dimension which limits the use of traditional computer vision techniques that require larger target dimensions. Most significantly, as the target heats up from HEL effects, the hot spot of the target will cause traditional tracking techniques to lose lock on target features and track the hot spot created by the HEL beam causing track drift and eventual break lock on the target aimpoint. Current semi-active or passive target imaging trackers are based on frame-to-frame correlation to an integrating or fixed target reference and will break track lock in cluttered backgrounds and target aspect change. Break lock occurs from feeding back corrupted target information to the integrated target reference that corrupts the correlation measurement between the target and the reference. Some approaches to mitigate the negative effects in the target reference update include SNR based thresholding of the target prior to integration in the reference as well as methods to estimate the rotation and scale of the target from the image data using (CV) techniques. Most CV are not suitable though for small targets and dynamic intensity variations, and cause track break lock and corrupted track estimates. Further, break lock occurs from the hot spot of the target from HEL heating biasing the track target position and causing the tracker to lose lock aimpoint features, which eventually drifts the tracked position off of the target. Others have attempted to solve this problem by eliminating the hot spot from the image or finding a track position on the target that is not near the expected heating point on the target. The approaches that eliminate a hot spot on the image do not account for temporal effects such as hot spot size growth and intensity increase or dynamic range compression and saturation due to intensity increases, all of which cause tracked point corruption and break lock. Approaches that utilize a target features offset from the hot spot are limited and can be inconsistent in location and SNR further causing break lock during HEL heating. HEL weapon systems require precision and very high speed target aimpoint tracking. Precision tracking approaches including centroid based tracking and correlator based tracking are known for performance and robustness in clutter and noise. A correlation tracker estimates the shift between a reference map of the target and the current input image. These shifts provide pointing to filtering algorithms for a variety of system laser beam pointing and tracking functions. The most efficient computational implementation for estimating the cross correlation of a target and reference map is done with a 2D FFT. For state of the art HEL tracking applications, the target area is typically separated from the background to reduce background clutter from corrupting the estimate. The size and shape of the target can change over the duration of the engagement. The 2D FFT sizes used in implementation must be adapted to the target size to minimize computation time and latency for high-speed image tracking and subsequent laser beam pointing. The FFT correlator needs to execute the 2D FFT operation in real time in a processing-constrained real time device at very high execution (frame) rates. Extra execution time impacts the algorithm processing timeline and prevents the tracking mission. Prior attempts have accepted the execution time penalty of creating a size-changing 2D FFT plan in real-time as an unavoidable cost, or selected a non-optimal predicted plan size to minimize the plan creation cost. Other approaches create oversized FFTs that will cover all the incoming signal sizes, the results being extracted after execution. Still other attempts have created a series of plans based on conjecture, and selected the closest one, without guarantees for optimal size or execution. Finally approaches have reduced the computational load of the correlator using binary or thresholded representations of the targets which are susceptible to noise and clutter. SUMMARY OF THE TECHNOLOGY In at least one aspect, the subject technology relates to a target tracking system for tracking a target. The system includes at least one imaging system configured to continuously capture a plurality of images of the target. The system includes a centroid tracker configured to run a first tracker mode to filter clutter and noise from the images, determine a centroid of the target, and generate a plurality of recursive reference images of the target based on the images. The system includes a correlator tracker configured to run a second tracker mode to determine a shift between successive images of the target and generate an offset for pointing the target tracking system at the target. The correlator tracker utilizes the recursive references images when the first tracker mode is running. The system is configured to track the target by running the first tracker mode and the second tracker mode simultaneously and monitoring the tracker modes for a failure. When a failure is detected in one of the tracker modes, the system runs only the tracker mode where no failure was detected. In some embodiments, the system includes a high energy laser (HEL) adjustment module. The HEL adjustment module is configured to run two stages. A first stage detects changes in intensity within a spatial pattern of the target within the image due to heating effects to identify a HEL affected area. After the change in intensity is detected, a second stage turns a HEL adjustment on to remove the heating effects within the spatial pattern of the target within the image. In some cases, when the HEL adjustment is on, the system is further configured to track the target using only areas of the target not within the HEL affected area. In some embodiments, the system is further configured to update the recursive reference image based on the HEL affected area. In some cases, when the HEL adjustment module is on, the system is configured to continuously monitor a peak intensity of the target in the image and update parameters of the system based on the peak intensity of the target in the image. In some embodiments, when the HEL adjustment module is on, the imaging system is configured to adjust sensor gains based on the peak intensity of the target in the image to keep the peak intensity of the target in the image within a predetermined range. In some embodiments, the correlation tracker runs at a relatively high speed with respect to a speed of the centroid tracker. In some cases, the system includes a target state estimator configured to estimate a current state of the target by fusing data from the correlator tracker and centroid tracker. The centroid tracker and the correlator tracker can be configured to subsequently rely on the estimated current state of the target. In some embodiments, the system includes a line of sight manager configured to check for irregular motion of the target and determine an aimpoint for the target. When the first tracker mode and the second tracker mode are operational, the aimpoint is based on the centroid location from the centroid tracker adjusted by the shift from the correlator tracker and the update from the HEL module. The system can also include a HEL and the line of sight manager can be configured to determine a second offset to the aimpoint for pointing the HEL. The system can then point the HEL at the target based on the second offset and the aimpoint. In some embodiments, the system utilizes gates outlining boundaries within the image within which the system looks for the target. The system can continuously resize the gates based on a shift between successive recursive references images. In some cases, the gates comprise two concentric areas. The first concentric area is a buffer around an estimated extent of the target size used to determine the target position. The second concentric area is an annulus around the first concentric area. The system can estimate background based on the second concentric area. In at least one aspect, the subject technology relates to a method of tracking a target with a target tracking system. The method includes continuously capturing, with at least one imaging system, a plurality of images of the target. The method includes running a first tracker mode using a centroid tracker, the first tracker mode including filtering clutter and noise from the images, determining a centroid of the target, and generating a plurality of recursive reference images of the target based on the images. The method also includes running a second tracker mode using a correlator tracker, the second tracker mode including determining a shift between successive images of the target and generating an offset for pointing the target tracking system at the target, wherein the correlator tracker utilizes the recursive references images when the first tracker mode is running. Finally, the method includes tracking the target with the system by running the first tracker mode and the second tracker mode simultaneously and monitoring the tracker modes for a failure, and, when a failure is detected in one of the tracker modes, running only the tracker mode where no failure was detected. In some embodiments, the method includes running a high energy laser (HEL) adjustment module in two stages. A first stage includes detecting changes in intensity within a spatial pattern of the target within the image due to heating effects to identify a HEL affected area. After the change in intensity is detected, a second stage includes turning a HEL adjustment on to remove the heating effects within the spatial pattern of the target within the image. In some cases, after turning the HEL adjustment on, the method includes tracking the target using only areas of the target not within the HEL affected area. The recursive reference image is then updated based on the HEL affected area. In some embodiments, after turning the HEL adjustment on, the method includes continuously monitoring a peak intensity of the target in the image and updating parameters of the system based on the peak intensity of the target in the image. In some cases, after turning the HEL adjustment on, the method includes adjusting sensor gains of the imaging system based on the peak intensity of the target in the image to keep the peak intensity of the target in the image within a predetermined range. In some embodiments, the correlation tracker runs at a relatively high speed with respect to a speed of the centroid tracker. The method can include estimating, with a target state estimator, a current state of the target by fusing data from the correlator tracker and centroid tracker. The centroid tracker and the correlator tracker can also be configured to subsequently rely on the estimated current state of the target. In some embodiments, the method includes checking for irregular motion of the target and determining an aimpoint for the target with a line of sight manager. When the first tracker mode and the second tracker mode are operational, the aimpoint is based on the centroid location from the centroid tracker adjusted by the offset from the correlator tracker and the update from the HEL module. The method can further include determining a second offset to the aimpoint for aiming a HEL, and aiming the HEL at the target based on the second offset and the aimpoint. In some embodiments, the method includes determining a plurality of gates outlining boundaries within the image within which the system looks for the target. The gates are then continuously resized based on a shift between successive recursive references images. In some cases the gates include two concentric areas. The first concentric area is a buffer around an estimated extent of the target size used to determine the target position. The second concentric area is an annulus around the first concentric area. The system can estimate background based on the second concentric area. In at least one aspect, the subject technology relates to a target tracking system for tracking a target with an ATS and FTS. The ATS includes a wide field of view (FOV) camera system continuously capturing an image of the target. The ATS is configured to identify the target for tracking and cue the target for tracking. The FTS includes a narrow FOV camera system. The FTS continuously re-centers the narrow FOV camera system around the target after the target is cued by the ATS. The ATS and FTS include independent steering mechanisms for respective camera systems. In some embodiments, the target tracking system is a high energy laser (HEL) tracking system. The ATS can be an MWIR tracker and the FTS can be an SWIR tracker. The SWIR can include a centroid tracker configured to filter clutter and noise from the images, determine a centroid of the target, and generate a plurality of recursive reference images of the target based on the images. The SWIR can also include a correlator tracker to determine a shift between successive recursive reference images of the target and generate an offset for an aimpoint for target. In some embodiments, the system can include a target state estimator. The target state estimator can be configured to estimate a current state of the target by fusing data from the correlator tracker, the centroid tracker, and the MWIR, and the recursive reference images can be updated based on the estimated current state of the target. In some cases, the SWIR tracker includes a high rate SWIR camera system and a low rate SWIR camera system and the MWIR camera system includes a low rate MWIR camera system. In some embodiments, the FTS is configured to track a plurality of features of the target, including a plurality of segments of the target. The system can be a HEL tracking system and include a HEL beam. The system can then point the HEL beam to the cued target based on the FTS. In some embodiments, the system is configured to run a battle damage assessment mode. In the battle assessment mode, the system detects when, after the target is struck by the HEL beam, kinematics of the target deviate from expected dynamics of the target. In the battle assessment mode, the system also detects when one or more of the segments of the target become spatially separated from the target and determines target damage. In the battle assessment mode, the system then provides a target damage assessment indicator based on the target damage determined. In some cases the system is configured to reassess the target after the HEL beam is turned off to determine the target damage, the target damage being based in part on a distance between spatially separated segments of the target. In some embodiments, the ATS is configured to cue the target for tracking when the target features meet a plurality of predetermined criteria, including: target size, target shape, target velocity, signal-to-noise ratio, and an amount of time appearing. In at least one aspect, the subject technology relates to a method of tracking a target with a target tracking system. The method includes operating an ATS including a wide field of view (FOV) camera system to continuously capture an image of the target, identify the target for tracking, and cue the target for tracking. The method also includes operating an FTS including a narrow FOV camera system to continuously re-center the narrow FOV around the target after the target is cued by the ATS, wherein the ATS and FTS include independent steering mechanisms for respective camera systems. In some embodiments, the target tracking system is a high energy laser HEL tracking system, the ATS is an MWIR tracker, and the FTS is a SWIR tracker. In some cases, operating the FTS includes operating a centroid tracker to filter clutter and noise from the images, determine a centroid of the target, and generate a plurality of recursive reference images of the target based on the images. Operating the FTS can also include operating a correlator tracker to determine a shift between successive recursive reference images of the target and generate an offset for an aimpoint for target. In some embodiments, the method includes estimating, with a target state estimator, a current state of the target by fusing data from the correlator tracker, the centroid tracker, and the MWIR. The recursive reference images can then be updated are based on the estimated current state of the target. In some cases, the SWIR tracker includes a high rate SWIR camera system and a low rate SWIR camera system, and the MWIR camera system includes a low rate MWIR camera system. In some embodiments, the method includes tracking a plurality of features of the target with the FTS, including a plurality of segments of the target. In some cases, the system is a HEL tracking system including a HEL beam. The method can include pointing the HEL beam at the target, after the target is cued, based on the SWIR tracker. In some embodiments, the method includes running the system in a battle damage assessment mode. In the battle damage assessment mode, the method includes detecting when, after the target is struck by the HEL beam, kinematics of the target deviate from expected dynamics of the target. In the battle damage assessment mode, the method can include detecting when one or more of the segments of the target become spatially separated from the target. In the battle damage assessment mode, the method includes determining target damage and providing a target damage assessment indicator based on the target damage determined. In some embodiments, the method includes turning the HEL beam off. After the HEL beam is turned off, the target can be reassessed to determine the target damage, the target damage based in part on a distance between spatially separated segments of the target. In some embodiments, the ATS cues the target for tracking when the target features meet a plurality of predetermined criteria, including: target size, target shape, target velocity, signal-to-noise ratio, and an amount of time appearing. BRIEF DESCRIPTION OF THE DRAWINGS So that those having ordinary skill in the art to which the disclosed system pertains will more readily understand how to make and use the same, reference may be had to the following drawings. FIG.1 is a block diagram of a target tracking environment including a target tracking system in accordance with the subject technology. FIG.2 is a block diagram of the tracking architecture for a target tracking system in accordance with the subject technology. FIG.3 is a block diagram of functions carried out by an acquisition track sensor (ATS) tracker as part of the tracking system of FIG.2. FIG.4 is an exemplary image generated by the ATS tracker in accordance with the subject technology. FIG 5 is a block diagram of a process of adaptively sizing and executing an FFT within the target tracking system in accordance with the subject technology. FIG.6 is a block diagram of functions carried out by the ATS tracker as part of the tracking system of FIG.2. DETAILED DESCRIPTION The subject technology overcomes many of the prior art problems associated with target detection systems. The advantages, and other features of the systems and methods disclosed herein, will become more readily apparent to those having ordinary skill in the art from the following detailed description of certain preferred embodiments taken in conjunction with the drawings which set forth representative embodiments of the present invention. Like reference numerals are used herein to denote like parts. Further, words denoting orientation such as “upper”, “lower”, “distal”, and “proximate” are merely used to help describe the location of components with respect to one another. For example, an “upper” surface of a part is merely meant to describe a surface that is separate from the “lower” surface of that same part. No words denoting orientation are used to describe an absolute orientation (i.e. where an “upper” part must always at a higher elevation). In addition to providing autocuing track functions in the ATS to cue the FTS narrow field of view camera, the tracker disclosed herein includes a multi-object trackfile incorporating laser on detection in the FTS, multiple feature trackers within the FTS tracker to create a consistent and robust track associated with the same target, and a target state estimator fusing both FTS and ATS track outputs for accuracy and track consistency. The system also includes a mode selection that turns on HEL resistant tracker features in the ATS, including gate size lock, gain control on system and algorithm processing. An example would be to switch to a correlation only track or centroid track once the target features are obscured from HEL heating effects. The third feature is a battle damage assessment mode that detects when the target kinematics are erroneous from the expected dynamics as well as features that are measured that displace in an unexpected manner. The FTS tracker also runs continuously in the background and re-centers the field of view of the narrow tracking sensor when track is lost. This is also an indicator for battle damage assessment. The system can track various segments of the target, and determine damage to the target in part based on a distance between separated segments of the target. Notably, where used herein, the ATS may be a short wave infrared tracker (SWIR) while the FTS may be a medium wave infrared tracker (MWIR). However, in some cases, the ATS and FTS can generally operate in an EM waveband of optical sensors outside of a SWIR or MWIR. The system includes an intra-target tracker that tracks multiple features and dynamics of the target instead of multiple targets to assess battle damage impact, a target state estimator that fuses the wide field and view and narrow field of view track sensor outputs to assess track performance and provide indicators for battle damage assessment, an adaptive learning system that selects the tracker type depending on state of the HEL effects during lase including the ability to switch on and off different track features that are more resistant to HEL effects on the tracker output, and adaptive clutter filtering that removes background for both stationary and moving targets. The tracker disclosed herein uses both correlation and centroid tracking algorithms with regression to any single mode if one fails with both modes running simultaneously. The centroid tracker has both clutter and noise filtering to aid target segmentation for recursively updating the reference template for the correlation tracker, preventing correlation tracker drift. The correlation tracker then provides the high speed offset for laser beam pointing using the recursive image updated by the centroid tracker. The recursive image update adapts for HEL heating effect mitigation, and illumination variations. Automatic aimpoint selection is done based on generalized estimates of target shape from a library of aimpoint locations using a computer. The tracker system disclosed herein includes interleaved and parallel centroid, target reference, and high speed correlation trackers to prevent track drift and provide robustness to track loss from artifacts (clutter and noise) and can default to either mode. The system tracks small, large, and overfilled targets in the field of view. The system utilizes clutter and noise filtering using bandpass and spatial noise adaptive filters, background statistical estimates for adaptive thresholding, position feedback from predicted target dynamics, and dynamically updated spatial target gates. The system addresses operation with zero velocity and moving targets. A recursive target reference is dynamically updated for correlation tracking and aimpoint for target maneuvers and/or change aspects. The system uses automatic gain learning for the target mask reference integration used in correlation to compensate for intensity variations from illumination and HEL heating effects. A line of sight manager control optimally blends centroid and correlation track positions, integrates aimpoint offsets, and detects errors in track position. Aimpoint estimation is carried out, morphologically filtering target feature estimates without requiring stored target models of target aspect changes. The target reference tracker keeps the target reference position centered to keep the laser on boresight and minimize residual tracker drifts. The tracker disclosed herein also accounts for the HEL heating effects, applying an algorithm which utilizes two stages. The first is a “HEL ON” detector (or a HEL detection module) that detects intensity changes from heating effects and sends a flag to the system and the tracker algorithm that the HEL is ON. Once the HEL is detected as ON (i.e. HEL effects are identified on the target), the HEL ON detector notifies the imaging system and the imaging system can use this information to adjust sensor gains to ensure that saturation effects are minimized by keeping the peak intensity and the target within the range of A/D for as long as possible. The second part of the algorithm then cores out the HEL affected area on the target, and uses the non-HEL heated target for tracking including the recursive reference. The peak intensity of the target after filtering while tracked is continuously monitored and the tracker algorithm parameters adaptively update to mitigate any residual heating effects on the tracker. Therefore, the subject technology includes a method that integrates (either at a system level and/or an algorithm level) a process and/or algorithm that detects the presence of HEL heating effects and adaptively adjusts the tracker algorithm and system parameters to be resistant to those effects. Additionally the system disclosed herein provides adaptive spatial filtering techniques that provide the removal of the spatial target heating effects in the target spatial pattern to maintain track without drift and track lock by tracking on target features not affected. The adaptive spatial filtering techniques and temporal detection of the HEL beam integrate together for a complete closed loop HEL heating effects mitigation algorithm solution for tracking. Further, optimized FFT libraries require an a priori computational path, i.e. a plan that contains execution profile, memory layout, and sizes of transforms, so that optimal paths are followed for the particular FFT operation and size. The planning step is process intensive and creating plans in a real time loop is prohibitive. This negative impact increases when FFT sizes change to adapt to FFT correlator needs on a frame-to-frame basis, requiring different plans for each size change. Every plan created also needs to be deconstructed, imposing additional time penalties. The system disclosed herein solves the problem of creating and using optimal 2D FFT plans for dynamical transform sizes with minimal impact in the execution time of the correlator, by using an algorithm that exhaustively searches for optimal sizes, automatically selecting the FFT sizes most appropriate out of the main execution loop. To that end, the subject technology includes an algorithm that exhaustively searches prime power factors (PPF) and powers of 2 (PO2) mathematically yielding optimal sized FFT for all sizes. The algorithm automatically selects the 2D FFT sizes most appropriate for the 2 dimensions of the input signal that is out of the main execution loop and creates all the necessary data structures for use during real time. The algorithm can also be tuned and optimized for any 2D FFT implementation including hardware, software and firmware. Further, the algorithm can be extended to 3D, 4D and ND where the image or signal representation extends in multiple dimensions. Further, the algorithm for automatically selecting 2D FFTs can also be applied as part of other known processes which employ 2D FFTs, as are known. This algorithm can also be used in other environments where FFTs are applied. Referring now to FIG.1, a target tracking environment is shown. A target 100 is being launched from a target launch area 102. The target 100 can be a military target, such as a mortar, missile, drone or the like, or a non-military projectile. A HEL target tracking system 104 in accordance with the teachings herein seeks to acquire the target 100 during flight, and damage or destroy the target 100 using a HEL beam. The tracking blocks shown herein and described below represent steps of tracking the target 100 during the target’s trajectory. The system 104 includes a target cueing system 106 and a beam control system 108 working in concert. The target cue at any given time is the presentation of internal position “truth” as given to the tracker system 104 from external sources. This can be a one-time update event or can be provided continuously to the system 104. Therefore target cues are used as “truth” input (measurement updates) relied on to build target tracks, as will be discussed in more detail herein. Targets can be cued after a number of features are observed and meet predetermined criteria, including a predetermined target size, target shape, target velocity, signal-to-noise ratio, and amount of time (i.e. time appearing in the images). The target cueing system 106 can utilize an ATS (e.g. a MWIR) tracking system while the beam control system 108 utilizes a FTS (e.g. a SWIR) tracking system, as discussed in more detail below. The target cueing system 106 initially seeks out the target 100, rejecting clutter and classifying the target 100 to initiate the target track with a radar track 110 at block 112. After a radar handover to the beam control system 108, the beam control system 108 begins a low bandwidth gimbal track 114 and detects the target 100 with a wide field of view (WFOV) acquisition and track sensor at 116. The beam control system 108 then pulls in and centers the target at block 118. At block 120, the acquisition and track sensor then determines edges and size features for tracking. The beam control system 108 then begins a high bandwidth precision track 122. During the precision track 122, the system 108 uses a narrow FOV track using a fine tracking sensor (FTS). The beam control system 108 classifies the target using FTS image features, kinematics, an acquisition and track signature, and LOS geometry, at block 124. At block 128, the FTS centers the target, determines a target ID, and selects an aimpoint on the target. Optionally, at block 130, a beacon illuminator (BIL) atmospheric compensations can be applied. In the final block 132, the beam control system 108 can engage the target 100 with a HEL based on the aimpoint determined. Further, FTS aimpoint maintenance can be carried out at block 132, as the HEL engages the target 100, including target jerk detection, FTS shape change detection, TS debris field separation, and compensation for HEL effects on the target 100. The functions of the system 104 to effectively track the target 100 are described in more detail herein. Referring now to FIG.2, a block diagram showing the architecture of a tracking system 200 in accordance with the subject technology is shown. The tracking system 200 can be used to track a target for engaging with a HEL, in accordance with the system 104 discussed previously. The system 200 includes an ATS track system 202 which initially looks for, and finds the target, similar to the target cueing system 106. Once the target is located in the ATS tracker, the ATS system 202 centers the target at a boresight of the system using a gimbal controller 224. Tracking and closed loop tracking sensor pointing is passed to an FTS, or SWIR, tracker for high speed, precision tracking, over a relatively small field of view, since the general location of the target is already known. Fine sensor pointing is controlled by the FSM controller 246. At a high level inputs are provided to the FSM controller 246, which then manages the line of sight at high rates (tens of kilo Hertz). The FSM controller 246 provides target angle positions through the high speed FTS controller. The system 200 can include a mission processor unit 248 which includes a processing designed to control the actions of the system 200 based on a current operation objective (e.g. target elimination) and other beam director states and modes needed for a particular engagements. The MPU 248 can include a processor and memory configured to execute instructions to carry out the processing tasks of the ATS system 202 and FTS system 204 described herein. In particular, the ATS system 202 cues the target using a detector to generate an image filtered based on various criteria. The detector can be a Robinson filter, or another other target filter or machine learned target shape detector. Detections in the image are screened based on expected size, cue gating, and persistence in time until enough confidence is established to start a track on the target like object. Potential targets can also be screened based on correlation to cue expectations given kinematic features. To that end, the ATS system 202 generates a ATS video 208, which can be between 60- 120 Hz. The ATS system 202 can process the video 208 with a spatial filter 206, a moving target indicator (MTI) clutter suppression module 212, and a low SNR target integration module 214. The spatial filter 206 is designed to apply adaptive spatial filtering to remove spatial scene clutter that affects the ability to detect the target. This allows the ATS system 202 to maintain a target track without drift and track lock by tracking target features not affected. The spatial filter 206 integrates up the signal to boost SNR as necessary to detect low SNR targets. A number of detections in the image are identified at block 216, and selectively cued as the target based on an association with expected target parameters such as position and velocity of the detection compared to expected position and velocity of targets of interest, at block 218. To identify targets from background, a default size for a target can be set, with the Robinson detector rejecting detections that are too small, for example, or other spatial filter or machine learned target detectors. The ATS system 202 can include a Kalman filter 220 to filter the image position estimates of the target position for a smoothed out tracked target location in the image. After a target is identified and determined to meet expected criteria, by the ATS system 202, the target track can be initialized and the gimbal is commanded to close loop track around the ATS system 202 target track. The tracked target location in the ATS system 202 is used as a cue to the FTS system 204 to begin high speed fine target tracking. Referring now to FIG.6, a flowchart 600 shows the functions of the ATS system 202, from target detection 602 to selection of most likely target for engaging with the HEL 612. The ATS system 202 screens detections for target like objects over a predefined interval of time during the track acquisition stage, so that target tracks can be initialized. At block 602, a number of detections are identified in a current image. At block 604 the detections in block 602 are assembled in time together using a consistency metric to track the position of the target in the image. Each detection in block 602 is then compared to a previous detection and the most likely match is used to update the tracked object positions in block 606. Block 606 is used to drop unwanted target tracks, including target tracks that are false from clutter or of targets that disappear from the field of view. Block 610 updates the kinematic parameters of the target tracks (position, velocity, and time) with estimation filtering with the Kalman filter. The ATS system 202 can use Kalman-J filter with a kinematic dynamic model in pixel space for time and measurement updates. At block 604, time updates are provided for existing tracks in accordance with: In the equations herein, “F” is a state transition matrix and “x” is a 6 state estimate of the track. is Joseph’s form for numerical stability. At block 606, the algorithm prunes target tracks when information is outdated. New detections are associated with existing current tracks at block 608. Detections can be matched with tracks based on a feature distance measured between all pairs (not necessarily Euclidean distance. For detections the distance between all pairs can be measured in accordance with: A measurement update is run at block 610. The measurement update can be carried out in accordance with: Finally, at block 612, the most likely track for a target is selected by comparing target tracks to the target cue, with targets being selected based on proximity to the cue. Mahalanobis distance is used to find the number of standard deviations from the centroid of a track to the centroid of a cue, in accordance with: The target track that minimizes the distance metric D is selected for a given cue. When a target track has sufficient likelihood of being associated with a target cue (based on the strength of correlation to a cue), the detection is declared a target. If no target is declared, the system defaults to a cue track, where targets are still be acquired and no target has been declared. If there is no cue, target confirmation is insufficient due to missing measurements, and the last known target track is propagated until measurements are available. With the ATS system 202, detections are provided with azimuth in elevation coordinates in pixel space relative to a focal plan array of the imaging system, in accordance with: Using navigation data, the tracker can track in a coordinate frame such as ECEF (earth centered, earth fixed). The unit line-of-sight can be rotated to ECEF in accordance with: Pixel coordinates can be replaced with ECEF angles, as follows, and passed to the FTS system 204: Referring again to FIG.2, the FTS system 204 disclosed herein is a high-speed target tracker responsible for steering a high-speed mirror to the target aimpoint while the target is maneuvering and/or in apparent line of sight motion due to atmospheric and platform jitter. The FSM controller 246 directs the high-speed mirror commanded by the FTS system 204 which guides the HEL to provide lethal effects on the target, similar to beam control system 108. The high-speed FTS system 204 shown provides modes to acquire the target from a cue or operator designated image location, engage in track once acquired, automatically determine an aimpoint (e.g. block 238), and then provide an offset 244 for a the laser beam location around the automatically determined aimpoint through a line of sight (LOS) pointing manager utilized by the FSM controller 246 for pointing to the target. The tracking architecture for the combined ATS and FTS tracking subsystem is broken down in detail in figure 2. Block 226 shows the high speed FTS tracking flow of processing from the FTS image to generating a command to the FSM for pointing the HEL beam. The functions within the figure 2. are detailed in figure 3. The tracking modes of the FTS system 204 are broken down in detail in FIG 3. In brief overview, the FTS system 204 generate a video feed of captured images which can be between 400-800 Hz, for example. The FTS system 204 uses correlation tracking algorithms (path 231) and centroid tracking algorithms (path 233) with regression to any single mode if one fails, with both modes normally running simultaneously. For example, the system 200 may switch to a correlation only track or centroid track once the target features are obscured from HEL heating effects. Actions on the centroid tracking path 233 are described herein as carried out by a centroid tracker, while actions on the lower path are described as carried out by a correlator tracker. The centroid tracker path 233 filters the captured image with clutter filter 228. The centroid tracker path 233 can make further changes using a HEL adjustment module 230 when HEL effects are detected on the target. This is advantageous, as the heat from the HEL causes the target to heat up significantly after it is engaged, which causes high intensity outputs and large illumination variations in the detected images, making it difficult to continue tracking the target. Making adjustments to account for the HEL effects allows the HEL to effectively engage the target until the target is eliminated. There are a number of circumstances that represent possible failures which can be observed, resulting in regression to a single tracker mode. Failure modes in the correlation tracker 300 (discussed in more detail below) that can occur are correlation drift due to target signal corruption due to noise and signature variations. In this case, the centroid tracker will properly threshold the target and segment the target from the background providing an absolute position reference on the target location. Failure modes in the centroid tracker 302 (discussed in more detail below) include missed target detections in the clutter filtering and target detection blocks in block 236 in figure 2. In this case the correlation tracker will continue to correlate against the previously stored target signature reference enabling continued track updates. Failure mode examples include: (1) HEL effect providing positive feedback to Tracker therefore Tracker tracks HEL effect instead of target (2) High speed correlator decorrelates, therefore actual track drifts away from target (3) Centroid aimpoint tracking becomes highly unstable therefore track is broken. Within HEL adjustment module 230, the detector determines whether the target is experiencing HEL effects at block 232, based on increased intensity and illumination variations within an image. If the target is experiencing HEL effects, the HEL ON detector (block 232) triggers the system 200 to make further modifications to account for the HEL effects, within HEL ON module 234. In block 236, the integrated target signature used for correlation is detected with a threshold and segmented. The detected integrated target signature is used by the aimpoint algorithms in block 238 for determining the location to point the HEL beam for lethality. In particular, once the HEL ON module 234 is activated, the imaging system of the system 200 can adjust sensor gains to ensure saturation effects are minimized by keeping the peak intensity and the target within the range of A/D for as long as possible. At block 240, a centroid estimate is provided to provide an absolute reference of the target location in the image which is used for target aimpoint and integrating the recursive image used by the high speed correlator. The recursive reference image can be relied on in subsequent tracking, and is used as a reference template by the correlator 229 as well as the template in block 236 for target aimpoint determination. Further processing can be done, at block 242, to eliminate unwanted background noise, clutter, jitter, or the like, using an Alpha-Beta filter, Kalman Filter, or other filter. The updated recursive reference image used for the aimpoint provides a high speed offset for the HEL, at block 244, which can be relied on by the FSM controller 246 for offset pointing a HEL to the target aimpoint added to frame to frame shifts provided by the centroid and correlation trackers to engage the target. A target state estimator 222 can fuse data from the FTS system 204 (e.g. from correlator tracker and centroid tracker) and from the ATS system 202 to ensure accuracy and consistency. The system 200 can then rely on the fused data for subsequent tracks. The system 200 continuously tracks the targets and dynamically updates the recursive target reference for correlation tracking and aimpoint as described above, ensuring the system 200 updates for target maneuvers and/or changes. The combination of the ATS system 202 and FTS system 204 allows for multiple targets to be cued, simultaneously, and allows for a split object track to be opened up with multiple tracks associated with the same target. This allows separate pieces of a target to be tracked even as the target is engaged by a HEL and destroyed, causing the target to break into multiple pieces. The system 200 can assess battle damage to the target when the kinematics of the target are erroneous from expected dynamics, and/or when features of a target are displaced in an unexpected manner. The system 200 can then provide an indicator of battle damage based on the battle damage assessment. Notably, while parts of the system 200 are shown and described herein as functional blocks or modules, it should be understood that the functions disclosed therein can be carried out using a specifically configured system designed to carry out the functions described. Referring now to FIG.3, the functions of the FTS system 204 are shown and described in more detail. The tracking functions of the FTS system 204 are further broken into high speed and low speed target tracking paths 300, 302. These paths 300, 302 generally correspond to the FTS paths 231, 233, respectively. The high-speed tracker path 300 provides target position updates at the current frame rate to the line of sight (LOS) pointing manager 314. The high-speed tracker path 300 is the main path to provide LOS updates to a high-speed mirror directed by FSM controller 246 when a target is being tracked. The low speed tracker path 302 provides a periodic update to the target aimpoint, temporally integrates an image centered target reference for correlation for the high-speed path, and detects, segments and updates and initializes the target tracking gates. Exemplary target tracking gates can be seen in FIG.4. Referring now to FIG.4, an image 400 of a target 406 obtained by the FTS system 204 is shown. The image 400 includes two concentric areas 402, 404 representing target gates. The first is a target area 402, which includes a buffer around the estimated extent of the target size. The second is a background estimate area 404, which is an annulus around the target area 402 used to estimate the background. The background estimation is determined during the low speed track path 302 of FIG.3. Tracker gates 402, 404 are one way in which the tracking system 200 effectively isolates the area where the target 406 is contained in the image 400 to maximize detection, provide efficient location to search in the image space, and reject background contributions. The tracker gates 402, 404 continuously update size and location in both the high-speed and low speed interleaved tracker paths 300, 302. When no cue information is initially present, the tracker gates 402, 404 can initially be sized to include the entire image the entire image 400. Alternatively, the tracker gates 402, 404 can initially be sized from target location and/or size information if present from the ATS tracker 202 or other system functions. The FTS tracker 204 also uses feature state estimators for tracking and centering of the HEL on image feature points. To that end, the FTS tracker 204 includes a number of state estimators. In the truest sense, ‘state’ implies velocity, acceleration, and position of the feature tracked. Since these features are in image coordinates, these feature state estimators provide target feature states in image coordinates and not in inertial coordinates. The target feature states estimators include the following, as will be discussed in more detail below: a correlation tracker module 308, a centroid tracker 324, an aimpoint tracker 312, and a recursive reference image tracker 326. Referring again to FIG.3, the low speed tracker 302 is illustrated in the lower path, while the high speed tracker 300 is illustrated in the upper path, but both can be part of the FTS tracker 204. After a target is initially acquired, subsequent tracking can be carried out with both the high and low speed trackers 300, 302. Generally, the components of path 300 can serve the role of the correlation tracker, while the components of path 302 can serve the role of the centroid tracker. An image interleave 330 can be used between the two trackers 300, 302. The image interleave is a mode where the low speed tracker 302 updates its track position, centroid, and target reference every Nth frame, where N is a variable set by the tracker 302 prior to operation. The output of the tracker 302 of the target position is normally the high speed tracker 300 correlation output for the subsequent N-1 frames, and Nth frame is either to centroid tracker location, correlation tracker target estimated location, or a blended combination of the two tracker target locations. In the lower speed tracker 302, the image is spatially filtered at block 316. The filter image block 316 removes background and noise, and enhances the target signal for further processing. The filter image block 316 operates on each frame of the image, and represents the first part in the processing chain. During this step, background statistics in the area around the target, not including the target, are determined to set pixel detection thresholds. Next, the signal is processed by an estimate background function 318. The estimate background process computes the statistics in an annulus around the target location in the case of track, or in a larger boundary in the case of acquisition (i.e. as targets are still being identified). Next, the tracker executes a threshold and segment target function 320. The threshold and segmentation function 320 takes as input each of the images processed in the low speed centroid tracker 324, applies a threshold based on the estimate background function 318, and then detects pixels that are greater than the threshold (both positive and negative). The pixels exceeding the threshold are grouped using 8-way connectivity and then transformed into objects along with their statistics such as size, area, centroid, among others. Next, a find and select target function 322 is performed. The find and select target function 322 has two modes of operation; a target acquisition mode as targets are identified and a track mode to track identified targets. In target acquisition, the function 322 takes as input the segmented objects from the threshold and segment targets function 320 and selects the segmented object closest to the acquisition criteria. In track, the find and select target function 322 selects the segmented object closest to the estimated track position. The find and select target function 322 also computes the extent of the target location and its position for estimating the target and background (or constant false alarm CFAR) gates (e.g.402, 402). Further processing is then carried out by the centroid target tracker 324. The centroid target tracker 324 implements a function which computes the centroid for each target selected in the find and select target function 322, and filters the position to develop rate and position estimates of its current position, and future position on the next time update. Track filtering has a variety of track filters implementation options including Kalman filters, alpha-beta, and other known filters. An update recursive reference function 326 is then applied to shift the segmented and selected target object into a reference image coordinate frame used for trackpoint and closed loop line-of-sight pointing. Each segmented object is integrated into the reference coordinate frame to enhance signal to noise ratio. This recursive reference picture supports two functions. First, the recursive reference picture is the reference image used to estimate each input image translation in the correlation tracker module 308 in the high-speed tracker (i.e. path 300) for closed loop tracking about the track reference point. Second, a target aimpoint 312 is estimated. Input frames are temporally weighted and designed to match target shape changes due to motion and dynamics to ensure close alignment between the reference and the current input frame, while maximizing the reference image signal to noise ratio. The ‘reference image track state estimator’ is embedded in block 326 keeps track of the target centric center location for pointing in the line of sight pointing manager 314. A propagate gates function 328 then updates the gated area from which image pixels are grabbed to compute track points and target features on the next frame based on the current segmented target size and the centroid tracker predicted target position on the next frame. The gates define the target area (e.g. gate 402) and a concentric area around the target (e.g. gate 404) for background estimation in the estimate background function 318 on the next frame, when in track. An aimpoint estimate function 312 then uses the integrated target reference image in the update recursive reference function 326 to compute the target aimpoint using computer vision based techniques to estimate shape, rotation axis, and orientation, or any other computer vision or machine learned technique. The aimpoint is determined about the rotation axis with a predefined location on the rotation axis, or user adjustable location, or user adjustable added to the predefined location. The aimpoint estimate 312 provides the LOS pointing manager 314 an offset pointing location or lase location to the aimpoint, and a shift to the center of the recursive reference for offset track. The aimpoint location is filtered and tracked similarly as the centroid target tracker 324 as discussed previously. The high-speed tracker 300 functions simultaneously with the low speed tracker 302. This high-speed tracker 300 is the main tracking function used for instantaneous pointing of the LOS of the FTS system 204. In the high-speed tracker 300 path, the input image is initially filtered at block 304, which functions similarly to the filter image block 316 described above (e.g. operating on each frame and removing background and noise to enhance the target signal). A correlate against reference function 306 is applied to compute the current image shift relative to the reference image estimated in the update recursive image function 326 in the low speed tracker 302 path. The correlated area is adapted to the size of the target gates computed in the find and select target function 322. The correlate against reference function determines the cross-correlation between the reference image and the current input image using an adaptively size 2D FFT and 2D Inverse FFT, sized to the current gated target area, to perform the cross correlation. This process of adaptively sizing the FFTs is shown in FIG.5, and discussed in more detail below. The current image shift relative to the recursive image is the detected peak location of the correlation output and interpolated to a subpixel location with a 2D peak interpolation method. The correlate against reference function 306 includes a pixel threshold that clips the gated image intensities and the reference image intensities to create a semi-binary image. The semi-binary image protects against intensity variations across the target shape while retaining the key target shape attributes for cross correlation. Pre-determined pixels are searched from the current image. A shift is then carried out by a correlation tracker function 308. The correlation tracker function 308 one of the target state estimates discussed previously. The correlation tracker function 308 uses the output of the image shift between the current frame and the reference frame. When correlation mode is operational, the correlation gated area size position updates, at update track gates block 310. The size of the correlation area updates with the interleaved centroid tracker find and select target function 322 discussed previously. The LOS pointing manager 314 is responsible for providing closed loop line of sight target track location to the angle controller of the system for pointing. The FSM controller 246 can respond to the pointing location. When the tracking system 200 is in acquisition mode, the centroid location of the detected target in the find and select target function 322 points the line of sight. In track mode, when the correlator is operational, the correlator image shifts relative to the reference image centered location. When the interleaved centroid mode is operational, the centroid location (e.g. centroid target tracker 324) points the line of sight as one option or blended with the correlation tracker or correlation tracker only output. The LOS pointing manager 314 also checks for irregular motion of the pointing and limits spurious pointing commands and adds the aimpoint estimate for pointing offset computed in block 312. Referring again to FIG.4, the image 400 of the target 406 includes a correlation track 408, the track state estimate smoothed target position 410, and the automatically generated aimpoint 412. Referring now to FIG.5, the system 200 utilizes an algorithm 500 that exhaustively searches prime power factors (PPF) and powers of 2 (PO2), mathematically yielding optimal FFT size. The algorithm builds an optimal selector that choses the FFT sizes most appropriate for the dimensions of the input signal. It does this outside of the main execution loop, preparing all the necessary data structures for real time FFT and iFFT transforms. While the system 200 utilizes 2D FFTs, in some instances, the algorithm can also be carried out for 3D FFTs. At step 502, the algorithm starts by defining a range coverage for FFT sizes. The coverage range can be user input based on the expected size of signals in both dimensions (minS, maxS). The algorithm 500 then generates a solution space of optimal size within plan creation engine 504. This starts, by module 506 sweeping each characteristic dimension of the FFT (scalable from 1 to N dimensions) and searching all prime power factors (PPF) and powers of two (PO2) factors of the given dimension range MxN. Within module 508, for each combination of factors, a 2D FFT plan is created which will be execution optimal and memory optimal for those sizes. A look up algorithm converts each set of incoming dimensions into the minimum power factors greater or equal to the dimensions. A look-up table is created, within module 510, that will map every combination of factors into an index for the storage spaces of the FFT and inverse FFT (IFFT) for the combination, which is guaranteed optimal for the algorithms used in the FFT calculation. This generates an exhaustive space for every combination of PPF from each dimension and allows the libraries to utilize deep benchmarking to select optimal instructions and memory spaces for that particular combination. A set of lookup tables will index the location of these data structures, which will allow for immediate selection during the fast loop (i.e. during the dynamically changing FFTs sizes) of the application. Any incoming size covered in the original range will be assigned an optimal FFT solution by the selection process in the algorithm 500. The plan created in plan creation engine 504 can then be utilized by the correlator 229 to apply the FFT. In particular, input 512 from the tracking system 200 is provided to the algorithm 500. This includes input from a sensor subsystem 518 of the system 200, incoming 2D sizes from the image acquisition system 516, and correlator dynamic signal sizes 514. The FFT for the correlator 229 can then be executed within high rate execution loop 520. During the high rate execution loop 520, the pre-generation of mapping tables and functions guarantees that the selection of the FFT plan will meet real-time requirements, without incurring penalties for in-the-loop plan creation. A real time optimal plan selection module 522 then selects the optimal plan from the plan creation engine 504. An FFT correlator algorithm 524 then optimally executes a 2D FFT (optimal execution module 526) to determine the cross-correlation between the reference image and the current input image using the adaptively sized FFT. The results 528 are utilized by the correlator 229 to perform the cross correlation. A plan creation destruction engine 530 provides out-of-the-loop plan deconstruction, insuring that memory deallocation execution penalties inside the high rate algorithm loop. Overall, by tracking targets using an interleaved and parallel centroid, target reference, and high speed correlation tracking, the tracking systems described herein can prevent track drift and provide robustness to track loss from artifacts (clutter and noise), and can default to either track mode, tracking small, large, and overfilled targets in the field of view. All orientations and arrangements of the components shown herein are used by way of example only. Further, it will be appreciated by those of ordinary skill in the pertinent art that the functions of several elements may, in alternative embodiments, be carried out by fewer elements or a single element. Similarly, in some embodiments, any functional element may perform fewer, or different, operations than those described with respect to the illustrated embodiment. Also, functional elements shown as distinct for purposes of illustration may be incorporated within other functional elements in a particular implementation. While the subject technology has been described with respect to preferred embodiments, those skilled in the art will readily appreciate that various changes and/or modifications can be made to the subject technology without departing from the spirit or scope of the subject technology. For example, each claim may depend from any or all claims in a multiple dependent manner even though such has not been originally claimed.