Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR TARGET TRACK MANAGEMENT OF AN AUTONOMOUS VEHICLE
Document Type and Number:
WIPO Patent Application WO/2018/154367
Kind Code:
A1
Abstract:
The present disclosure relates to a system and method that improves track management of high velocity manoeuvring objects/ targets while eliminating the probability of external clutter and false positive. An aspect of the present disclosure provides a system (also referred to as "a track management system") for enabling management of track for an autonomous vehicle, wherein the system includes a data stream receive module 902 configured to enable data streams from a plurality of sensors to be received, wherein the data streams are of different or same time interval, an integrated sequential synchronization module 904 configured to enable the received data streams to be synchronized, and a sensor prioritization module 906 configured to enable prioritization of a first sensor over a second sensor, wherein the first and second sensors are selected from the plurality of sensors for track management.

Inventors:
DAS SOUMYO (IN)
VORA PRASHANTKUMAR BIPINCHANDRA (IN)
KUMAR KISHAN (IN)
Application Number:
PCT/IB2017/054530
Publication Date:
August 30, 2018
Filing Date:
July 26, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
KPIT TECH LIMITED (IN)
International Classes:
G05D1/02; B60W30/09; G01S13/72; G01S13/86; G01S13/931; G06V10/32; G08G1/16
Foreign References:
US20140032012A12014-01-30
Other References:
CHHETRI A S ET AL: "Scheduling multiple sensors using particle filters in target tracking", STATISTICAL SIGNAL PROCESSING, 2003 IEEE WORKSHOP ON ST. LOUIS, MO, USA SEPT. 28, - OCT. 1, 2003, PISCATAWAY, NJ, USA,IEEE, 28 September 2003 (2003-09-28), pages 549 - 552, XP010699960, ISBN: 978-0-7803-7997-8, DOI: 10.1109/SSP.2003.1289522
Attorney, Agent or Firm:
KHURANA & KHURANA, ADVOCATES & IP ATTORNEYS (IN)
Download PDF:
Claims:
We Claim:

1. A system comprising:

a non-transitory storage device having embodied therein one or more routines operable for track management of an autonomous and/or a semi-autonomous vehicle; and

one or more processors coupled to the non-transitory storage device and operable to execute the one or more routines, wherein the one or more routines include:

a data stream receive module, which when executed by the one or more processors, receives data streams from a plurality of sensors, wherein the data streams are of different or same time interval;

an integrated sequential synchronization module, which when executed by the one or more processors, synchronizes the received data streams; and

a sensor prioritization module, which when executed by the one or more processors, prioritizes, in real-time, a first sensor over a second sensor based on the synchronized data streams to enable at least any or a combination of an accurate prediction of a target/object, an accurate updation of a state associated with target/object, or an accurate association of data from the received data streams utilizing the first sensor for track management, wherein the first and second sensors are selected from the plurality of sensors.

2. The system as claimed in claim 1, wherein the first sensor is prioritized over the second sensor based on one or more measurements associated with or computed by the two sensors, and wherein the one or more measurements are selected from any or combination of estimated position(s) of target object, measurement of actual position(s) of the target object sensed by each of the first and the second sensors, constant associated with the measurement computed as a factor of normalization based on sensors static noise characteristic, and cost function of least square errors associated with the first and the second sensors.

3. The system as claimed in claim 1, wherein the first sensor is prioritized over the second sensor based at least on least squares error in estimating position of a target object computed by both the first and second sensors.

4. The system as claimed in claim 1, further comprising: a weighted track initialization module, which when executed by the one or more processors:

identifies an object or a state thereof, wherein the object is captured on the track of the autonomous or the semi-autonomous vehicle by the first sensor; and

classifies the object identified based on one or more pre-defined clusters of objects, wherein the pre-defined clusters of objects are created based on the objects captured in past on the track of said autonomous or the semi-autonomous vehicle by the first sensor.

5. The system as claimed in claim 4, wherein the weighted track initialization module, which when executed by the one or more processors, initializes tracking of the identified object based on one or more weightages associated with the object in a grid and/or based on the weightages of the grid itself, wherein the grid is a pre-defined area within a range of each sensor of the plurality of sensors, and wherein the one or more weightages associated with the object are defined based on any or combination of a quality of signals from at least one sensor of the plurality of sensors, size of the object, relative velocity of the object, and relative yaw rate of the object.

6. The system as claimed in claim 1, further comprising a target prediction and data association module, which when executed by the one or more processors:

predicts an object on the track of the autonomous or the semi-autonomous vehicle by the first prioritized sensor, wherein the prediction of the object is performed based on one or more pre-stored objects; and

associates the object predicted/captured on the track of the autonomous or the semi-autonomous vehicle by the first sensor with the track.

7. The system as claimed in claim 1, further comprising a validation gate determination module, which when executed by the one or more processors, determines validation gates to track association of the autonomous or the semi-autonomous vehicle with an object captured on the track of the autonomous or the semi-autonomous vehicle by the first sensor based on a track history of positional co-ordinates and/or one or more attributes associated with the object, wherein the attributes associated with the object are selected from any or combination of a position (Z) of the object, velocity (V) of the object, orientation (Θ) of the object, and features of the object including dimension.

8. The system as claimed in claim 7, wherein the validation gate determination module, which when executed by the one or more processors, updates a region for validation gates to track association of the autonomous or the semi-autonomous vehicle with the object while comparing a velocity and an orientation associated with a position of the autonomous or the semi-autonomous vehicle with a prediction of a velocity and an orientation associated with a position of the object.

9. A method for track management of an autonomous and/or a semi-autonomous vehicle, the method comprising the steps of:

receiving, at one or more processors of a computing device, data streams from a plurality of sensors, wherein the data streams are of different or same time interval; synchronizing, at the one or more processors, the received data streams; and prioritizing, at the one or more processors, in real-time, a first sensor over a second sensor, based on the synchronized data streams, to enable at least any or a combination of an accurate prediction of a target/object, an accurate updation of a state associated with target/object, or an accurate association of data from the received data streams utilizing the first sensor for track management, wherein the first and second sensors are selected from the plurality of sensors.

10. The method as claimed in claim 9, wherein the first sensor is prioritized over the second sensor based on one or more measurements associated with or computed by the two sensors, and wherein the one or more measurements are selected from any or combination of estimated position(s) of target object, measurement of actual position(s) of the target object sensed by each of the first and the second sensors, constant associated with the measurement computed as a factor of normalization based on sensors static noise characteristic, and cost function of least square errors associated with the first and the second sensors.

11. The method as claimed in claim 9, wherein the first sensor is prioritized over the second sensor based at least on least squares error in estimating position of a target object computed by both the first and second sensors.

12. The method as claimed in claim 9 further comprising:

identifying, at the one or more processors, an object or a state thereof, wherein the object is captured on the track of the autonomous or the semi-autonomous vehicle by the first sensor; and

classifying, at the one or more processors, the object identified based on one or more pre-defined clusters of objects, wherein the pre-defined clusters of objects are created based on the objects captured in past on the track of said autonomous or the semi-autonomous vehicle by the first sensor.

13. The method as claimed in claim 12 further comprising: initializing, at the one or more processors, tracking of the identified object based on one or more weightages associated with the object in a grid and/or based on the weightages of the grid itself, wherein:

the grid is a pre-defined area within a range of each sensor of the plurality of sensors, and wherein the one or more weightages associated with the object are defined based on any or combination of a quality of signals from at least one sensor of the plurality of sensors, size of the object, relative velocity of the object, and relative yaw rate of the object.

14. The method as claimed in claim 9 further comprising:

predicting, at the one or more processors, an object on the track of the autonomous or the semi-autonomous vehicle by the first prioritized sensor, wherein the prediction of the object is performed based on one or more pre-stored objects; and associating, at the one or more processors, the object predicted/captured on the track of the autonomous or the semi-autonomous vehicle by the first sensor with the track.

15. The method as claimed in claim 9 further comprising: determining, at the one or more processors, validation gates to track association of the autonomous or the semi-autonomous vehicle with an object, captured on the track of the autonomous or the semi-autonomous vehicle by the first sensor, based on a track history of positional co-ordinates and/or one or more attributes associated with the object, wherein the attributes are selected from any or combination of a measured position (Z) of the object, velocity (V) of the object, orientation (Θ) of the object, and features of the object including dimension.

16. The method as claimed in claim 15 further comprising: updating, at the one or more processors, a region for validation gates to track association of the autonomous or the semi- autonomous vehicle with the object, while comparing a velocity and an orientation associated with a position of the autonomous or the semi-autonomous vehicle with a prediction of a velocity and an orientation associated with a position of the object.

Description:
SYSTEM AND METHOD FOR TARGET TRACK MANAGEMENT OF AN

AUTONOMOUS VEHICLE

TECHNICAL FIELD

[0001] Embodiments of the present disclosure relate to autonomous and/or semi- autonomous vehicles, and in general to systems and methods for acquiring situational awareness information, and providing operational intelligence to autonomous and/or semi- autonomous vehicles for safe, efficient, and automated land navigation, and more particularly, to systems and methods for track management of an autonomous vehicle.

BACKGROUND

[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.

[0003] Conventionally, vehicles have been operated by humans. More recent advances have focused on making vehicles operationally autonomous (also sometimes referred to as self-driving) or at least semi-autonomous. The expectation is that safety can be increased by removing the element of human error and accurate target tracking, which is the primary cause of accidents. The autonomous and semi-autonomous vehicles have potential of being used in a number of applications. For example, use of autonomous vehicles may be desirable in military or civilian applications that would otherwise expose human operators or passengers to dangerous environments. Autonomous and semi-autonomous vehicles can also be used for sensing environment and navigating without any human intervention.

[0004] Target tracking and/or track management enables autonomous/self-driving or semi-autonomous vehicles to identify nearby vehicles/objects, upcoming objects/hurdles, surrounding terrain, upcoming obstacles, a particular path etc for removing the element of human error. In current approaches, environment sensing to achieve target tracking and/or track management, and navigation is performed by a plurality of sensors, such as but not limited to multiple radars, cameras, lasers, Lidar sensors, ultrasonic sensors, installed in the autonomous or semi-autonomous vehicles, and thus such vehicles have a wide spectrum of applications. The sensors in such vehicles can be configured on board to transmit and receive variety of information such as track information, nearby vehicles/objects, upcoming objects/hurdles, surrounding terrain, upcoming obstacles, a particular path, etc. and take appropriate action(s) such as avoiding accident with upcoming obstacles.

[0005] In recent decades, computer vision technology has been researched for driver assistance and safety systems in intelligent vehicle fields. Many researchers have worked on obstacle detection and object recognition methods. However, the existing obstacle detection and object recognition methods are restricted by its slow speed in obstacle detection and limited information associated with the vehicle. For example, the obstacles on roads, false impressions of an object, say mistaking a walker on the road with a still object, and the like, giving rise to missing or false detections of the objects.

[0006] Further, currently the track management of high velocity objects such as cars, pedestrians, and other targets is difficult and inaccurate. Also, the conventional track initialization mechanisms which are utilized in multiple-object tracking problems are primarily based on velocity and acceleration models of the target vehicles, and involve a false track initialization and computation complexities for track association. Furthermore, data streams sensed by the plurality of sensors are out of sequence (not synchronized), which leads to inaccuracy while interpolating data lacks robustness in complex scenarios of track management.

[0007] In view of above technical problems and/or technical drawbacks available in conventional autonomous or semi-autonomous vehicles with regards to track management, there exists a need to provide a technically advanced system and method that can efficiently improve track management of high velocity manoeuvring objects/targets using dynamic information while eliminating probability of external clutters and false positive to enhance the reliability of track management. Furthermore, there is also a need in conventional autonomous or semi-autonomous vehicles to provide a validation gate evaluation in target tracking and an integration of out-of-sequence data streams sensed by the sensors installed in the vehicle.

[0008] All publications herein are incorporated by reference to the same extent as if each individual publication or patent application were specifically and individually indicated to be incorporated by reference. Where a definition or use of a term in an incorporated reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein applies and the definition of that term in the reference does not apply.

[0009] In some embodiments, the numbers expressing quantities of ingredients, properties such as concentration, reaction conditions, and so forth, used to describe and claim certain embodiments of the invention are to be understood as being modified in some instances by the term "about." Accordingly, in some embodiments, the numerical parameters set forth in the written description and attached claims are approximations that can vary depending upon the desired properties sought to be obtained by a particular embodiment. In some embodiments, the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the invention are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable. The numerical values presented in some embodiments of the invention may contain certain errors necessarily resulting from the standard deviation found in their respective testing measurements.

[0010] As used in the description herein and throughout the claims that follow, the meaning of "a," "an," and "the" includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of "in" includes "in" and "on" unless the context clearly dictates otherwise.

[0011] The recitation of ranges of values herein is merely intended to serve as a shorthand method of referring individually to each separate value falling within the range. Unless otherwise indicated herein, each individual value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g. "such as") provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the invention.

OBJECTS OF THE INVENTION

[0012] It is an object of the present disclosure to provide a technically advanced system and method that efficiently improves track management of high velocity manoeuvring objects/targets while eliminating probability of external clutter and false positive.

[0013] It is another object of the present disclosure to provide a technically advanced system and method that enables validation gate evaluation in target tracking, and integration of out of sequence data streams that are sensed by plurality of sensors such as RADAR sensors, camera Sensors, microwave sensors, Lidar sensors etc. that are installed in the vehicles. [0014] It is another object of the present disclosure to provide a system and method to improve track maintenance in complex scenarios such as in crowded cities and/or where there is un-predictable movement of vehicles and pedestrians.

[0015] It is another object of the present disclosure to provide a system and method that adapts and/or enables dynamic changeover of selection of a sensor so as to render more accurate estimation from the sensor (such as from RADAR sensor and/or vision sensor) through sensor prioritization.

[0016] It is another object of the present disclosure to provide a system and method to synchronize data streams that are sensed by multiple sensors for validation purpose, wherein the data streams are received by the sensors at different intervals.

[0017] It is yet another object of the present disclosure to provide a system and method to identify and/or classify target feature(s) of, say a high velocity manoeuvring object/target, based on signals sensed from RADAR sensor and/or vision sensor.

[0018] It is still another object of the present disclosure to a provide system and method that enables prediction of a target, updation of state and association of data that is utilized for track validation.

SUMMARY

[0019] Aspects of the present disclosure relate to the field of autonomous and/or semi- autonomous vehicles, and in general to systems and methods for acquiring situational awareness information and providing operational intelligence to autonomous and/or semi- autonomous vehicles for safe, efficient, and automated land navigation. More particularly, the present disclosure relates to systems and methods for track management of an autonomous vehicle.

[0020] In an embodiment, the present disclosure provides a technically advanced system and method that efficiently improves track management of high velocity manoeuvring objects/targets using dynamic information while eliminating probability of external clutters and false positive to enhance reliability of track management. System of the present disclosure is adapted to receive data streams from a plurality of sensors, pre-process the data streams to obtain processed data streams, and thereby synchronize out of sequence data streams, and based on the synchronization, prioritize at least one sensor of a plurality of sensors as a primary sensor and the remaining sensor or sensors from the plurality of sensors as secondary sensor or sensors. The primary sensor is then be utilized to provide track initialization, or target prediction and data association, or track validation and gating or any combination thereof.

[0021] An aspect of the present disclosure provides a system (also interchangeably referred to as "a track management system") for enabling management of track for an autonomous vehicle in a service area of automobile industry, wherein the system include a non-transitory storage device having embodied therein one or more routines operable for track management of an autonomous and/or a semi-autonomous vehicle, and one or more processors coupled to the non-transitory storage device and operable to execute the one or more routines, wherein the one or more routines include a data stream receive module, an integrated sequential synchronization module, and a sensor prioritization module.

[0022] In an aspect, data stream receive module, which when executed by the one or more processors, enables data streams from a plurality of sensors to be received, wherein the data streams are of different or same time interval. Integrated sequential synchronization module, which when executed by the one or more processors, enables the received data streams to be synchronized, based on which sensor prioritization module, which when executed by the one or more processors, enables prioritization of a first sensor over a second sensor to enable at least any or a combination of an accurate prediction of a target/object, an accurate updation of a state associated with target/object, or an accurate association of data from the received data streams utilizing the first sensor for track management, wherein the first and second sensors are selected from the plurality of sensors for track management.

[0023] In an aspect, the plurality of sensors include but are not limited to multiple radars, cameras, lasers, Lidar sensors, ultrasonic sensors, installed in/configured at the autonomous vehicles or semi-autonomous vehicle.

[0024] In an aspect, first sensor is prioritized over second sensor based at least on one or more measurements associated with or computed by the two sensors, wherein the one or more measurements are selected from any or combination of estimated position(s) of target object, measurement of actual position(s) of the target object sensed by each of the first and the second sensors, constant(s) associated with the measurement computed as a factor of normalization based on sensors static noise characteristic, and cost function of least square errors associated with the first and the second sensors. In another aspect, the first sensor is prioritized over the second sensor based at least on least squares error in estimating position of a target object computed by both the first and second sensors.

[0025] In an aspect, system of the present disclosure furthers include a weighted track initialization module, which when executed by the one or more processors, enables identification of an object or a state thereof, wherein the object is captured on the track of the autonomous or the semi-autonomous vehicle by the first prioritized sensor. Such a captured object may further be classified based on one or more pre-defined clusters of objects that have previously been captured on the track of said autonomous or the semi-autonomous vehicle by the first prioritized sensor.

[0026] In another aspect, weighted track initialization module is further configured to initialize tracking of an identified object based on one or more weightages associated with the object in a grid and/or based on weightages of the grid itself. In an aspect, the grid is a predefined area within a range of each sensor of the plurality of sensors. In an aspect, the one or more weightages associated with the object is defined based any or a combination of quality of signals from each sensor of the plurality of sensors (or from at least the first and second sensors), size of the object, relative velocity of the object, and relative yaw rate of the object.

[0027] In an aspect, system of the present disclosure further includes a target prediction and data association module, which when executed by the one or more processors, enables prediction of an object on the track of the autonomous or the semi-autonomous vehicle by the first prioritized sensors, wherein the prediction of the object is performed based on one or more pre-stored objects. Such predicted object that has previously been captured on the track of said autonomous or the semi-autonomous vehicle by the first prioritized sensors may further be associated with the track.

[0028] In an aspect, system of the present disclosure further includes a validation gate determination module, which when executed by the one or more processors, enables determination of validation gates to track association of the autonomous or the semi- autonomous vehicle with the object captured on the track of the autonomous or the semi- autonomous vehicle by the first prioritized sensor based on track history of positional coordinates and/or one or more attributes associated with the object. In an aspect, attributes associated with the object is selected from any or combination of position (Z) of the object, velocity (V) of the object, orientation (Θ) of the object, and features of the object including dimension.

[0029] In another aspect, the validation gate determination module is further configured to update a region for validation gates to track association of the autonomous or the semi- autonomous vehicle with the object while comparing a velocity and an orientation associated with a position of the autonomous or the semi-autonomous vehicle with a prediction of a velocity and an orientation associated with a position of the object. [0030] An aspect of the present disclosure relates to a method for track management of an autonomous or a semi-autonomous vehicle. The method receives data streams from a plurality of sensors, wherein the data streams is of different or same time interval. Upon receipt of the data streams, the method synchronizes the data streams to thereby prioritize, in real-time, a first sensor over a second sensor based on synchronized data streams to enable at least any or a combination of an accurate prediction of a target/object, an accurate updation of a state associated with target/object, or an accurate association of data from the received data streams utilizing the first sensor for track management, wherein the first and second sensors are selected from the plurality of sensors.

[0031] It would be appreciated that although aspects of the present disclosure have been explained with respect to management of an autonomous or an semi-autonomous vehicle in automobile industry, the present disclosure is not limited to the same in any manner whatsoever and any other form of vehicles are completely covered within the scope of the present disclosure.

[0032] This technical solution of being able to, in real-time, receive data from plurality of sensors, and then dynamically prioritize the sensors for accurate track management and utilizing such sensor prioritization data for weighted track initialization, target prediction, data association, track validation and gating helps in reducing the overall time and effort required to choose and manage tracks in autonomous and/or semi-autonomous vehicles and also reduces the computing resources required for track management. As the proposed system runs dynamically and, in real-time, and looks out for track related data for accurate and efficient track management, it significantly imparts a technical impact through reduction in any processor time that is otherwise required for the vehicle/user to undertake computations for track management.

[0033] Other features of embodiments of the present disclosure will be apparent from accompanying drawings and from detailed description that follows.

BRIEF DESCRIPTION OF THE DRAWINGS

[0034] In the figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label. [0035] The diagrams are for illustration only, which thus is not a limitation of the present disclosure, and wherein:

[0036] FIG. 1 illustrates exemplary functional modules of a system for target track management in accordance with an embodiment of the present invention.

[0037] FIG. 2 illustrates integration of sensed data streams and post processing thereof in accordance with an embodiment of the present invention.

[0038] FIG. 3 illustrates integrated sequential synchronization strategy to synchronize out of sequence data stream of sensors in accordance with an embodiment of the present invention.

[0039] FIG. 4 illustrates sensor prioritization flow diagram in accordance with an embodiment of the present invention.

[0040] FIG. 5 illustrates grid management of weighted track initialization in accordance with an embodiment of the present invention.

[0041] FIG. 6 illustrates weighted track initialization in accordance with an embodiment of the present invention.

[0042] FIG. 7 illustrates grid coverage and time update flow diagram in accordance with an embodiment of the present invention.

[0043] FIGs. 8A and 8B illustrate RADAR and camera sensor data fusion in accordance with an embodiment of the present invention.

[0044] FIG. 9 illustrates validation gating in accordance with an embodiment of the present invention.

[0045] FIG. 10 illustrates an exemplary representation of a method for target track management in accordance with embodiments of the present disclosure.

[0046] FIG. 11 illustrates an exemplary computer system in which or with which embodiments of the present invention may be utilized.

DETAILED DESCRIPTION

[0047] Embodiments of the present disclosure include various steps, which will be described below. The steps may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose or special- purpose processor programmed with the instructions to perform the steps. Alternatively, steps may be performed by a combination of hardware, software, and firmware or by human operators. [0048] Embodiments of the present disclosure may be provided as a computer program product, which may include a machine-readable storage medium tangibly embodying thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to, fixed (hard) drives, magnetic tape, floppy diskettes, optical disks, compact disc read-only memories (CD-ROMs), and magneto-optical disks, semiconductor memories, such as ROMs, PROMs, random access memories (RAMs), programmable read-only memories (PROMs), erasable PROMs (EPROMs), electrically erasable PROMs (EEPROMs), flash memory, magnetic or optical cards, or other type of media/machine-readable medium suitable for storing electronic instructions (e.g., computer programming code, such as software or firmware).

[0049] Various methods described herein may be practiced by combining one or more machine-readable storage media containing the code according to the present disclosure with appropriate standard computer hardware to execute the code contained therein. An apparatus for practicing various embodiments of the present disclosure may involve one or more computers (or one or more processors within a single computer) and storage systems containing or having network access to computer program(s) coded in accordance with various methods described herein, and the method steps of the disclosure could be accomplished by modules, routines, subroutines, or subparts of a computer program product.

[0050] If the specification states a component or feature "may", "can", "could", or "might" be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.

[0051] Although the present disclosure has been described with the purpose of system and method for improvement in track management of high velocity manoeuvring objects/ targets while removing the probability of external clutter and false positive, it should be appreciated that the same has been done merely to illustrate the disclosure in an exemplary manner and any other purpose or function for which they explained structure or configuration can be used is covered within the scope of the present disclosure.

[0052] Exemplary embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which exemplary embodiments are shown. This disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. These embodiments are provided so that this disclosure will be thorough and complete and will fully convey the scope of the disclosure to those of ordinary skill in the art. Moreover, all statements herein reciting embodiments of the disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure).

[0053] Thus, for example, it will be appreciated by those of ordinary skill in the art that the diagrams, schematics, illustrations, and the like represent conceptual views or processes illustrating systems and methods embodying this disclosure. The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing associated software. Similarly, any electronic code generator shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the entity implementing this disclosure. Those of ordinary skill in the art further understand that the exemplary hardware, software, processes, methods, and/or operating systems described herein are for illustrative purposes and, thus, are not intended to be limited to any particular named.

[0054] Aspects of the present disclosure relate to the field of autonomous and/or semi- autonomous vehicles, and in general to systems and methods for acquiring situational awareness information and providing operational intelligence to autonomous and/or semi- autonomous vehicles for safe, efficient, and automated land navigation. More particularly, the present disclosure relates to systems and methods for track management of an autonomous vehicle.

[0055] In an embodiment, the present disclosure provides a technically advanced system and method that efficiently improves track management of high velocity manoeuvring objects/targets using dynamic information while eliminating the probability of external clutters and false positive to enhance reliability of track management. System of the present disclosure is adapted to receive data streams from a plurality of sensors, pre-process the data streams to obtain processed data streams, and thereby synchronize out of sequence data streams, and based on the synchronization, prioritize at least one sensor of a plurality of sensors as a primary sensor and the remaining sensor or sensors from the plurality of sensors as secondary sensor or sensors. The primary sensor is then be utilized to provide track initialization, or target prediction and data association, or track validation and gating or any combination thereof. [0056] An aspect of the present disclosure provides a system (also interchangeably referred to as "a track management system") for enabling management of track for an autonomous vehicle in a service area of automobile industry, wherein the system includes a non-transitory storage device having embodied therein one or more routines operable for track management of an autonomous and/or a semi-autonomous vehicle, and one or more processors coupled to the non-transitory storage device and operable to execute the one or more routines, wherein the one or more routines includes a data stream receive module, an integrated sequential synchronization module, and a sensor prioritization module.

[0057] In an aspect, data stream receive module, which when executed by the one or more processors, enables data streams from a plurality of sensors to be received, wherein the data streams are of different or same time interval. Integrated sequential synchronization module, which when executed by the one or more processors, enables the received data streams to be synchronized, based on which sensor prioritization module, which when executed by the one or more processors, enables prioritization of a first sensor over a second sensor to enable at least any or a combination of an accurate prediction of a target/object, an accurate updation of a state associated with target/object, or an accurate association of data from the received data streams utilizing the first sensor as primary sensor for track management, wherein the first and second sensors are selected from the plurality of sensors for track management.

[0058] In an aspect, the plurality of sensors include but are not limited to multiple radars, cameras, lasers, Lidar sensors, ultrasonic sensors, installed in/configured at the autonomous vehicles or semi-autonomous vehicle.

[0059] In an aspect, first sensor is prioritized over second sensor based at least on one or more measurements associated with or computed by the two sensors, wherein the one or more measurements are selected from any or combination of estimated position(s) of target object, measurement of actual position(s) of the target object sensed by each of the first and the second sensors, constant(s) associated with the measurement computed as a factor of normalization based on sensors static noise characteristic, and cost function of least square errors associated with the first and the second sensors. In another aspect, the first sensor is prioritized over the second sensor based at least on least squares error in estimating position of a target object computed by both the first and second sensors.

[0060] In an aspect, system of the present disclosure further includes a weighted track initialization module, which when executed by the one or more processors, enables identification of an object or a state thereof, wherein the object is captured on the track of the autonomous or the semi-autonomous vehicle by the first prioritized sensor. Such a captured object may further be classified based on one or more pre-defined clusters of objects that have previously been captured on the track of said autonomous or the semi-autonomous vehicle by the first prioritized sensor.

[0061] In another aspect, weighted track initialization module is further configured to initialize tracking of an identified object based on one or more weightages associated with the object in a grid and/or based on weightages of the grid itself. In an aspect, the grid is a predefined area within a range of each sensor of the plurality of sensors. In an aspect, the one or more weightages associated with the object is defined based any or a combination of quality of signals from each sensor of the plurality of sensors (or from at least the first and second sensors), size of the object, relative velocity of the object, and relative yaw rate of the object.

[0062] In an aspect, system of the present disclosure further includes a target prediction and data association module, which when executed by the one or more processors, enables prediction of an object on the track of the autonomous or the semi-autonomous vehicle by the first prioritized sensor as primary, wherein the prediction of the object is performed based on one or more pre-stored objects. Such predicted object that has previously been captured on the track of said autonomous or the semi-autonomous vehicle by the first prioritized sensor may further be associated with the track.

[0063] In an aspect, system of the present disclosure further includes a validation gate determination module, which when executed by the one or more processors, enables determination of validation gates to track association of the autonomous or the semi- autonomous vehicle with the object captured on the track of the autonomous or the semi- autonomous vehicle by the first prioritized sensor based on track history of positional coordinates and/or one or more attributes associated with the object. In an aspect, attributes associated with the object is selected from any or combination of position (Z) of the object, velocity (V) of the object, orientation (Θ) of the object, and features of the object including dimension.

[0064] In another aspect, the validation gate determination module is further configured to update a region for validation gates to track association of the autonomous or the semi- autonomous vehicle with the object while comparing a velocity and an orientation associated with a position of the autonomous or the semi-autonomous vehicle with a prediction of a velocity and an orientation associated with a position of the object.

[0065] An aspect of the present disclosure relates to a method for track management of an autonomous or a semi-autonomous vehicle. The method receives data streams from a plurality of sensors, wherein the data streams are of different or same time interval. Upon receipt of the data streams, the method synchronizes the data streams to thereby prioritize, in real-time, a first sensor over a second sensor based on synchronized data streams to enable at least any or a combination of an accurate prediction of a target/object, an accurate updation of a state associated with target/object, or an accurate association of data from the received data streams utilizing the first sensor for track management for track management, wherein the first and second sensors are selected from the plurality of sensors.

[0066] It would be appreciated that although aspects of the present disclosure have been explained with respect to management of an autonomous or an semi-autonomous vehicle in automobile industry, the present disclosure is not limited to the same in any manner whatsoever and any other form of vehicles are completely covered within the scope of the present disclosure.

[0067] This technical solution of being able to, in real-time, receive data from plurality of sensors, and then dynamically prioritize the sensors for accurate track management and utilizing such sensor prioritization data for weighted track initialization, target prediction, data association, track validation and gating helps in reducing the overall time and effort required to choose and manage tracks in autonomous and/or semi-autonomous vehicles and also reduces the computing resources required for track management. As the proposed system runs dynamically and, in real-time, and looks out for track related data for accurate and efficient track management, it significantly imparts a technical impact through reduction in any processor time that is otherwise required for the vehicle/user to undertake computations for track management.

[0068] FIG. 1 illustrates exemplary functional modules 100 of a system for track management in accordance with an embodiment of the present invention. In an exemplary aspect, the proposed system 100 includes a data stream receive module 102 configured to enable data streams from a plurality of sensors to be received, wherein the data streams are of different or same time interval, an integrated sequential synchronization module 104 configured to enable the received data streams to be synchronized, a sensor prioritization module 106 configured to enable prioritization of a first sensor over a second sensor, wherein the first and second sensors are selected from the plurality of sensors for track management, a weighted track initialization module 108 configured to perform grid based track initialization, a target prediction and data association module 110 configured to predict target vehicle, and a validation gate determination module 112 configured to perform validation gating. [0069] In an aspect, the plurality of sensors include but are not limited to multiple radars, cameras, lasers, Lidar sensors, ultrasonic sensors, installed in/configured at the autonomous vehicles or semi-autonomous vehicle.

[0070] In an aspect, integrated sequential synchronization module 104 is configured to synchronize data streams sensed by multiple sensors and/or received by the data stream receive module 102, and update sensor fusion with the data streams/signals that is of varying time intervals. In an aspect, data streams received by sensors are synchronized for data fusion and validation.

[0071] In an aspect, a sensor prioritization module 106 is configured to prioritize one of the sensors over other for track management as primary sensor; lower priority sensor leads to inaccurate track management and is configured as secondary sensor. The combination of primary and secondary sensors performs accurate track management.

[0072] In an aspect, weighted track initialization module 108 is configured to execute grid based track management and initialization in order to initialize the track of identified objects that are based on weightage of the objects in the grid.

[0073] In an aspect, target prediction and data association module 110 is configured to perform prediction and data association against re-evaluation of the state estimation and prediction of dominant sensor.

[0074] In an aspect, validation gate determination module 112 is configured to perform validation gating that is determined by one or more target vehicle attributes, said attributes being any or a combination of measured position of targets (Z), velocity of targets (V), orientation of the targets (Θ), and classified extracted features of the targets.

[0075] In an embodiment, RADAR and camera are only exemplary sensors, and any other suitable sensor that enables aspects of the invention is well within in the scope of the invention.

[0076] In another embodiment, validation time for target track ID initialization as a function of weightage factors assigned to each tracking object of a grid is calculated.

[0077] In an aspect, state estimation and prediction along with dynamic changeover based upon more accurate sensor estimation among RADAR and camera is performed.

[0078] In an aspect, target track management is performed to enhance track maintenance in complex scenarios such as crowded city and un-predictable movement of vehicles and pedestrians. In an aspect, the track is also managed for non-linear and highly manoeuvring relative movement of targets with respect to host vehicle. [0079] FIG. 2 illustrates a proposed system 200 for integration of sensed data streams and post processing thereof in accordance with an embodiment of the present invention.

[0080] In an aspect, proposed system of the present disclosure measures raw data from RADAR 202 and/or from camera 204, wherein the raw data includes a plurality of parameter values including but not limited to velocity, position (of autonomous vehicle(s)), and such measured information/parameter values/data stream is input at 206 for pre-processing. Synchronization between the pre-processed raw data stream(s) are measured such that if the data streams are out of sequence; integrated sequential synchronization strategy 208 synchronizes out of sequence data stream(s) of sensors (comprising RADAR and camera). In an embodiment, the dynamic changeover for the selection/prioritization of the sensors is adopted to achieve more accurate estimation from sensors, say among RADAR and camera. The sensors are prioritized by utilizing sensor prioritization 210 mechanism. Sensor prioritization 210 data is further utilized for weighted track initialization 212, target prediction and data association 214, and track validation and gating 216.

Integrated Sequential Synchronization Strategy 208

[0081] FIG. 3 illustrates an integrated sequential synchronization strategy 208 configured to synchronize out of sequence data stream(s) of sensors in accordance with an embodiment of the present invention. According to an aspect, integrated sequential synchronization mechanism synchronizes data streams sensed by multiple sensors, wherein sensor data streams are of varying time intervals. In an aspect, discretion for sensor fusion or dependence on individual sensor is decided and therefore state and covariance update is performed at specific instances. In an aspect, data streams that two different sensors receive at different intervals are synchronized for data fusion and validation. In an aspect, integration of both the sensed signals at their own sampling rate reduces latency. FIG. 3 further illustrates resolving discrepancy of the multiple signals with varied sampling rate by using sequential approach of sensor fusion. In an aspect, state and covariance update by sensor 1 (depicted as 302) and state and covariance update by sensor 2 (depicted as 304) are of varying sampling rate/time intervals. In an aspect, fusion of both the state and covariance update by fusion of sensor 1 and state and covariance update by sensor 2 is performed. In case sensors 1 and 2 receive data at same instant, the mechanism enables multiple sensor based fusion, thereby performing state updation. In case sensors 1 and 2 receive data at different instances, mechanisms performs state updation based on available sensor and update the state when data from other sensor is available. Sensor Prioritization 210

[0082] In an aspect, track prediction is performed by utilizing prediction of more accurate sensor as primary/dominant. Secondary sensor is re-evaluated for track management by using validation gate of the prediction model. In an aspect, validation gate provides probabilistic mechanism to remove the uncertainty of state estimation and re-evaluation of validation gate. In another aspect, accurate sensor selected from camera and RADAR with minimal estimation errors over a period is selected as a primary one so as to achieve better accuracy. In an aspect, dynamic changeover of sensor selection strategy that overrides dominance of one sensor over other sensor is adopted that is primarily based on least square errors of estimation w.r.t. sensor 1 and sensor 2 measurements.

[0083] FIG. 4 illustrates sensor prioritization flow diagram 400 in accordance with an embodiment of the present disclosure. According to an aspect, proposed sensor prioritization flow can include the steps of, at step 402, receiving RADAR sensed data for present time frame, at step 404, receiving camera sensed data for present time frame, at step 406, initializing once RADAR as primary/dominant sensor at the starting of time frame, at step 408, continuous parallel computing assuming RADAR as primary sensor and

taking into consideration the least square errors of estimation w.r.t RADAR sensed data and Camera sensed data for n seconds, and at step 410, computing assuming Camera as

primary sensor and taking into consideration the least square errors of estimation w.r.t Radar sensed data and Camera sensed data for n seconds. The method further include the steps of, at step 412, performing a check to evaluate if wherein if is

not lesser than at step 414, selecting RADAR as primary sensor, and if

at step 416, selecting camera as p rrimary sensor for further

p rrocessing. In an asp rect, ' is defined as given below:

Where,

associated with RADAR measurement computed as a factor of

normalization based on sensors static noise characteristic

associated with camera measurement computed as a factor of normalization

based on sensors static noise characteristic function for least square errors with RADAR as primary sensor

function for least square errors with camera as primary sensor

Weighted Track Initialization 212

[0084] It would be appreciated that track initialization is a process of finding out what objects to track, wherein an exemplary objective of object tracking is to identify and track relevant moving object in a scene/captured images and generate exactly one track per object, which involves detecting the moving object, tracking them while they are visible, and reacquiring the object once they emerge from an occlusion to maintain identity. This is an extremely difficult problem made even more difficult when the sensor is moving. In many real-world problems, it refers to figuring out which motion signals correspond to real moving objects. For example, background subtraction provides a signal on which pixels are "moving", but sometimes it produces false positives.

[0085] In an aspect, once the sensor prioritization 210 is executed, weighted track initialization 212 is performed. In an aspect, based on the weighted track initialization 212 information, target feature of say high velocity manoeuvring objects/targets are identified and/or classified from cluster of data points derived based on sensed signals from the RADAR 202 and the camera 204.

[0086] FIG. 5 illustrates grid management of weighted track initialization 500 in accordance with an embodiment of the present invention. According to an aspect, a grid based track management and initialization mechanism is employed to initialize tracking of identified objects based on weightage of the objects in the grid, which eliminates background clutter and false positives in order to manage the track with better accuracy. Such mechanisms also eliminate the impact of one or more objects that are not in the scope of interest. [0087] In an aspect, weightage factors of the tracking object sensed by multiple sensors are defined by one or more attributes of host vehicle, including but not limited to, quality of signals from RADAR (Qrad), quality of signals from camera (Qcam), identified size (S 0bj ), relative velocity (Vel 0 bj), and relative yaw rate of the identified object (Yawrate 0 bj) respectively. In an aspect, validation time for track ID Tn) initialization is based on weightage factors, cell identification, and sensor data availability factor (SDAF) of grid map.

[0088] In an aspect, reliability of sensed data at longer longitudinal distance is reduced for camera sensor, and feature extraction is complex for RADAR sensor, and therefore the mechanism of grid based weighted discretion of the validation for track ID initialization is accountable for such hindrances and enable tracks after validating the reliability of sensed track. In an aspect, T ID grid validation time for target track initialization are computed as a function of and SDAF wherein, in an exemplary aspect,

identifier (i indicates x-axis, j indicates y-axis, k indicates left or right side of sensor), and SDAF = sensor data availability factor.

[0089] According to one embodiment, grid weightage factor is based on various parameters, including but not limited to, Qrad ; Qcarn S Yawrate 0bj of tracked object.

According to an embodiment, quality signals, weightage parameters are computed as:

wherein K a ^calibration parameters and RCS is

RADAR cross section. In an embodiment, track initialization is performed when all the weightage factors (Qcarn are at their optimal value.

[0090] FIG. 6 illustrates weighted track initialization 600 in accordance with an embodiment of the present invention. As can be seen, the outputs from RADAR 202 and the camera 204 are used to form respective grids at 602-1 and 602-2, wherein weightages are then associated for each grid (at 604-1 and 604-2) based on above-mentioned parameters. In an aspect, at block 606, grid coverage and time update is performed for each sensor based grid, output of which is provided to fused track initialization at block 608. In an aspect, both grids (602-1 and 602-2) are in communication with block 608, wherein communication is bidirectional. As can further be seen, grid weightages (604-1 and 604-2) bi-directionally communicate with block 606, wherein bidirectional communication between RADAR based grid weightage 604-1 and block 606 are performed to exchange track weightage factorI/K rad fe , and bidirectional communication between camera based grid weightage 604-2 and block 606 are performed to exchange track weightage factor W cam .. k . In an aspect, grid track association 602-1 and 602-2 are utilized to associate tracks in the respective grids, wherein grids include historical grid based track.

[0091] FIG. 7 illustrates grid coverage and time update flow diagram 700 in accordance with an embodiment of the present invention. According to an aspect, the proposed method for grid coverage and time update include the steps of, at step 702, receiving sensor data for present time frame, at step 704, performing a check to evaluate whether both sensors (camera and RADAR) signal are available in grid, wherein if both sensors signal are available, at step 706, setting SDAF as high, and, at step 708, computing In case, if at step 704, both sensors signal are not available, at step 710, another check is performed to evaluate if primary sensor is available in the grid, wherein if the primary sensor is available, at step 712, setting SDAF as medium, and if the primary sensor is not available, at step 716 setting SDAF as low, wherein setting SDAF as medium or low are utilized for computing T ID at step 708.

[0092] In an embodiment, track history of grid and computed grid validation time for target track initialization is utilized by fused track initialization 708 to assign a track ID (Ύκ>) to target.

Target Prediction and Data Association 214

[0093] In an exemplary aspect, with reference to FIG. 2, sensor prioritization 210 in conjunction with target prediction and association 214 involves prediction of target object and associating prediction with one or more tracks every time whenever prediction is performed. In an aspect, target prediction, state update and data association are utilized for track validation

[0094] In an aspect, a prediction is performed if tracked vehicle is available, wherein such prediction is updated to include other updated tracked vehicles/objects.

[0095] FIGs. 8A and 8B illustrate RADAR and camera sensor data fusion 800 in accordance with an embodiment of the present invention. In an aspect, sensor fusion and track validation mechanism are configured to perform track management, wherein the radar and camera sensor fusion data are utilized to initiate track management by utilizing data streams sensed by camera and RADAR.

[0096] As can be seen in FIG. 8A, raw outputs from both RADAR 202 as well as camera 204 are taken and then pre-processed at blocks 802-1 and 802-2 respectively. Sensor prioritization is performed at blocks 804-1 and 804-2 on pre-processed outputs of the two sensors, based on which weighted track initialization 806-1 and 806-2 are conducted so as to update track initialization values at blocks 808-1 and 808-2. At blocks 810-1 and 810-2, for both the sensors output, track management is implemented and output streams are utilized for state estimation and prediction at blocks 812-1 and 812-2, wherein, blocks 812-1 and 812-2 outputs are input at 814-1 and 814-2 for sensor fusion, filter gain determination and state update, wherein the output of the 814-1 and 814-2 is input for validation gating 816-1 and 816-2. In an embodiment, output of 814-1 and 814-2 is sent to block 818 for integrated sequential synchronization strategy, and output of validation gating 816-1 and 816-2 are sent to block 808-1 and 808-2. In an aspect, sensor fusion and state update at block 814-1 and 814-2 is in bidirectional communication with block 818. In an aspect, FIG. 8A implementation is utilized when data streams received from both the sensors are not synchronized/out of sequence. [0097] In an aspect, data association 808-1, track management 810-1, state estimation and prediction 812-1, sensor data fusion, determine filter gain and state update 814-1, are part of the target prediction and data association itself.

[0098] FIG. 8B shows the same steps for processing of data streams from the two sensors, wherein implementation is utilized when data streams received from both the sensors are synchronized.

Validation and Gating 216

[0099] It would be appreciated that validation gates are used in target tracking to cull unlikely measurement-to-track associations (association of detections and tracks) before remaining association ambiguities are handled by a more comprehensive (and expensive) data association scheme. An essential property of a gate is to accept a high percentage of correct associations, thus maximizing track accuracy but provide a sufficiently tight bound to minimize the number of ambiguous associations.

[00100] FIG. 9 illustrates validation gating 900 in accordance with an embodiment of the present invention. Validation and gating 216 depends on track history of positional coordinate and is a function of velocity, orientation and extracted features.

[00101] In an aspect, gating 216 is determined based on target vehicle attributes, wherein the attributes are any or a combination of measured position of targets (Z), velocity of targets (V), orientation of the targets (Θ), and classified extracted features of the targets. Furthermore, the region of validation gating is updated while comparing the velocity and orientation of the measurements with the prediction of the velocity and orientation of target. The classified extracted feature includes, but is not limited to, the dimension of the target, which is further used to update the region of validation gating.

[00102] In an exemplary embodiment, varying signal receiving timings from the sensors (RADAR and/or camera) are managed in multiple ways; one exemplary way being separate management of the two data streams if the data recipient do not receive the signal at same instance, whereas in another way, innovation and validation for fusion, filter gain update, state and co-variance updates, and track validation is performed simultaneously when the signals from both sensors are available at same instance. In an aspect, track validation is discretized based on a single sensor between sampling interval of the sensor with different sampling rate.

[00103] FIG. 10 illustrates an exemplary representation of a method 1000 for track management in accordance with embodiments of the present disclosure. In an aspect, the method include the steps of, at step 1002, synchronizing out of sequence data streams of a target vehicle, wherein the data streams are sensed by two different sensors; at step 1004, prioritizing a sensor; and, at step 1006, executing weighted track initialization to track identified objects based on weightage of the objects in a grid formed based on the data streams. The method further include the steps of, at step 1008, predicting target and performing data association based on the data streams; and at step 1010, validation gating the sensor fusion data based on history of positional coordinates, velocity, orientation and features of the target vehicle.

[00104] FIG. 11 illustrates an exemplary computer system 1100 in which or with which embodiments of the present invention may be utilized.

[00105] Embodiments of the present disclosure include various steps, which have been described above. A variety of these steps may be performed by hardware components or may be tangibly embodied on a computer-readable storage medium in the form of machine- executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with instructions to perform these steps. Alternatively, the steps may be performed by a combination of hardware, software, and/or firmware. As shown in the figure, computer system 1100 includes an external storage device 1110, a bus 1120, a main memory 1130, a read only memory 1140, a mass storage device 1150, communication port 1160, and a processor 1170. A person skilled in the art will appreciate that computer system 1100 may include more than one processor and communication ports. Processor 1170 may include various modules associated with embodiments of the present invention. Communication port 1160 can be any of an RS-232 port for use with a modem based dialup connection, a 10/100 Ethernet port, a Gigabit or 10 Gigabit port using copper or fiber, a serial port, a parallel port, or other existing or future ports. Communication port 1160 may be chosen depending on a network, such a Local Area Network (LAN), Wide Area Network (WAN), or any network to which computer system 1100 connects. Memory 1130 can be Random Access Memory (RAM), or any other dynamic storage device commonly known in the art. Read only memory 1140 can be any static storage device(s) e.g., but not limited to, a Programmable Read Only Memory (PROM) chips for storing static information e.g., start-up or BIOS instructions for processor 1170. Mass storage 1150 may be any current or future mass storage solution, which can be used to store information and/or instructions. Bus 1120 communicatively couples processor(s) 1170 with the other memory, storage and communication blocks. Optionally, operator and administrative interfaces, e.g. a display, keyboard, and a cursor control device, may also be coupled to bus 1120 to support direct operator interaction with computer system 1100. Other operator and administrative interfaces can be provided through network connections connected through communication port 1160. External storage device 1110 can be any kind of external hard-drives, floppy drives, IOMEGA® Zip Drives, Compact Disc - Read Only Memory (CD-ROM), Compact Disc - Re-Writable (CD-RW), Digital Video Disk - Read Only Memory (DVD-ROM). Components described above are meant only to exemplify various possibilities. In no way should the aforementioned exemplary computer system limit the scope of the present disclosure.

[00106] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms "comprises" and "comprising" should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refers to at least one of something selected from the group consisting of A, B, C ....and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc. The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the appended claims.

[00107] Although the present disclosure has been described with the purpose of system and method for improvement in track management used for automotive applications, it should be appreciated that the same has been done merely to illustrate the disclosure in an exemplary manner and any other purpose or function or application, i.e., for tracking in any transportation systems such as, but not limited to, aerospace, water transport, rail, off-road vehicles, space vehicles, etc., for which explained structure or configuration can be used is covered within the scope of the present disclosure.

[00108] While embodiments of the present disclosure have been illustrated and described, it will be clear that the disclosure is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the disclosure, as described in the claims.

ADVANTAGES OF THE INVENTION

[00109] The present disclosure provides a system and method for track management of high velocity objects/cars/targets while eliminating the probability of external clutter and false positive in order to manage the track with better accuracy.

[00110] The present disclosure provides a system and method for track management of high velocity objects/cars/targets while eliminating the impact of one or more objects that are not in the scope of interest.

[00111] The present disclosure provides a system and method for target track management that can be performed to enhance track maintenance in complex scenarios such as crowded city and un-predictable movement of vehicles and pedestrians.

[00112] The present disclosure provides a system and method for track management that can also be used for non-linear and highly manoeuvring relative movement of targets with respect to host vehicle.