Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AUTONOMOUS VEHICLE MOTION PLANNING
Document Type and Number:
WIPO Patent Application WO/2019/108993
Kind Code:
A1
Abstract:
A system and method for generating simulated vehicles with configured behaviors for analyzing autonomous vehicle motion planners are disclosed. To generate the simulated vehicles, perception data from a plurality of perception data sensors is received. Configuration instructions and data including pre-defined parameters and executables defining a specific driving behavior for each simulated dynamic vehicle of a plurality of simulated dynamic vehicles are obtained. A target position and target speed are generated for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the generated target positions and target speeds being based on the perception data and the configuration instructions and data. A plurality of trajectories and acceleration profiles to transition each of the plurality of simulated dynamic vehicles from a current position and speed to the corresponding target position and target speed is generated.

Inventors:
LI XINGDONG (US)
SUN XING (US)
LIN WUTU (US)
LIU LIU (US)
Application Number:
PCT/US2018/063405
Publication Date:
June 06, 2019
Filing Date:
November 30, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
TUSIMPLE (US)
International Classes:
G05D1/00
Foreign References:
US20170123428A12017-05-04
US20090276111A12009-11-05
US20150081156A12015-03-19
US20170287335A12017-10-05
US20140236414A12014-08-21
Attorney, Agent or Firm:
SATHE, Vinay et al. (US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. A system comprising:

a processor;

a perception data collection module, executable by the processor, configured to receive perception data from a plurality of perception data sensors;

a dynamic vehicle configuration module, executable by the processor, configured to obtain configuration instructions and data including pre-defined parameters and executables that define a specific driving behavior for each simulated dynamic vehicle of a plurality of simulated dynamic vehicles;

a dynamic vehicle simulation module configured to generate a target position and target speed for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the generated target positions and target speeds being based on the perception data and the configuration instructions and data; and

a trajectory generator configured to generate a plurality of trajectories and acceleration profiles to transition each simulated dynamic vehicle of the plurality of simulated dynamic vehicles from a current position and speed to the corresponding target position and target speed as generated by the dynamic vehicle simulation module.

2. The system of claim 1 wherein the perception data sensors are capable of collecting information related to at least one of image generation, light amplification by stimulated emission of radiation (laser), light detection and ranging (LIDAR), global positioning ), sound navigation and ranging (sonar) , radio detection and ranging (radar) , and distance measuring.

3. The system of claim 1 wherein the perception data representing real-world traffic environment information relates to locations, routings, scenarios, and driver behaviors.

4. The system of claim 1 wherein the configuration instructions and data represent at least one simulated dynamic vehicle with an aggressive driving behavior and at least one simulated dynamic vehicle with a conservative driving behavior.

5. The system of claim 1 wherein the dynamic vehicle simulation module uses a rule-based process and corresponding data structures for generating the target position and target speed corresponding to the specific behavior of each simulated dynamic vehicle.

6. The system of claim 1 wherein the dynamic vehicle simulation module is further configured to generate a target heading for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles.

7. The system of claim 1 wherein the trajectory generator is further configured to generate a plurality of waypoints for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the waypoints representing movement and behavior of each simulated dynamic vehicle in a simulation environment.

8. The system of claim 1 wherein the plurality of trajectories and acceleration profiles for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles is used for analyzing a control system of an autonomous vehicle.

9. A method comprising:

receiving perception data from a plurality of perception data sensors;

obtaining configuration instructions and data including pre-defined parameters and executables that define a specific driving behavior for each simulated dynamic vehicle of a plurality of simulated dynamic vehicles;

generating a target position and target speed for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the generated target positions and target speeds being based on the perception data and the configuration instructions and data; and

generating a plurality of trajectories and acceleration profiles to transition each simulated dynamic vehicle of the plurality of simulated dynamic vehicles from a current position and speed to the corresponding target position and target speed.

10. The method of claim 9 wherein the perception data sensors are capable of collecting information related to at least one of image generation, light amplification by stimulated emission of radiation (laser), light detection and ranging (LIDAR), global positioning, sound navigation and ranging (sonar), radio detection and ranging (radar), and distance measuring.

11. The method of claim 9 wherein the perception data representing real-world traffic environment information relates to locations, routings, scenarios, and driver behaviors.

12. The method of claim 9 wherein the configuration instructions and data represent at least one simulated dynamic vehicle with an aggressive driving behavior and at least one simulated dynamic vehicle with a conservative driving behavior.

13. The method of claim 9, further comprising:

generating the target position and target speed corresponding to the specific behavior of each simulated dynamic vehicle using a rule -based process and corresponding data structures.

14. The method of claim 9, further comprising:

generating a target heading for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles.

15. The method of claim 9, further comprising:

generating a plurality of waypoints for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the waypoints representing movement and behavior of each simulated dynamic vehicle in a simulation environment.

16. The method of claim 9, further comprising:

analyzing a control system of an autonomous vehicle using the plurality of trajectories and acceleration profiles for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles.

17. A non-transitory machine-useable storage medium embodying instructions which, when executed by a machine, cause the machine to:

receive perception data from a plurality of perception data sensors;

obtain configuration instructions and data including pre-defined parameters and

executables that define a specific driving behavior for each simulated dynamic vehicle of a plurality of simulated dynamic vehicles;

generate a target position and target speed for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the generated target positions and target speeds being based on the perception

data and the configuration instructions and data; and

generate a plurality of trajectories and acceleration profiles to transition each simulated dynamic vehicle of the plurality of simulated dynamic vehicles from a current position and speed to the corresponding target position and target speed.

18. The non-transitory machine-useable storage medium of claim 17 wherein the perception data representing real-world traffic environment information relates to locations, routings, scenarios, and driver behaviors being monitored.

19. The non-transitory machine-useable storage medium of claim 17 wherein the configuration instructions and data represent at least one simulated dynamic vehicle with an aggressive driving behavior and at least one simulated dynamic vehicle with a conservative driving behavior.

20. The non-transitory machine-useable storage medium of claim 17, the instructions further causing the machine to:

generate the target position and target speed corresponding to the specific behavior of each simulated dynamic vehicle using a rule-based process and corresponding data structures.

21. The non-transitory machine-useable storage medium of claim 17, the instructions further causing the machine to: generate a target heading for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles.

22. The non-transitory machine-useable storage medium of claim 17, wherein the trajectory generator is further configured to generate a plurality of waypoints for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the waypoints representing movement and behavior of each simulated dynamic vehicle in a simulation environment.

23. The non-transitory machine-useable storage medium of claim 17, wherein the plurality of trajectories and acceleration profiles for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles is used for analyzing a control system of an autonomous vehicle.

Description:

PRIORITY CLAIM

[0001] This present document claims priority to U.S. Patent Application 15/827,583 entitled“SYSTEM AND METHOD FOR GENERATING SIMULATED VEHICLES WITH CONFIGURED BEHAVIORS FOR ANALYZING AUTONOMOUS VEHICLE MOTION PLANNERS” filed on November 30, 2017, which is incorporated herein by reference in its entirety.

TECHNICAL FIELD

[0002] The disclosed subject matter pertains generally to tools (systems, apparatuses, methodologies, computer program products, etc.) for autonomous driving simulation systems, trajectory planning, vehicle control systems, and autonomous driving systems, and more particularly, but not by way of limitation, to a system and method for generating simulated vehicles with configured behaviors for analyzing autonomous vehicle motion planners.

BACKGROUND

[0003] An autonomous vehicle is often configured to follow a trajectory based on a computed driving path generated by a motion planner. However, when variables such as obstacles (e.g., other dynamic vehicles) are present on the driving path, the autonomous vehicle must use its motion planner to modify the computed driving path and perform corresponding control operations so the vehicle may be safely driven by changing the driving path to avoid the obstacles. Motion planners for autonomous vehicles can be very difficult to build and configure. The logic in the motion planner must be able to anticipate, detect, and react to a variety of different driving scenarios, such as the actions of the dynamic vehicles in proximity to the autonomous vehicle.

[0004] In most cases, it is dangerous or infeasible to test autonomous vehicle motion planners in real-world driving environments. As such, simulators can be used to test autonomous vehicle motion planners. However, to be effective in testing autonomous vehicle motion planners, these simulators must be able to realistically model the behaviors of the simulated dynamic vehicles in proximity to the autonomous vehicle in a variety of different scenarios. Conventional simulators have been unable to overcome the challenges of modeling driving behaviors of the simulated proximate dynamic vehicles to make the behaviors of the simulated dynamic vehicles as similar to real driver behaviors as possible. Moreover, conventional simulators have been unable to achieve a level of efficiency and capacity necessary to provide an acceptable test tool for autonomous vehicle motion planners.

[0005] Thus, a more efficient simulator system for generating simulated vehicles with configured behaviors for analyzing autonomous vehicle motion planners is necessary.

SUMMARY

[0006] The present document discloses techniques for generating simulated vehicles with configured behaviors for analyzing autonomous vehicle motion planners Disclosed techniques may be implemented in a dynamic vehicle simulation system to generate simulated dynamic vehicles with various driving behaviors to test, evaluate, or otherwise analyze autonomous vehicle motion planning systems, which will be used in real autonomous vehicles in actual driving environments.

[0007] In one example aspect, a system comprising a perception data collection module, executable by a processor is configured to receive perception data from a plurality of perception data sensors. A dynamic vehicle configuration module is configured to obtain configuration instructions and data including pre-defined parameters and executables that define a specific driving behavior for each simulated dynamic vehicle of a plurality of simulated dynamic vehicles. A dynamic vehicle simulation module is configured to generate a target position and target speed for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the generated target positions and target speeds being based on the perception data and the configuration instructions and data. A trajectory generator is configured to generate a plurality of trajectories and acceleration profiles to transition each simulated dynamic vehicle of the plurality of simulated dynamic vehicles from a current position and speed to the corresponding target position and target speed as generated by the dynamic vehicle simulation module.

[0008] In another aspect, a method is disclosed for receiving perception data from a plurality of perception data sensors; obtaining configuration instructions and data including pre-defined parameters and executables that define a specific driving behavior for each simulated dynamic vehicle of a plurality of simulated dynamic vehicles; generating a target position and target speed for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the generated target positions and target speeds being based on the perception data and the configuration instructions and data; and generating a plurality of trajectories and acceleration profiles to transition each simulated dynamic vehicle of the plurality of simulated dynamic vehicles from a current position and speed to the corresponding target position and target speed.

[0009] In another aspect, a non-transitory machine-useable storage medium embodying instructions which, when executed by a machine, cause the machine to: receive perception data from a plurality of perception data sensors; obtain configuration instructions and data including pre-defined parameters and executables defining a specific driving behavior for each of a plurality of simulated dynamic vehicles; generate a target position and target speed for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the generated target positions and target speeds being based on the perception data and the configuration instructions and data; and generate a plurality of trajectories and acceleration profiles to transition each simulated dynamic vehicle of the plurality of simulated dynamic vehicles from a current position and speed to the corresponding target position and target speed.

[0010] These, and other aspects, are disclosed in the present document.

BRIEF DESCRIPTION OF THE DRAWINGS

[0011] The various embodiments are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings. More particular descriptions and equally effective implementations are included in this disclosure.

[0012] Fig. 1 illustrates the components of the dynamic vehicle simulation system according to one example embodiment.

[0013] Fig. 2 illustrates a process flow diagram according to an example embodiment of a system and method for generating simulated vehicles with configured behaviors for analyzing autonomous vehicle motion planners. [0014] Fig. 3 illustrates a diagrammatic representation of a machine in an exemplary form of a computer system within which a set of instructions, when executed, may cause the machine to perform any one or more of the methodologies discussed herein.

[0015] Identical reference numerals have been used, where possible, to designate identical elements that are common to the figures. It is contemplated that elements disclosed in one implementation may be beneficially utilized in other implementations without specific recitation.

DETAILED DESCRIPTION

[0016] In the following detailed description, reference is made to the accompanying drawings that form specific embodiments by way of illustration in which the disclosed subject matter can be practiced. However, it should be understood that other embodiments may be utilized, and structural changes may be made without departing from the scope of the disclosed subject matter. Any combination of the following features and elements is contemplated to implement and practice the disclosure.

[0017] In one example aspect, a system and method for generating simulated vehicles with configured behaviors for analyzing autonomous vehicle motion planners are disclosed. To generate the simulated vehicles, perception data from a plurality of perception data sensors is received. Configuration instructions and data including pre-defined parameters and executables defining a specific driving behavior for each simulated dynamic vehicle of a plurality of simulated dynamic vehicles are obtained. A target position and target speed are generated for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the generated target positions and target speeds being based on the perception data and the configuration instructions and data. A plurality of trajectories and acceleration profiles to transition each of the plurality of simulated dynamic vehicles from a current position and speed to the corresponding target position and target speed is generated. In the description, common or similar features may be designated by common reference numbers. As used herein, “exemplary” may indicate an example, an implementation, or an aspect, and should not be construed as limiting or as indicating a preference or a preferred implementation.

[0019] Autonomous vehicles currently face several technical limitations hindering their interaction and adaptability to the real world. [0020] The current autonomous vehicle technology is often reactive - that is, decisions are based on a current condition or status. For instance, autonomous vehicles may be programmed to make an emergency stop upon detecting an object in the middle of the road. However, current autonomous vehicle technology has a limited capacity to determine the likelihood of being hit from behind or the probability of causing a highway pileup due to quick braking.

[0021] Furthermore, current technology does not know how to make real-world judgment calls. Various objects on the roadway require different judgments based on the context and current conditions. For instance, swerving to avoid a cardboard box causes unnecessary danger to the autonomous vehicle and other drivers. On the other hand, swerving is necessary to avoid hitting persons in the middle of the roadway. The judgment calls change upon the road conditions, the trajectory of other vehicles, the speed of the autonomous vehicle, and the speed of other vehicles.

[0022] Additionally, current technology is not suitable for an environment with other human drivers. Autonomous vehicles must be able to predict the behaviors of other drivers or pedestrians when reacting to changes in traffic patterns. One goal for acceptance of the autonomous vehicles in real life is to behave in a manner that allows proper interaction with other human drivers and vehicles. Human drivers often make decisions in traffic based on predictable human responses that are not necessarily conducive to machine rules. In other words, there is a technical problem with autonomous vehicles in that current autonomous driving vehicles behave too much like a machine. This behavior potentially causes accidents because other drivers do not anticipate certain acts performed by the autonomous vehicle.

[0023] The present document provides technical solutions to the above problems, among other solutions. For example, a simulation environment is used to simulate various road conditions and driver behaviors based on sensory perception data. The simulation environment enhances the ultimate trajectory decision and other autonomous vehicle judgments to become more human-like, thereby reducing problems to surrounding vehicles and pedestrians. The simulation environment is used to simulate behavior based on fluctuating driver behaviors, resulting in fewer difficulties interacting with these driving styles. Thus, the disclosure provides a simulation environment as a solution to the above problems, among other solutions.

[0024] In another example aspect, a system and method for generating simulated vehicles with configured behaviors for analyzing autonomous vehicle motion planners are disclosed herein. Specifically, the present disclosure describes a dynamic vehicle simulation system to generate simulated dynamic vehicles with various driving behaviors to test, evaluate, or otherwise analyze autonomous vehicle motion planning systems, which will be used in real autonomous vehicles in actual driving environments. The simulated dynamic vehicles (also denoted herein as non-player characters or NPC vehicles) generated by the simulation system of various example embodiments described herein can model the vehicle behaviors that would be performed by actual vehicles in the real world, including lane change, overtaking, acceleration behaviors, and the like.

[0025] Fig. 1 illustrates the components of the dynamic vehicle simulation system according to one example embodiment of the dynamic vehicle simulation system 102 and the dynamic vehicle simulation module 200 therein. In particular, the dynamic vehicle simulation system 102 can include a perception data collection module 204. The perception data collection module 204 can be executed by a data processor 171 of the dynamic vehicle simulation system 102. The perception data collection module 204 can include an array of interfaces for receiving perception data from a plurality of perception information sensing devices or perception data sensors 110.

[0026] The perception data sensors 110 may include image generating devices (e.g., cameras), light amplification by stimulated emission of radiation (laser) devices, light detection and ranging (LIDAR) devices, global positioning system (GPS) devices, sound navigation and ranging (sonar) devices, radio detection and ranging (radar) devices, other distance measuring systems, vehicular transducers, and the like. The perception data gathered by the perception data sensors 110 at various traffic locations can include traffic or vehicle image data, roadway data, environmental data, distance data from LIDAR or radar devices, and other sensor information received from the perception data sensors 110 positioned adjacent to particular roadways (e.g., monitored locations) or installed on stationary test vehicles. Additionally, the perception data sensors 110 can include perception data gathering devices installed in or on moving test vehicles being navigated through pre-defined routings in an environment or location of interest. The perception data can include data from which a presence, position, and velocity of neighboring vehicles in the vicinity of or proximate to a host vehicle, autonomous vehicle, or simulated vehicle can be obtained or calculated.

[0027] The perception data collection module 204 can collect actual trajectories of vehicles under different scenarios and different driver behaviors. The different scenarios can correspond to different locations, different traffic patterns, different environmental conditions, and the like. The scenarios can be represented, for example, by an occupancy grid, a collection of vehicle states on a map, or a graphical representation, such as a top-down image of one or more areas of interest. The driver behaviors can correspond to a driver’ s short-term driving activity, such as changing lanes to the left or right, overtaking other vehicles, accelerating/decelerating, merging to/from a ramp, making left or right turn at an intersection, making a U-turn, and the like. The driver behaviors can also correspond to a set of driver or vehicle control actions to accomplish the particular short-term driving activity.

[0028] The image data and other perception data collected by the perception data collection module 204 reflects truly realistic, real-world traffic environment information related to the locations or routings, the scenarios, and the driver behaviors being monitored. Using the standard capabilities of well-known data collection devices, the gathered traffic and vehicle image data, and other perception or sensor data can be wirelessly transferred (or otherwise transferred) to a data processor of a standard computing system, upon which the perception data collection module 204 can be executed. Alternatively, the gathered traffic and vehicle image data and other perception or sensor data can be stored in a memory device at the monitored location or in the test vehicle and transferred later to the data processor of the standard computing system. The traffic and vehicle image data, other perception or sensor data, and the driver behavior data gathered or calculated by the perception data collection module 204 can be used to generate simulated proximate dynamic vehicles for a simulation environment as described in more detail below.

[0029] As described above, the dynamic vehicle simulation system 102 can gather the perception data collected by the perception data collection module 204. This perception data can be used in a simulation environment, produced by the dynamic vehicle simulation system 102, to create corresponding simulations of proximate dynamic vehicles or object trajectories in the simulation environment. As a result, the example embodiments use the perception data collection module 204 to collect perception data that can be used to infer corresponding human driving behaviors. Then, the example embodiments can use the dynamic vehicle simulation system 102 in the simulation environment to simulate proximate dynamic vehicles with configurable human driving behaviors based in part on the collected perception data.

[0030] Referring again to Fig. 1, the dynamic vehicle simulation system 102 can include a dynamic vehicle configuration module 206, a set of dynamic vehicle configuration data 208, and a dynamic vehicle simulation module 210. The dynamic vehicle configuration module 206 and the dynamic vehicle simulation module 210 can be executed by a data processor 171 of the dynamic vehicle simulation system 102. The dynamic vehicle configuration data 208 can be stored in a memory device or system 172 of the dynamic vehicle simulation system 102. The dynamic vehicle configuration module 206 can be configured to read portions of the pre-defined data retained as the dynamic vehicle configuration data 208 to obtain pre-defined parameters and executables for each of a plurality of dynamic vehicles being simulated by the dynamic vehicle simulation module 210, described in more detail below. The pre-defined parameters and executables for each simulated dynamic vehicle constitute configuration instructions and data defining a specific driving behavior for each simulated dynamic vehicle of a plurality of simulated dynamic vehicles. The configuration instructions and data enable the dynamic vehicle simulation module 210 to generate a simulation of a particular dynamic vehicle with a specific driving behavior. For example, the configuration instructions and data for a particular dynamic vehicle can cause the dynamic vehicle simulation module 210 to generate a simulation of the particular dynamic vehicle with an aggressive driving behavior. In at least one implementation, the aggressive driving behavior can correspond to a simulated dynamic vehicle that frequently changes lanes, exhibits steep acceleration and deceleration rates, and travels close to other neighboring vehicles.

[0031] In another example, the configuration instructions and data for a particular dynamic vehicle can cause the dynamic vehicle simulation module 210 to generate a simulation of the particular dynamic vehicle with a conservative driving behavior. In at least one implementation, the conservative driving behavior can correspond to a simulated dynamic vehicle that infrequently changes lanes, exhibits moderate acceleration and deceleration rates, and maintains a greater distance from other neighboring vehicles. Other specific driving behaviors can be simulated using the configuration instructions and data defined in the dynamic vehicle configuration data 208 and processed by the dynamic vehicle configuration module 206.

[0032] The dynamic vehicle simulation module 210, of an exemplary embodiment as shown in Fig. 1, can receive the perception data from the perception data collection module 204 and the configuration instructions and data for each simulated dynamic vehicle from the dynamic vehicle configuration module 206. The received perception data can inform the dynamic vehicle simulation module 210 of the environment surrounding the particular dynamic vehicle being simulated. For example, the perception data can include information indicative of the presence, position, and velocity of neighboring vehicles in the vicinity of or proximate to a host vehicle, an autonomous vehicle, or simulated dynamic vehicle. The perception data can also include information indicative of the presence and position of obstacles, the location of the available roadways, traffic patterns, and other environmental information. The configuration instructions and data for each dynamic vehicle to be simulated can inform the dynamic vehicle simulation module 210 of the specific configurable driving behaviors to be modeled for the particular dynamic vehicle being simulated. Given the perception data and the configuration instructions and data for each dynamic vehicle being simulated, the dynamic vehicle simulation module 210 can generate a proposed or target position, speed, and heading for each particular dynamic vehicle being simulated at specific points in time. The proposed or target position, speed, and heading for each simulated dynamic vehicle can be generated based on the received perception data and the specific configuration instructions and data.

[0033] In at least one embodiment, the dynamic vehicle simulation module 210 can use a rule -based process and corresponding data structures to determine and generate the target position, speed, and heading corresponding to the specific behavior of each simulated dynamic vehicle based on the configuration instructions and data corresponding to each simulated dynamic vehicle. In the example embodiment, the specific behavior of a simulated dynamic vehicle, as represented in the rule-based process and corresponding data structures, can be modeled using the target position and direction of the simulated dynamic vehicle along with the target speed of the simulated dynamic vehicle. The target position/direction and target speed of the simulated dynamic vehicle correspond to the position and speed of a vehicle that would be expected given the pre-defined dynamic vehicle configuration data 208 as described above. Thus, the target position/direction and target speed of the simulated dynamic vehicle correspond to the pre-configured behavior for a specific simulated dynamic vehicle. The target position/direction and target speed of each simulated dynamic vehicle are likely to be different because the target position/direction and target speed of the simulated dynamic vehicle correspond to the pre-configured behavior for specific simulated dynamic vehicles.

[0034] In at least one embodiment, a particular simulated dynamic vehicle having a pre-configured behavior corresponding to an aggressive driver may be more likely (higher probability) to have a determined target position/direction and target speed associated with a lane change, passing maneuver, sharp turn, or sudden stop. A different simulated dynamic vehicle having a pre-configured behavior corresponding to a conservative driver may be less likely (lower probability) to have a determined target position/direction and target speed associated with a lane change, passing maneuver, sharp turn, or sudden stop.

[0035] As a result, a target position/direction and target speed of each simulated dynamic vehicle conforming to the configured driving behavior for each simulated dynamic vehicle can be generated by the dynamic vehicle simulation module 210. The target position/direction and target speed can be passed to a trajectory generator 212 as shown in Fig. 1.

[0036] As illustrated in Fig. 1, the trajectory generator 212 can receive the target position/direction and target speed generated by the dynamic vehicle simulation module 210 for each simulated dynamic vehicle as described above. The trajectory generator 212 can generate a trajectory to transition a particular simulated dynamic vehicle from its current position/direction and speed to the target position/direction and target speed as generated by the dynamic vehicle simulation module 210.

[0037] In an example embodiment, the trajectory generator 212 can include a path sampler module 214 and a speed sampler module 216. The path sampler module 214 can generate multiple paths or trajectories from the particular simulated dynamic vehicle’s current position to the target position. The multiple trajectories enable a selection of a particular trajectory based on the presence of obstacles or the accommodation of other simulation goals, such as safety, fuel-efficiency, and the like. The speed sampler module 216 can generate multiple acceleration profiles to transition the particular simulated dynamic vehicle from its current speed to the target speed. Again, the multiple acceleration profiles enable a selection of a particular acceleration profile based on the presence of obstacles or the accommodation of other simulation goals, such as safety, fuel-efficiency, and the like. The multiple trajectories and multiple acceleration profiles for each simulated dynamic vehicle can be represented as waypoints each having a corresponding position, speed, acceleration, and time. The waypoints generated by the trajectory generator 212 can represent the movements and behaviors of each simulated dynamic vehicle in the simulation environment.

[0038] As shown in Fig. 1, the trajectory corresponding to each of a plurality of simulated dynamic vehicles can be provided as an output from the dynamic vehicle simulation system 102 and an input to an autonomous vehicle controller 182. The autonomous vehicle controller 182 can include a motion planner module used to generate a trajectory for an autonomous vehicle based on the environment around the autonomous vehicle and the destination or goals of the autonomous vehicle. The environment around the autonomous vehicle can include the presence, position, heading, and speed of proximate vehicles or other objects near the autonomous vehicle. Given the trajectories corresponding to a plurality of simulated dynamic vehicles as provided by the dynamic vehicle simulation system 102, the motion planner module in the autonomous vehicle controller 182 can be stimulated to react to the presence and behavior of the simulated dynamic vehicles just as the motion planner would react to the presence and behavior of real vehicles in a real-world driving environment. In this manner, the dynamic vehicle simulation system 102 can be used to produce trajectories corresponding to a plurality of simulated dynamic vehicles, which can be used to stimulate the motion planner of an autonomous vehicle.

[0039] The trajectories produced by the motion planner in response to the plurality of simulated dynamic vehicles can be analyzed to determine if the motion planner is producing acceptable output. As described above, the behaviors of the simulated dynamic vehicles generated by the dynamic vehicle simulation system 102 can be configured, modified, and specifically tuned to produce a wide range of driving behaviors, environments, scenarios, and tests to exercise the full capabilities of the autonomous vehicle motion planner. As a result of the processing performed by the dynamic vehicle simulation system 102 as described above, data corresponding to simulated drivers and vehicle behaviors and corresponding simulated dynamic vehicle trajectories can be produced. Ultimately, the dynamic vehicle simulation system 102 can be used to provide highly configurable simulated traffic trajectory information to a user or for configuration or analysis of a control system of an autonomous vehicle. In particular, the simulated traffic trajectory information can be used to create a virtual world where a control system for an autonomous vehicle can be analyzed, modified, and improved. The virtual world is configured to be identical (as possible) to the real world where vehicles are operated by human drivers. In other words, the simulated traffic trajectory information generated by the dynamic vehicle simulation system 102 is highly useful for configuring and analyzing the control systems of an autonomous vehicle. The dynamic vehicle simulation system 102 and the simulated traffic trajectory information described and claimed herein can be implemented, configured, processed, and used in a variety of other applications and systems as well.

[0040] Referring again to Fig. 1, the dynamic vehicle simulation system 102 can be configured to include executable modules developed for execution by a data processor 171 in a computing environment of the dynamic vehicle simulation system 102 and the dynamic vehicle simulation module 200 therein. In the example embodiment, the dynamic vehicle simulation module 200 can be configured to include the plurality of executable modules as described above.

[0041] In various example embodiments, the set of dynamic vehicle configuration data 208 can be configured to simulate more than the typical driving behaviors. To simulate an environment that is identical to the real world as much as possible, the dynamic vehicle configuration data 208 can represent typical driving behaviors, which represent average drivers. Additionally, the dynamic vehicle configuration data 208 can also represent atypical driving behaviors. In most cases, the trajectories corresponding to the plurality of simulated dynamic vehicles include typical and atypical driving behaviors. As a result, autonomous vehicle motion planners can be stimulated by the dynamic vehicle simulation system 102 using trajectories related to the driving behaviors of polite and impolite drivers as well as patient and impatient drivers in the virtual world. In all, the simulated dynamic vehicles can be configured with data representing driving behaviors that are as varied as possible. [0042] A data storage device or memory 172 can also be provided in the dynamic vehicle simulation system 102 of an example embodiment. The memory 172 can be implemented with standard data storage devices (e.g., flash memory, DRAM, SIM cards, or the like) or as cloud storage in a networked server. In an example embodiment, the memory 172 can be used to store the set of dynamic vehicle configuration data 208 as described above.

[0043] Fig. 2 illustrates a process flow diagram according to an example embodiment of a system and method 1000 for dynamic vehicle simulation. In step 1010, t perception data is received from a plurality of perception data sensors. In step 1020, configuration instructions and data including pre-defined parameters and executables defining a specific driving behavior for each of a plurality of simulated dynamic vehicles are obtained. In step 1030, a target position and target speed are generated for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the generated target positions and target speeds being based on the perception data and the configuration instructions and data. In step 1040, a plurality of trajectories and acceleration profiles to transition each of the plurality of simulated dynamic vehicles from a current position and speed to the corresponding target position and target speed are generated. Although the steps are sequentially listed here, the steps may be processed or performed in any order according to other implementations.

[0044] Fig. 3 shows a diagrammatic representation of a machine in an exemplary form of a computing system 700 within which a set of instructions when executed and/or processing logic when activated may cause the machine to perform any one or more of the methodologies described and/or claimed herein. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a laptop computer, a tablet computing system, a Personal Digital Assistant (PDA), a cellular telephone, a smartphone, a web appliance, a set-top box (STB), a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) or activating processing logic that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term“machine” can also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions or processing logic to perform any one or more of the methodologies described and/or claimed herein.

[0045] The example computing system 700 can include a data processor 702 (e.g., a

System-on-a-Chip (SoC), general processing core, graphics core, and optionally other processing logic) and a memory 704, which can communicate with each other via a bus or other data transfer system 706. The mobile computing and/or communication system 700 may further include various input/output (I/O) devices and/or interfaces 710, such as a touchscreen display, an audio jack, a voice interface, and optionally a network interface 712. In an example embodiment, the network interface 712 can include one or more radio transceivers configured for compatibility with any one or more standard wireless and/or cellular protocols or access technologies (e.g., 2nd (2G), 2.5, 3rd (3G), 4th (4G) generation, and future generation radio access for cellular systems, Global System for Mobile communication (GSM), General Packet Radio Services (GPRS), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (WCDMA), LTE, CDMA2000, WLAN, Wireless Router (WR) mesh, and the like). Network interface 712 may also be configured for use with various other wired and/or wireless communication protocols, including TCP/IP, ETDP, SIP, SMS, RTP,

WAP, CDMA, TDMA, UMTS, UWB, WiFi, WiMax, Bluetooth™, IEEE 802.1 lx, and the like. In essence, network interface 712 may include or support virtually any wired and/or wireless communication and data processing mechanisms by which information/data may travel between a computing system 700 and another computing or communication system via network 714.

[0046] The memory 704 can represent a machine -readable medium on which is stored one or more sets of instructions, software, firmware, or other processing logic (e.g., logic 708) embodying any one or more of the methodologies or functions described and/or claimed herein. The logic 708, or a portion thereof, may also reside, completely or at least partially within the processor 702 during execution thereof by the mobile computing and/or communication system 700. As such, the memory 704 and the processor 702 may also constitute machine-readable media. The logic 708, or a portion thereof, may also be configured as processing logic or logic, at least a portion of which is partially implemented in hardware. The logic 708, or a portion thereof, may further be transmitted or received over a network 714 via the network interface 712. While the machine-readable medium of an exemplary embodiment can be a single medium, the term "machine-readable medium" should be taken to include a single non-transitory medium or multiple non- transitory media (e.g., a centralized or distributed database, and/or associated caches and computing systems) that store the one or more sets of instructions. The term "machine-readable medium" can also be taken to include any non-transitory medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the various embodiments, or that is capable of storing, encoding or carrying data structures utilized by or associated with such a set of instructions. The term "machine-readable medium" can accordingly be taken to include, but not be limited to, solid- state memories, optical media, and magnetic media.

[0047] Some embodiments described herein may be captured using the following clause- based description.

[0048] A system comprising:

a processor;

a perception data collection module, executable by the processor, configured to receive perception data from a plurality of perception data sensors;

a dynamic vehicle configuration module, executable by the data processor, configured to obtain configuration instructions and data including pre-defined parameters and executables that define a specific driving behavior for each simulated dynamic vehicle of a plurality of simulated dynamic vehicles;

a dynamic vehicle simulation module configured to generate a target position and target speed for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the generated target positions and target speeds being based on the perception data and the configuration instructions and data; and

a trajectory generator configured to generate a plurality of trajectories and acceleration profiles to transition each simulated dynamic vehicle of the plurality of simulated dynamic vehicles from a current position and speed to the corresponding target position and target speed as generated by the dynamic vehicle simulation module. [0049] The system of clause 1 wherein the perception data sensors are capable of collecting information related to at least one of image generation, light amplification by stimulated emission of radiation (laser), light detection and ranging (LIDAR), global positioning), sound navigation and ranging (sonar) , radio detection and ranging (radar) , and distance measuring.

[0050] The system of clause 1 wherein the perception data representing real-world traffic environment information relates to locations, routings, scenarios, and driver behaviors.

[0051] The system of clause 1 wherein the configuration instructions and data represent at least one simulated dynamic vehicle with an aggressive driving behavior and at least one simulated dynamic vehicle with a conservative driving behavior.

[0052] The system of clause 1 wherein the dynamic vehicle simulation module uses a rule- based process and corresponding data structures for generating the target position and target speed corresponding to the specific behavior of each simulated dynamic vehicle.

[0053] The system of clause 1 wherein the dynamic vehicle simulation module is further configured to generate a target heading for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles.

[0054] The system of clause 1 wherein the trajectory generator is further configured to generate a plurality of waypoints for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the waypoints representing movement and behavior of each simulated dynamic vehicle in a simulation environment.

[0055] The system of clause 1 wherein the plurality of trajectories and acceleration profiles for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles is used for analyzing a control system of an autonomous vehicle.

[0056] A method comprising:

receiving perception data from a plurality of perception data sensors;

obtaining configuration instructions and data including pre-defined parameters and executables that define a specific driving behavior for each simulated dynamic vehicle of a plurality of simulated dynamic vehicles;

generating a target position and target speed for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the generated target positions and target speeds being based on the perception data and the configuration instructions and data; and

generating a plurality of trajectories and acceleration profiles to transition each simulated dynamic vehicle of the plurality of simulated dynamic vehicles from a current position and speed to the corresponding target position and target speed.

[0057] The method of clause 9 wherein the perception data sensors are capable of collecting information related to at least one of image generation, light amplification by stimulated emission of radiation (laser), light detection and ranging (LIDAR), global positioning, sound navigation and ranging (sonar), radio detection and ranging (radar), and distance measuring.

[0058] The method of clause 9 wherein the perception data representing real-world traffic environment information relates to locations, routings, scenarios, and driver behaviors.

[0059] The method of clause 9 wherein the configuration instructions and data represent at least one simulated dynamic vehicle with an aggressive driving behavior and at least one simulated dynamic vehicle with a conservative driving behavior.

[0060] The method of clause 9 including using a rule-based process and corresponding data structures for generating the target position and target speed corresponding to the specific behavior of each simulated dynamic vehicle.

[0061] The method of clause 9, further comprising:

generating a target heading for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles.

[0062] The method of clause 9, further comprising:

generating a plurality of waypoints for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the waypoints representing movement and behavior of each simulated dynamic vehicle in a simulation environment.

[0063] The method of clause 9, further comprising:

analyzing a control system of an autonomous vehicle using the plurality of trajectories and acceleration profiles for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles.

[0064] A non-transitory machine-useable storage medium embodying instructions which, when executed by a machine, cause the machine to:

receive perception data from a plurality of perception data sensors; obtain configuration instructions and data including pre-defined parameters and executables that define a specific driving behavior for each simulated dynamic vehicle of a plurality of simulated dynamic vehicles;

generate a target position and target speed for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the generated target positions and target speeds being based on the perception data and the configuration instructions and data; and

generate a plurality of trajectories and acceleration profiles to transition each

simulated dynamic vehicle of the plurality of simulated dynamic vehicles from a current position and speed to the corresponding target position and target speed.

[0065] The non-transitory machine-useable storage medium of clause 17 wherein the perception data representing real-world traffic environment information relates to locations, routings, scenarios, and driver behaviors being monitored.

[0066] The non-transitory machine-useable storage medium of clause 17 wherein the configuration instructions and data represent at least one simulated dynamic vehicle with an aggressive driving behavior and at least one simulated dynamic vehicle with a conservative driving behavior.

[0067] The non-transitory machine-useable storage medium of clause 17, wherein the dynamic module uses a rule-based process and corresponding data structures for generating the target position and target speed corresponding to the specific behavior of each simulated dynamic vehicle.

[0068] The non-transitory machine-useable storage medium of clause 17, wherein the dynamic vehicle simulation module is further configured to generate a target heading for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles.

[0069] The non-transitory machine-useable storage medium of clause 17, wherein the trajectory generator is further configured to generate a plurality of waypoints for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles, the waypoints representing movement and behavior of each simulated dynamic vehicle in a simulation environment.

[0070] The non-transitory machine-useable storage medium of clause 17, wherein the plurality of trajectories and acceleration profiles for each simulated dynamic vehicle of the plurality of simulated dynamic vehicles is used for analyzing a control system of an autonomous vehicle.

[0071] The disclosed and other embodiments, modules and the functional operations described in this document can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this document and their structural equivalents, or in combinations of one or more of them. The disclosed and other embodiments can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more them. The term“data processing apparatus” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them. A propagated signal is an artificially generated signal, e.g., a machine-generated electrical, optical, or electromagnetic signal, that is generated to encode information for transmission to suitable receiver apparatus.

[0072] A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.

[0073] The processes and logic flows described in this document can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).

[0074] Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random-access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. However, a computer need not have such devices. Computer readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.

[0075] While this patent document contains many specifics, these should not be construed as limitations on the scope of any invention or of what may be claimed, but rather as descriptions of features that may be specific to particular embodiments of particular inventions. Certain features that are described in this patent document in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.

[0076] Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Moreover, the separation of various system components in the embodiments described in this patent document should not be understood as requiring such separation in all embodiments.

[0077] Only a few implementations and examples are described. Other implementations, enhancements, and variations can be made based on what is described and illustrated in this patent document.

[0078] The illustrations of embodiments described herein are intended to provide a general understanding of the structure of various embodiments, and they are not intended to serve as a complete description of all the elements and features of components and systems that might make use of the structures described herein. Many other embodiments will be apparent to those of ordinary skill in the art upon reviewing the description provided herein. Other embodiments may be utilized and derived, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The figures herein are merely representational and may not be drawn to scale. Certain proportions thereof may be exaggerated, while others may be minimized. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.

[0079] Some embodiments implement functions in two or more specific interconnected hardware modules or devices with related control and data signals communicated between and through the modules, or as portions of an application- specific integrated circuit. Thus, the example system is applicable to software, firmware, and hardware implementation s .

[0080] The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment

[0081] While the foregoing is directed to implementations of the present disclosure, other and further implementations of the disclosure may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.