Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD OF DYNAMICALLY CONTROLLING PARAMETERS FOR PROCESSING SENSOR OUTPUT DATA
Document Type and Number:
WIPO Patent Application WO/2019/022910
Kind Code:
A2
Abstract:
Various embodiments include dynamically controlling one or more parameters for obtaining and/or processing sensor data received from a sensor on a vehicle based on the speed of the vehicle. In some embodiments, parameters for obtaining and/or processing sensor data may be individually tuned (e.g., decreased, increased, or maintained) by leveraging differences in the level of quality, accuracy, confidence and/or other criteria in sensor data associated with particular missions/tasks performed using the sensor data. For example, the sensor data resolution required for collision avoidance may be less than the sensor data resolution required for inspection tasks, while the update rate required for inspection tasks may be less than the update rate required for collision avoidance. Parameters for obtaining and/or processing sensor data may be individually tuned based on the speed of the vehicle and/or the task or mission to improve consumption of power and/or other resources.

Inventors:
TURPIN MATTHEW (US)
CHAVES STEPHEN (US)
MELLINGER III DANIEL (US)
DOUGHERTY JOHN (US)
SHOMIN MICHAEL (US)
SWEET III CHARLES (US)
SWART HUGO (US)
Application Number:
PCT/US2018/039951
Publication Date:
January 31, 2019
Filing Date:
June 28, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
QUALCOMM INC (US)
International Classes:
G06F1/32
Other References:
None
Attorney, Agent or Firm:
HANSEN, Robert, M. et al. (US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. A method of dynamically controlling a sensor on a vehicle, comprising:

determining, by a processor of the vehicle, a speed of the vehicle; and controlling, by the processor, one or more parameters for obtaining or processing sensor data output from the sensor based on at least the speed of the vehicle.

2. The method of claim 1 wherein controlling the one or more parameters for obtaining or processing the sensor data output from the sensor based on at least the speed of the vehicle comprises:

determining, by the processor, whether the speed of the vehicle exceeds a speed threshold; and

decreasing, by the processor, one or more of (i) a sampling or frame rate of the sensor and (ii) a data processing rate of the sensor data in response to determining that the speed of the vehicle does not exceed the speed threshold.

3. The method of claim 1, further comprising:

determining, by the processor, a task or mission performed by the vehicle using the sensor data; and

controlling, by the processor, the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the task or mission performed using the sensor data.

4. The method of claim 3, wherein controlling the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the task or mission performed using the sensor data comprises: decreasing, by the processor, at least one of a resolution of the sensor data, a sampling or frame rate of the sensor, a data processing rate of the sensor data, a range of depths or depth-related information searched for in the sensor data, or any combination thereof.

5. The method of claim 3, wherein the task or mission performed using the sensor data comprises one or more of mapping, object inspection, collision avoidance, localization, or any combination thereof.

6. The method of claim 1, further comprising:

determining, by the processor, a distance to an object closest to the vehicle; and controlling, by the processor, the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the distance to the object closest to the vehicle.

7. The method of claim 6, wherein controlling the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the distance to the object closest to the vehicle comprises:

determining, by the processor, whether the distance to the object closest to the vehicle is within a threshold distance; and

decreasing, by the processor, a resolution of the sensor data in response to determining that the distance to the object closest to the vehicle is within the threshold distance.

8. The method of claim 7, further comprising:

decreasing, by the processor, a range of pixel disparities searched in

stereoscopic sensor data received from a stereoscopic camera in response to determining that the distance to the object closest to the vehicle is not within the threshold distance.

9. The method of claim 1, wherein the vehicle is an unmanned vehicle.

10. The method of claim 1, wherein the sensor is a camera, a stereoscopic camera, an image sensor, a radar sensor, a sonar sensor, an ultrasound sensor, a depth sensor, a time-of-flight sensor, a lidar sensor, an active sensor, a passive sensor, or any combination thereof.

11. A computing device for a vehicle, comprising:

a processor coupled to a sensor and configured with processor-executable instructions to:

determine a speed of the vehicle; and

control one or more parameters for obtaining or processing sensor data output from the sensor based on at least the speed of the vehicle.

12. The computing device of claim 11, wherein the processor is further configured to control the one or more parameters for obtaining or processing the sensor data output from the sensor based on at least the speed of the vehicle by:

determining whether the speed of the vehicle exceeds a speed threshold; and decreasing one or more of (i) a sampling or frame rate of the sensor and (ii) a data processing rate of the sensor data in response to determining that the speed of the vehicle does not exceed the speed threshold.

13. The computing device of claim 11, wherein the processor is further configured with processor-executable instructions to:

determine a task or mission performed by the vehicle using the sensor data; and control the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the task or mission performed using the sensor data.

14. The computing device of claim 13, wherein the processor is further configured with processor-executable instructions to control the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the task or mission performed using the sensor data by:

decreasing at least one of a resolution of the sensor data, a sampling or frame rate of the sensor, a data processing rate of the sensor data, a range of depths or depth- related information searched for in the sensor data, or any combination thereof.

15. The computing device of claim 11, wherein the processor is further configured with processor-executable instructions to:

determine a distance to an object closest to the vehicle; and

control the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the distance to the object closest to the vehicle.

16. The computing device of claim 15, wherein the processor is further configured with processor-executable instructions to control the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the distance to the object closest to the vehicle by:

determining whether the distance to the object closest to the vehicle is within a threshold distance; and

decreasing a resolution of the sensor data in response to determining that the distance to the object closest to the vehicle is within the threshold distance.

17. The computing device of claim 16, wherein the processor is further configured with processor-executable instructions to:

decrease a range of pixel disparities searched in stereoscopic sensor data received from a stereoscopic camera in response to determining that the distance to the object closest to the vehicle is not within the threshold distance.

18. A non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processor of a computing device for a vehicle to perform operations comprising:

determining a speed of the vehicle; and

controlling one or more parameters for obtaining or processing sensor data output from a sensor based on at least the speed of the vehicle.

19. The non-transitory processor-readable storage medium of claim 18 wherein the stored processor-executable instructions are configured to cause the processor to perform operations such that controlling the one or more parameters for obtaining or processing the sensor data output from the sensor based on at least the speed of the vehicle comprises:

determining whether the speed of the vehicle exceeds a speed threshold; and decreasing one or more of (i) a sampling or frame rate of the sensor and (ii) a data processing rate of the sensor data in response to determining that the speed of the vehicle does not exceed the speed threshold.

20. The non-transitory processor-readable storage medium of claim 18 wherein the stored processor-executable instructions are configured to cause the processor to perform operations further comprising:

determining a task or mission performed by the vehicle using the sensor data; and

controlling the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the task or mission performed using the sensor data.

21. The non-transitory processor-readable storage medium of claim 20 wherein the stored processor-executable instructions are configured to cause the processor to perform operations such that controlling the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the task or mission performed using the sensor data comprises:

decreasing at least one of a resolution of the sensor data, a sampling or frame rate of the sensor, a data processing rate of the sensor data, a range of depths or depth- related information searched for in the sensor data, or any combination thereof.

22. The non-transitory processor-readable storage medium of claim 18 wherein the stored processor-executable instructions are configured to cause the processor to perform operations further comprising:

determining a distance to an object closest to the vehicle; and

controlling the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the distance to the object closest to the vehicle.

23. The non-transitory processor-readable storage medium of claim 22 wherein the stored processor-executable instructions are configured to cause the processor to perform operations such that controlling the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the distance to the object closest to the vehicle comprises:

determining whether the distance to the object closest to the vehicle is within a threshold distance; and

decreasing a resolution of the sensor data in response to determining that the distance to the object closest to the vehicle is within the threshold distance.

24. The non-transitory processor-readable storage medium of claim 23 wherein the stored processor-executable instructions are configured to cause the processor to perform operations further comprising: decreasing a range of pixel disparities searched in stereoscopic sensor data received from a stereoscopic camera in response to determining that the distance to the object closest to the vehicle is not within the threshold distance.

25. A vehicle, comprising:

means for determining a speed of the vehicle; and

means for controlling one or more parameters for obtaining or processing sensor data output from a sensor based on at least the speed of the vehicle.

26. The vehicle of claim 25 wherein means for controlling the one or more

parameters for obtaining or processing the sensor data output from the sensor based on at least the speed of the vehicle comprises:

means for determining whether the speed of the vehicle exceeds a speed threshold; and

means for decreasing one or more of (i) a sampling or frame rate of the sensor and (ii) a data processing rate of the sensor data in response to determining that the speed of the vehicle does not exceed the speed threshold.

27. The vehicle of claim 25, further comprising:

means for determining a task or mission performed by the vehicle using the sensor data; and

means for controlling the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the task or mission performed using the sensor data.

28. The vehicle of claim 27, wherein means for controlling the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the task or mission performed using the sensor data comprises: means for decreasing at least one of a resolution of the sensor data, a sampling or frame rate of the sensor, a data processing rate of the sensor data, a range of depths or depth-related information searched for in the sensor data, or any combination thereof.

29. The vehicle of claim 25, further comprising:

means for determining a distance to an object closest to the vehicle; and means for controlling the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the distance to the object closest to the vehicle.

30. The vehicle of claim 29, wherein means for controlling the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the distance to the object closest to the vehicle comprises:

means for determining whether the distance to the object closest to the vehicle is within a threshold distance; and

means for decreasing a resolution of the sensor data in response to determining that the distance to the object closest to the vehicle is within the threshold distance.

Description:
TITLE

System And Method Of Dynamically Controlling Parameters For Processing Sensor Output Data

RELATED APPLICATION(S)

[0001] This application is a continuation-in-part of U.S. Patent Application No.

15/224,904, filed on August 1, 2016, entitled "System And Method Of Dynamically Controlling Parameters For Processing Sensor Output Data For Collision Avoidance And Path Planning," the entire contents of which are incorporated herein by reference.

BACKGROUND

[0002] Unmanned vehicles, such as an unmanned aerial vehicle (UAV), are typically configured with view sensors (e.g., cameras, radar, etc.) capable of perceiving an environment within a field of view (FOV) in a direction that the sensor is facing. Data from view sensors may be used by an autonomous vehicle to navigate through the environment, including detecting obstacles, determining how to avoid obstacles, path mapping, and/or path finding. For example, a stereoscopic camera on an autonomous vehicle can capture stereoscopic image pairs of the environment in the direction that the stereoscopic camera is facing. A processor (e.g., central processing unit (CPU), system-on- chip (SOC), etc.) processes the stereoscopic image pairs to generate three- dimensional (3D) depth maps of the environment within the field of view of the camera. To enable depth measurements of the environment all around the

autonomous vehicles, multiple stereoscopic cameras may be situated so that a combination of the respective fields of view may encompass 360 degrees around the vehicle. However, the use of multiple stereo cameras or other view sensors (e.g., radar, sonar, etc.) increases the processing demands on the processor. The faster the vehicle is moving the faster sensor data (e.g., images) need to be processed to detect obstacles in time to avoid them. However, the vehicle's processor has a limited processing bandwidth (available millions of instructions per second (MIPS)). SUMMARY

[0003] Various embodiments are disclosed for dynamically controlling one or more parameters for obtaining and/or processing sensor data received from a sensor, particularly a stereoscopic sensor, on a vehicle based on the speed of the vehicle and/or a particular mission or task performed using the sensor output data ("sensor data"). For example, in some embodiments, when a vehicle is hovering or slowly moving, it is likely that the surrounding environment perceived by the sensor will also be changing slowly, if at all. Thus, in some embodiments the update rate at which sensor output data is obtained (e.g., frame rate) and/or processed may be decreased or throttled when the vehicle speed exceeds a threshold. Although some parameters (e.g., the update rate) may be increased when the vehicle's speed exceeds the threshold, other parameters for processing sensor data may be decreased based on the particular mission or task performed using the sensor data.

[0004] Some embodiments may include controlling parameters for obtaining and/or processing sensor data by leveraging differences in the level of quality, accuracy, confidence and/or other criteria in sensor data associated with particular missions or tasks (e.g., mapping, inspection, localization, collision avoidance). For example, the resolution of the sensor data required to perform collision avoidance may be less than the resolution of the sensor data required to inspect a product for defects, while the update rate required for an inspection task may be less than the update rate required for collision avoidance. Thus, in some embodiments, one or more parameters for obtaining and/or processing sensor data may be decreased, while other parameters may be maintained or increased depending on the particular task. In this way, parameters for obtaining and/or processing sensor output may be individually tuned (e.g., decreased, increased, or maintained) based on the vehicle's speed and the task or mission performed using the sensor data. In some embodiments, such parameter control may improve consumption of various resources, such as power, memory, and/or processing time, for example. [0005] Various embodiments for dynamically controlling a sensor on a vehicle may include a processor of the vehicle (e.g., a UAV) determining a speed of the vehicle, and controlling one or more parameters for obtaining or processing sensor data output from the sensor based on at least the speed of the vehicle. In some embodiments, controlling the one or more parameters for obtaining or processing the sensor data output from the sensor based on at least the speed of the vehicle may include determining whether the speed of the vehicle exceeds a speed threshold and decreasing one or more of (i) a sampling or frame rate of the sensor and (ii) a data processing rate of the sensor data in response to determining that the speed of the vehicle does not exceed the speed threshold. In some embodiments, the sensor may be a camera, a stereoscopic camera, an image sensor, a radar sensor, a sonar sensor, an ultrasound sensor, a depth sensor, a time-of-flight sensor, a lidar sensor, an active sensor, a passive sensor, and/or any combination thereof.

[0006] Some embodiments may further include a task or mission performed by the vehicle using the sensor data and control the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the task or mission performed using the sensor data. In some embodiments, controlling the one or more parameters may include decreasing a resolution of the sensor data, a sampling or frame rate of the sensor, a data processing rate of the sensor data, a range of depths or depth-related information searched for in the sensor data, and/or any combination thereof. In some embodiments, the task or mission performed using the sensor data may include mapping, object inspection, collision avoidance, localization, and/or any combination thereof.

[0007] Some embodiments may further include determining a distance to an object closest to the vehicle, and controlling the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the distance to the object closest to the vehicle. In some embodiments, controlling the one or more parameters for obtaining or processing the sensor data based on the speed of the vehicle and the distance to the object closest to the vehicle may include determining whether the distance to the object closest to the vehicle is within a threshold distance and decreasing a resolution of the sensor data in response to determining that the distance to the object closest to the vehicle is within the threshold distance. Some embodiments may further include decreasing a range of pixel disparities searched in stereoscopic sensor data received from a stereoscopic camera in response to determining that the distance to the object closest to the vehicle is not within the threshold distance.

[0008] Further embodiments include a vehicle and/or a computing device within a vehicle including a processor configured with processor-executable instructions to perform operations of the embodiment methods summarized above. In some embodiments, the vehicle may be an unmanned vehicle. Further embodiments include a non-transitory processor-readable storage medium having stored thereon processor- executable instructions configured to cause a processor to perform operations of the embodiment methods summarized above. Further embodiments include a vehicle and/or a computing device within a vehicle including means for performing functions of the embodiment methods summarized above.

BRIEF DESCRIPTION OF THE DRAWINGS

[0009] The accompanying drawings, which are incorporated herein and constitute part of this specification, illustrate exemplary embodiments, and together with the general description given above and the detailed description given below, serve to explain the features of the various embodiments.

[0010] FIG. 1 is a schematic perspective view of an unmanned aerial vehicle (UAV) navigating through an environment in which various embodiments may be applied.

[0011] FIGS. 2A and IB illustrate front elevation and plan views, respectively, of a UAV including multiple view sensors according to some embodiments. [0012] FIG. 3 illustrates components of a control unit for a vehicle that may be configured to implement methods of dynamically controlling parameters for processing output data from multiple view sensors on a vehicle for collision avoidance and/or path planning according to some embodiments.

[0013] FIG. 4A and 4B illustrate a method of dynamically controlling parameters for processing output data from multiple view sensors on a UAV for collision avoidance and/or path planning according to some embodiments.

[0014] FIGS. 5A, 5B and 5C are schematic diagrams that illustrate a processor controlling parameters for processing output data from multiple stereoscopic cameras according to some embodiments.

[0015] FIG. 6 illustrates another method of dynamically controlling parameters for processing output data from multiple view sensors on a vehicle for collision avoidance and/or path planning according to some embodiments.

[0016] FIG. 7 illustrates another method of dynamically controlling parameters for processing output data from multiple view sensors on a vehicle for collision avoidance and/or path planning according to some embodiments.

[0017] FIG. 8 illustrates a method of dynamically controlling parameters for processing sensor data according to some embodiments.

[0018] FIG 9 is a schematic diagram that illustrates the concept of controlling the range of disparities searched between stereoscopic images according to some embodiments.

[0019] FIG. 10 illustrates a method of dynamically controlling parameters for processing sensor data based on the speed of a vehicle and the task or mission performed using the output data received from the sensor according to some embodiments. DETAILED DESCRIPTION

[0020] Various embodiments will be described in detail with reference to the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. References made to particular examples and implementations are for illustrative purposes, and are not intended to limit the scope of the claims.

[0021] As used herein, the term "vehicle" refers to one of various types of unmanned or manned vehicles. Unmanned vehicles may be remotely controlled, autonomous, or semi-autonomous. Autonomous (or semi-autonomous) vehicles are capable of sensing their environment and navigating on their own with minimal inputs from a user. Manned vehicles and autonomous vehicles may be periodically controlled by an operator, and thus semi-autonomous. Examples of vehicles suitable for implementing various embodiments include unmanned aerial vehicles (UAVs), including robots or drones; terrestrial vehicles, including automobiles; space-based vehicles, including spacecraft or space probes; and aquatic vehicles, including surface-based or undersea watercraft. Unmanned vehicles are becoming more commonplace in a number of military and commercial applications.

[0022] The term "computing device" is used herein to refer to an electronic device equipped with at least a processor. Examples of computing devices may include UAV flight control and/or mission management computer that are onboard the UAV, as well as remote computing devices communicating with the UAV configured to perform operations of the various embodiments. Remote computing devices may include wireless communication devices (e.g., cellular telephones, wearable devices, smart-phones, web-pads, tablet computers, Internet enabled cellular telephones, Wi- Fi® enabled electronic devices, personal data assistants (PDA's), laptop computers, etc.), personal computers, and servers. In various embodiments, computing devices may be configured with memory and/or storage as well as wireless communication capabilities, such as network transceiver(s) and antenna(s) configured to establish a wide area network (WAN) connection (e.g., a cellular network connection, etc.) and/or a local area network (LAN) connection (e.g., a wireless connection to the Internet via a Wi-Fi® router, etc.).

[0023] Various embodiments are disclosed for dynamically controlling one or more parameters for processing sensor data received from various view sensors on a vehicle, including, for example, the rate at which sensor data from various view sensors on the vehicle are received and/or processed, based on the direction of travel, orientation, and speed of the vehicle. Various embodiments may be particularly useful for managing the processing of sensor data used by a navigation or collision avoidance system of an autonomous vehicle, such as a UAV. For example, in some embodiments, the rate (or frequency) at which data from a particular view sensor is processed may depend on the current direction and speed of travel and the view direction in which the sensor perceives the environment (i.e., field of view).

Processing demands may be reduced by focusing processing on sensor data from view sensors with a field of view encompassing the direction of travel, while reducing the rate or frequency at which data from view sensors with fields of view in directions other than the direction of travel. In some embodiments, the rate of processing data from a given view sensor on the vehicle may be based on a collision risk probability that a vehicle processor may determine as a function of the speed and direction of the vehicle and one or more risk factors, such as the speed of potential threats (e.g., other autonomous vehicles, missiles, birds, etc.).

[0024] In some embodiments, the processor may adjust the sampling or frame rate of view sensors in order to reduce the amount of information (bandwidth) carried over internal data buses, and enable data buses with a fixed bandwidth to carry more data from view sensors having a field of view encompassing the direction of travel. In some embodiments, the processor may not control the sampling or frame rate of view sensors, and instead adjust or throttle the rate at which sensor data from each view sensor is analyzed of processed, thus focusing processing resources on data from sensors having a field of view encompassing the direction of travel. In some embodiments, the processor may do both, adjusting the sampling or frame rate of view sensors and adjusting or throttling the rate at which sensor data from each view sensor is analyzed of processed.

[0025] In some embodiments, the processor may dynamically control the transmit power of the various view sensors on a vehicle based on the direction of travel, orientation, and speed of the vehicle. For example, the extent to which some view sensors perceive the environment (e.g., distance away from the sensor) may depend on the transmit power of the view sensor (e.g., radar sensors, sonar sensors, etc.). Power demands may be reduced by increasing the transmit power of view sensors with a field of view encompassing the direction of travel, while reducing the transmit power of sensors with fields of view in directions not encompassing the direction of travel.

[0026] FIG. 1 is a schematic perspective view of a UAV 110 navigating through an environment 100 in which various embodiments may be applied. With autonomous navigation, there is generally a risk that the unmanned vehicle 110 will collide with structures or objects in the environment that are positioned along the navigational route. For example, the UAV 110 may need to avoid colliding with various obstacles along its flight path including, but are not limited to, trees 120, buildings 130, power/telephone lines 140, and supporting poles 150. The UAV 110 may also need to avoid moving objects, such as people, birds, and other moving vehicles. To counter such risk, the UAV 110 may be configured with a computerized collision avoidance system that senses the environment 100 and causes the vehicle 110 to perform defensive maneuvers in order to avoid collisions with obstacles within the vicinity of the vehicle 110. Such maneuvers may include emergency braking, hovering, reducing speed, changing direction, orientation, or any combination thereof.

[0027] FIGS. 2A and IB illustrate front elevation and plan views, respectively, of a UAV 200 (which may correspond to the UAV 110 in FIG. 1) including multiple view sensors 220a, 220b, 220c, 220d (collectively 220) according to some embodiments. With reference to FIGS. 1-2B, in some embodiments, the UAV 200 may be equipped with four view sensors 220a, 220b, 220c, 220d for use in a collision avoidance system. In some embodiments, the UAV 200 may include more or less than four view sensors 220a, 220b, 220c, 220d. In some embodiments, the view sensors 220 may include any type of view sensor that is capable of perceiving an environment (e.g., 100) within a limited field of view. For example, the view sensors 220 may include one or more of cameras (e.g., stereoscopic cameras), image sensors, radar sensors, sonar sensors, ultrasound sensors, depth sensors, time-of-flight sensors, laser radar sensors (known as "lidar sensors"), active sensors, passive sensors, or any combination thereof. View sensors may include combinations of different view sensors, such as radar plus machine vision sensors, binocular or trinocular camera systems, multispectral camera systems, etc. Different types of view sensors (i.e., view sensors using different technologies) typically have different fields of view in terms of viewing angle and/or range sensitivities.

[0028] In some embodiments, the view sensors 220 may be attached to a main housing 210 of the UAV 200. In some embodiments, the view sensors 220 may be integrated into the main housing 210 of the UAV 200, such that the view sensors 220 are exposed through openings in the main housing 210. In some embodiments, the view sensors 220a, 220b, 220c, 220d may be offset from one another (e.g.,

horizontally, vertically, or both horizontally and vertically), such that the view sensors may face different view directions to perceive (or sense) the environment surrounding the UAV 200.

[0029] The view sensors 220a, 220b, 220c, 220d may be characterized by the direction in which each view sensor faces (referred to herein as the view direction 230) and/or the field of view 232 of each view sensor. The view direction 230 may be a centerline of the field of view 232 of the sensor. Some view sensors may have a narrow field of view 232, such as laser radars (known as "lidar"), in which case the characteristic evaluated in the various embodiments may be only the view direction 230. Some view sensors may have a wide field of view 232, such as cameras equipped with a fish eye lens, and radars with near-omnidirectional antennas. [0030] View sensors with a wide field of view 232 (e.g., 90 degrees as illustrated in FIG. 2B) may encompass the direction of travel of the UAV 200 even when the view direction 230 is not aligned with the direction of travel. For example, a view sensor 220a, 220b, 220c, 220d with a 90 degree field of view (as illustrated in FIG. 2B) may encompass the direction of travel of the UAV 200 when the view direction 230 is within 45 degrees of the direction of travel.

[0031] In some embodiments, the respective fields of view 232 of the view sensors 220 may overlap to some extent, such as to provide a complete 360 degree view of the environment. For example, if the four view sensors 220a, 220b, 220c, 220d illustrated in FIG. 2B have a field of view 232 of greater than 90 degrees, the fields of view of adjacent sensors would overlap in the illustrated configuration. In some

embodiments, the view sensors 220 may be tilted away from the rotors 215 (e.g., upward or downward) in order to prevent the rotors 215 from entering into the respective fields of view of the sensors 220.

[0032] The UAV 200 may include an onboard computing device within the main housing 210 that is configured to fly and/or operate the UAV 200 without remote operating instructions (i.e., autonomously), and/or with some remote operating instructions or updates to instructions stored in a memory, such as from a human operator or remote computing device (i.e., semi-autonomously).

[0033] The UAV 200 may be propelled for flight in any of a number of known ways. For example, two or more propulsion units, each including one or more rotors 215, may provide propulsion or lifting forces for the UAV 200 and any payload carried by the UAV 200. In some embodiments, the UAV 200 may include wheels, tank-treads, or other non-aerial movement mechanisms to enable movement on the ground, on or in water, and combinations thereof. The UAV 200 may be powered by one or more types of power source, such as electrical, chemical, electro-chemical, or other power reserve, which may power the propulsion units, the onboard computing device, and/or other onboard components. For ease of description and illustration, some detailed aspects of the UAV 200 are omitted, such as wiring, frame structure, power source, landing columns/gear, or other features that would be known to one of skill in the art.

[0034] Although the UAV 200 is illustrated as a quad copter with four rotors, some embodiments of the UAV 200 may include more or fewer than four rotors 215. In addition, although the view sensors 220a, 220b, 220c, 220d are illustrated as being attached to UAV 200, the view sensors 220a, 220b, 220c, 220d may, in some embodiments, be attached to other types of vehicles, including both manned and unmanned vehicles.

[0035] FIG. 3 illustrates components of a control unit 300 for a vehicle (e.g., the UAV 110, 200 in FIGS. 1-2B) that may be configured to implement methods of dynamically controlling one or more parameters for processing output data from multiple view sensors on a vehicle based on speed and direction of travel according to some embodiments. With reference to FIGS. 1-3, the control unit 300 may include various circuits and devices used to power and control the operation of the vehicle. The control unit 300 may include a processor 310, memory 312, a view sensor input/output (I/O) processor 320, one or more navigation sensors 322, a navigation processor 324, a radio frequency (RF) processor 330 coupled to an antenna 332, and a power supply 340. The view sensor input/output (I/O) processor 320 may be coupled to multiple view sensors 220.

[0036] In some embodiments, the processor 310 may be dedicated hardware specifically adapted to implement a method of dynamically controlling one or more parameters for processing sensor data, such as controlling data processing rates of output data, from multiple view sensors 220 on the vehicle for collision avoidance and/or path planning according to some embodiments. In some embodiments, the processor 310 may also control other operations of the vehicle (e.g., flight of the UAV 200). In some embodiments, the processor 310 may be or include a programmable processing unit 311 that may be programmed with processor-executable instructions to perform operations of the various embodiments. In some embodiments, the processor 310 may be a programmable microprocessor, microcomputer or multiple processor chip or chips that can be configured by software instructions to perform a variety of functions of the vehicle. In some embodiments, the processor 310 may be a combination of dedicated hardware and a programmable processing unit 311.

[0037] In some embodiments, the memory 312 may store processor-executable instructions and/or outputs from the view sensor I/O processor 320, the one or more navigation sensors 322, navigation processor 324, or a combination thereof. In some embodiments, the memory 312 may be volatile memory, non- volatile memory (e.g., flash memory), or a combination thereof. In some embodiments, the memory 312 may include internal memory included in the processor 310, memory external to the processor 310, or a combination thereof.

[0038] The processor 310, the memory 312, the view sensor I/O processor 320, the one or more navigation sensors 322, the navigation processor 324, the RF processor 330, and any other electronic components of the control unit 300 may be powered by the power supply 340. In some embodiments, the power supply 340 may be a battery, a solar cell, or other type of energy harvesting power supply.

[0039] In some embodiments, the processor 310 may be coupled to the view sensor I/O processor 320, the one or more navigation sensors 322, the navigation processor 324, or a combination thereof. In some embodiments, the processor 310 may be further configured to receive and process the respective outputs of the view sensor I/O processor 320, the one or more navigation sensors 322, the navigation processor 324, or a combination thereof.

[0040] The processor 310 may be configured to receive output data from the view sensors 220 mounted on the vehicle. In some embodiments, the processor 310 may receive the output data directly from the view sensor I/O processor 320, which may be coupled to the view sensors 220. In some embodiments, the processor 310 may access the output data from the view sensors 220 via the memory 312.

[0041] The processor 310 may be configured to receive navigational data from the one or more navigation sensors 322 and/or the navigation processor 324. The processor 310 may be configured to use such data in order to determine the vehicle's present position, orientation, speed, velocity, direction of travel, or any combination thereof, as well as the appropriate course towards a desired destination. The one or more navigation sensors 322 may include one or more gyroscopes (typically at least three), a gyrocompass, one or more accelerators, location sensors, or other types of sensors useful in detecting and controlling the attitude and movements of the vehicle. Location sensors coupled to the navigation processor 324 may include a global navigation satellite system (GNSS) receiver (e.g., one or more Global Positioning System (GPS) receivers) enabling the vehicle (e.g., 300) to determine the vehicles coordinates, altitude, direction of travel and speed using GNSS signals. Alternatively or in addition, the navigation processor 324 may be equipped with radio navigation receivers for receiving navigation beacons or other signals from radio nodes, such as navigation beacons (e.g., very high frequency (VHF) Omni Directional Radio Range (VOR) beacons), Wi-Fi access points, cellular network base stations, radio stations, remote computing devices, other UAVs, etc. In some embodiments in which the vehicle is a UAV (e.g., 200), the one or more navigation sensors 322 may provide attitude information including vehicle pitch, roll, and yaw values.

[0042] In some embodiments, the processor 310 may be coupled to the RF processor 330 in order to communicate with a remote computing device 350. For example, in some embodiments, the RF processor 330 may be configured to receive signals 334 via the antenna 332, such as signals from navigation facilities, etc., and provide such signals to the processor 310 and/or the navigation processor 324 to assist in operation of the vehicle (e.g., 200). The RF processor 330 may be a transmit-only or a two-way transceiver processor. For example, the RF processor 330 may include a single transceiver chip or a combination of multiple transceiver chips for transmitting and/or receiving signals. The RF processor 330 may operate in one or more of a number of radio frequency bands depending on the supported type of communications.

[0043] The remote computing device 350 may be any of a variety of computing devices, including but not limited to a processor in cellular telephones, smart-phones, web-pads, tablet computers, Internet enabled cellular telephones, wireless local area network (WLAN) enabled electronic devices, laptop computers, personal computers, and similar electronic devices equipped with at least a processor and a communication resource to communicate with the RF processor 330. Information may be transmitted from one or more components of the control unit 300 (e.g., the processor 310) to the remote computing device 350 over a wireless link 334 using Bluetooth®, Wi-Fi® or other wireless communication protocol.

[0044] While the various components of the control unit 300 are illustrated in FIG. 3 as separate components, some or all of the components may be integrated together in a single device or module, such as a system-on-chip module.

[0045] FIG. 4A illustrates a method 400 of dynamically controlling one or more parameters for processing output data from multiple view sensors (e.g., 220a, 220b, 220c) on a vehicle (e.g., UAV 200) based in part upon the vehicle's speed and direction of travel according to some embodiments. With reference to FIGS. 1-4 A, operations of the method 400 may be performed by the vehicle's control unit (e.g., 300).

[0046] In block 410, a processor (e.g., the processor 310 in the control unit 300) may determine a speed and a direction of travel of the vehicle in any suitable manner. In some embodiments, the processor may obtain the vehicle's current speed or direction of travel from one or more of the navigation sensors (e.g., 322), the navigation processor 324, or both. In some embodiments, the processor may calculate vehicle's speed or direction of travel based on navigational data (e.g., position, orientation, time, etc.) provided by one or more of the navigation sensors (e.g., 322), the navigation processor 324, or both. In some embodiments, the direction of travel may be represented as a two-dimensional (2D) vector (e.g., left, right, forward, backwards or North, South, East, West, North-East, etc.). In some embodiments, the direction of travel may be represented as a three-dimensional (3D) vector. [0047] In block 420, the processor may determine a view direction (e.g., 230) and/or field of view (e.g., 232) of each of the view sensors (e.g., 220a, 220b, 220c, 220d). In some embodiments, where the view direction and/or field of view of each view sensor is pre-configured (i.e., fixed), the processor may access information regarding the view direction (e.g., 230) and/or field of view (e.g., 232) for each view sensor stored in the memory 312. In some embodiments in which the view direction of each view sensor (i.e., centerline of the field of view) is controlled by the processor (e.g., 310) or remotely controlled by a remote computing device (e.g. 350), the processor (e.g., 310) may access information regarding the current view direction of each view sensor by requesting the view direction information directly from each sensor (e.g., via the view sensor I/O processor 320) or by accessing the view direction information of each view sensor from the memory 312. In some embodiments, the view direction of each view sensor may be represented as a two-dimensional (2D) vector (e.g., left, right, forward, backwards or North, South, East, West, North-East, etc.). In some embodiments, the view direction of each view sensor may be represented as a 3D vector. In some embodiments, the field of view of each view sensor may be represented as a 2D or 3D vector of a centerline (i.e., sensor view direction) and an angle about the 2D or 3D vector defining the expanse of the field of view.

[0048] In block 430, the processor (e.g., 310) may control one or more parameters for processing output data from each of the view sensors (e.g., 220a, 220b, and 220c) based on the speed and the direction of travel of the vehicle and the view direction (e.g., 230) and/or field of view (e.g., 232) of the view sensor. In various

embodiments, the one or more parameters for processing output data from view sensors that may be controlled by the processor may include one or more of a data sampling rate, a sensor frame rate, a processing rate (i.e., a rate at which sensor data is processed), and/or a transmit power for view sensors that transmit (e.g., radar, sonar, etc.) [0049] In some embodiments, the processor may throttle (or reduce) the data sampling and/or processing rate of the output data received from one or more view sensors with a view direction that is directed away from or a field of view that does not encompass the direction of travel of the moving vehicle. In some embodiments, the processor may control the view sensors to reduce the sampling or frame rate of those sensors with a view direction that is directed away from or with a field of view that does not encompass the direction of travel of the vehicle. In some embodiments, the processor may both control the sampling or frame rate of view sensors and adjust the rate at which sensor data is processed based on the field of view of each sensor and the direction and speed of travel of the vehicle. In some embodiments, the processor may maintain or increase the data sampling and/or processing rate of the output data received from one or more view sensors having a field of view that encompasses the direction of travel of the moving vehicle. Thus, processing demands may be reduced by focusing processing on sensor data in or that encompasses the direction of travel where the probability or likelihood of collision is greater, while reducing the sampling rate/frequency processing of data from view sensors with fields of view that do not encompass the direction of travel where the probability/likelihood of collision is less.

[0050] In optional block 440, the processor (e.g., 310) may control the transmit power of each of the view sensors (e.g., 220a, 220b, and 220c) based on the speed and the direction of travel of the vehicle and the view direction (e.g., 230) and/or field of view (e.g., 232) of the view sensor. View sensors using greater transmit power (e.g., radar sensors, sonar sensors, etc.) may be capable of perceiving the environment at greater distances from the sensor as compared to view sensors using less transmit power. In some embodiments, the processor may reduce the transmit power of one or more view sensors having a view direction that is directed away from or having a field of view that does not encompass the direction of travel of the moving vehicle. In some embodiments, the processor may maintain or increase the transmit power of one or more view sensors having a view direction aligned with or having a field of view that encompasses the direction of travel of the moving vehicle. Thus, power demands may be reduced by focusing transmit power to view sensors oriented towards the direction of travel where the probability or likelihood of collision is greater, while reducing the transmit power to view sensors oriented in directions other than the direction of travel where the probability/likelihood of collision is less.

[0051] FIG. 4B is a flow diagram that illustrates a method 4300 of controlling one or more parameters for processing output data (e.g., a data processing rate, a sensor frame rate, etc.) received from each view sensor based on the speed and the direction of travel (i.e., block 430 of FIG. 4A) according to some embodiments. In some embodiments, for example, if the vehicle is travelling fast in a particular direction, there may be a high probability or likelihood that the vehicle will continue to travel in the same direction and that a collision with other vehicles or obstacles may occur in that direction. Thus, in some embodiments, the processor may throttle or reduce processing of output data from view sensors that perceive the environment in directions that do not encompass the current direction of travel. Conversely, if the vehicle is travelling slowly in a particular direction, there may be a high probability or likelihood that the vehicle may change direction and that a probability or likelihood of collision with other vehicles or obstacles may occur in any direction. Thus, in some embodiments, the processor may throttle or reduce the data processing rate equally across all view sensors.

[0052] With reference to FIGS. 1-4B, in determination block 4320, the processor (e.g., 310) may determine whether the speed of the vehicle (e.g., UAV 200) exceeds a speed threshold (i.e., the vehicle is travelling fast).

[0053] In response to determining that the speed of the vehicle does not exceed the speed threshold (i.e., determination block 4320 = "No"), the processor may adjust one or more parameters for processing output data (e.g., the data processing rate, sensor frame rate, etc.) received from one or more of the view sensors in block 4340. For example, in some embodiments, the processor may set the data processing rate to the same for data from all view sensors.

[0054] In response to determining that the speed of the vehicle exceeds the speed threshold (i.e., determination block 4320 = "Yes"), the processor may determine for each view sensor whether the view direction (e.g., 230) of the sensor is directed away from or the field of view (e.g., 232) does not encompass the direction of travel of the vehicle in determination block 4360.

[0055] In response to determining that one or more view sensors are directed away from or do not encompass the direction of travel of the vehicle (i.e., determination block 4360 = "Yes"), the processor may throttle the sensor sampling or frame rate and/or the data processing rate of the output data received from the one or more view sensors in block 4380. In some instances, the processor may place a view sensor directed away from the direction of travel in a low power mode.

[0056] In response to determining that the view direction of one or more view sensors are aligned with or encompass the direction of travel of the vehicle (i.e., determination block 4360 = "No"), the processor may maintain or increase the sensor sampling or frame rate and/or the data processing rate of the output data received from the one or more view sensors that are directed towards the direction of travel of the vehicle as described in block 4400.

[0057] FIGS. 5 A, 5B, and 5C are schematic diagrams that illustrate a processor (e.g., 310) controlling one or more parameters (e.g., sensor sampling or frame rates and/or data processing rates) for processing output data from multiple stereoscopic cameras 520a, 520b, 520c (which may correspond to the view sensors 220 in FIGS. 2A and 3 and view sensors 220a, 220b, 220c, 220d in FIG. 2B) based on the speed and the direction of travel of a vehicle (e.g., UAV 200) according to some embodiments. With reference to FIGS. 1-5C, a vehicle (e.g., robot, car, drone, etc.) may be equipped with a stereoscopic camera 520a facing forward, a stereoscopic camera 520b facing left, and a stereoscopic camera 520c facing right. The stereoscopic cameras 520a, 520b, 520c may be coupled directly or indirectly (e.g., via a view sensor I/O processor 320 of FIG. 3) to a processor (e.g., 310) that performs obstacle detection by

processing the camera output data. As the vehicle moves, the processor processes image frames captured by each of the stereoscopic cameras 520a, 520b, and 520c to generate information (e.g., 3D depth maps) used in collision avoidance and/or path planning.

[0058] Images captured of the environment in the direction of travel may have a higher probability or likelihood of containing information useful for avoiding collisions. In particular, images captured in the direction of travel will reveal stationary objects that may be potential collision threats.

[0059] Thus, when the vehicle is moving in a forward direction (e.g., as shown in FIG. 5A), the processor (e.g., 310) may set the camera frame rate and/or process image frames captured by the left-facing stereoscopic image 520b and the right-facing stereoscopic camera 520c at a lower rate (i.e., fewer frames are received and/or processed per second) than the forward-facing stereoscopic camera 520a. For example, the processor may set the camera frame rate and/or process images frames from the left-facing stereoscopic image 520b and the right-facing stereoscopic camera 520c at a lower rate of five frames per second (fps) and set the camera frame rate and/or process image frames from the forward-facing stereoscopic camera 520a at a standard or increased rate of thirty frames per second (fps).

[0060] When the vehicle is moving in a lateral direction (e.g., to the right, such as in FIG. 5B), the processor may set the camera frame rate and/or process the images frames captured by the forward-facing stereoscopic camera 520a and the left-facing stereoscopic camera 520b at a lower rate than the image frames captured by the right- facing stereoscopic camera 520c. [0061] When the vehicle is moving in a direction that is not perfectly aligned with the orientation of any of the view sensors (e.g., moving in the North- West direction, such as in FIG. 5C), the processor may set the camera frame rate and/or process the images frames captured by view sensors with a field of view that encompasses the direction of motion at a higher rate than for sensors with a field of view that does not encompass the direction of travel. In the example illustrated in FIG. 5C, the forward- facing stereoscopic camera 520a and the left- facing stereoscopic camera 520b have fields of view that overlap and encompass the direction of travel. Therefore, the camera frame rate and/or process the images frames captured by shows view sensors may be set at a rate that is greater (e.g., proportionally more) than the rate of capture and/or processing of image frames by the right-facing stereoscopic camera 520c.

[0062] Referring to FIGS. 1-5C, in some implementations, such as when the vehicle is an aircraft or a waterborne vessel, the processor (e.g., 310) may receive radio signals broadcast from other vessels that indicate the other vessel's location, speed, and direction. For example, commercial aircraft transmit Automatic Dependent Surveillance - Broadcast (ADS-B) signals that inform other aircraft of their respective location, altitude, direction of travel and speed. Similarly, ships and other waterborne vessels broadcast Automatic Identification System (AIS) signals that inform other vessels of their respective location, direction of travel, speed and turning rate. In such systems, each vessel broadcasts its location, speed, and direction, and each vessel processes signals received from other vessels to calculate probability of collision and/or a closest point of approach (CPA). Thus, in some embodiments, in addition to adjusting the sampling and/or processing rate of other view sensors (e.g., radar), the processor (e.g., 310) may prioritize the processing of AIS or ADS-B signals from vessels that present the greatest risk of collision. For example, if the vehicle is moving fast, the processor may throttle the data processing rate of AIS or ADS-B signals received from other vessels (e.g., via one or more view sensors 220) that are not in the direction of travel, while increasing the data processing rate of signals received from other vessels in the direction of travel. Conversely, if the vehicle is moving slowly compared to other vessels, the signals received from all other vessels (e.g., via the view sensors 220) may be processed equally as the threat of collision may come from any direction (i.e., there is little or no preferential processing of signals based the direction of travel).

[0063] FIG. 6 illustrates a method 600 of dynamically controlling one or more parameters (e.g., sensor sampling or frame rate and/or data processing rates) for processing output data from multiple view sensors on a vehicle for collision avoidance and/or path planning according to some embodiments. With reference to FIGS. 1-6, operations of the method 600 may be performed by the vehicle's control unit (e.g., 300 in FIG. 3). The method 600 may include operations in block 420 (e.g., as described with reference to FIG. 4A).

[0064] In block 610, the processor (e.g., the processor 310 in the control unit 300) may determine a speed and an anticipated next direction of travel of the vehicle. For example, the processor may obtain information regarding an anticipated course change or a preconfigured navigation path to determine the speed and the next direction of travel in advance of a change in the direction of travel of the vehicle. In some embodiments, such information or knowledge may be obtained from a navigation processor (e.g., 324 in the control unit 300).

[0065] In block 620, the processor (e.g., 310) may determine a next parameter or parameters (e.g., sensor sampling or frame rate and/or data processing rate) for processing the output data received from each of the view sensors (e.g., 220a, 220b, 220c or 520a, 520b, 520c) based on the speed and the next direction of travel of the vehicle and the view direction (e.g., 230) and/or the field of view (e.g., 232) of the view sensor. For example, in some embodiments, the processor may select or calculate a throttled (or reduced) sensor sampling or frame rate and/or data processing rate for processing the output data received from one or more view sensors (e.g., 220) associated having a view direction that is directed away from and/or a field of view not encompassing the next direction of travel of the moving vehicle. In some embodiments, the processor may maintain the current sensor sampling or frame rate and/or data processing rate for processing the output data received from one or more view sensors with a view direction that is directed towards and/or a field of view encompassing the next direction of travel of the moving vehicle. In some

embodiments, the processor may select or calculate an increased sensor sampling or frame rate and/or data processing rate for processing the output data received from one or more view sensors with a view direction that is directed towards and/or a field of view encompassing the next direction of travel of the moving vehicle.

[0066] In block 630, the processor (e.g., 310) may detect whether the vehicle is moving in the next direction of travel. For example, in some embodiments, the processor may detect whether the vehicle is moving in the next direction of travel based on information obtained from one or more of the navigation sensors (e.g., 322), the navigation processor (e.g., 324), or both.

[0067] In block 640, the processor (e.g. 310) may process the output data received from each view sensor according to the next parameter(s) for processing sensor data (e.g., the next sensor sampling or frame rate and/or data processing rate) determined for the view sensor in response to detecting that the vehicle is moving in the next direction of travel. In this way, the processor may schedule the rate at which to receive and/or process sensor data from view sensors that have a view direction and/or field of view in one or more anticipated directions of travel or aligned with the pre- configured path.

[0068] FIG. 7 illustrates a method 700 of dynamically controlling one or more parameters (e.g., sensor sampling or frame rate and/or data processing rates) for processing output data from multiple view sensors (e.g., 220, 520 in FIGS. IB and 5A-5C) on a vehicle (e.g., UAV 200) for collision avoidance and/or path planning according to some embodiments. For example, in some embodiments, the rate at which various view sensors around the vehicle (including different types of view sensors) may be sampled and processed based upon the risk of collision in each of the different view directions of the view sensors. In some embodiments, the probability or likelihood of collision in a particular direction may take into account one or more different collision risk factors in addition to the vehicle's speed and direction of travel.

[0069] With reference to FIGS. 1-7, operations of the method 700 may be performed by the vehicle's control unit (e.g., 300). The method 700 may include operations in blocks 410 and 420 (e.g., as described with reference to FIG. 4A).

[0070] In block 710, the processor (e.g., 310) may determine one or more collision risk factors in the view direction and/or the field of view of each sensor. In some embodiments, the one or more collision risk factors may include detection of an obstacle in the view direction, a speed of the detected obstacle in the view direction (e.g., speed of other UAVs, missiles, animals, etc.), one or more of at least one operational characteristic of the sensor, one or more vehicle handling parameters (e.g., stopping distance, turning radius, etc. as a function of speed), a processing

characteristic of the processor (e.g., bandwidth, available memory, etc.), or any combination thereof.

[0071] In some embodiments, for example, an operational characteristic of a sensor may include the detection range of the sensor, the frame rate or scan rate of the sensor, the amount of output data generated by the sensor to process (e.g., output data from radar sensors may require less processing as compared to 3D image data from stereoscopic cameras that require significant processing), the effectiveness of each sensor in current conditions (e.g., radar sensors typically operate better at night and in fog, while cameras work better during daylight on clear days), and the reliability of the sensor for detecting collision threats (e.g., radar sensors are typically unreliable for detecting birds and vehicles).

[0072] In block 720, the processor may control one or more parameters (e.g., a sensor sampling or frame rate and/or data processing rate) for processing output data received from each of the sensors based on the vehicle's speed and the direction of travel and the one or more collision risk factors in the view direction and/or field of view of the sensor. In some embodiments, the processor may calculate a probability or likelihood of collision based on the vehicle's speed and the direction of travel and the one or more collision risk factors in the view direction and/or field of view of each sensor and then use the calculated probability of collision in deciding whether to throttle, increase or maintain the sensor sampling or frame rate and/or data processing rate of output data from a particular sensor.

[0073] For example, if the processor (e.g., 310) determines that the vehicle is travelling North at a low speed and a moving obstacle is traveling at high speeds toward the vehicle from the West, the processor (e.g., 310) may throttle the sensor sampling or frame rate and/or data processing rate of output data received from the sensors (e.g., 220) that are directed away from the direction of the moving obstacle as the threat of collision is higher in the direction of West. In some embodiments in which one or more view sensors (e.g., 220) face or perceive the environment in the direction of the collision threat, the processor may throttle the sensor sampling or frame rate and/or processing of data from one or more sensors that are not as effective, reliable or fast enough in detecting obstacles in current conditions (e.g., night, day, fog, etc.).

[0074] Various embodiments also include dynamically controlling one or more parameters for obtaining and/or processing sensor data received from a sensor, particularly a stereoscopic sensor (e.g., the view sensors 220a, 220b, 220c), on a vehicle based on the speed of the vehicle and/or a particular mission or task performed using the sensor output data (generally referred to herein as "sensor data"). For example, in some embodiments, when a vehicle (e.g., the UAV 100, 200) is hovering or slowly moving, it is likely that the surrounding environment perceived by the sensor will also be changing slowly, if at all. Thus, the update rate at which sensor data is obtained (e.g., frame rate) and/or processed may be decreased or throttled. Although some parameters (e.g., the update rate) may be increased when the vehicle's speed exceeds a threshold, other parameters for processing sensor data may be decreased based on the particular mission or task performed using the sensor data.

[0075] In some embodiments, parameters for obtaining and/or processing sensor output may be controlled by leveraging differences in the level of quality, accuracy, confidence and/or other criteria in sensor data associated with particular missions or tasks (e.g., mapping, inspection, localization, collision avoidance). For example, the resolution of the sensor data required to perform collision avoidance may be less than the resolution of the sensor data required to inspect a product for defects, while the update rate required for an inspection task may be less than the update rate required for collision avoidance. Thus, in some embodiments, one or more parameters for obtaining and/or processing sensor data may be decreased, while other parameters may be maintained or increased depending on the particular task. In this way, parameters for obtaining and/or processing sensor output may be individually tuned (e.g., decreased, increased, or maintained) based on the vehicle's speed and the task or mission performed using the sensor data. In some embodiments, such parameter control may improve consumption of various resources, such as power, memory, and/or processing time, for example.

[0076] FIG. 8 illustrates a method of dynamically controlling parameters for obtaining and/or processing sensor data according to some embodiments. With reference to FIGS. 1-8, operations of the method 800 may be performed by a processor (e.g., 310) of a control unit (e.g., 300) of a vehicle (e.g., the UAV 100, 200) having a sensor (e.g., the view sensor 220a, 220b, 220c). For ease of reference, the term "processor" is used generally to refer to the processor or processors

implementing operations of the method 800.

[0077] In block 810, the processor may determine a speed of the vehicle in any suitable manner. In some embodiments, the processor may obtain the vehicle's current speed from one or more of the navigation sensors (e.g., 322), the navigation processor 324, a speedometer, an airspeed indicator (e.g., a pitot tube), GNSS receiver, or any combination thereof. In some embodiments, the processor may calculate vehicle's speed based on navigational data (e.g., position, orientation, time, etc.) provided by one or more of the navigation sensors (e.g., 322), the navigation processor 324, or both.

[0078] In block 820, the processor (e.g., 310) may determine a task or mission that may be performed using the data output from the sensor. Some examples of tasks or missions that may use the data output by a sensor include generating two-dimensional (2D) and/or three-dimensional (3D) maps of the environment for navigation or collision avoidance, inspecting a product, structure or other object for defects (e.g., cracks in a pipeline or other structure), localizing the vehicle in 3D space (e.g., determining a position and/or orientation for the vehicle), gathering data on a target of surveillance, and detecting objects and structures while navigating through the environment for collision avoidance. In some embodiments, the task or mission may be identified or described in a task or mission profile stored in a memory (e.g., the memory 312) of the vehicle (e.g., the UAV 100, 200). In some embodiments, the task or mission may be inferred or determined by the processor based upon the operations being performed by the vehicle.

[0079] In some embodiments, each task or mission may be associated with a different level of quality, accuracy, confidence and/or other sensor data criteria associated with the mission or task. For example, in some embodiments, while a collision avoidance routine may require frequent updates of sensor data, lower resolutions of sensor data may be acceptable, such as for detecting nearby obstacles. Other tasks, such as inspection tasks, may require less frequent updates of sensor data but greater resolution to observe fine details or generate detailed models of the objects under inspection. In some embodiments, the particular sensor data requirements or criteria associated with each task or mission may be identified in a task or mission profile stored in memory or inferred by the processor based on the determined task or mission. [0080] In block 830, the processor (e.g., 310) may control one or more parameters for obtaining and/or processing sensor data based on the speed of the vehicle and/or the task or mission performed using the output data received from the sensor. In various embodiments, the parameters for processing output data received from the sensor may include one or more of a data capture or sampling rate, a frame rate (i.e., the rate at which an imaging sensor captures or outputs image frames), a processing rate (i.e., a rate at which sensor data is processed), a resolution of the sensor data, and a range of depths or depth-related information searched in the sensor data.

[0081] In some embodiments, the sensor may be a stereoscopic camera that outputs stereoscopic digital images (e.g., left and right images) of scenes within the camera's field of view. In such embodiments, the processor may control one or more of an image capture rate, the rate at which the stereoscopic images are output by the camera, the rate at which the depth-from-stereo (DFS) processing is performed on the stereoscopic images, and/or the resolution of the stereoscopic images (e.g., a total number of pixels in each image). In some embodiments, using existing DFS techniques, the processor may also control the range of disparities searched between stereoscopic images to extracting depth information.

[0082] FIG 9 is a schematic diagram that illustrates the concept of controlling the range of disparities searched between stereoscopic images according to some embodiments. With reference to FIGS. 1-9, a processor (e.g., 310) may perform a DFS technique using a pair of stereoscopic images (e.g., left and right stereo images 900-L and 900-R) that involves identifying one or more target pixels (e.g., 912-L) in one of the stereoscopic images (e.g., 900-L), and searching for one or more matching pixels (e.g., 912-R) in the other stereoscopic image (e.g., 912-R). The relative difference, or disparity, between the pixel location of the target pixel 912-L in a row (or column) 910-L and the pixel location of the matching pixel 912-R in a row (or column) 910-R may be used to determine depth information (e.g., distance from the stereoscopic camera). For example, the closer an object or an object feature is closer to the camera, the greater disparity between the target pixel 912-L and the matching pixel 912-R.

[0083] To identify pixels in a second image (e.g., right stereo image 900-R) that match to objects or object features in a first (e.g., left stereo image 900-L), the processor may evaluate (e.g., color and/or luminosity) the values in pixels in the second image that lie a number of pixels away from a pixel coordinate in the first image to determine whether there is a match (e.g., within a threshold difference). When a pixel in the second image is identified matching a given pixel in the first image, the distance or number of pixels between the pixel coordinate in the first image and the matching pixel in the second image is referred to as "pixel disparity." The number of pixels away from a pixel coordinate in the first image that are evaluated for matching is referred to as the "disparity range." Modern digital cameras capture a large number of pixels, and the comparison of pixel values requires a finite time and processing power. Thus, the greater the disparity range used in the DFS processing, the greater the image processing demands on the processor performing this analysis.

[0084] In some embodiments, the processor (e.g., 310) may be configured to control the disparity range 920 of pixels that are searched based on the proximity of objects that a particular task or mission is focused on. As described, the pixel disparity of objects close to the image sensor will be much greater than the pixel disparity of distant objects. Thus, limiting the disparity range 920 of pixels that are searched in DFS processing will enable distant objects to be localized while saving processing power, but limit the ability to localize nearby objects. For example, if the task or mission involves identifying and localizing objects for navigation and collision avoidance, the disparity range 920 may be reduced, thereby saving processing power. Thus, the range of disparities 920 to search may be reduced to a minimum number of pixels N M iN when the task or mission is focused on objects distant from the camera. As another example, the range of disparities 920 to search may be extended to a maximum number of pixels N AX when the task or mission is focused on objects in close proximity to the camera (e.g., for inspections). Another case is when avoiding obstacles and nearby objects are detected. Nearby objects may dominate the collision avoidance problem, while far away obstacles may be ignored. Thus, the minimum number of pixels N MIN may be increased, thereby saving processing power.

[0085] Controlling one or more parameters associated with obtaining and/or processing sensor data based on the vehicle's speed and/or the task or mission performed using the sensor data in block 830 may enable reductions in the processing demands on the vehicle's control unit (e.g., 300) and/or may facilitate increases in other parameters associated with processing sensor data. In some embodiments, the processor (e.g., 310) may be configured to decrease one or more parameters for obtaining and/or processing output data received from a sensor (e.g., the view sensor 220a, 220b, and/or 220c) based on the speed of the vehicle and/or the task or mission performed using the output data received from the sensor. For example, when a vehicle (e.g., the UAV 100, 200) is hovering or slowly moving, it is likely that the surrounding environment perceived by a sensor (e.g., 220a, 220b, 220c) will be changing slowly, if at all. Thus, in such situations, the processor (e.g., 310) may decrease one or more parameters for obtaining and/or processing sensor data in order to reduce the likelihood of obtaining redundant output data from the sensor and/or performing redundant data processing. For example, the rate at which sensor data is obtained or received (e.g., a data sampling rate or an output frame rate of the sensor) and/or the rate at which the sensor data is processed (e.g., a data processing rate) may be decreased or throttled. As another example, the amount of sensor data that is obtained or received (e.g., pixel density, pixel information, etc.) and/or the amount of processing performed on sensor data (e.g., pixel disparity range, color vs. brightness processing, etc.) may be decreased or throttled.

[0086] In contrast, when a vehicle is moving fast, changes in the surrounding environment perceived by the sensor (e.g., 220a, 220b, 220c) will occur faster. Thus, the processor (e.g., 310) may maintain or increase one or more parameters for obtaining or processing sensor data (e.g., a data sampling rate, an output frame rate, and/or a data processing rate) in order to avoid missing or failing to detect objects or changes in the surrounding environment. Although one or more parameters may be increased in response to the vehicle's speed exceeding specific thresholds, the processor (e.g., 310) may be configured to decrease other parameters based on the particular task or mission performed using the sensor data. For example, when the task or mission involves collision avoidance, the processor (e.g., 310) may increase the rate at which sensor data is obtained/received and/or processed in response to the vehicle's speed exceeding a threshold speed. However, the processor may also decrease the resolution of the sensor data when an obstacle is detected in close proximity.

[0087] In some embodiments, controlling one or more parameters in response to the vehicle's speed exceeding a threshold speed may include comparing the vehicle's speed against one or more threshold speeds and individually controlling (e.g., increasing, decreasing, and/or maintaining) the parameters based on such

comparisons. In some embodiments, controlling one or more parameters in response to the vehicle's speed may be implemented using any form of decision criteria for speed-based parameter control, including, without limitation, a lookup table, a proportional parameter controller, and/or other multiple level checking data structure or scheme. Thus, adjustments to various parameters may be made based on comparing the vehicle's speed to any number of thresholds or decision criteria configured in any of a variety of data structures. In some embodiments, a parameter may be increased or decreased by different amounts and/or the threshold speed(s) may be varied based on the task or mission performed using the sensor data.

[0088] As described, by leveraging the quality, accuracy, and/or confidence requirements associated with a particular task or mission, parameter(s) for obtaining and/or processing sensor data may be adjusted to obtain gains in performance (e.g., reduced processing demands) at acceptable costs (e.g., lower image capture rate, reduced disparity search ranges, etc.). For example, in the context of collision avoidance, if the vehicle is traveling slowly (i.e., the vehicle's speed is below a certain threshold) and no nearby obstacles are detected, the processor may reduce certain parameters (e.g., image capture rate, pixel disparity ranges, etc.) in order to focus on detecting distant obstacles while reducing processing demands.

[0089] In some situations, unexpected changes in the surrounding environment (e.g., a rapidly approaching object) may be missed while the parameters for obtaining and/or processing sensor data are set to less than maximum values. To avoid failing to detect such unexpected changes, the processor may be configured to occasionally reset such parameters to default or maximum values for a period of time to enable more detailed or complete surveillance or analysis of sensor data. For example, to avoid failing to detect unexpected nearby obstacles, the processor may occasionally reset the image capture rate and/or pixel disparity range to a maximum or near-maximum values scan images output from a stereoscopic camera for obstacles at greater pixel disparities ranges. Thus, in some embodiments, in optional block 840, the processor may occasionally reset one or more of the controlled parameters for processing the output data received from the sensor. For example, a parameter for obtaining and/or processing data that is increased or decreased in block 830 may be temporarily reset to a default or maximum value. In some embodiments, the processor may be configured to reset one or more of the controller parameters periodically (e.g., once a second or at another rate), semi-periodically (e.g., in response to the expiration of a timer) and/or activation of a time- or event-based trigger.

[0090] FIG. 10 illustrates a method of dynamically controlling parameters for obtaining and/or processing sensor data based on the speed of a vehicle and the task or mission performed using the sensor data according to some embodiments. With reference to FIGS. 1-10, operations of the method 1000 may be performed by a processor (e.g., 310) of a control unit (e.g., 300) of a vehicle (e.g., the UAV 100, 200). For ease of reference, the term "processor" is used generally to refer to the processor or processors implementing operations of the method 1000. In some embodiments, method 1000 may be particularly useful when the task or mission performed using the sensor data is or involves collision avoidance.

[0091] In blocks 810, 820 and 840, the processor may perform operations of like number blocks of the method 800 as described.

[0092] In determination block 1010, the processor may determine whether the speed of the vehicle exceeds a speed threshold. For example, in some embodiments, the processor may compare the vehicle's speed to a threshold speed that is stored in memory (e.g., 312). In some embodiments, the threshold speed may be selected or calculated based on the particular task to be performed using the sensor data. For example, in some embodiments, the threshold speed may be different for each of the different types of tasks or mission (e.g., mapping, inspection, localization, collision avoidance, etc.) that may be performed by the vehicle. For example, greater speed thresholds may be associated with tasks that are less sensitive to the speed of the vehicle, while lower threshold speeds may be associated with tasks that are more sensitive to the vehicle's speed. In some embodiments, the vehicle's speed may be compared to one or more threshold speeds using any form of decision criteria for speed-based parameter control, including, without limitation, a lookup table, a proportional parameter controller, and/or other multiple level checking scheme. Thus, a parameter may be increased or decreased by different amounts based on any number of speed thresholds and which threshold speed is exceeded.

[0093] In response to determining that the speed of the vehicle does not exceed the speed threshold (i.e., determination block 1010 = "No"), the processor may decrease (e.g., throttle, reduce, etc.) one or more parameters for obtaining and/or processing sensor data in block 1020. For example, the processor may decrease one or more of a sampling or frame rate of the sensor and a data processing rate of the output data received from the sensor. For example, if the sensor is a stereoscopic camera, the processor may decrease one or more of the image capture rate, the rate at which the stereoscopic images are output by the camera, and/or the rate at which the DFS processing is performed on the stereoscopic images.

[0094] In response to determining that the speed of the vehicle exceeds the speed threshold (i.e., determination block 1010 = "Yes") or after completing the operations in block 1020, the processor may whether an imaged object closest to the vehicle is within a threshold distance to determination block 1030. For example, when the task or mission performed using the sensor data is a collision avoidance task, the processor may determine whether the distance to the object closest to the vehicle is within a threshold distance. In embodiments in which the sensor is a stereoscopic camera, the processor may process stereoscopic images received from the camera using DFS techniques to detect object(s) that are within a threshold distance of the vehicle (e.g., a distance at which the object poses a potential collision risk for the vehicle). In some embodiments, the threshold distance may be a fixed distance relative to the vehicle. In some embodiments, the threshold distance may be a variable distance that is inversely related to the vehicle's speed. For example, the threshold distance may be longer at slower speeds and shorter at higher speeds. In some embodiments, the distance of the closest object may be compared to one or more threshold distances using any form of decision criteria for distance-based parameter control, including, without limitation, a lookup table, a proportional parameter controller, and/or other multiple level checking scheme. Thus, a parameter may be increased or decreased by different amounts based on any number of speed thresholds and which threshold distance is exceeded.

[0095] In response to determining that the object closest to the vehicle is within the threshold distance (i.e., determination block 1030 = "Yes"), the processor may decrease a resolution of the output data received from the sensor in block 1040. In some embodiments, the processor may specify a reduced camera resolution in terms of the number of megapixels or the number of pixel rows by columns used to generate the captured image. For example, when the sensor is a stereoscopic camera, the processor may configure the camera to reduce the pixel resolution of the captured stereoscopic digital images. In some embodiments, the processor may also increase the pixel disparity range to facilitating localization of the nearby object. In some embodiments, when the task is collision avoidance, a lower resolution of the images may be suitable to detect an object as a potential collision risk as opposed to other tasks that may require higher resolution output to generate more accurate 3D representations of the object. In some embodiments, the processor may specify a type of reduced camera resolution (e.g., from SVGA to VGA resolution).

[0096] In response to determining that the object closest to the vehicle is not within the threshold distance (i.e., determination block 1030 = "No"), the processor may decrease pixel disparity range searched between stereoscopic images in block 1050. As described, the distance between matching pixels of an object in a pair of stereoscopic images (i.e., the pixel disparity) is inversely related to the distance to the object. Therefore, distant objects (e.g., objects detected outside the threshold distance) may be detected and localized by the processor by searching a smaller range of pixels. Thus, the processor may reduce the pixel disparity range (e.g., such as to N MIN ) used in DFS processing of stereoscopic images, thereby processing demands on processor performing the DFS techniques.

[0097] By decreasing one or more parameters in blocks 1020, 1040, and/or 1050, the processor may optionally increase other parameters in order to meet other criteria for quality and/or accuracy of the sensor data. For example, in some embodiments, in order to detect objects that are in close proximity to the vehicle (e.g., less than a threshold distance), the processor may optionally increase the pixel disparity range (e.g., 920) searched in the stereoscopic image processing. In some embodiments, in order to avoid missing or failing to detect an object moving towards the vehicle, the processor may increase an image capture rate of a stereoscopic camera, the rate at which the stereoscopic images are output by the camera, the rate at which the DFS processing is performed on the stereoscopic images, etc. In some embodiments, in order to detect objects that are distant from the vehicle (e.g., greater than the threshold distance), the processor may increase a resolution of the output data received from the sensor.

[0098] The various embodiments illustrated and described are provided merely as examples to illustrate various features of the claims. However, features shown and described with respect to any given embodiment are not necessarily limited to the associated embodiment and may be used or combined with other embodiments that are shown and described. In particular, various embodiments are not limited to use on aerial UAVs and may be implemented on any form of UAV, including land vehicles, waterborne vehicles and space vehicles in addition to aerial vehicles. Further, the claims are not intended to be limited by any one example embodiment. For example, one or more of the operations of the methods 400, 4300, 600, and 700 may be substituted for or combined with one or more operations of the methods 400, 4300, 600, and 700, and vice versa.

[0099] The foregoing method descriptions and the process flow diagrams are provided merely as illustrative examples and are not intended to require or imply that the steps of the various embodiments must be performed in the order presented. As will be appreciated by one of skill in the art the order of operations in the foregoing embodiments may be performed in any order. Words such as "thereafter," "then," "next," etc. are not intended to limit the order of the operations; these words are used to guide the reader through the description of the methods. Further, any reference to claim elements in the singular, for example, using the articles "a," "an" or "the" is not to be construed as limiting the element to the singular.

[0100] The various illustrative logical blocks, modules, circuits, and algorithm operations described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and operations have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the claims.

[0101] The hardware used to implement the various illustrative logics, logical blocks, modules, and circuits described in connection with the aspects disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but, in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of receiver smart objects, e.g., a combination of a DSP and a microprocessor, two or more microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Alternatively, some operations or methods may be performed by circuitry that is specific to a given function.

[0102] In one or more aspects, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a non- transitory computer-readable storage medium or non-transitory processor-readable storage medium. The operations of a method or algorithm disclosed herein may be embodied in a processor-executable software module or processor-executable instructions, which may reside on a non-transitory computer-readable or processor- readable storage medium. Non-transitory computer-readable or processor-readable storage media may be any storage media that may be accessed by a computer or a processor. By way of example but not limitation, such non-transitory computer- readable or processor-readable storage media may include RAM, ROM, EEPROM, FLASH memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage smart objects, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above are also included within the scope of non- transitory computer-readable and processor-readable media. Additionally, the operations of a method or algorithm may reside as one or any combination or set of codes and/or instructions on a non-transitory processor-readable storage medium and/or computer-readable storage medium, which may be incorporated into a computer program product.

[0103] The preceding description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the claims. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the scope of the claims. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the following claims and the principles and novel features disclosed herein.