Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
CONTROL SYSTEMS FOR UNMANNED AERIAL VEHICLES
Document Type and Number:
WIPO Patent Application WO/2018/156991
Kind Code:
A1
Abstract:
A method for controlling an unmanned aerial vehicle within a flight operating space. The unmanned aerial vehicle includes one or more sensor arrays on each spar. The method includes determining, using a plurality of sensor arrays, a flight path for the unmanned aerial vehicle. The method also includes receiving, by at least one sensor array of the plurality of sensor arrays, sensor data identifying at least one object in the operating space. The sensor data is transmitted over a communications bus connecting components of the UAV. The method further includes determining, by one or more processors onboard the unmanned aerial vehicle, a flight path around the at least one object. The method also includes generating, by the one or more onboard processors, a first signal to cause the unmanned aerial vehicle to navigate within the operating space around the at least one object.

Inventors:
TSUTSUMI ERIKA (US)
THOMSON CHAD (US)
ALEMAN JOHN (US)
JOHNSON SAMUEL (US)
MISTRY SAMIR (US)
Application Number:
PCT/US2018/019570
Publication Date:
August 30, 2018
Filing Date:
February 23, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
CYPHY WORKS INC (US)
TSUTSUMI ERIKA (US)
THOMSON CHAD (US)
ALEMAN JOHN (US)
International Classes:
G05D1/10; B64C39/02; G05D1/08
Domestic Patent References:
WO2015175379A12015-11-19
WO2016115155A12016-07-21
WO2015187836A12015-12-10
Foreign References:
US20160291136A12016-10-06
US20130233964A12013-09-12
US201615041211A2016-02-11
US7631834B12009-12-15
US201414581027A2014-12-23
US20170024152W2017-03-24
US201615316011A2016-12-02
US20150033992W2015-06-03
Other References:
MUNGUIA, R; URZUA, S; BOLEA, Y: "Vision-based SLAM system for unmanned aerial vehicles", SENSORS, vol. 16, 2016, pages 372
Attorney, Agent or Firm:
HOOVER, Thomas, O. et al. (US)
Download PDF:
Claims:
CLAIMS

1. A system for controlling an unmanned aerial vehicle within a flight operating space, the system comprising:

an unmanned aerial vehicle having a tether connector;

a plurality of sensors coupled to the unmanned aerial vehicle and positioned to provide a field of view in at least three directions, the three directions being separated by at least 90 degrees, the plurality of sensors positioned to detect at least one object located in the flight operating space; and

one or more processors onboard the unmanned aerial vehicle that:

receive, from the plurality of sensors, sensor data identifying a location of the unmanned aerial vehicle;

receive, from the plurality of sensors, sensor data identifying at least one object in a flight operating space of the aerial vehicle;

determine a spatial relationship between the location of the unmanned aerial vehicle and the at least one object in the flight operating space; and

generate, based on the sensor data, a first signal to cause the unmanned aerial vehicle to navigate around the at least one object within the flight operating space.

2. The system of claim 1 further comprising a tether connected to the aerial vehicle.

3. The system of claim 1 wherein the unmanned aerial vehicle includes a plurality of at least four rotors; and

Wherein the plurality of sensors provide at least a 180-degree visual field in a plane of the aerial vehicle wherein, the plurality of sensors generate sensor data including inertial measurement unit data that is processed to generate a first signal to cause the plurality of rotors to navigate the unmanned aerial vehicle around the at least one object within the flight operating space.

4. The system of claim 1 further comprising:

a global positioning (GPS) sensor on the vehicle;

a tether dispensing unit on the vehicle; and

a base station coupled to the unmanned aerial vehicle by a tether, wherein the tether transmits flight control signals and electric power to the unmanned aerial vehicle.

5. The system of claim 4, further comprising an operator control unit

communicatively coupled to the base station, the operator control unit configured to: transmit flight controls to the base station; and

receive and display sensor data from at least one sensor array of the plurality of sensor arrays.

6. The system of claim 5, wherein the first signal is generated based on instructions received from the operator control unit in communication with the one or more processors onboard the unmanned aerial vehicle.

7. The system of claim 1, wherein the first signal is generated autonomously by the unmanned aerial vehicle.

8. The system of claim 4, wherein at least one sensor of the plurality of sensors includes a camera, and wherein the operator control unit includes a display, the operator control unit configured to display video taken by the camera.

9. The system of claim 1 wherein the an unmanned aerial vehicle comprises:

a flight control system;

a bumper device mounted to the aerial vehicle; and

a tether connecting the aerial vehicle to a power source.

10. The system of claim 9 wherein the bumper device comprises a plurality of spring elements coupled to one or more bumper elements extending around the aerial vehicle.

11. The system of claim 9 wherein the bumper device comprises a wherein the bumper device comprises a cage extending around the vehicle, the cage having a tether port having an opening through which the tether extends between the aerial vehicle and a base station.

12. The system of claim 9 wherein the power source comprises a base station mounted on a vehicle.

13. The system of claim 12 wherein the vehicle comprises a ground vehicle, a water vehicle or an autonomous vehicle.

14. The system of claim 11 wherein the cage comprises a frame mounted to the aerial vehicle, the plurality of sensors mounted to the frame.

15. The system of claim 1 wherein the aerial vehicle comprises a spooler that rotates relative to the aerial vehicle to dispense a tether.

16. The system of claim 15 wherein the spooler is connected to a spooler motor, the motor being actuated by a flight controller mounted to the vehicle.

17. The system of claim 16 wherein the flight controller receives inertial

measurement data to control dispensing of the tether from the aerial vehicle.

18. The system of any of claims 1-17 further comprising a sensor to detect a position of a dispensed tether from the aerial vehicle.

19. The system of any of claims 1-18 wherein the aerial vehicle comprises a plurality of motors that can be oriented at different angles relative to an axis of the aerial vehicle.

20. The system of claim 19 wherein each motor is connected to the aerial vehicle with a strut having a twist angle and a dihedral angle.

21. The system of any of claims 1-20 wherein the aerial vehicle comprises a plurality of struts, each strut having a motor mounted thereon and at least two of the plurality of sensors mounted on an end of each strut.

22. The system of claim 21 wherein the at least two sensors mounted on each strut includes a first camera and a second camera.

23. The system of claim 21 wherein the at least two sensors comprise a sensor array having a LiDAR sensor.

24. The system of any of claims 1-19 wherein the aerial vehicle comprises at least six motors, each motor mounted on a strut and driven by a motor connected to a flight control processor.

25. The system of claim 1 wherein one of the processors comprises an integrated circuit that processes data from the plurality of sensors.

26. The system of claim 25 wherein the integrated circuit processes data

simultaneously from each sensor.

27. The system of claim 26 wherein the integrated circuit comprises an FPGA, the FPGA including a bus connected to the plurality of sensors.

28. The system of any of claims 1-27 wherein the sensors are directed over at least 360 degrees in a plane of the vehicle.

29. The system of claim 28 further comprising an acoustic transducer to sense distance from an object.

30. The system of claim 26 wherein the integrated circuit includes a memory and a memory controller.

31. The system of any of claims 1-30 further comprising a radio transceiver.

32. The system of claim 1 further comprising a base station having a battery and a transceiver circuit to transmit Ethernet data with a tether to the aerial vehicle.

33. The system of claim 1 wherein the plurality of sensors comprises a plurality of sensor circuit boards, each circuit board having at least two sensors connected to a data bus.

34. The system of claim 33 wherein each sensor circuit board has a proximity sensor and a camera.

35. The system of claim 1 further comprising a base station with a high voltage convertor.

36. The system of claim 1 further comprising an altimeter, a laser scanner, an inertial measurement unit, a GPS sensor and a camera.

37. The system of claim 1 wherein the aerial vehicle further comprises a power management circuit.

38. The system of claim 1 wherein a processor generates a panoramic image extending 360 degrees around an axis of the aerial vehicle and further includes further sensors viewing in both directions along the axis.

39. The system of claim 1 further comprising an ultra- wide band (UWB) transceiver on the vehicle to detect UWB signals from beacons.

40. A system for controlling an unmanned aerial vehicle within a flight operating space, the system comprising:

a plurality of sensors coupled to the unmanned aerial vehicle and positioned to provide a field of view in at least three directions, the three directions being separated by at least 90 degrees, the plurality of sensors deployed so as to obtain sensor data indicative of at least one object located in the flight operating space; and an embedded control system for autonomous flight operations, the embedded control system including a system on a chip (SoC), the SoC including:

a memory,

a first application specific integrated circuit (ASIC) that:

identifies a first location of the unmanned aerial vehicle using the sensor data received from one or more sensors of the plurality of sensors,

identifies at least one object in the flight operating space using sensor data received from one or more sensors of the plurality of sensors,

determines a spatial relationship between the first location of the unmanned aerial vehicle and the at least one object in the flight operating space,

a second ASIC that:

receives the determined spatial relationship from the first ASIC, and generates, based on the determined spatial relationship, a first signal to cause the unmanned aerial vehicle to alter a flight operation in the flight operating space.

41. The system of claim 40 wherein the memory stores an initial mission path through the flight operating space and the first signal alters the mission path.

42. The system of claim 40 wherein the unmanned aerial vehicle includes a connector for a detachable tether that connects the aerial vehicle to a power source.

43. The system of claim 42 wherein the unmanned aerial vehicle continually monitors a location of the tether with respect to the aerial vehicle, the location stored in the memory.

44. The system of claim 43 wherein the location of the tether is a constraint considered by the second ASIC in generating the first signal.

45. The system of claim 43 wherein the location of the tether includes a location of the tether following a detaching of the tether from the aerial vehicle.

46. The system of claim 40 wherein the first signal alters at least one of an amount or angle of thrust generated by the aerial vehicle.

47. The system of claim 40 wherein the first signal alters at least one of an altitude or navigational path of the aerial vehicle.

48. The system of claim 40 wherein the plurality of sensors comprises a plurality of sensor arrays, each array comprising at least one camera and at least one LiDAR sensor.

49. The system of claim 40 wherein the tether comprising a twisted pair of wires that transmits a direct current power signal and a communication signal from a ground station to the aerial vehicle, the ground station optionally comprising a networked computer mounted on a vehicle.

50. A system for controlling an unmanned aerial vehicle within a flight operating space, the system comprising:

a plurality of sensors coupled to the unmanned aerial vehicle and positioned to provide a field of view in at least three directions, the three directions being separated by at least 90 degrees, the plurality of sensor arrays deployed so as to obtain sensor data indicative of at least one object located in the flight operating space;

an embedded control system for autonomous flight operations, the embedded control system including a system on a chip (SoC), the SoC including;

a memory,

at least one application specific integrated circuit (ASIC) that:

identifies a first location of the unmanned aerial vehicle using the sensor data received from the plurality of sensor arrays,

identifies at least one object in the flight operating space using sensor data received from the plurality of sensor arrays, determines a spatial relationship between the first location of the unmanned aerial vehicle and the at least one object in the flight operating space, and

generates, based on the determined spatial relationship, a first signal to cause the unmanned aerial vehicle to alter a flight operation in the flight operating space.

51. The system of claim 50 wherein the memory stores an initial mission path through the flight operating space and the first signal alters the mission path.

52. The system of claim 50 wherein the unmanned aerial vehicle includes a connector for a detachable tether that connects the aerial vehicle to a power source.

53. The system of claim 52 wherein the unmanned aerial vehicle continually monitors a location of the tether with respect to the aerial vehicle, the location stored in the memory.

54. The system of claim 52 wherein the location of the tether is a constraint used by a second ASIC connected to the at least one ASIC in generating the first signal.

55. The system of claim 53 wherein the location of the tether includes a location of the tether following a detaching of the tether from the aerial vehicle.

56. The system of claim 50 wherein the first signal alters at least one of an amount or angle of thrust generated by the aerial vehicle.

57. The system of claim 50 wherein the first signal alters at least one of an altitude or navigational path of the aerial vehicle.

58. The system of claim 50 wherein the plurality of sensors comprises a plurality of sensor arrays, each array comprising at least one camera and at least one LiDAR sensor.

59. The system of claim 50 wherein a tether attached to the aerial vehicle comprises a twisted pair of wires that transmits a direct current power signal and a communication signal from a base station to the aerial vehicle, the base station optionally comprising a networked computer mounted on a vehicle.

60. A method for controlling an unmanned aerial vehicle within a flight operating space, the system comprising:

obtaining sensor data indicative of at least one object located in the flight operating space using a plurality of sensor arrays coupled to the unmanned aerial vehicle and positioned to provide a field of view in at least three directions, the three directions being separated by at least 90 degrees; and

processing the sensor data with an embedded control system for autonomous flight operations, the embedded control system including a system on a chip (SoC), the SoC including:

a memory,

a first application specific integrated circuit (ASIC) that:

identifies a first location of the unmanned aerial vehicle using the sensor data received from the plurality of sensor arrays, identifies at least one object in the flight operating space using sensor data received from the plurality of sensor arrays, determines a spatial relationship between the first location of the unmanned aerial vehicle and the at least one object in the flight operating space, and

a second ASIC that:

receives the determined spatial relationship from the first ASIC, and generates, based on the determined spatial relationship, a first signal to cause the unmanned aerial vehicle to alter a flight operation in the flight operating space.

61. The method of claim 60 wherein the memory holds an initial mission path through the flight operating space and the first signal alters the mission path.

62. The method of claim 61 wherein the unmanned aerial vehicle includes a connection for a detachable tether connecting the aerial vehicle to a power source.

63. The method of claim 60 wherein the unmanned aerial vehicle continually monitors a current location and orientation of the tether with respect to the aerial vehicle, the current location stored in the memory.

64. The method of claim 63 wherein the current location of the tether is a constraint considered by the second ASIC in generating the first signal.

65. The method of claim 63 wherein the current location of the tether includes a location of the tether following a detaching of the tether from the aerial vehicle to allow for later retrieval.

66. The method of claim 60 wherein the first signal alters at least one of an amount or angle of thrust generated by the aerial vehicle.

67. The method of claim 60 wherein the first signal alters at least one of an altitude or navigational path of the aerial vehicle.

68. The method of claim 60 further comprising transmitting sensor data to a data processor with a tether connected to the aerial vehicle.

69. The method of claim 68 further comprising processing the sensor data with the data process using a simultaneous localization and mapping (SLAM) process to generate flight control data that is transmitted to the aerial vehicle with the tether.

70. The method of claim 68 wherein the sensor data transmitted to the data processor with the tether comprises position data identifying a location of the aerial vehicle.

Description:
CONTROL SYSTEMS FOR UNMANNED AERIAL VEHICLES

Background

[0001] This application claims priority to U.S. Application No. 62/463,539, filed February 24, 2017, and U.S. Application No. 62/541,637, filed August 4, 2017, the entire contents of which is incorporated herein by reference.

[0002] This invention relates to control systems for unmanned aerial vehicles (UAVs).

[0003] In recent years the use of UAVs has become widespread, particularly in military and recreational applications. Until recently, commercial use of UAVs was limited due to regulatory and technological constraints of UAVs (e.g., safety, limited range, poor reliability, etc.) as well as the relatively high cost of UAVs.

[0004] Due to advances in technology and an increased prevalence of UAVs, UAVs are becoming cost effective and sufficiently reliable for use in commercial applications.

[0005] At the same time, there is a need for a cost effective, efficient means of controlling and maneuvering UAVs within confined aerospace, such as gaps between buildings and/or trees or within buildings or enclosures, and for delivering items to customers at the customer's location. It is unclear how UAVs will accurately maneuver within constrained areas or to accurately navigate in proximity to objects above ground.

Summary

[0006] The present invention relates to systems and methods for controlling an unmanned aerial vehicle within a flight operating space. The method includes determining, using a plurality of sensor arrays mounted to the unmanned aerial vehicle, a flight path for the unmanned aerial vehicle, wherein the plurality of sensor arrays are positioned on the unmanned aerial vehicle to provide a field of view on different sides of the vehicle. For example, the sensor arrays perceive objects in at least 3 directions that are at least 90 degrees apart, for example. Each sensor array includes an imaging device and a distance measurement device. The method also includes receiving, by at least one sensor array of the plurality of sensor arrays, sensor data identifying at least one object in the operating space. The method further includes determining, by one or more processors, a spatial relationship between a first location of the unmanned aerial vehicle and the at least one object in the operating space. The method also includes determining, by one or more processors onboard the unmanned aerial vehicle, a flight path around the at least one object based on the sensor data and the spatial relationship. The method further includes generating, by the one or more onboard processors, a first signal to cause the unmanned aerial vehicle to navigate within the operating space around the at least one object.

[0007] In an exemplary embodiment, the unmanned vehicle is a multi-rotor vehicle (e.g. quadcopters, hexacopters, octocopters). Multi-rotor vehicles generally have motors rigidly mounted to the airframe and control vehicle motion by adjusting speed of rotation of individual motors based on an selected model of all rotors generating thrust in one or more directions. This makes for a system which can be controlled in roll, pitch, yaw, and net thrust. Such a multi-rotor vehicle can move in space by holding a particular roll or pitch angle and varying the net thrust. Under certain circumstances, this approach can lead to system instability as the vehicle hovers. Hover quality can be improved by controlling each axis independently of the vehicle's roll and pitch.

[0008] Approaches described herein can employ thrusters which include rotors mounted to a multi-rotor helicopter frame in which spars have dihedral and twist. That is, the thrust directions can be fixed, and not all parallel. Each thruster generates an individual thrust line which is generally not aligned with the thrust lines of other thrusters. Free- body analysis yields the forces and moments acting on the body from each thruster. The forces and moments are summed together to produce a unique mapping from motor thrust to net body forces and moments. A desired input including roll, pitch, and yaw moments and forward, lateral, and vertical thrusts can be received and used to calculate the necessary change in motor speeds, to achieve the desired input and resulting thrust.

[0009] In some aspects, the aerial vehicle is configured to maintain a desired spatial orientation while at the same time generating a net thrust that varies in magnitude and/or direction. In some aspects, a sensor such as a still or video camera is statically coupled to the multi-rotor vehicle and an orientation of the vehicle is maintained such that the camera remains pointed in a given direction while the net thrust vector generated by the vehicle causes the vehicle to move in space. Note that the sensor can comprise the camera, the IMU, the GPS sensor, the pressure sensor, or other sensors that control flight operation. The control system can include a processor that computes updated motor speeds and twist angles, for example.

[0010] Preferred embodiments employ an onboard control system that communicates with an external flight management system to control flight operations of the unmanned aerial vehicle. The control system on the unmanned aerial vehicle can comprise a system-on-chip (SOC) that includes one or more integrated circuits having one or more memories to process sensor data and uses stored instructions regarding flight path, payload operation and numerous flight control operations as described herein. The one or more integrated circuits can include field programmable gate arrays (FPGA) and/or application specific integrated circuits (ASIC), for example, an integrated circuit. The integrated circuit can comprise a communications bus that receives data from the sensor arrays and routes the sensor data for processing to provide flight control data and communicate sensor data to the external flight management system.

[0011] The flight control system on the vehicle can be connected to one or more sensor arrays to control flight operations of the vehicle including collision avoidance, navigation, and to compensate for collision events encountered during flight. A bumper can be used to maintain flight operation of the vehicle in the event of a collision. The bumper can comprise one or more bumper guards attached to the vehicle. A preferred embodiment of a bumper device can further comprise one or more bumper elements or guards that extend around the vehicle. The flight control system can utilize thrust vectoring as described herein to control rotations of the vehicle after a collision and thereby assist in maintaining flight.

[0012] Tether communications enables processing data on a ground station where computational resources are essentially unconstrained. Inputs that are not required at very high rates to maintain flight stability and collision avoidance are good candidates for ground processing. For example, video processing to build maps or identify objects are good candidates. Where an unmanned aerial vehicle operates at low velocity, this further eases the need for onboard processing (as the unmanned aerial vehicle can tolerate more latency). This is in contrast to a high speed unmanned aerial vehicle that must do all or most processing locally. The processing architecture may use RF communications between the vehicle and the ground station, while the tether enables continued operation when RF is not possible.

[0013] Embodiments can include both tethered and untethered unmanned aerial vehicles. Tethered embodiments can include a tether management system that is on the ground, a building, or another vehicle, or alternatively, is on the aerial vehicle. For embodiments with the tether management system on the vehicle, a rotating spool can be used wherein the tether comprises a magnet wire filament.

[0014] Other features and advantages of the invention are apparent from the following description, and from the claims.

Brief Description of the Drawings

[0015] FIG. 1 illustrates an aerial vehicle system with a tethered unmanned aerial vehicle in accordance with embodiments of the invention.

[0016] FIG. 2 illustrates an unmanned aerial vehicle.

[0017] FIG. 3 illustrates a system for the unmanned aerial vehicle shown in FIG. 2 [0018] FIG. 4A and 4B shows an exemplary unmanned aerial vehicle. [0019] FIG. 5 shows spooler components of the unmanned aerial vehicle. [0020] FIG. 6 shows a base station for the unmanned aerial vehicle. [0021] FIG. 7 is an operator control unit (OCU).

[0022] FIG 8 is a diagram of the unmanned aerial vehicle component connections.

[0023] FIGs. 9A, 9B, and 9C illustrate a propeller, motor, and bump guard of an unmanned aerial vehicle.

[0024] FIG. 10 shows a depiction of a specialized spool.

[0025] FIG. 11 shows a horizontal view of the spool shown in FIG. 10 attached to an unmanned aerial vehicle.

[0026] FIG. 12A and 12B shows an unmanned aerial vehicle with four rotors. [0027] FIG. 13 shows a top view of the unmanned aerial vehicle with six rotors. [0028] FIG. 14 illustrates a UAV tether comprising magnet wire.

[0029] FIGs. 15A-E show exemplary diagrams for an embedded control system for the unmanned aerial vehicle based on FPGA architecture.

[0030] FIG. 15F depicts an SoC used to perform autonomous flight in an exemplary embodiment.

[0031] FIG. 15G depicts an alternative SoC used to perform autonomous flight in an exemplary embodiment.

[0032] FIG. 16 is a second embodiment of a printed circuit board onboard an unmanned aerial vehicle.

[0033] FIG. 17 illustrates a diagram of a global bus in an embedded control system for the unmanned aerial vehicle.

[0034] FIG. 18 illustrates an example diagram of a base station electrical architecture.

[0035] FIG. 19 illustrates an example diagram of an unmanned aerial vehicle electrical architecture.

[0036] FIG. 20 illustrates an example method for creating a flight path for an unmanned aerial vehicle.

[0037] FIG. 21 illustrates wires of a collasped advanced bump guard.

[0038] FIG. 22A-B illustrates a collapsed advanced bump guard system that includes a frame component for the unmanned aerial vehicle.

[0039] FIG. 23 illustrates a tethered unmanned aerial vehicle that includes the advanced bump guard, in accordance with embodiments of the invention.

[0040] FIG. 24 illustrates a view of the tethered unmanned aerial vehicle that includes the advanced bump guard shown in FIG. 3. [0041] FIG. 25A-D shows various views of the frame component of the advanced bump guard.

[0042] FIG. 26A-C shows embodiments of the tethered unmanned aerial vehicle that includes the advanced bump guard.

[0043] FIG. 27 shows a depiction of a spooler component of the advanced bump guard.

[0044] FIG. 28 shows the hub of the spooler component shown in FIG. 7.

[0045] FIG. 29 shows an inner component of the spooler component shown in FIG. 7.

[0046] FIG. 30 shows an inner component of the spooler component shown in FIG. 7.

[0047] FIGs. 31A-C are embodiments of a printed circuit board onboard an unmanned aerial vehicle.

[0048] FIG. 32 shows an exemplary sensor array.

[0049] FIG. 33 illustrates an aerial vehicle employing thrust vectoring to control flight.

[0050] FIG. 34 illustrates the angling of rotors on the vehicle to provide thrust vectoring.

[0051] FIG. 35 illustrates a selected thruster or rotor on the vehicle.

[0052] FIG. 36 shows a portion of the flight control system used to control thrust vectoring.

[0053] FIG. 37 illustrates the control system including a plurality of sensors.

[0054] FIG. 38 illustrates a system to perform industrial inspections using an unmanned aerial vehicle.

[0055] FIG. 39 is a process flow sequence for generating flight control data.

[0056] FIG. 40 is a process flow sequence for processing sensor data using a SLAM- based approach. Description

[0057] FIG. 1 illustrates an aerial vehicle system 100 with a tethered unmanned aerial vehicle 102. System 100 includes the air vehicle 102 which houses all flight systems and sensors. The air vehicle 102 is connected to a base station 104 through a microfilament tether 106. The microfilament tether 106 enables power and data transfer between the air vehicle 102 and the base station 104. The base station 104 is the power and communications hub of system 100, and houses the power conversion and microfilament communication circuitry, as well as the battery and high voltage safety systems. An operator controls and interacts with the system 100 using a control/joystick 110 connected through a USB cord 112 to an Operator Control Unit (OCU) 108. An OCU application, which runs on the OCU 108, serves to display video and telemetry data as well as communicates flight commands to the air vehicle 102. The base station 104 and OCU 108 are connected to chargers 114, such as a wall socket. System 100 may also include an interchangeable spooler cartridge system and a vision-based teleoperation assistance system. The spooler and an interchangeable microfilament dispenser may be installed on the air vehicle 102. For example, exemplary embodiments of the present disclosure can utilize the tether/spooler system disclosed in U.S. patent application Ser. No. 15/041,211 entitled "Spooler for unmanned aerial vehicle system," filed on Feb. 11, 2016, the disclosure of which is incorporated by reference herein.

[0058] FIG. 2 illustrates an unmanned aerial vehicle 202 with small reconnaissance platform. The unmanned aerial vehicle 202 is a semi-autonomous high-endurance nano unmanned air system (nano-UAS) meant for operation in close-quarters indoor environments. Key components include a microfilament tether system, an interchangeable spooler cartridge system, and a vision-based teleoperation assistance system.

[0059] The unmanned aerial vehicle 202 includes all flight systems and sensors. The microfilament tether system includes a spooler 204 and an interchangeable microfilament dispenser installed on the unmanned aerial vehicle 202. The microfilament tether system enables power and data transfer between the unmanned aerial vehicle 202 and a base station. Exemplary embodiments of the present disclosure may utilize the aerial robots and/or filament disclosed in U.S. Patent No. 7,631,834 entitled "Aerial robot with dispensable conductive filament" filed on April 19, 2007, the disclosure of which is incorporated by reference herein.

[0060] The unmanned aerial vehicle 202 provides a sensor platform, visual position, velocity, and attitude stabilization, multiple camera sensor modules, and closed loop flight. The nano class airframe is designed for constrained environment operation. Other embodiments can includes larger airframes with additional spars and rotors depending on a size and a weight of the pay load and the system applications. For example, in alternative embodiments, the unmanned aerial vehicle 202 can be configured for deliveries. For example, exemplary embodiments of the present disclosure may utilize the delivery methods and systems disclosed in U.S. Patent Application No. 14/581,027 entitled "Unmanned delivery" filed on December 23, 2014, the disclosure of which is incorporated by reference herein.

[0061] In additional embodiments, the aerial vehicle system (e.g., aerial vehicle system 100) and/or unmanned aerial vehicle 202 may be utilized in data analytics. Data collected by the unmanned aerial vehicle 202 may be used for analytics that are processed on a ground station, such as base station 104. The system may use one or more classes of analytics to enhance performance and/or usefulness of data obtained by sensors by reacting to observed conditions during flight. These include, for example, object/feature identification, anomaly detection, and change detection. Object/feature identification involves determining that sensor input (for example, image or other sensor modality) matches a pattern stored in a library database. This may be used to identify a particular object, defect, or class of defect. Anomaly detection involves determining that sensor input within a time or space window is statistically different from what was seen during a flight or from a library of past flights. For example, this can compare a current section of pipe to other sections of pipe or pipes that have been travelled during the flight. This function does not have pre-defined signatures that are being searched. Change detection involves determining that sensor input from one physical location has changed. This can be use used to monitor conditions that may be unstable, such as an unstable collapsed building.

[0062] The persistence provided by the tether enables these analytics to evaluate conditions over a longer period of time. Any of these events can trigger an autonomous behavior macro to alter a flight plan and/or to go collect more information during the flight. These kinds of real time re-plans and diversions may not be relevant to short missions of battery-powered UAVs. Each of these algorithms ensure that the tether is not interpreted as relevant object, change, or anomaly.

[0063] FIG. 3 illustrates a system for the unmanned aerial vehicle 202 shown in FIG. 2. The system includes four major system components, an unmanned aerial vehicle 202, a base station 304, a microfilament spooler 306, and an operator control unit (OCU) 308. The system further includes a number of peripherals, such as a controller 310 to simplify flight operations, and a USB hub connector 312 to connect the base station 304 to the controller 310 and the OCU 308.

[0064] The OCU 308 controls and interacts with the unmanned aerial vehicle system. In an exemplary embodiment, the OCU 308 is a computer tablet displaying system software for commands, such as providing high level commands to the unmanned aerial vehicle 202. The OCU 308 further displays video returned from cameras on the unmanned aerial vehicle 202 to provide video feedback to a user.

[0065] In some embodiments, the computer tablet can be a FZ-M1 Tablet, for example. An OCU application, which runs on the OCU 308, serves to display video and telemetry data to a user, and communicate flight commands to the unmanned aerial vehicle 202.

[0066] Base station 304 is the power and communications hub of the system and houses the power conversion and microfilament communication circuitry, as well as the battery and high voltage safety systems. The microfilament spooler 306 provides microfilament deployment and a disposable cartridge. In an exemplary embodiment, the spooler 306 deploys a tether that is 250 feet in length or longer, and the tether microfilament is a 38AWG twisted pair and carries communications and power.

[0067] Those skilled in the art will realize that OCU 308 as shown as a computer tablet could instead be a desktop computer, laptop computer, or a handheld mobile computing device.

[0068] Additional embodiments for controlling a tethered aerial vehicle are disclosed in PCT patent application no. PCT/US2017/024152 entitled "Persistent Aerial Reconnaissance and Communication System," filed on 24 March 2017, the entire disclosure of which is incorporated by reference herein. [0069] FIG. 4A shows an exemplary top view of the unmanned aerial vehicle 400. A main body 402 of the unmanned aerial vehicle 400 houses a main electronics stack, sensors, and cameras. Each strut 404 includes an electronic speed controller (ESC) and a motor/propeller mount. Each propeller 406 attaches to a motor 408 and creates thrust when spun. The motor 408 spins the propeller 406 when actuated to create thrust and stabilize the system. Each propeller 406 includes a bump guard 410 that protects the unmanned aerial vehicle from bumps against obstacles. A camera module 412 generates video data to be sent to user.

[0070] FIG. 4B shows an exemplary bottom view of the unmanned aerial vehicle 400 without a spooler. Sonar or acoustic transducer 414 provides downward and/or upward facing ranging data, for example, utilized for flight behaviors. A camera module 412 generates video data to be sent to a user. A spooler receiver 418 is a receptacle for a microfilament spooler cartridge. The microfilament (tether) is connected, via the spooler, to the unmanned aerial vehicle 400 using a tether connector 420. The tether connector 420 electronically connects the tether to the control electronics on the unmanned aerial vehicle 400.

[0071] FIG. 5 shows spooler components of the unmanned aerial vehicle. A base station connector 502 is a power and communications connector to a base station. A leader 504 is a larger wire for system setup and pre-flight operation. Unmanned aerial vehicle interface 506 is a mechanical and electrical interface to the unmanned aerial vehicle. A spooler 508 holds the microfilament spool. The microfilament spooler is a disposable microfilament cartridge that both houses and deploys microfilament, while cooling and protecting the microfilament. The spooler provides snag and tension-free deployment of microfilament developed for a winding and deployment technique, an unspooling method that feeds from the outside of a bobbin and directs the filament back again through the spool's center. In an exemplary embodiment, each spool can contain 250 feet of microfilament.

[0072] FIG. 6 shows a base station 600 for the unmanned aerial vehicle. The base station 600 is the power and communications nexus for the unmanned aerial vehicle. The base station 600 provides the power conversion and communications functionality for the system, and feeds both power to and command/data signals to and from the unmanned aerial vehicle via the microfilament. In an exemplary embodiment, the base station's internal battery provides one hour of continuous operation, which can be extended by connecting a standard 120V external power source to the system using the AC adapter. The base station 600 employs a double-isolated power supply system with built-in system checks to prevent accidental high voltage application.

[0073] The base station 600 includes a BATT-ON Switch 602 to connect/disconnect the battery to the rest of system. A DC IN connector 604 is a connector for external power input (9-36VDC). A CSTOP switch 606 is a controlled- stop switch to immediately shutdown the entire system, including a fast shutdown of high voltage (HV) circuitry. A HV OUT connector 608 is an output power and communications port, which connects to a microfilament spooler. A LV ON button 610 provides power to the base station 600. A HV PRIMER Button 612 interacts with the high voltage (HV) turn on/off sequence. A HV good indicator light 614 indicates when high voltage (HV) is active. A HV fault indicator light 616 indicates a high voltage (HV) fault. A BATT status button 618 when pushed displays the battery charge status. BATT charge status indicator 620 indicates a charge state when BATT status button is pushed, and indicates active charging when an external power is applied. An OCU USB connector 622 is a USB connection to an OCU tablet.

[0074] During the startup procedure, the HV PRIMER button 612 is depressed for 3 seconds. During this interval, the base station 600 engages a "spool connection check", which performs a test to confirm the connection validity of the high voltage chain from the base station 600 to the spooler, and on to the unmanned aerial vehicle. The user may proceed to the next stage only if the test is successful, indicating full and proper connectivity.

[0075] The base station 600 constantly monitors system performance during operation. If it detects that the microfilament connection is lost, high voltage is immediately shut down as a safety precaution, to decrease exposure to other objects. It is able to sense this condition by assessing the current in the microfilament power system. If the current drops below a normal threshold, an open circuit is indicated, identifying that a break in the microfilament power chain has occurred.

[0076] The base station 600 uses a "smart battery" system. This system includes built- in battery protection for overvoltage, undervoltage, overcurrent, and over-temperature conditions. The battery is charged using a built in smart charger, which adjusts charging characteristics according to temperature or other environmental factors, and according to battery- specific conditions.

[0077] FIG. 7 is an operator control unit (OCU) 700. The OCU 700 is the user's interface with the system. In an exemplary embodiment, the OCU 700 resides on a touchpad, but is portable to any Windows or Linux-based platform. The OCU 700 displays the video streams from the unmanned aerial vehicle, and accepts high level operator commands for unmanned aerial vehicle control.

[0078] The OCU 700 includes a main screen 702 to display an OCU application and returned video. A power button 704 turns on and off the OCU. An USB port 706 is a data connector to the system, typically the base station. An OCU power port 708 is a connector for a power supply.

[0079] FIG 8 is a diagram of the unmanned aerial vehicle component connections. An OCU 802 is connected to a base station 804 via a USB connector 805. The base station 804 is connected to an air vehicle 808 via a microfilament 810. The microfilament 810 is housed and deployed through a microfilament spooler 806 coupled to the air vehicle 808.

[0080] FIGs. 9A, 9B, and 9C illustrate a propeller 902, motor 904, and bump guard 906 of the unmanned aerial vehicle 900. FIGs. 9A and 9B display the bumps guards 906 extended. FIG. 9C displays the bump guard as retracted. In the depicted embodiment, the unmanned aerial vehicle 900 is a multicopter in a quadrotor configuration. A larger number of rotors can also be used. The depicted embodiment uses two pairs of counter-rotating propellers to provide lift and stability.

[0081] The unmanned aerial vehicle 900 is the flying component, in effect, a mobile sensor platform. The unmanned aerial vehicle 900 houses microfilament interface circuitry for power and communications, and a multi-core processor and microcontroller for flight computation and video processing.

[0082] In an exemplary embodiment, the unmanned aerial vehicle 900 hosts three camera sensor modules, providing three VGA video streams displayed to the user on the system's operator control unit (OCU). Position and attitude stability are maintained using SONAR, a magnetometer, a gyroscope, a barometer, and an accelerometer.

[0083] The unmanned aerial vehicle 900 can maintain position stabilization for a period of time in certain conditions. The unmanned aerial vehicle 900 uses camera(s) to compute vehicle motion, and uses this data to make navigational corrections. The camera(s) may be downward-facing, forward-facing, or in any other configuration. The flight area should include visible features for the algorithm to track (i.e., a plain white surface will not provide accurate positional data, but patterned and textured surface like a carpet or wooden floor yields more robust position determinations).

[0084] In an exemplary embodiment, the UAV body 908 includes nacelles 910 that extends beyond the normal boundary of the body 908. Each nacelle 910 includes sensor arrays that may include, but are not limited to, one or more of cameras, distance sensors, sonar, Lidar, and LEDs. Each sensor array has a field of view of the environment. The nacelle 910 increases the field of view for the senor array(s) attached to the end of the nacelle 910, and decreases interference caused by the propeller 902. In alternative embodiments, the sensor arrays may be mounted on bumpers.

[0085] The tether is connected to the unmanned aerial vehicle 900 via a tether connector 912. The tether connector 912 electronically connects the tether to the control electronics on the unmanned aerial vehicle 900.

[0086] FIG. 10 shows a depiction of a specialized spool 1002. The specialized spool 1002 is a coil of wire/tether/microfilament. The spool transfers power and data, and maintain connectivity through the wire/tether/microfilament. The specialized spool 1002 can be coreless or can be wound around a bobbin or other central core. A microfilament/tether deployment mechanism is used to unspool the microfilament/tether as the unmanned aerial vehicle traverses terrain.

[0087] In an exemplary embodiment, the tethered unmanned aerial vehicle is meant for lateral, indoor travel. Being tethered, the unmanned aerial vehicle needs a way to deploy the tether as it moves along. Reliable tether filament management is the keystone to allowing lateral movement of a tethered flying vehicle. Conventional methods include spooling unreel thread or filament from the center of a coreless spool of wire, or wind off from the outside of a traditional spool. However, both methods include problems such as requiring a slip ring, unreliability due to catching and snagging, and maintaining the line under too much tension, which could break the tether.

[0088] The specialized spool 1002 includes a modified exit path 1004 of the filament such that the microfilament/tether first travels upward, over the top, and then down through the center of the spooler. This provides a tunable, snag-free, and robust solution for deploying the microfilament/tether. The specialized spool 1002 includes of two mechanical pieces: a center bobbin 1006, and an outer shroud 1008. The microfilament/tether is wound around the bobbin, and routed over rounded low-friction routing surfaces, then downwards, emerging through the center of the bobbin 1006.

[0089] The shroud 1008 has three functions. First, it provides mechanical protection for the wound microfilament/tether on the bobbin. Second, it functions to mechanically attach the bobbin to the unmanned aerial vehicle. The bobbin cannot be directly attached to the unmanned aerial vehicle because there needs to be full clearance above the bobbin to allow for microfilament/tether unspooling. Third, it prevents the filament from springing outwards, becoming too loose, looping around the top and bottom of the spooler and pulling tight.

[0090] FIG. 11 shows a horizontal view of the spool 1002 shown in FIG. 10 attached to an unmanned aerial vehicle 1102.

[0091] FIG. 12A shows a bottom view of an unmanned aerial vehicle 1202 with four rotors. A spool 1201, such as spool 1002 shown in FIG. 10, is attached to the bottom of unmanned aerial vehicle 1202. The unmanned aerial vehicle 1202 also includes sensors arrays 1204 mounted next to each spar. In an alternative embodiment, the sensor arrays 1204 are mounted on each spar as well as locations on the main body.

[0092] In the depicted embodiment, the unmanned aerial vehicle 1202 is a multi-rotor helicopter that includes a central body 1206 from which a number (i.e., n) of rigid spars 1208 radially extend. The end of each rigid spar 1208 includes a rotor 1210. In the depicted embodiment, the rotors 1210 are in a horizontal configuration. In some examples, each of the rotors 1210 includes an electric motor 1212 which drives rotor 1210 to generate thrust. In some examples, the motor 1212 may be used to tilt the associated rotor 1210. The motors 1212 may include mount servos used for fine tuning and control of the tilting of rotor 1210. The motor mount servos controls the tilting and/or pitch of the rotor by linkage, which tilts the rotor 1210 in varies degrees of freedom.

[0093] Unmanned aerial vehicle 1202 includes a plurality of sensors arrays 1204 to detect surroundings. Sensors arrays 1204 are located on each spar 1208 and may include, but not limited to, cameras and/or range finders, such as sonar, Lidar, or proximity sensors. In one embodiment, multiple sensors are placed on the unmanned aerial vehicle 1202 in order to get a full 360° image. In another embodiment, the sensor arrays are positioned on the unmanned aerial vehicle to provide a field of view in at least 3 directions that are at least 90 degrees apart, each sensor array including an imaging device and a distance measurement device. In some embodiments, the unmanned aerial vehicle 1202 may not include sensor arrays on all sides to provide overlapping coverage; rather, a panoramic image is stitched together using ground processing.

[0094] In one embodiment, at least one sensor array 1204 is a Lidar sensor integrated on a chip (for example, a silicon photonic chip with steerable transmitting and receiving phased arrays and on-chip germanium photodetectors). The detection method in Lidar is based on a coherent method instead of direct time-of-flight measurement, where the system only reacts to light that was originally transmitted by the device. This reduces the effect of sunlight that can be a large noise factor in Lidar systems, and allows for modest photodetectors instead of expensive avalanche photodetectors or photo-multiplier tubes that are challenging and expensive to integrate in a silicon photonics platform. In addition, the on-chip Lidar is smaller, lighter, and cheaper than traditional Lidar. The on-chip Lidar may also be much more robust because of the lack of moving parts. For example, the non- mechanical beam steering in the on-chip Lidar is 1,000 times faster than what is currently achieved in mechanical Lidar systems, and may allow for an even faster image scan rate. This can be useful for accurately tracking small high-speed objects that are only in the Lidar' s field of view for a short amount of time, which could be important for obstacle avoidance for high-speed UAVs.

[0095] FIG. 12B shows a top view of the unmanned aerial vehicle 1202 with four rotors. The unmanned aerial vehicle 1202 includes a propeller (or rotor) 1214 located on each spar. Each propeller is attached to an electric motor 1216 which drives the propeller 1214, and an electronic speed controller 1218. The unmanned aerial vehicle 1202 further includes a DC/DC 1220 converter and sensor arrays 1222. The sensor arrays 1222 may include, but not limited to, cameras, distance sensors, sonar, Lidar, and LEDs.

[0096] Each sensor array 1222 has a field of view 1226 of the environment. The field of views 1226 of at least two adjacent sensor arrays 1222 overlap (for example, one meter from the sensor arrays 1222) to create a full panoramic view of the environment. As shown, multiple sensor arrays 1222 are placed around the unmanned aerial vehicle 1202 in order to get a full 360° image. However, in alternative embodiments, sensor arrays 1222 may not be placed around the unmanned aerial vehicle 1202 in order to get a full 360° image; rather, the sensor arrays 1222 may be placed around the unmanned aerial vehicle 1202 in order to obtain less than a full 360° image and a panoramic image may be composed using ground processing based on the images received.

[0097] FIG. 13 shows a top view of an unmanned aerial vehicle 1302 with six rotors. The unmanned aerial vehicle 1302 includes a propeller (or rotor) 1304 located on each spar. Each propeller is attached to an electric motor 1306 which drives the propeller 1304, and an electronic speed controller 1308. The unmanned aerial vehicle 1302 further includes sensor arrays 1310. The sensor arrays 1310 may include, but not limited to, cameras, distance sensors, sonar, Lidar (including on-chip Lidar), and LEDs.

[0098] FIG. 14 illustrates a UAV tether 1400 comprising magnet wire. Magnet wire is a copper or aluminum wire 1402 coated with a very thin layer of insulation 1404. The insulation 1404 may, for example, include the application and evaporation of extremely thin single or multiple layers of lacquer, polymer, or epoxy materials. Prior tethers for electrical transmission of power and/or data to and from drones have used traditional thick polymeric insulation systems to electrically separate the conductors. Use of magnet wire is advantageous in that it allows for light weight, good flexibility, and low cost for single use missions.

[0099] In an exemplary embodiment, the UAV uses only 25-40 watts of power, weighs approximately 120-160 grams, and employs 400 VDC at .1 amps. Therefore, the wire diameter 1406 may be small. The UAV as described herein can operate with low currents, and has a feature to carry a single disposable microfilament spool that deploys and unspools as the vehicle traverses indoor and outdoor obstacles. Thereafter, the deployed copper 1402 can be rounded up and recycled, with an new filled spool attached to vehicle for another single use mission.

[0100] The tether 1400 is a single pair of impedance controlled wires that can deliver power and data from a fixed base station to the UAV. In an exemplary embodiment, the tether 1400 is an unshielded twisted pair (UTP) 1408 of magnet wires, comprised of small diameter, flexible copper conductors 1402 having each a durable, relatively thick polymer elastomeric dielectric coating 1404 to prevent DC and AC shorting, and to provide impedance control for the high frequencies used for analog/data transmissions. The UTP magnet wires 1408 (also known as a microfilament) utilize a single or a small number of individual strands. Each strand has the application and evaporation of extremely thin single or multiple layers of lacquer, polymer, or epoxy materials. This is advantageous for single use missions, in that it allows for very light weight, good flexibility, and low cost. The UTP magnet wires 1408 are lighter and smaller in diameter than traditional approaches for providing power to drones/UAVs, such that using operational high voltages subsequently requires low currents. The UTP magnet wires 1408 can be stowed in reels of comparatively small volume.

[0101] In an exemplary embodiment, the UTP wire diameter 1406 is 38 gage. The wire diameter 1406 is .0034 inches/1. lmm diameter, and has a resistance of 659 mOhm/ft. This equates to a DC resistance for a 400 foot spooler microfilament length of 460 ohms. The UTP magnet wires 1408 can easily be impedance controlled by knowing the conductor diameter, the insulation thickness and dielectric, the number of twists per inch, according to the following:

V¾r L d J

[0102] The variable Zo for the UTP magnet wires 1408 is approximately 200 ohms, to interface efficiently with the data transmission and delivery circuits. The use of high resistance/small diameter magnet wire does not require the tight impedance control required by larger diameter/lower resistance wires, since an impedance mismatch (and thus interfering reflections) are largely attenuated by the high UTP resistance.

[0103] The UTP magnet wires 1408 are extremely flexible since the small diameter 1406 of annealed copper has a coating thickness of <.001 inches. In an exemplary embodiment, a particularly effective coating system was found to be quad (4 applications) of polyimide insulation (per NEMA standard MW1000-1997 STANDARD. ) The dielectric breakdown was found to be in excess of 3000 volts, and tolerated a large degree of rough handling/kinking without a single shorting or breakdown event.

[0104] As a means of comparison, a standard insulated 38g wire has a jacket insulation thickness of .0035 in. This translates to a wire diameter of .012 inches, versus the diameter of the magnet wire approach of .0051 inches. This is over a 4x increase in volume and about a 50% weight increase per unit length. The implications for a flying vehicle design would be six times less filament carried per equivalent volume/weight of spooled microfilament.

[0105] FIGS. 15- 15E show exemplary diagrams for an embedded control system for the unmanned aerial vehicle based on FPGA architecture. It should be appreciated that the embedded control system utilizing an FPGA illustrated in FIGS 15- 15E is presented for explanatory purposes and the present invention is not limited to the use of one or more FPGAs. For example, the operations performed by the FPGA in FIGS 15-15E could also be handled on a microprocessor, single board computer or system on a chip. The embedded control system provides an autopilot function to automatically control an aircraft trajectory of the unmanned aerial vehicle. The control system generally deals with data input by a microprocessor as the CPU, while the processing result is output to the respective peripheral interface, driving the peripherals for unmanned automatic driving.

[0106] In certain configurations, the UVA control system (for example, FPGA 1502) is fabricated as one or more integrated circuits that reside on a printed circuit board assembly. The system can operate based on wireless communications under battery power or can operate using power transmitted through a tether. The system can include flight stabilization, camera stabilization systems, and an inertial measurement unit (IMU).

[0107] The unmanned aerial vehicle navigates enclosed spaces using image processing and communication integrated circuit board. The image processing and communication integrated circuit board may comprise FPGA as an exemplar processor, although these operations could also be handled on a microprocessor or separate single board computer. The control system includes the FPGA onboard the unmanned aerial vehicle, a base station, sensor arrays, and a number of visual processing equipment. The ground station is coupled to a OCU.

[0108] The onboard processor 1504 and SOM 1506 can generate a set of fiducial points describing the flight path, and can maneuver the flight path by altering rotor speed, or altering rotor tilt angle, or a combination of both. For example, processor 1504 transmits signals to one or more motors of the one or more rotors to adjust a configuration of the one or more rotors. For example, exemplary embodiments of the present disclosure can utilize the thrust vectoring system and method disclosed in International Pub. No. WO 2015/187836 Al, entitled "Fixed rotor thrust vectoring," filed on June 3, 2015, the disclosure of which is incorporated by reference herein.

[0109] In an exemplary embodiment, processor 1504 transmits instructions over a communications bus to instruct the one or more motors to adjust the one or more of the rotors into a tilted, vertical or horizontal configuration. Such command signals can be sent in response to on-board sensor data indicating that a flight path change is needed to avoid a collision. Alternatively, the system can receive wireless commands from an external flight controller that determines a collision avoidance path must be implemented to avoid a collision with other objects such as buildings or other vehicles. Alternatively a remote pilot can assume control of the vehicle to avoid a collision. For example, the FPGA 1502 may receive instructions from an operator using an operator control unit (OCU) 1514.

[0110] The sensor modules 1510 transmit sensor data to the FPGA 1502, which filters the sensor data to create the set of fiducial points that define viable flight paths. The set of fiducial points are based on a distance of one or more objects detected by the sensor modules 1510, wherein the fiducial points are designed to avoid the one or more objects in the operating space.

[0111] The embedded control system includes, but not limited to, processor modules 1504, FLASH modules 1505 used to store a control program, RAM modules 1524, FPGA 1502, 1508, sensor modules 1510, camera modules 1512, digital radio module 1513, GPS receiver module. The processor module is based on ARM and FPGA architectures, including ARM processors and coprocessor FPGA.

[0112] In an exemplary embodiment, an UAV includes ten (10) sensor modules 1520. Each module 1520 weighs approximately 1 gram and includes two OVM 4622 low light global shutter cameras and a OVM 7695 VGA color imager. The module 1520 further includes an OVM 7695 VGA image analysis engine and a one watt white LED. The module 1520 also includes two laser distance proximity sensors.

[0113] The FPGAs 1502, 1508 are responsible for the control of computation and data communications on the UAV. The UAV and base station each includes a silicon on module (SOM) i-MX6 processor. The processor includes a i-MX6 quad-core 2Ghz mobile applications processor, 4 GB LPDDR2 mobile memory, Wi-Fi capable, and H.264 parallel encoding/decoding engine. The SOM i-MX6 processor transmits packets from various processes by writing into a buffers. The packets have a destination global address and a local address to identify the specific hardware to write or read from. The SOM Ϊ-ΜΧ6 processor can, for example, turn on and off a motor over the global bus using the address (e.g., address E: 0-5).

[0114] In some embodiments, GPS receiver modules are connected via serial communication protocol with the processor 1504 and/or i-MX6 SOM 1506, which receives GPS information in real time and transferred to the autopilot.

[0115] The FPGA 1502 receives the output from the plurality of sensors and delivers the organized sensor data directly onto the filament transceiver 1518 or a wireless RF connection. The FPGA 1502 has a global bus connecting the chip components, and communicates with the IMU system and controls the motor speed controllers. Thus, the FPGA 1502 determines the navigational path based on the IMU data and the plurality of sensors viewing in multiple directions.

[0116] The user can interact with the UAV through a visual display device, such as a OCU 1514. The visual display device may also display other aspects, elements and/or information or data associated with exemplary embodiments. The UAV may include other I/O devices for receiving input from a user, for example, a keyboard and a pointing device coupled to the visual display device. The UAV may include other suitable conventional I/O peripherals. The OCU 1514 can run an operating system , such as any of the versions of the Microsoft® Windows® operating systems, the different releases of the Unix and Linux operating systems, any version of the MacOS® for Macintosh computers, any embedded operating system, any real-time operating system, any open source operating system, any proprietary operating system, any operating systems for mobile computing devices, or any other operating system capable of running on the computing device and performing the operations described herein.

[0117] The sensor module 1520 includes OVM 7695 Mobile Industry Processor Interface (MIPI) Cameras 1512. Video data from cameras 1512 is transmitted to MIPI camera receivers. MIPI camera receivers/decimators transmit video data via video bus (VBUS) to FPGA 1502, which is a PCB on the UAV. The video data is routed through the dedicated VBUS. Video data is received by image processing modules in FPGA 1502. Each image processing module processes data for 1 camera, scaling and locating each camera's video stream. Image processing modules output are combined into one output (scaled and lower bandwidth) and transmitted to a dram controller 1522. The dram controller 1522 stores video data into memory 1524. Dram controller 1522 transmits video data to MIPI video driver 1526. MIPI video driver 1526 transmits video data to the SOM 1506. The SOM 1506 transmits video data over microfilament 1518 to the FPGA 1508 on the base station. The base station houses the power conversion and microfilament communication circuitry, as well as the battery and High Voltage safety systems. Video data is received by the SOM 1528 in the base station. The SOM 1528 transmits video data over USB to operator control unit (OCU) tablet 1514. The OCU 1514 displays the video streams from the unmanned aerial vehicle, and accepts high level operator commands for unmanned aerial vehicle control.

[0118] At least one sensor of 1510 includes a VL6180 proximity sensor (other embodiments may include a different proximity sensor) that provides time-of-flight sensor data, ambient light sensor data, and illumination with IR emitters. Sensor data is packetized for transmission on the global bus (GBUS), for example. All data besides video data is routed through the GBUS. Users of continuous sensor information do not need to request the information, they just retrieve data from the global bus. Sensor data is transmitted via the global bus to UAV PCB. Sensor data is received by the processor 1504, which in some embodiments is a STM32 subsystem processor PX4. The processor 1504 is for autopiloting and for high voltage monitoring, and controls the motor controllers based on the sensor data, which can include the IMU data. The processor 1504 receives data from the IMUs. The processor 1504 transmits instructions to the a FPGA 1530 (e.g., lattice X03) for the spindles. FPGA 1530 controls motor circuits for spindles. Sensor data is transmitted through the global bus on vehicle PCB and into the base station.

[0119] In the exemplary embodiment, at least one sensor module 1520 is a proximity and ambient light sensing module and includes a VL6180X proximity sensor. The VL6180X provides fast, accurate distance ranging, and measures absolute range from 0 to above 10 cm (ranging beyond 10 cm is dependent on conditions). This provides near field absolute distance to be measured independent of target reflectance. Instead of estimating the distance by measuring the amount of light reflected back from the object (which is significantly influenced by color and surface), the VL6180X measures the time the light takes to travel to the nearest object and reflect back to the sensor (Time-of- Flight). The VL6180X combines an IR emitter, a range sensor and an ambient light sensor in a three-in-one optical module. The module is designed for low power operation. Ranging and ALS measurements can be automatically performed at user defined intervals. Multiple threshold and interrupt schemes are supported to minimize host operations. Host control and result reading is performed using an I2C interface. Optional additional functions, such as measurement ready and threshold interrupts, are provided by two programmable GPIO pins.

[0120] In alternative embodiments, the sensors 1510 may include one or more of a radar sensor, an inertial measurement unit (IMU) sensor 1516 which includes a gyroscope and an accelero meter, a GPS sensor, a Lidar sensor, and a pressure sensor. The sensors have 5 lanes and all 5 lanes have to go fail to loose communications (video bandwidth will degrade but not stop as lanes fail).

[0121] The control system can include a tether 1518 capable of secure transmission and receipt of data placed on the tether or transmitted and received wirelessly. The tether 1518 may be configured to transmit and receive data via one or more network devices with one or more networks, for example, using a tethered connection or a wireless connection, or some combination of the above. The tether 1518 may include a built-in network adapter, network interface card, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing the UAV to any type of network capable of communication and performing the operations described herein. In one embodiment, the tether 1518 is a serial link, which may include a commercially available twisted pair transceiver integrated circuit.

[0122] The UAV can utilize an IMU 1516 to provide respective absolute estimates of the UAV position, orientation, and speed at each clock interval. The scheme utilizes an IMU 1516 and a GPS sensor. In additional embodiments, the UAV can utilize a GPS sensor, an IMU, and/or optional additional relative motion sensors. The IMU 1516 can provide an estimation of the UAV acceleration and/or angular velocity. The GPS sensor can provide an estimation of the absolute position and/or velocity of the UAV. The relative position and/or velocity sensor (e.g., vision sensor, lidar, ultrasonic sensor, time- of-flight or depth camera, or any other sensor that provides relative position and/or velocity information) can be used to obtain an estimation of the UAV velocity and relative UAV position. Further, the GPS can facilitate continuous monitoring of the position of the UAV. The processor may act on the positional data provided by the GPS to allow the UAV to traverse and record particular flight paths. The GPS module may also report back an actual GPS position of the UAV to a base station. The clocked IMU data and other relative sensors can be used to estimate vehicle position between GPS points or if the GPS signal is temporarily unavailable.

[0123] The processor 1504 and other control components may communicate via wireless signals when the tether 1518 is disconnected. Accordingly, in one embodiment, the processor and/or other control components or sensors described herein may comprise wireless controls or sensors, or a wireless data receiver or transceiver interface.

[0124] The Lidar sensor 4508, which measures distance to a target by illuminating that target with a laser light, provide detection of objects, reliable distance measurement, collision avoidance and driver assistance. Data provided by the Lidar sensor 4502 is used by the control system 4400 to compute updated rotor speed to avoid collisions.

[0125] In one embodiment, in autonomous flight modes, the UAV uses onboard sensing information to calculate absolute and relative state information, compare the current state with the planned mission path, and generate commands to move the vehicle close to the planned mission path. In addition to a pure waypoint plan, autonomous behavior macros enable specification of a plan in terms of higher level actions e.g., "travel 40 ft down the pipe" (waypoint driven); "when waypoint n is reached, conduct a detailed scan of all structures within a 5 ft sphere at a sensor standoff distance of 1 ft." (behavior macro). Mission planning software calculates a nominal trajectory, and onboard autonomy monitors the execution of the trajectory and updates the execution plan at preplanned or sensed events. An event can be a combination of a location, distance travelled, time, sensed environmental condition, sensed vehicle state, or received user command. Autonomous behaviors are preferably calculated using the tether direction as a constraint to prevent tether impingement or tangling on environmental structures. In an embodiment, an autonomous behavior may be calculated using visual observations of the tether to identify a "go home" direction.

[0126] In order to provide autonomous flight modes, some embodiments utilize one or more SoCs with dedicated ICs to perform real-time navigation of the UAV instead of FPGAs. FIG 15F depicts an SoC used to perform autonomous flight in an exemplary embodiment. An SoC 1550 on a printed circuit board includes a microcontroller 1551 and memory 1552, Memory 1552 may include one or more of ROM, RAM, EEPROM and Flash memory. Memory 1552 may hold the planned mission path. SoC 1550 may also include one or more processors 1553. Processor 1553 may have one or more cores. SoC 1550 may include one or more timers, peripherals, voltage regulators, power management circuits, external interfaces, analog interfaces including DACs and ADCs and co-processors (not shown). SoC 1550 may also include one or more communication interfaces including a wireless interface. SoC 1550 also includes one or more ICs including one or more ASICs 1554. For example in one embodiment, ASIC 1554A may be used to perform centralized processing of sensor data in real-time that is received over the global bus from the onboard sensors. ASIC 1554A may process the on-board sensor data to detect navigation hazards in the upcoming mission path. The processed data results may be provided to ASIC 1554B. ASIC 1554B may be used to perform real-time control of the UAV based on navigation-related objects detected in the processed data to alter the stored mission plan as necessary. For example, ASIC 1554B may alter the amount of thrust to change the UAV's heading to avoid an object detected in the sensor data.

[0127] In another embodiment, as depicted in FIG. 15G, the SoC may use a single ASIC for performing both processing of sensor data and real-time control of the UAV. [0128] To process the sensor data and detect navigation hazards or other environmental conditions of interest, a number of different algorithms may be utilized including but not limited to the following algorithms for performing : visual odometry as described at http://www.ros.org/news/2014/06/new-package-svo-semi-direct- monocular-visual- odometry.html (the contents of which are incorporated herein by reference in their entirety); SLAM as discussed at https://dspace.mit.edU/bitstream/handle/1721. l/98678/ 920682960-MIT.pdf?sequence=l(the contents of which are incorporated herein by reference in their entirety); path planning as discussed in http://acl.mit.edu/papers/ LuderslO_GNC.pdf (the contents of which are incorporated herein by reference in their entirety); object recognition, Fisher Vector as described at http://lear.inrialpes.fr/src/ inria_fisher/ (the contents of which are incorporated herein by reference in their entirety). Further algorithms for anomaly detection and change detection may also be utilized.

[0129] The real-time control of the UAV based on the processed sensor data may also consider additional constraints to dynamically account for the current position of the tether extending from the UAV. For example, in one embodiment, the current location and orientation of the tether with respect to the body of the UAV is constantly monitored. Real-time navigation operations necessitated by identified objects or conditions detected in the mission path are limited based on the tether location.

[0130] In one embodiment, the monitoring of the tether location and orientation extends to identifying a location at which the tether is disconnected from the UAV so that a retrieval operation can be later performed.

[0131] FIG. 16 is an exemplary embodiment of a printed circuit board 1600 onboard an unmanned aerial vehicle. The printed circuit board 1600 includes a 32 bit microprocessor 1702. The printed circuit board includes a FPGA 1704. The printed circuit board 1600 includes a TCP SSM circuit 1706 and a SSM 1708. The printed circuit board 1600 includes a micro SD card 1710, a motor ESC connector 1712, and motor mounting holes 1714. The print circuit board includes nano SOM HDI interconnect 1716. In some embodiments, the printed circuit board 1600 is connected to a battery for untethered flight. [0132] FIG. 17 illustrates a diagram of a global bus in an embedded control system for the unmanned aerial vehicle. A main FPGA 1702 receives data transmitted over a global bus 1703 from a communications module 1704, a sonar altimeter and/or barometric altimeter module 1706, a HD camera module 1708, a laser scanner module 1710, and at least one system software manager (SSM) module 1712. The SSM module 1712 include a global shutter camera IR, a color camera, a global shutter camera VIS, an illumination LED, a far proximity sensor, a near proximity sensor, and a FPGA. The laser scanner module 1710 includes a laserLED emitter, a time of flight (TOF) receiver, a global shutter camera, and a FPGA. The communications module 1704, the sonar altimeter module 1706, and the HD camera module 1708 each include a HD camera and a FPGA.

[0133] The main FPGA 1702 is further in communication with motor drivers 1714, a DDR memory 1716, a mobile processor 1718, and an inertial measurement unit (IMU) 1720.

[0134] FIG. 18 illustrates an example diagram of a base station electrical architecture. The electrical architecture of a base station includes a power module 1802, a battery module 1808, a main module 1812, and an interface module 1814. The power module 1802 includes a bus power conversion and a power-on control. The power module 1802 receives power (VDC) from an external power connector 1804. The power module 1802 is also coupled to a power button 1806 used to power on/off the base station. The power module 1802 communicates through a bus with the battery module 1808, including transmitting video data. The battery module 1808 includes charge protection, a charger/balance, and a gas gauge. The battery module 1808 provides video data to a system-on-chip (SoC) display 1810.

[0135] The power module 1802 provides power to the main module 1812. The main module 1812 includes a communications FPGA and a nano-SOM. The main module 1812 provides power and communication data to the interface module 1814. The interface module 1814 includes a high- voltage (HV) conversion, HV safety circuits, and communication analog interface. The interface module 1814 transmits voltage and communication data to a spooler connector 1816. [0136] FIG. 19 illustrates an example diagram of an unmanned aerial vehicle electrical architecture. The electrical architecture of an unmanned aerial vehicle includes a main module 1902, an interface module 1904, a laser module 1908, and at least one SSM 1912. The main module 1902 receives video data (via a video bus) and communications data from the interface module 1904. The main module 1902 includes video processing, data processing, and nano-SOM carrier. The interface module 1904 receives voltage and communications from a pooler connection 1906.

[0137] The main module 1902 transmits laser power and 3-phase electric power to a laser module 1908. The main module 1902 further communicates with, and transmits power to, an electronic speed controller (ESC) 1910. The ESC 1910 transmits 3-phase electric power to at least one motor 1914. The ESC 1910 communicates with, and transmits power to, the at least one SSM 1912. Each SSM 1912 includes a color camera, two grayscale global shutter, two laser rangers, and a lWatt LED.

[0138] FIG. 20 illustrates an example method for creating a flight path for an unmanned aerial vehicle. At step 2002, the UAV receives sensor data from multiple directions within a flight operating space, the detected sensor data including near field sensor data, mid-range sensor data, and far field sensor data. Each sensor array detects objects, events, and/or changes in the sensor array's field of view. For example, sensors arrays may be attached to each spar of the UAV and may include, but not limited to, cameras and/or range finders, such as sonar, Lidar, or proximity sensors. Adjacent sensor arrays have overlapping field of views, providing a panoramic view enabling the UAV to get a full 360° image of an environment.

[0139] At step 2004, the UAV (e.g., an onboard processor and/or SOM) filters noise from the sensor data to generate a set of fiducial points in the operating space describing a flight path for the UAV. In one embodiment, the set of fiducial points are based on a distance of one or more objects detected by the sensor arrays, wherein the set of fiducial points are designed to avoid the one or more objects in the operating space.

[0140] At step 2006, the UAV to follow the set of fiducial points that define the flight path. Further processing the detected sensor data to adjust speed, direction, and altitude of the UAV using the set of fiducial points that define a selected flight path. More particularly, instructions are transmitted over a communications bus on the UAV to configure the UAV to follow the set of fiducial points. For example, in one embodiment, a processor embedded on the UAV transmits signals over a communications bus to one or more motors of the one or more rotors to adjust a configuration of the one or more rotors. Such command signals can be sent to follow the set of fiducial points or in response to on-board sensor data indicating that a flight path change is needed to avoid a collision. The UAV can maneuver through the flight path by altering rotor speed, or altering rotor tilt angle, or a combination of both.

[0141] At step 2008, the UAV records flight path data that is clocked in relation to the vehicle position at each time interval in combination with the detected ranging data from each direction at each interval and IMU data at each interval.

[0142] FIG. 21 illustrates hoops 2102 of a bumper 2104 in a collapsed position. The hoops 2102 of the bumper 2104 can be made into a spherical shape, as shown in FIG. 3. The bumper 2104 can be configured to be bi-stable, so the bumper 2104 can keep its shape in a fully deployed spherical shape or a collapsed position. The hoops 2102 create a physical mesh to prevent injection of microfilament, fingers, and other foreign objects, and becoming ensnared in obstacles when flying.

[0143] In one embodiment, each hoop 2102 is comprised of shape memory alloy (SMA), tempered and annealed to a super elastic condition. For example, each hoop 2102 may comprise Nitinol, which can undergo 8% strain energy (vs. .1-3% for highly tempered SS music wire or carbon fiber). The SMA-comprised hoops 2102 can be tempered to thermally deploy to a spherical shape from a collapsed state upon activation of a heating current. Thus, the bumper 2104 can act as a super zero maintenance "parachute" should the unmanned aerial vehicle suffer malfunction and need to crash. In other embodiments, the hoops 2102 may be composed of a different alloy or a different metal.

[0144] FIGs. 22A-B illustrates a collapsed bumper 2104 that includes a frame component 2202 for an unmanned aerial vehicle. The frame component 2202 includes a circular bumper 2204 connected to an end of each rotor spar. The frame component 2202 further includes a circular bumper ring 2208 that encircles the unmanned aerial vehicle. Each circular bumper 2204 is connected to an inside of the circular bumper ring 2208. Each hoop 2102 is connected to an outside of the bumper ring 2208, as shown in FIG. 2B.

[0145] The collapsed bumper 2104 can be returned to a spherical shape, as shown in FIG. 23, without damage. The collapse state of the bumper 2104 enables the unmanned aerial vehicle to be kept in a stowed minimal shaped position.

[0146] FIG. 23 shows an unmanned aerial vehicle 2302 with six rotors and an bumper 2104 that protects the unmanned aerial vehicle 2302 from bumps against obstacles. A microfilament tether 2304 is attached to the bottom of unmanned aerial vehicle 2302 and extend through an opening tether port 2303 in the bumper 2104. The unmanned aerial vehicle 2302 is connected to a base station through the tether 2304. The tether 2304 enables power and data transfer between the unmanned aerial vehicle 2302 and the base station. The base station is the power and communications hub, and houses the power conversion and microfilament communication circuitry, as well as the battery and high voltage safety systems. The spooler and an interchangeable microfilament dispenser may be installed on the unmanned aerial vehicle 2302. For example, exemplary embodiments of the present disclosure can utilize the tether/spooler system disclosed in U.S. patent application Ser. No. 15/041,211 entitled "Spooler for unmanned aerial vehicle system," filed on Feb. 11, 2016, the entire disclosure of which is incorporated by reference herein.

[0147] In the depicted embodiment, the unmanned aerial vehicle 2302 is a multi-rotor helicopter that includes a central body 2306 from which a number (i.e., n) of rigid spars 2308 radially extend. The end of each rigid spar 2308 includes a rotor 2310. In the depicted embodiment, the rotors 2310 are in a horizontal configuration.

[0148] The bumper 2104 acts as a rigid cage about the unmanned aerial vehicle 2302. The bumper 2104 absorbs crash energies by acting as springs and energy absorbers. The bumper 2104 includes multiple hoops 2102 generally arranged in a spherical enclosure around the unmanned aerial vehicle 2302. In alternative embodiments, the hoop diameter and/or helix wrap angles may be changed to create a variety of useful shapes. A toroidal shape can be used to enclose the rotors yet provide system that can mount a camera above or underneath the system with a clear field of view and minimize interference with tether operation.

[0149] In an exemplary embodiment, the bumper 2104 includes a plurality of sensor arrays mounted to the circular bumper ring 2208 of the frame component 2202, as shown in FIGs. 24 - 25. Sensors arrays may include, but not limited to, cameras and/or range finders, such as sonar, Lidar, or proximity sensors. In one embodiment, multiple sensors are placed on the unmanned aerial vehicle 2302 in order to get a full 360° image. The plurality of sensors arrays may be used to detect surroundings. In another embodiment, the sensor arrays are positioned on the unmanned aerial vehicle 2302 to provide a field of view in at least 3 directions that are at least 90 degrees apart to have a clear and unobstructed field of view, each sensor array including an imaging device and a distance measurement device. In alternative embodiments, the sensor arrays may be mounted on each spar 2308 and/or locations on the main body 2306.

[0150] In one embodiment, at least one sensor array is a Lidar sensor integrated on a chip (for example, a silicon photonic chip with steerable transmitting and receiving phased arrays and on-chip germanium photodetectors). The detection method in Lidar is based on a coherent method instead of direct time-of-flight measurement, where the system only reacts to the light that was originally transmitted by the device. This reduces the effect of sunlight that can be a large noise factor in Lidar systems, and allows for modest photodetectors instead of expensive avalanche photodetectors or photo- multiplier tubes that are challenging and expensive to integrate in a silicon photonics platform. In addition, the on-chip Lidar is smaller, lighter, and cheaper than traditional Lidar. The on-chip Lidar may also be much more robust because of the lack of moving parts. For example, the non- mechanical beam steering in the on-chip Lidar is 1,000 times faster than what is currently achieved in mechanical Lidar systems, and may allow for an even faster image scan rate. This can be useful for accurately tracking small highspeed objects that are only in the Lidar' s field of view for a short amount of time, which could be important for obstacle avoidance for high-speed UAVs.

[0151] In an exemplary embodiment, each of the rotors 2310 includes an electric motor 2312 which drives rotor 2310 to generate thrust. In some examples, the motor 2312 may be used to tilt the associated rotor 2310. The motors 2312 may include mount servos used for fine tuning and control of the tilting of rotor 2310. The motor mount servos control the tilting and/or pitch of the rotor by linkage, which tilts the rotor 2310 in various directions or degrees of freedom. When the bumper 2104 bumps against obstacles, the rotors 2310 create thrust and stabilize the unmanned aerial vehicle 2302. Position and attitude stability may maintained using, for example, LIDAR, SONAR, a magnetometer, a gyroscope, a barometer, and/or an accelerometer. A collision sensor or sensor array can be used to detect point or region on the case or bumper device.

[0152] In one embodiment, the unmanned aerial vehicle 2302 is a semi-autonomous high-endurance nano unmanned air system (nano-UAS) meant for operation in close- quarters indoor environments. Key components include a microfilament tether system, an interchangeable spooler cartridge system, and a vision-based teleoperation assistance or autonomous system. The unmanned aerial vehicle 2302 includes all flight systems and sensors. The microfilament tether system includes a spooler, as shown in FIG. 26 - 27, and an interchangeable microfilament dispenser installed on the unmanned aerial vehicle 2302. The microfilament tether system enables power and data transfer between the unmanned aerial vehicle 2302 and a base station. Exemplary embodiments of the present disclosure may utilize the aerial robots and/or filament disclosed in U.S. Patent No. 7,631,834 entitled "Aerial robot with dispensable conductive filament" filed on April 19, 2007, the disclosure of which is incorporated by reference herein.

[0153] The unmanned aerial vehicle 2302 provides a sensor platform, visual position stabilization, multiple camera sensor modules, and closed loop flight. The nano class airframe is designed for constrained environment operation. Other embodiments can includes larger airframes with additional spars and rotors depending on a size and a weight of the payload and the system applications. For example, in alternative embodiments, the unmanned aerial vehicle 2302 can be configured for deliveries. For example, exemplary embodiments of the present disclosure may utilize the delivery methods and systems disclosed in U.S. Patent Application No. 14/581,027 entitled "Unmanned delivery" filed on December 23, 2014, the disclosure of which is incorporated by reference herein.

[0154] A main body 2306 of the unmanned aerial vehicle 2302 houses a main electronics stack, sensors, and cameras. Each rotor spar 2308 includes an electronic speed controller (ESC) and a motor/propeller mount. Each propeller attaches to a motor 2312 and creates thrust when spun. The motor 2312 spins the propeller 2314 when actuated to create thrust and stabilize the unmanned aerial vehicle 2302.

[0155] In some embodiments, the spherical shape of the bumper 2104 enables the unmanned aerial vehicle 2302 to roll when on the ground. By use of reversible motor speed controllers, thrust vectoring can be used to upright the unmanned aerial vehicle 2302 and drive it with precision.

[0156] FIG. 24 illustrates a side view of the unmanned aerial vehicle 2302 and advanced bump guard shown in FIG. 23. Sensor arrays 2404 are illustrated along the frame component.

[0157] FIGs. 25A - 25D shows various views of an embodiment of the frame component 2502 for an unmanned aerial vehicle. The frame component 2502 includes a circular bumper 2204 connected to an end of each rotor spar 2308 and a circular bumper ring 2208 that encircles the unmanned aerial vehicle. Each circular bumper 2204 is connected to an inside of the circular bumper ring 2208.

[0158] The frame component 2502 includes curved indentations 2504 where sensor arrays can be attached, as shown in FIG. 26. The sensor arrays are positioned around the frame component 2502 to provide a field of view in at least 3 directions that are at least 90 degrees apart to have a clear and unobstructed field of view. In some embodiments, the sensor arrays are flushed with the frame component 2502 or depressed within the curved indentations 2504 such that the sensor arrays are not impacted if the advanced bump guard bumps against obstacles.

[0159] FIGs. 26A - 26C shows various views of an embodiment of the unmanned aerial vehicle 2602 with an bump guard 2604 and the frame component 2502. The advanced bump guard 2604 includes a spooler 2606 located below the unmanned aerial vehicle 2602 and within an opening of the bump guard 2604. Sensor arrays 2608 are located with within the curved indentations 2504 of the frame component 2502. Each sensor array 2608 has a field of view of the environment. The field of views of at least two adjacent sensor arrays 2608 overlap (for example, one meter from the sensor arrays 2608) to create a full panoramic view of the environment. Multiple sensor arrays 2608 are placed around the frame component 2502 in order to get a full 360° image. In alternative embodiments, the propellers can be located on either sides of the spar. [0160] The unmanned aerial vehicle 2602 includes a printed circuit board 2610 onboard, as shown in FIGs. 32A - 32C.

[0161] FIG. 27 illustrates the spooler 2606 shown in FIGs. 26A - 26C. The spooler 2606 includes a spool or coil of wire/tether/microfilament. The spool transfers power and data, and maintain connectivity through the wire/tether/microfilament. The spool can be coreless or can be wound around a bobbin or other central core, such as an inner component shown in FIG. 29. A microfilament/tether deployment mechanism is used to unspool the microfilament/tether as the unmanned aerial vehicle traverses terrain.

[0162] The spooler 2606 holds the microfilament spool. The spooler 2606 can be a disposable microfilament cartridge that both houses and deploys microfilament, while cooling and protecting the microfilament. The spooler 2606 provides snag and tension- free deployment of microfilament developed for a winding and deployment technique, an unspooling method that feeds from the outside of the bobbin and directs the filament back again through the spool' s center.

[0163] In an exemplary embodiment, the tethered unmanned aerial vehicle is meant for lateral indoor travel. Being tethered, the unmanned aerial vehicle needs a way to deploy the tether as it moves along. Reliable tether filament management is the keystone to allowing lateral movement of a tethered flying vehicle. Conventional methods include spooling unreel thread or filament from the center of a coreless spool of wire, or wind off from the outside of a traditional spool. However, both methods include problems such as requiring a slip ring, unreliability due to catching and snagging, and maintaining the line under too much tension, which could break the tether.

[0164] In some embodiments, the spooler 2606 includes a specialized spool that includes a modified exit path of the filament such that the microfilament/tether first travels upward, over the top, and then down through the center 2708 of the spooler 2606. This provides a tunable, snag-free, and robust solution for deploying the microfilament/tether. The specialized spooler 2606 includes at least three mechanical pieces: an inner component 2702, an outer component 2704, and a hub 2706. The microfilament/tether is wound around the inner component 2702, and routed over rounded low-friction routing surfaces, then downwards, emerging through the center of the inner component 2702. [0165] FIG. 28 shows the hub 2706 of the spooler component shown in FIG. 27. The hub 2706 attaches the spooler 2606 to the bump guard 2604.

[0166] FIG. 29 shows an inner component 2702 of the spooler component shown in FIG. 27. The microfilament/tether is wound around the inner component 2702, and routed over rounded low-friction routing surfaces, then downwards, emerging through the center of the inner component 2702.

[0167] FIG. 30 shows an outer component 2704 of the spooler component shown in FIG. 27. The outer component 2704 has two functions. First, it provides mechanical protection for the wound microfilament/tether on the inner component 2702. Second, it prevents the filament from springing outwards, becoming too loose, looping around the top and bottom of the spooler and pulling tight - a failure mode that was seen in several unshrouded initial prototypes.

[0168] FIG. 31A - 31C are embodiment of a printed circuit board 3102 onboard the unmanned aerial vehicle 2602. FIG. 31A illustrate a single printed circuit board 3102. FIGs. 3 IB - 31C illustrates a stacked printed circuit board 3104. The printed circuit board shown in FIGs. 3 IB - 31C may function similar to or equivalent to the printed circuit boards described in FIGs. 15 - 17.

[0169] FIG. 32 shows an exemplary sensor array 2608.

[0170] FIG. 33 is an exemplary multi-rotor UAV helicopter 3300 that includes a central body 3302 from which a number (i.e., n) of rigid spars 3304 radially extend. The end of each rigid spar 3304 includes a thruster 3306 rigidly mounted thereon. In some examples, each of the thrusters 3306 includes an electric motor 3308 (e.g., a brushless DC motor) which drives a rotor 3310 to generate thrust. Very generally, in operation the central body 3302 includes a power source which provides power to the motors 3308 which in turn cause the rotors 3310 to rotate. While rotating, each of the rotors 3310 forces air above the helicopter 3300 in a generally downward direction to generate a thrust having a magnitude and direction that can be represented as a thrust vector 3312.

[0171] The bumper system or cage as described can employ a thrust vectoring flight control system to provide collision compensation after the vehicle collides with an object during flight. [0172] FIG. 34 illustrates the angling of rotors on the vehicle to provide thrust vectoring. As described in U.S. Application No. 15/316,011 filed on December 2, 2016, claiming priority to PCT/US2015/033992 filed on June 3, 2015, the entire contents of the above applications being incorporated herein by reference, in contrast to conventional multi-rotor helicopter configurations, the multi-rotor helicopter 3300 has each of its thrusters 3306 rigidly mounted with both a dihedral angle, Θ and a twist angle, φ. In some examples, both (1) the dihedral angle is the same for each spar 3304, and (2) the magnitude of the twist angle is the same for each spar 3304 with the sign of the twist angle being different for at least some of the spars 3304. To understand the mounting angles of the thrusters 3306, it is helpful to consider the plane defined by the rigid spars 3304 of the multi-rotor helicopter 3300 as being a horizontal plane 3314. With this in mind, mounting the thrusters 3306 with a dihedral angle includes mounting the thrusters 3306 at an angle, Θ with respect to a line from the center of the rotor 3310 to the center of the central body 3302. Mounting a thruster 3306 with a twist angle at the end of a rigid spar 3304 includes mounting the thrusters 3306 at an angle, φ such that they are rotated about a longitudinal axis of the rigid spar 3304.

[0173] Due to the dihedral and twist mounting angles of the thrusters 3306, the thrust vectors 3312 are not simply perpendicular to the horizontal plane 3314 defined by the rigid spars 3304 of the multi-rotor helicopter 3300. Instead, at least some of the thrust vectors have a direction with an oblique angle to the horizontal plane 3314. The thrust force vectors, ^> are independent (i.e., no force vector is a multiple of other of the force vectors) or there are at least k (e.g., k = 3,6, etc.) independent thrust force vectors.

[0174] FIG. 35 is a detailed view of an 1 th thruster 3306 shows two different coordinate systems: an x, y, z coordinate system and a w, , v ; , w ; coordinate system. The x, y, z coordinate system is fixed relative to the vehicle and has its z axis extending in a direction perpendicular to the horizontal plane defined by the rigid spars 3304 of the multi-rotor helicopter 3300. The x and y axes extend in a direction perpendicular to one another and parallel to the horizontal plane defined by the rigid spars 3304. In some examples, the x, y, z coordinate system is referred to as the "vehicle frame of reference." The w, , v ; , w ; coordinate system has its w ; axis extending in a direction perpendicular to a plane defined by the rotating rotor 3310 of the 1 th thruster 3306 and its u t axis extending in a direction along the i th spar 3304. The u { and v i axes extend in a direction perpendicular to one another and parallel to the horizontal plane defined by the rotating rotor 3310. In some examples, the u i ,v i , w i coordinate system is referred to as the "rotor frame of reference." Note that the x, y, z coordinate system is common for all of the thrusters 3306 while the u i ,v i , w i is different for each (or at least some of) the thrusters 3306.

[0175] The rotational difference between the x, y, z and the u i ,v i , w i coordinate systems for each of the n thrusters 3306 can be expressed as a rotation matrix R, . In some examples, the rotation matrix R, can be expressed as the product of three separate rotation matrices as follows:

R t = RfR Rf where Rf is the rotation matrix that accounts for the rotation of the i th spar relative to the x, y, z coordinate system, Rf is the rotation matrix that accounts for the dihedral angle, Θ relative to the x, y, z coordinate system, and Rf is the rotation matrix that accounts for the twist angle, φ relative to the x, y, z coordinate system.

[0176] Very generally, multiplying an arbitrary vector in the u i ,v i , w i coordinate system by the rotation matrix^, results in a representation of the arbitrary vector in the x, y, z coordinate system. As is noted above, the rotation matrix R t at the i th spar depends on the spar number, i, the dihedral angle, Θ, and the twist angle, φ. Since each spar has its own unique spar number, i, dihedral angle, Θ, and twist angle, φ, each spar has a different rotation matrix, R { . One example of a rotation matrix for a second spar with a dihedral angle of 15 degrees and a twist angle of -15 degrees is 0.4830 -0.8700 -0.0991

0.8365 0.4250 0.3459

-0.2588 -0.2500 0.9330

[0177] In general, the ith thrust vector 1012 can be represented as a force vector, ri 113. The force vector, - 113 generated by the ith thruster 1006 extends only along the w i axis of the u i ,v i , w i coordinate system for the ith thruster 1006. Thus, the ith force vector 1013 can be expressed as:

0

0

where ^ represents the magnitude of the i force vector 1013 along the w ( axis of the u i ,v i , w i coordinate system. In some examples, f i is expressed as: f, tf where is an experimentally determined constant and wf is the square of the angular speed of the motor 1008.

[0178] The components of i th force vector 1013 in the x, y, z coordinate system can be

•th

determined by multiplying the i force vector 1013 by the i m rotation matrix R { as follows:

where is a vector representation of the i force vector 1013 in the x, y, z coordinate system.

[0179] The moment due to the i thruster 1006 includes a motor torque component due to the torque generated by the thruster' s motor 1008 and a thrust torque component due to the thrust generated by the rotor 1010 of the thruster 1006. For the i thruster 1006, the motor rotates about the w i axis of the u i ,v i , w i coordinate system, generating a rotating force in the u i ,v i plane. By the right hand rule, the motor torque generated by the 1 th thruster' s motor 1008 is a vector having a direction along the w i axis. The motor torque vector for the i th thruster can be expressed as:

where

with k 2 being an experimentally determined constant, and ω f being the square of the angular speed of the motor 1008.

[0180] To express the motor torque vector in the x, y, z coordinate system, the motor torque vector is multiplied by the rotation matrix R { as follows:

[0181] The torque due to the thrust generated by the rotor 1010 of the i thruster 1006 is expressed as the cross product of the moment arm of the i th thruster 1006 in the x, y, z coordinate system, and the representation of the i th force vector 1013 in the x, y, z coordinate system, Fi

where the moment arm is expressed as the length of the i spar 1004 along

the u i ,v i , w i coordinate system multiplied by the spar rotation matrix, Rf . I

R 0

0

[0182] The resulting moment due to the i thruster 1006 can be expressed as:

[0183] The force vectors in the x, y, z coordinate system, - generated at each thruster 1006 can be summed to determine a net thrust vector:

[0184] By Newton's second law of motion, a net translational acceleration vector for the multi-rotor helicopter 1000 can be expressed as the net force vector in the x, y, z coordinate system, divided by the mass, m of the multi-rotor helicopter 1000. For example, for a multi-rotor helicopter 1000 with n thrusters, the net translational acceleration vector can be expressed as:

0

a 0

m m M

[0185] The moments in the x, y, z coordinate system, M i generated at each thruster 1006 can be summed to determine a net moment:

[0186] By Newton's second law of motion, a net angular acceleration vector for the multi-rotor helicopter 1000 can be expressed as the sum of the moments due to the n thrusters divided by the moment of inertia, / of the multi-rotor helicopter 1000. For example, for a multi-rotor helicopter 1000 with n thrusters, the net angular acceleration can be expressed as:

[0187] Based on the above model of the multi-rotor helicopter 1000, it should be apparent to the reader that the magnitudes and directions of the overall translational acceleration vector a and the overall angular acceleration vector a can be individually controlled by setting appropriate values for the angular speeds, o i for the motors 1008 of each of the n thrusters 1008.

[0188] FIG. 36 is an exemplary approach to controlling a vehicle, a multi-rotor helicopter control system 3600 receives a control signal 3616 including a desired position, ^ in the inertial frame of reference (specified as an n, w, h (i.e., North, West, height) coordinate system, where the terms "inertial frame of reference" and n, w, h coordinate system are used interchangeably) and a desired rotational orientation, in the inertial frame of reference (specified as a roll ( R ), pitch ( P ), and yaw ( Y ) in the inertial frame of reference) and generates a vector of voltages V which are used to drive the thrusters 3308 of the multi-rotor helicopter 3300 to move the multi-rotor helicopter 3300 to the desired position in space and the desired rotational orientation.

[0189] The control system 3600 includes a first controller module 3618, a second controller module 3620, an angular speed to voltage mapping function 3622, a plant 3624 (i.e., the multi-rotor helicopter 3300), and an observation module 3626. The control signal 3616, which is specified in the inertial frame of reference is provided to the first controller 3618 which processes the control signal 3616 to determine a differential thrust force vector, AF ' ' ~ and a differential moment vector, Δ 5 each specified in the frame of reference of the multi-rotor helicopter 3300 (i.e., the x, y, Z coordinate system). In some examples, differential vectors can be viewed as a scaling of a desired thrust vector. For example, the gain values for the control system 3600 may be found using empiric tuning procedures and therefore encapsulates a scaling factor. For this reason, in at least some embodiments, the scaling factor does not need to be explicitly determined by the control system 3600. In some examples, the differential vectors can be used to linearize the multi-rotor helicopter system around a localized operating point.

[0190] In some examples, the first controller 3618 maintains an estimate of the current force vector and uses the estimate to determine the differential force vector in the inertial frame of reference, AF " : :'"" as a difference in the force vector required to achieve the desired position in the inertial frame of reference. Similarly, the first controller 3618 maintains an estimate of the current moment vector in the inertial frame of reference and uses the estimate to determine the differential moment vector in the inertial frame of reference, Δ " !* as a difference in the moment vector required to achieve the desired rotational orientation in the inertial frame of reference. The first controller 3618 then applies a rotation matrix to the differential force vector in the inertial frame AF " "" ' ' ' to determine its representation in the x, y, z coordinate system of the multi-rotor helicopter

3300, ΔΡ" . Similarly, the first controller 3618 applies the rotation matrix to the differential moment vector in the inertial frame of reference, M>4 ! " t rs,a' to determine its representation in the x, y, z coordinate system of the multi-rotor helicopter 3300, AM " .

[0191] The representation of the differential force vector in the x, y, z coordinate system, A ¾)" and the representation of the differential moment vector in the x, y, z coordinate system, Δ ' ' ·ν" are provided to the second controller 3620 which determines a vector of differential angular motor speeds:

[0192] As can be seen above, the vector of differential angular motor speeds,

Αύ) includes a single differential angular motor speed for each of the n thrusters 3306 of the multi-rotor helicopter 3300. Taken together, the differential angular motor speeds represent the change in angular speed of the motors 3308 required to achieve the desired position and rotational orientation of the multi-rotor helicopter 3300 in the inertial frame of reference.

[0193] In some examples, the second controller 3620 maintains a vector of the current state of the angular motor speeds and uses the vector of the current state of the angular motor speeds to determine the difference in the angular motor speeds required to achieve the desired position and rotational orientation of the multi-rotor helicopter 3300 in the inertial frame of reference.

[0194] The vector of differential angular motor speeds, A*yis provided to the angular speed to voltage mapping function 3622 which determines a vector of driving voltages:

[0195] As can be seen above, the vector of driving voltages, V includes a driving voltage for each motor 3308 of the n thrusters 3306. The driving voltages cause the motors 3308 to rotate at the angular speeds required to achieve the desired position and rotational orientation of the multi-rotor helicopter 3300 in the inertial frame of reference.

[0196] In some examples, the angular speed to voltage mapping function 3622 maintains a vector of present driving voltages, the vector including the present driving voltage for each motor 3308. To determine the vector of driving voltages, V, the angular speed to voltage mapping function 3622 maps the differential angular speed A > i for each motor 3308 to a differential voltage. The differential voltage for each motor 3308 is applied to the present driving voltage for the motor 3308, resulting in the updated driving voltage for the motor, V t . The vector of driving voltages, V includes the updated driving voltages for each motor 3308 of the thrusters 3306.

[0197] The vector of driving voltages, V is provided to the plant 3624 where the voltages are used to drive the motors 3308 of the thrusters 3306, resulting in the multi- rotor helicopter 3300 translating and rotating to a new estimate of position and orientation:

[0198] At least one sensor 3626 includes one or more of an inertial measurement unit (IMU) sensor, gyroscope and accelerometer sensors, a GPS sensor, a Lidar sensor, or a radar sensor, as further described in FIG. 37. The sensor 3626 obtains data and feeds it back to a combination node 3628 as an error signal. The control system 3600 repeats this process, achieving and maintaining the multi-rotor helicopter 3300 as close as possible to the desired position and rotational orientation in the inertial frame of reference.

[0199] FIG. 37 is an schematic block diagram of the at least one sensor 3626 shown in FIG. 36 used in the multi-rotor helicopter control system 3600 for controlling a vehicle. Sensor 3626 includes one or more of a collision sensor 3701, a radar sensor 3702, an inertial measurement unit (IMU) sensor 3704, a GPS sensor 3706, a Lidar sensor 3708, a pressure sensor 3710, a gyroscope sensor 3712, and an accelerometer sensor 3714.

[0200] The data collected from the IMU sensor 3704 enables the control system 3600 to track the UAV's position, i.e., using, for example, dead reckoning, or to adjust for wind.

[0201] The pressure sensor 3710 measures atmospheric pressure. Data provided by the pressure sensor 3710 enabling the control system 3600 to adjust other parameters (i.e., rotor speed, tilting angle, etc.) based on the atmospheric pressure.

[0202] The radar sensor 3702 provide detection of objects, reliable distance measurement, collision avoidance and driver assistance. Data provided by the radar sensor 3702 is used by the control system 3600 to compute updated rotor speed to avoid collisions. [0203] The GPS sensor 3706 provides accurate position and velocity information. Data provided by the GPS sensor 3706 is used by the control system 3600 to compute updated location and velocity information.

[0204] The Lidar sensor 3708, which measures distance to a target by illuminating that target with a laser light, provide detection of objects, reliable distance measurement, collision avoidance, and driver assistance. Data provided by the Lidar sensor 3702 is used by the control system 3600 to compute updated rotor speed to avoid collisions.

[0205] The gyroscope sensor 3712 measures the angular rotational velocity, and assists with orientation and stability in navigation. The accelerometer sensor 3714 measures linear acceleration of movement. Data provided by the gyroscope sensor 3712 and accelerometer sensor 3714 is used by the control system 3600 to compute updated linear and rotation velocity.Cameras report pixels and LiDARs report times of flight, and other sensors report other raw data, as described above. Algorithms are applied to the raw data to extract higher level information. Preferably, all of the raw data from the sensors (e.g., cameras, radar, Lidars, IMU, etc.) is combined together using a centralized processing architecture. The centralized processing may be performed on the main flight processor or a separate processor.

[0206] FIG. 38 illustrates a system 3800 for industrial inspections using an UAV 3802, according to an exemplary embodiment. There are a wide variety of industrial inspection in spaces where it is inconvenient, dangerous or costly to send in a human inspector. Operating in many of these environments currently requires specialized training, permitting, or support equipment. The system 3800 addresses these short comings through the UAV 3802, which includes a suite of onboard sensors for navigation, perception, and inspection. In an exemplary embodiment, system 3800 includes a vehicle 3804 for launching the UAV 3802 and/or acting as a ground station to provide power and data communications. The system 3800 includes supporting ground equipment that manages data and power for a tether 3806 connecting the UAV 3802 to the vehicle 3804 and provides interfaces for the user to plan, execute and monitor the inspection process.

[0207] The general use case is to plan a mission based on user goals and external data, initiate a mission by sending the UAV 3802 into the area of interest, execute a flight plan to collect sensor data in the area of interest, process sensor data to aid navigation and to inspect and evaluate the environment, identify the location and nature of defects or other environment features, present data to the user, update the flight plan based on detected automatically detected conditions or based on user command, and to save data for future purposes.

[0208] Example applications cases of this the embodiments described herein include: enclosed space inspection for chemical, oil, and gas; inspection of interior and exterior of tanks, pumps, and other process equipment and supporting infrastructure; inspection of pipelines, such as inspecting an interior or exterior of the pipelines; inspection of transportation infrastructure, such as inspecting bridges and subways; inspection of mines, such as underground tunnels, quarries, equipment, and infrastructure; inspection of utilities infrastructures, such as power and cellular tower, underground tunnels or pipes, water pumping stations and towers, and power generation facilities; inspection of marine structures, such as inspecting ship exteriors, mechanical rooms, or cargo holds; inspection of interior and exterior spaces of offshore drilling and support equipment; and inspection of building construction, such as inspecting interior and exteriors of buildings to provide additional quality oversight during the construction process.

[0209] The key challenges posed by these environments include one or more of GPS denial, limited RF capability, difficult physical access, and long duration. The use of a tethered UAV 3802 provides significant advantages over a human or ground robotics system to overcome these challenges. Using the UAV 3802 provides mobility for fast transit or observing difficult to reach locations such as ceilings or crevices. The tether 3806 provides continuous power to enable virtually unlimited flight times, compared to battery power systems that typically fly around 30 minutes. The communications over the tether 3806 also enables the vehicle to operate in environments where RF communications are not possible. The tether communications also enables ground processing of video and sensor data for advanced supplementary perception and real time updates of a mission plan from ground processing or from human supervisory control inputs.

[0210] Other potential applications may include integration with another vehicle (e.g., ground vehicle 3804, marine vehicle, crawling/climbing vehicle) to provide additional mobility. Some applications may require several systems with coordinated operations [0211] In some embodiments, mission planning may be provided. Mission planning includes setting up a mission plan for a particular area, defining high level behaviors (slow scan of a designated vs exploring), fiducial points, designating preferred approach directions, and/or keep out zones. This can include auxiliary data such as building/structure plans, terrain maps, weather or solar data, and autonomous execution. Autonomous execution includes, but is not limited to, onboard stabilization, collision avoidance, broad exploration, detailed scanning, go home, and land/perch.

[0212] In an exemplary embodiment, the mission planning software resides on the ground station. The software uses user provided inputs and external data to produce a nominal mission plan in the form of flight fiducial points and pointing directions for the flight. The user provided inputs include, but are not limited to: a list of target locations and payload pointing directions in either absolute or relative coordinates. This information may be specified relative to maps, imagery, or other models that may be available, and overlaid onto the same information so that the user can visualize the mission; keep-out zones, preferred direction, or other constrains for the vehicle path; vehicle flight constraints, such as height or speed limits; autonomous behaviors, such as scanning a specified area at a specified distance, exploring and mapping an area, returning to home, or land/perch for a specified period; enabling or disabling event driven macro responses, such as rescanning an area with higher resolution if an anomaly is detected or returning home if a vehicle malfunction is detected.

[0213] The mission planning calculates a nominal flight plan that describes the trajectory fiducial points and pointing directions. The optimization software includes terms to account for the tether length and spooling rate constraints, as well as planning to avoid or minimize tether entanglement.

[0214] The mission planning software may also use external data to create the mission plan, including static information (maps or models of the physical structure, terrain maps) or dynamic information (sun direction, wind, the location of other equipment or people).

[0215] In some conditions, the UAV 3802 operates in GPS-denied environments, like a confined area, such as a pipe 3808 (the depictions in FIG. 38 may not be to scale). In this case, the system may (1) initializes absolute position on the outside/exterior of the tank where GPS is available and then sends the UAV 3802 into the confined flight pace, (2) operates in a relative frame of reference, (3) a user inputs an initial relative absolute position, and/or (4) use geo -referenced visual landmarks, to complete an inspection. For example, the UAV 3802 may use beacons 3810 to navigation in the confined space. In another embodiment, where GPS signals are extremely weak, un-precisse and highly susceptible to interference the UAV 3802 may use simultaneous localization and mapping (SLAM). SLAM is a technique allowing an autonomous vehicle to simultaneously build a map of a previously unknown environment while also localizing itself within this map. This enable the UAV 3802 to operate semi or completely autonomously.

[0216] FIG. 39 is a process flow sequence for generating flight control data. At step 3902, sensor arrays on an UAV detects sensor data, including image data, IMU data, and/or LiDAR data. At step 3904, a processor onboard the UAV processes the sensor data. At step 3906, the onboard processor stores the sensor data in an onboard memory. At step 3908, the sensor data is transmitted to a flight control processor. At step 3910, the UAV transmits the sensor data via tether and/or wireless transmission to a base station. At step 3912, a base station processor or a networked processor processes the sensor data. At step 3914, the base station processor or the networked processor generates flight control data. At step 3916, the base station processor or the networked processor transmits the flight control data to the flight control processor. At step 3918, the flight control processor transmits control data to motors on the UAV to control flight.

[0217] In one embodiment, the base station processor or the networked processor processes the sensor data using SLAM, which takes information from multiple different kinds of sensors in order improve the quality of position and velocity estimation, as further shown by FIG. 40. SLAM is discussed at Sebastian Hening, Corey A. Ippolito, Kalmanje S. Krishnakumar, Vahram Stepanyan, and Mircea Teodorescu. "3D LiDAR SLAM Integration with GPS/INS for UAVs in Urban GPS-Degraded Environments", AIAA Information Systems-AIAA Infotech @ Aerospace, AIAA SciTech Forum, (AIAA 2017-0448) (the contents of which are incorporated herein by reference in their entirety), and Munguia, R, Urzua, S, Bolea, Y "Vision-based SLAM system for unmanned aerial vehicles." Sensors 2016; 16: 372. (the contents of which are incorporated herein by reference in their entirety). [0218] FIG. 40 is a process flow sequence for processing sensor data using a SLAM- based approach, according to an exemplary embodiment. In the described embodiment, the UAV includes at least an IMU sensor 4002, an inertial navigational system 4004, a LiDAR sensor 4010, a GPS sensor data 4008, and a camera sensor 4014. The IMU sensor 4002 generates and transmits data to the inertial navigational system (INS) 4004. The INS 4004 generates and transmits INS data to an adaptive Kalman filter 4006. GPS sensor 4008 generates and transmits GPS data to the adaptive Kalman filter 4006. LiDAR sensor 4010 generates and transmits LiDAR data to SLAM processing 4012. The SLAM processing 4012 generates and transmits LiDAR SLAM data to the adaptive Kalman filter 4006. In some embodiments, the camera sensor 4014 generates and transmits camera data to SLAM processing 4012. The SLAM processing 4012 generates and transmits camera SLAM data to the adaptive Kalman filter 4006. The fusion of the GPS data, the LiDAR SLAM data, the optional camera SLAM data, and the INS data is combined by the Adaptive Kalman Filter 4006 to provide accurate state estimates of the position and velocity of the UAV.

[0219] The above procedure is further described in Sebastian Hening, Corey A. Ippolito, Kalmanje S. Krishnakumar, Vahram Stepanyan, and Mircea Teodorescu. "3D LiDAR SLAM Integration with GPS/INS for UAVs in Urban GPS-Degraded Environments", AIAA Information Systems-AIAA Infotech @ Aerospace, AIAA SciTech Forum, (AIAA 2017-0448) (the contents of which are incorporated herein by reference in their entirety), and Munguia, R, Urzua, S, Bolea, Y "Vision-based SLAM system for unmanned aerial vehicles." Sensors 2016; 16: 372. (the contents of which are incorporated herein by reference in their entirety).

[0220] Since certain changes may be made without departing from the scope of the present invention, it is intended that all matter contained in the above description or shown in the accompanying drawings be interpreted as illustrative and not in a literal sense. Practitioners of the art will realize that the sequence of steps and architectures depicted in the figures may be altered without departing from the scope of the present invention and that the illustrations contained herein are singular examples of a multitude of possible depictions of the present invention.

[0221] The foregoing description of example embodiments of the invention provides illustration and description, but is not intended to be exhaustive or to limit the invention to the precise form disclosed. Modifications and variations are possible in light of the above teachings or may be acquired from practice of the invention. For example, while a series of acts has been described, the order of the acts may be modified in other implementations consistent with the principles of the invention. Further, non-dependent acts may be performed in parallel and embodiments may be combined or separated in a manner not specifically discussed herein.