Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
INTERACTIVE TRANSPORT SERVICES PROVIDED BY UNMANNED AERIAL VEHICLES
Document Type and Number:
WIPO Patent Application WO/2018/005379
Kind Code:
A1
Abstract:
Embodiments relate to a client-facing application for interacting with a transport service that transports items via unmanned aerial vehicles (UAVs). An example graphic interface may allow a user to order items to specific delivery areas associated with their larger delivery location, and may dynamically provide status updates and other functionality during -the process of fulfilling a UAV transport request.

Inventors:
LESSER JONATHAN (US)
BAUERLY MICHAEL (US)
BURGESS JAMES RYAN (US)
CHENG MAY (US)
SONG RUE (US)
Application Number:
PCT/US2017/039320
Publication Date:
January 04, 2018
Filing Date:
June 26, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
X DEV LLC (US)
International Classes:
G06Q10/08; B64C39/02
Foreign References:
US20160068264A12016-03-10
US20150120094A12015-04-30
US20150321758A12015-11-12
US20150170526A12015-06-18
KR20160020454A2016-02-23
Other References:
See also references of EP 3475898A4
Attorney, Agent or Firm:
SCOTT, Brett, W. (US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. A method comprising:

receiving, by a client computing device, a transport request for transport of one or more items by an unmanned aerial vehicle (UAV);

■responsive. to receiving the transport request, fee client computing. device:

(i) determining one or more L AV-aecessihle sub-areas within a geographic- area associated with, the client competing device, and

(ii) display ing, on a graphic display, a graphic map interface indicating the one or more U A .V ^accessible sub-areas;

receiving, via the graphic map interface of the client computing device, a selection of one of the one or more UAV-accessible sub-areas; and

responsive to receiving the selection, the client computing- device causing the UAV 'to transport the one or more items to the selected UAV-accessibie sub-area.

2. The method of claim I , wherein the UAV is adapted to lower the one or more items from the UAV b a tether, and wherein the one or more UAV-aceessible sub-areas comprise one or more teiher-aceessible sub-areas.

3. 'The method of claim L wherein determining one or more UAV-accessible sob-areas comprises;

determining that a sub-area of the geographic area includes an. unobstructed vertical ath between the ground and the sky; and.

based at least in part on the sub-area havin the lasobsiraeted vertical path betwee the ground and the sky, determining that the s«b-area is a UAV-accessible sub-area.

4. The method of claim 3, wherein determining one or more UAV-accessible sub-areas further comprises;

determining a. weather condition of a sub-area of the geographic area; and

based on the determined weather condition and the sub-area having the unobstracted vertical path between the ground and the sky, determining that the sub-area is a UAV- accessible sub-area. 5, The method, of claim' 3, wherein deteFmhung one -or more UAV-accesstble sub-areas further comprises;

deternisniag that a sub-area of the geographic area does not include a hazard; and based on. the sub-area not including the hazard and the sub-area having the unobstructed vertical path -between the ground and the sky, determining that the sub-area is a tl'W -accessible - sub-area.

6. The ..method of claim 3, wherein determining one or more' UAV-aceessihie sub-areas further comprises:

determining that the sub-area within the geographic area includes a surface feature, wherein the surface feature comprises at least one of a water surface, a ground surface, or building surface; and

based on the determined Surface feature and the sub-area having the unobstructed vertical path, determining that the sub-ares is a UAV -accessible sub-area.

7. The method of claim 3, wherei determining one or more available IJAV-aeeessible sub-areas further comprises:

determining that the sub-area within the geographic, area, is at least a threshold distance from a property line; and

based on the sub-are being at least the threshold, distance from the propert line and the sub-area having the unobstructed vertical path, determining that the sub-area is a UA Y~ accessible sub-area,

8, The method- of claim. 1, wherein the geographic area associated with the client computing device is based on a location of the client, computing device.

9. The method of claim 1, wherein the geographic area -associated with the client computing device is based on a location identified by a user through a user interface of the client computing device.

10, The method of claim 1, wherein the geographic area associated with the client computing device includes a location based on an order history of a user. I I. The method of claim I, wherein the geographic area associated with the cheat com eting device comprises a predicted future location of the client computing device based on one or more movement patterns of the client computing device,

1.2. The method of claim I s wherein the geographic area associated with the client computing device com rises a real estate property, and wherein 'the one or more UAV- accessible- sub-areas compris a sob-area located on at least one of a backyard, a front yard, a driveway, or a patio of the real estate property.

13. The method of claim 1 , -wherein receiving the selection of one of the .one or more UAV-accessible sub-areas comprises:

receiving a selection of one of the backyard, the front yard, the driveway, or tbe patio of the real estate property; and

receiving a selection of a sub-area within die selected one of the backyard, the front yard, the driveway, or the patio of the real estate property.

14. The method of claim I, wherein displaying the graphic map interface indicating the one or more UAV-accessible sub-areas comprises;

displaying a graphic map interface of the- geographic area; and

displaying on the graphic map interface a boundary- of each of the one or more UA V- accessible sub-areas.

15. The method of claim 14, wherein displaying tbe graphic ma interface indicating, the one or more UAV-aceessihle sub-areas further comprises displaying on the graphic map interface a label for each of the one or more UAV-accessible sub-areas.

1 . The method of claim 1 , .'wherein causing the UAV to transport the one or more items to the selected UAV-aecessibSe sab-area comprises;

determining that a weight of the one or more items exceeds a threshold weight; and responsive to determining that tire weight of the one or more items exceeds the threshold weight, causing the UAV and at least one other UAV to 'transport the one or more items to the selected UAV-accessible sub-area.

17. The method of claim 1, wherein the graphic map interface includes an overhead view of the one or more UAV -accessible sub-areas.

I S. The method of claim: I , wherein the graphic map interface i ncludes an oblique view of the one or more UAV-aecessible sub-areas.

19. A system comprising:

a non-transitory computer-readable medium; and

program instructions stored on the non-teassitor eonipiHer-readahle medi m and executable b at least one processor to;

receive a transport request for transport of one or more items by an unmanned aerial vehicle (UA'V);

responsive to receiving the transport .request:

(I) determine one or mote UAV-aecessible sub-areas within a geographic area- .associated with the system, and

(ii) display,, on a graphic display, a graphic map interface indicating the one or more UAV-aecessible sub-areas;

receive, via the graphic map interface of the client computing device, a selection of ne of the one or more UAV-aecessible sub-areas; and

responsive to receiving the selection, cause the UA V to transport the one or more items to the selected UAV-aecessible sub-area.

20. A non-transitory compute readable medium having stored thereon instructions executable by -a computing device to cause the computing device to perform functions comprising:

receiving a transport request for transport of one or more items by an unmanned aerial vehicle (UAV);

responsive to receiving the transport request:

(i) determining one or more UAV-aecessible sub-areas within a geographic area, associated with the computing device, and

{«) displaying, on a graphic display, a graphic ma interface indicating the one •or more UAV-aecessible sob-areas; receiving, via the graphic map ttUerfhce, a selection, of one of the oac or more UAV- accessible sub-areas; and

responsive to receiving flic selection';, causing tiic UAV to transport the one or more items to the selected IJAV-accessible sub-area.

Description:
INTERACTIVE TRANSPORT SERVICES PROVIDED BY UNMANNED AERIAL

VEHICLES

CROSS-REFERENCE TO RELATED APPLICATION

{0001 This application claims priority to U.S. Patent Application No. 15/195,607, filed June 28, 201 , which is incorporated herein by reference in its entirety.

BACKGROUND

|01 )2f Unless otherwise indicated, herein, the materials described in this section are not prior ait to the claims in this application and. are not admitted to be prior art by inclusion in this section.

[0003] An unmanned vehicle, which may also be referred to as an autonomous vehicle, is a vehicle- capable of travel without a physically-present human operator. AH unmanned vehicle may operate in a remote-c nirql mode, in an autonomous mode, or is a partially autonomous mode.

|<MM | When an unmanned vehicle operates in a remote-control mode, a pilot or driver that is at a remote location can control the unmanned vehicle via commands that are sen! t the unmanned vehicle via a wifeiess link. When the unmanned vehicle operates in -autonomous mode, the unma ed vehicle typically moves based on pre-programmed navigation waypoinis, dynamic automatio systems or a combination of these. Further, some unmanned vehicles can operate in both a remote-control mode and an autonomous mode, and in some instances may do so concurrently. For instance, a remote pilot or driver may wish to leave navigation to an autonomous system while manually performing another task, such as operating a mechanical system for picking up objects, as an example.

0O05J Various types of unmanned vehicles exist for various different environments. For instance, unmanned vehicles exist for operation in the air, on the ground* , underwater, and in space. Examples include quad-copters and tail-sitter UAVs, among others. Unmanned vehicles also exist for hybrid operations in. which multi-environment operation is possible. Examples of hybrid unmanned vehicles include an amphibious craft that is capable of operation o land as well as on water or a floatplane thai is capable of landing on water as well as o land. Other examples are also possible.

SUMMARY

(0O06{ Certain aspects of conventional deliver ! ,-' methods may lead to a poor consumer experience. For instance, a restaurant employee delivering- a food order by ear may get stock in traffic., delaying delivery of the food. This delay may inconvenience a hungry consumer not only by causing them to wait longer for their food, but also perhaps causing the temperature of hot food or cold food to approach room cmperat tc. I» another example, a conventional delivery service may only be capabie of delivering a package to- a limited number of destinations (e,g,, a mailbox, post office box, or doorstep associated with a particular address). This may be problematic if a consumer wishes to have an item delivered to a location that doe not have a conventional delivery destination. A UAV delivery system may address these or other issues by avoidin delays associated with corrvemional delivery methods and by allowing a user to select from various custom delivery locations. Examples of such systems and methods are disclosed herein.

|0O07| In one aspect, a method includes: receiving., by a client computing device, a transport request: for transport -of bne or more items by a UAV; responsive to . receivin the transport request, the client computing device (i) determining one or more UAV-accessib!e sub-areas within a geographic area associated with the client: computing device and (it) displaying, on a graphic display, a. graphic map interface indicating the one of more UAV- aceessibie sub-areas; receiving, via the graphic map interface of the client computing device, a selection of one of the one or more tJAV-aceessihie sub-areas; and responsive to receiving the selection, the client computing device causing the UAV to transport the one or more items to the selected UAV-aeeessibie sub-area.

jfiiiOSJ Is another aspect, a system include a non-transitory computer-readable medium; and program instructions stored oo the noiHransitory computer-reada le medium and executable by at least one processor to: receive a transport request for transport . of one or more Items by UAV' ' ;. responsive to receiving the transport request, (i) determine one o more II AV-aecessible sub-areas within a. geographic area associated with the system, and (ii) display, on a graphic display, a graphic map interface indicating the one or more UAV- accessibie sub-areas; receive, via the graphic map interface of the client computing deviee, a selection of one of the one or more U AV-aceessi e sub-areas; and responsive to receding the selection, cause the UAV to transport the one or more item to the selected UAV- accessible sub-area.

{00O9J in another aspect, a non-transitory computer readable medium ' has stored thereo instructions executable by a computing device to cause the computing device to perform functions including receiving a transport request for transport of one or more items by a UAV; responsive to receiving the transport request (i) determining one or snore UAV- aeeessibie sub-areas within a geographic area associated with the computing device and (it) displaying, on a graphic display, a graphic map interface indicatin the one or more IJAV- accessible sub-areas: recei ving, via the graphic map interface, a selection of one of the one or more UAV -accessible stsb-afcas; and responsive, to receiving the selection, causing fee UAV to transport the one or more items to the selected UAV-accessibie sub-area.

{0010} These as well as other aspects, advantages, and alternatives will become apparent to those of ordinary skill Irs the art- by reading th following detailed description with reference where appropriate to the accompanying drawings. Further, it should! be understood that the description provided in ibis summar section ari elsewhere in this document is intended to illustrate the claimed subject matter by way of -example and not by way of limitation.

BRIE DESCRIPTION OF THE DRAWINGS

f OllJ Figure !A is a simplified illustration of an, unmanned aerial vehicle, according to an example embodiment.

} 12j Figure IB is a simplified illustration of an unmanned aerial vehicle, according to an example embodiment

f§f)13f Figure IC is a simplified illustration of an unmanned aerial vehicle, according to an example embodiment

ΪΟ 14 Figure- ID is a simplified illustration of an unmanned aerial vehicle, according to an example embodiment,

jfitllS] Figure IE is a simplified illustration of an unmanned aerial vehicle, according to an example embodiment

j ' O J Figure 2 is a simplified block diagram illustrating components of an

itnmanned aerial vehicle, according to an example embodiment.

[0 I7| Figure 3 is a simplified block diagram illustrating a UAV s ste :, according to an example embodiment.

}{H)18J Figure 4A is an illustration of an interface for placing a UAV transport request, according ;to .an example embodiment.

[Θ019| Figure 4B is an illustration of an interface for placing a UAV transport re uest, according to an example embodiment

{0020] Figure 5 is a flow chart of a method for placing a UA V transport request,: according to an example embodiment.

|O02¾J Figure 6A is an ill ustration of an interface for placing a UA V transport request, according to an example embodiment

|0i)22| Figure 6B is an illustration of an interface for placing a UAV transport request, according to an example embodiment. J0O23| Figure 6C is an illustration of an interface for placing a UAV transport request, according to an example embodiment.

f0024| Figure 6D is an illustration of an interface for placing a UAV transport request,, according to an example embodiment

(00 5 Figure 6E is an illustration of art interface for placing a UAV transport request; according to -an example -embodiment.

(0026) Figure 7 A. is a flow chart illustrating a method for providing a client-facing application for a UAV transport service, according to an example embodiment.

{0027| Figure 7B is a flow chart illustrating a method for providing a client-lacing application for a UA transport service, according to aft. example embodiment

(0028| Figure 8A is an. illustrative -screen from a client-facing application for a UAV transport service, according to an example embodiment.

(0029) Figure 8B is ao illustrative screen from a. client-facing appMcation for a UAV transport service, according to an example embodiment.

{0030| Figure 8C is an illustrative screen from a client-feeing application for a UA V transport service, according to an example embodiment

{0031} Figure 8D is an illustrative screen from, a -client-facing application for a UAV transport service, according to as example anbodiment.

(0032| Figure 8-E is an illustrative screen- from a client-facing application for a UAV transport service, according to an example embodiment.

(Q033| Figure 8F is an illastrariye screen from a ciieht-facing- -application for a UAV transport, service, according to an example embodiment.

{0034J Figure 8(3 is an illusnative screen from a client-facing application for a IJAV transport service, according to- an example embodiment.

{00351 F gure 8H is an illustrative screen from a client-.fac.ing application for a UAV transport service, according to an example embodiment.

(0036| Figure 9 is an illustrative screen from a client-facing application for a UAV transport service, according to an example embodiment.

DETAILED DESCRIPTION

{003-71 The following detailed descriptio describes various features and functions of the disclosure with reference to the accompanying Figures, In the Figures, similar- symbols typically identify similar components, unless context dictates otherwise. The illustrative apparatuses described herein are not meant to be limiting, ft will be readily understood that certain aspects of the disclosure can ' be arranged and combined in a wide variety of different configurations, ail of which are contemplated herein,

i. Overvie

(0038] Example embodiments take the form of or relate to a graphical user interface

(GUI) for a UAV transport application, and/or to the service-provider systems thai interface with such a transport-service ' application and coordinate deliveries of I terns requested via such an application, in an example -embodiment,, the user-facing application may provide access to UAV food, transport service via an application running on a user's device; e.g., via an application running on a mobile phone, wearable device, tablet, or personal computer. However, the examples described herein may apply equally to UAV delivery of other types of i tems. Further, the UAV delivery service may employ UAVs that carry items from a source location (e.g., a restaurant or store) to a target location indicated by the user. ' The UAVs may be configured to lower stems to the ground at the -delivery location via a tether attached to the package containing the items.

1.1 illustrative Unmanned Vehicles

(0039] Herein, the terms "unmanned aerial vehicle" and "UAV" refer to any

autonomous or semi-autonomous vehicle that is capable of performing some functions without a physically present; human pilot,

(00 01 A UAV can take various forms. For example, a UA may take the form of a fixed- wing aircraft, a glider aircraft, a tail-sitter aircraft, a jet aircraft, a ducted fan aircraft, a iighter-than-air dirigible such as blimp or steerahle balloon, a rotoreraft such as a helicopter or multicopies, and/or an omithopter, among other possibilities. Further, the terms "drone," "'unmanned aerial vehicle system" (UAVS), or ''unma ned aerial system" (UAS) may also he used to refer to a UAV.

(.0041] Figure 3 A is a simplified illustration providing a top-down view of, a UAV, according to an example embodiment, in particular, Figure 1A shows an example of a fixed- wing UAV 100, which may also be referred to as an. .airplane, an. aeroplane, a biplane, a glider, or a plane, among other possibilities. The feed-wing UAV 100, as the ' name implies, has stationary wings .102 that generate lift based on the wing shape and the vehicle' forward airspeed. For instance, the two wings 102 may have an airfoil-shaped, cross section to produce an aerodynamic force on the UAV 100,

(0042] As depicted, the fixed-wing UAV 100 may include a wing bod 104 rather than, a clearly defined fuselage. The wing body 104 may contain, fo example, control electronics such as an inertia! measurement unit (I U) and or an electronic speed controller,

3 batteries, other sensors, arid/or a payload, among other possibilities. The illustrative UAV 100 may also include Sanding gear (not shown) to assist with controlled take-offs and landings. In other embodiments, other types of UAVs without landing gear arc also possible.

[00431 The UAV 100 further includes propulsion units 106, which can each include a motor, shaft, and propeller, for propelling the UAV 100. Vertical stabilizers 108 (or fins) may also be attached to the -wing body 10 and/or the wings 102 to stabilize the UAV's yaw (turn left or right) during flight. In some -embodiments, the UAV 1 0 may be also be configured to function as a glider. To do so, UAV 100 may power off its motor, propulsion units, etc. , and glide for a period of time,

{0044] During flight, the UA 100 may control the direction and/or speed of its movement by controlling its pitch, roll, yaw, and or altitude; For example, the vertical stabilizers 108 may include one or more rudders for controlling the UAV's yaw, and the wings ' 102 may include am or more elevators for controlling the UAV's pitch and/or one or more ailerons for controlling the UAV's roll,. As another example,, increasin or decreasing the speed of all the propellers simultaneously ca result in the UAV .100 increasing or decreasing its altitude, respectively.

{0045} Similarly, Figure I B shows another example of a fixed- win UAV 120. The fxed-wmg UAV 120 includes a fuselage . 122, two- wings- 124 with an airfoil-shaped cross section to provide. lift for the UAV 120, a vertical stabilizer 126 (or fin) to stabilize the plane's yaw (turn left or right), a horizontal stabilizer 128 (also referred to as an elevator or tailpiane) to stabilize pitch (tilt up or down), landing gear 130, and -a propulsion unit 132, which can include a motor, shaft, and propeller.

[00461 Figure 1C shows an example of a UAV 140 with a propeller in a pusher configuration. The term "pusher" refers- to the fact that a propulsion unit: 142 is mounted at the back of the UAV and "pushes" the vehicle forward, in contrast to the propulsion unit being mounted at the front of the UAV. Similar to the description provided for Figures 1 A and I B, Figure IC depicts common structures used is a pusher plane, includin a fuselage 144, two ' ings 14f>, vertical stabilizers 1 8, and the propulsion unit 142, which can include a motor, shaft, and propeller.

[0047] Figure ID shows an example of a tail-sitter UA J 60. In the illustrated

.example, die tail-sitter UAV 160 has fixed wings 162 to provide- lift and allow the UAV 160 to glide horizontally (e.g., alon the x-axis, in a position that is approximately perpendicular to the position shows in Figure ID). However, the fixed wings 162 also allow the tail-sitter UAV 160 to lake off and land verticall on its own, f0O $f For example, si: a launch site, the tail-sitter UAV 160 may be positioned vertically (as shown) with its fins 164 and/or wings 162 resting on the ground and stabilizing the UAV 160 m the vertical position. The tail-sitter UAV 160 may then take off by operating its propellers 166 to generate an upward thrust (e.g.,. a thrust that is generally along the y- axis). Once at a suitable altitude, the tail -sitter UAV 160 may use it flaps 168 to reorient itself in a horizontal position, such that its fuselage .170 is closer to being aligned with the - axis than the y-axis. Positioned horizontally, the propellers 1.66 may provide forward tlirust so that the tail-sitter UAV 16 ' can ly in a similar manner as a typical airplane,

[0049] Many ' variations on the illustrated fixed-wing UAVs are possible. For instance, fixed-whig UAVs may include more or fewe propellers, and/or may utilize .a ducted fen or multiple ducted fens for propulsion. Further, UAVs with more wings (e.g., an "x-wing" configuration with four wings), with fewer wings, or even with no wings, are also possible.

jOOSOJ As rioted above, some embodiments may involve other ty es of UAVs, in addition to or in the alternative to fixed-wing UAVs, For instance, Figure I E shows an exatiiple of rotoreraft that is .commonly referred t as a muliieopter 180, The nmlticopte 180 may also be referred to as a quadcopter, as it includes four rotors 182, It should be understood that example, embodiments may involve a rotoreraft with more or fewer rotors than the muMcopter 180, For example, a helicopter typically has two rotors. Other examples with three or more rotors are possible as well. Herein, the term "multieopter" refers to any rotoreraft having more than two rotors, and the term "helicopter" refers to rotoreraft having two rotors.

[0051] Referring to the multieopter 180 in greater detail, the four rotors 182 provide propulsion and maneuverability for the ' multieopter 180. More specifically, each roto 182 includes blades that are attached to a motor 184, Configured as such, the rotors 182 may allow the muliieopter 180 to lake off and land vertically, to maneuver in any direction, and/or to hover. Further, the pitch of the blades may be adjusted as a group and/or differentially, and may allow tire multieopter 180 to ' control its pitch, roll, yaw, and/or altitude,

[0052] It should be understood that references herein to an "unmanned 1 * aerial vehicle or UAV can appl equally to autonomous and semi-autonomous aerial vehicles. In an autonomous implementation, ail functionality of the aerial vehicle is automated; e.g., preprogrammed or controlled via real-time computer functionality that responds to input from various sensors and/or pre-determined information. In a semi-autonomous implementation, some functions of an aerial vehicle may be controlled by a human opemtor, while other functions are carried out autonomously, Further, in some embodiments, a UAV may be configured to allow a reraote operator to take over functions that can. otherwise be controlled autonomously b the UAV. Yet further, a given -type of function ma be controlled remotely at one level of abstraction and performed autonom usl at another level of abstraction. For example, a remote operator could control high level navigation decisions for a UAV, such as by specifying, that the UAV should travel from one location to another (e.g., irons a warehouse in a suburban area to a delivery address in a nearby city), while the UAVs navigation system autonomously controls more fine-grained navigation decisions, such as the specific route to take between the two locations, . specific flight controls to achieve the route and avoid obstacles while navigating the route, and so on.

j0O53) More generally, it should he understood that the example UAVs described herein are not intended to he limiting. Example embodiments may relate to, be implemented within, or take the form of any type- of unmanne -aerial vehicle.

III. illustrative UAV Components

[G05 J Figure 2 is a simplified block diagram illustrating components of a. UA V 200, according to as example embodiment, UAV 20 may take the form of, or be similar in form to, one of the UAVs 100, 120, 140, 160, and 180 -described in reference to Figures 1A- 1E. However, UAV 200 may also take other forms.

} ' 8055J U AV 200 may include various types of sensors, an may include a computing system configured to provide the functionality described herein. In tire illustrated embodiment, the sensors of UAV 200 include an inertia! measureine i unit (IMU) 202, ultrasonic seasons) 204, and a GPS 206, among other possible sensors and sensing systems.

[00561 In the Illustrated embodiment, UAV 200 also includes one or more processors

20®. A processor 208 ma be a -general-purpose processor or a special purpose processor (e.g., digital signal processors, application specific integrated circuits, etc.). The one or more processors 208 can be configured to execute computer-readable program instructions 212 that are stored in the data storage 210 and are executable to provide the functionality of a UAV described herein.

[Θ057| The data storage 210 may include or take the form of one or more compaler- readable storage media that can be read or accessed by at least one processor 208. The one or more computer-readable storage media can include volatile and/or non-volatile storage components, such as optical, magnetic, organic or other -memory- or disc storage, which can be integrated in whole or in part w ith at least one of the. one or more processors 208. In some embodiments, the data storage 210 can he impleajented using a single physical device (e.g., one optica!, magnetic, organic or other memory or disc storage at , while- in other embodiments, the data storage 210 can be implemented using- two or more-physical devices. {0058 As noted, the data storage 210 can include computer-readable program stiiictions 212 and perhaps additional data, such as diagnostic data of the UAV 200. As such, the data storage 210 may include program instructions 212 io perform or facilitate some or all of the UAV functionality described herein. For iastance, in the illustrated embodiment, program instructions 2.52 include a navigation, module 21.4.

A. Sensors

(O059J In an illustfatrve embodiment, IMU 202 rim include both. atr coelerometer

and -a gyroscope, which may be used together to .determine ' an .orientation of the UAV 200. In particular, the acceSerorneier ca t measure the orienta tion of the vehicle with respect to earth, while the gyroscope measures the rate of rotation around an axis. IMUs are commercially available in low-cost, low-power packages. For instance, an 1MU 202 may take ' the form of or include a miniaturized- MicroElectroMechanical System (MEMS) or a NanoEleetroMcchanical System (HEMS), Other types oflMtis may also be utilized.

}Θ06Θ| An IMU 202 may melude other sensors, in addition to aeeeierometers and gyroscopes, which may help to better determine position and/or help to increase autonomy of the U AV 200. Two examples of such sensors are magnetometers and pressure- sensors. In some embodiments, a UAV may include a low-power, digital 3-axis magnetometer, which can be used to realize an orientation independent electronic compass for accurate heading information. However,, other types of magnetometers may "be- utilized as well Other examples are also possible. Further, note that a UAV could include some or all of the above- described inertia sensors -as separate components from: n 1MU.

fOi l I UA V 200 may also include a pressure sensor or barometer,, which can be used to determine the altitude of the UAV 200; Alternatively, other sensors, such as sonic altimeters or radar altimeters, can he used to provide an indication of altitude, which may help to- impro ve the accuracy of and/or prevent drift of an IMU.

}(M)62| In a farther aspect, UA V 200 may include one or more sensors that allow- the

UAV to sense object in the environment. For instance, in. the illustrated embodiment, UAV 200 includes ultrasonic sensor(s 204. Ultrasonic sensor(s) 204 can determine the distance to an object, by- generating sound waves and determining the time interval between transmission of the wave and receiving the corresponding echo off an object. A typical application of an ultrasonic sensor for unmanned vehicles or IMUs is .low-level altitude control and obstacle avoidance. An ultrasonic sensor can also be used for vehicles that need to hover at a certain height or need to be capable of detecting obstacles. Other systems can be used to -determine, sense the presence of, and/or determine the distance to nearby objects, such- as a light detection and ranging (LIDAR) system, laser detection and ranging (LADAR) system, .and/or an infrared or forward-looking infrared (FOR) system, among other possibilities.

[00631 in some embodiments, UAV 200 may also include one or more imaging systemCs). For example, one or more still and/or video cameras may be utilized by UAV 200 to capture image data from the UAV's -environment. As a specific example, charge-coupled device (CCD) cameras or complementary roetal~oxide-semico»diietor (CMOS) cameras can be used with uritnanned vehicles. Such imaging sensor(s) have numerous possible applications, ' such- as obstacle avoidance, localization techniques-. gfOimd tracking for more accurate navigation (e,g.. by applying optical flow techniques to images ' ), video feedback, and/or image recognition and processing, among other possibilities.

{ ' 0064 ' } UAV 2,00 may also include a. GPS receiver 206, The GPS receiver 206 maybe configured to provide data that is typical of well-iaiown OPS systems, such as the GPS coordinates of ' the UAV 200. Such- GPS data may be utilized by the UAV 200 for various functions. As such, the UAV may use its GPS receiver 206 to help navigate to the- caller's location, as indicated, at least in pai , by the GPS coordinates provided by their mobile device. Other examples are also possible.

B, Navigation and Location Determination

{0065} The navigation module 21 may provide ' functionality that allows die UAV

200 to, e.g., move about its environment ' and reach a desired location. To do so, the navigation module 21 may control the altitude and/or direction of flight by controlling the mechanical features of die UAV that affect flight (e.g., its rudder(s), eievator(s), aileroii(s), and/or the speed of its propelier(s}>,

{.0066 { in order to navigate the UAV -200 to a target location,, the navigation module

214 may implement various navigation techniques, such as map based navigation and localization-based navigation, for instance. With map-based navigation, the UAV 200 may be provided with a map of its environment, which may then be used to navigate to a particular location on the map. With localization-based navigation, the UA 200 may be capable of navigating in an unknown environment using localization. Localization-based navigation may involve the UAV 20 building sis own map of its environment and .calculating its position within the map and or the position of objects in the en vironmen For example, as a UAV 200 moves throughout its -environment, the UA V 200 may continuously use localization to update its map of the environment This continuous mapping process may be referred to as simultaneous localization and map in (SLAM). Oilier navijjation techniques may also be- utilised.

|θ0ί»7| In some embodiments, the navigation module 214 may navigate using a technique that relies on waypxrinis. in particular, waypoints are sets of coordinates thai identify points in phy sical space. For instance, an air-navigation waypoint may be defined by a certain latitude, longitude, and altitude. Accordingly, navigation module 214 may cause UAV 200 to move from waypoint -to waypoint, in order to ultimately travel to a final destination (e.g., a final waypoint in- a sequence of waypoints).

|0 t¾| In a further ' . aspect, the navigation module 214 and/©* ' other components ' and systems of the UAV 2 ( 30 may be configured for 'loeaiisition'' to more precisely navigate to the scene of a target location. More specifically, it may be desirable in certain, situations for a UAV to be within a threshold distance of the target location where a payload 220 is being delivered by a UAV (e.g., . within a few feet of the target destination). To this end, a UAV may use a two-tiered approach in which it uses a more-general location-detennhiatiori technique to navigate to a general area that is associated with the target location, and then use a more-refioed loeation-detemhnanon technique to identify and/o navigate to the target location within the genera! area,

jfltM9J For example, the UAV 200 may navigate to the general, area of a target destination where a pay load 220 is being delivered using waypoints and/or map-based navigation. The UAV may then switch to a mode in which it utilizes a localization process to locate- and travel to a more specific location. For instance, if the UAV 200 is to deliver a payload. to a user's home, the UAV 200 may need to be substantially close to the target location in order to avoid delivery of the payload to undesixed areas (e.g., onto a roof, into a pool, onto a neighbor's property, etc.). However, a GPS signal may onl get the UAV " 200 so far (e.g.. within a block of the user's home). A more precise ioeation-detenihnation tech i ue may then be used to find the specific target location.

{00701 Various types of location-determination techniques may be used to accomplish localisation of the target delivery location once the U A 200 lias navigated to the general area of the target delivery location. For instance, the UAV 201) may be equipped with one or more sensor} 1 systems, such as, tor example, ultrasonic sensors 204, infrared sensor (not sbowa}, and/or other sensors, which may provide input mat the navigation module 214 utilizes to navigate autonomously or senii-autonomousiy to the specific target location, |fit)71f As another example, once the U AV 200 reaches the general, area of the target delivery location (or of a m ving subject such as a person or their mobile device), the UAV 200 may switch to a '' y-by-wirc" mode where it is controlled, at least, in part, by a remote operator, who cast navigate the UAV 200 to the specific target location. To s end, sensory data from the UAV 200 ma be -sent to the remote operator to assist them in navigating the JAV 200 to the specific location.

jO072 As yet another example, the IJAV 200 may include a module that is able to signal to a passer-by for assistance in either reaching the specific target delivery location; for example, the UAV 20 may display s visual message requesting such assistance in a graphic- display, play an audio message or tone through speakers to indicate the need for such assistance, among oilier possibilities, Such a visual or audio message might indicate that assistance is needed in delivering the UAV 200 to a particular person or a . particular location, and might provide Information to assist the passer-by in delivering the IJAV 200 to the person or location (e.g., a description or picture of the person or location, and/or the person or location * s name), ' among other possibilities. Such a feature can be useful in a scenario in which the UAV is unable to use sensory functions or another locatioti-defcrminaiion technique to reach the specific target location. However, this feature, is not limited to .such scenarios.

{0073J In some embodiments, once the UAV 200 arrives at the general are of a target delivery location, the UAV 20 may utilize a beacon from a user's remote device (e.g., the user's mobile phone) to locate the person. Such a beacon may take various forms. As an example, consider the scenario where a remote device, such- as the mobile phone of a person who requested a UAV .delivery, is able to send out directional signals (e.g., via an RF signal, a light signal and/or an audio signal). In this scenario, the UAV 200 may be -configured to navigate by "'sourc ig" * such directional signals - in other words, by determining where the signal is strongest and navigating accordingly. As anothe example, a mobile device can emit: a frequency, either in the human range or outside the human range, and the UAV 200 can listen for that freqiteney and navigate accordingly. As a related example, if the IJAV 200 is listening for spoken commands, then the UAV 200 could utilize spoken statements, such as Tel over here!" to source- the specific location of the person requesting delivery of a payioact

|0074| in art alternative arrangement, a navigation module may lie implemented at a remote computing device, winch communicates wirelessly with the UAV 200, The remote computing device may receive data indicating the operational state of the U AV 200, sensor data from the U V 200 that allows it to assess the . environmental conditions bein experienced by the UAV 200, and/or location, information for the UAV 200. Provided with such intim tion, the remote computing device may determine a itudin J and/or directional adjustments to should be made by- the UAV 200 and/or may deterawne how the UAV 2 ( K ) should adjust its mechanical features (e.g., its rttdder(s), elevator{s), aiieron(s) } arid/or the speed of its propellerfs)) in order to effectuate such movements. The remote computing system may then communicate such adjustments to the UAV 200 so it can move in the determined manner.

C. Communication Systems

10075} IB a further aspect, the UAV 200 includes one or more communication systems 2.16. The eo n unieaiioas systems 216 may include one or more wireless interfaces and/or one or more wireline · interfaces, which allow the U V 200 to communicate via one or more networks. Such wireless interfaces may provide for communication under one or more wireless communication protocols, such as Bluetooth, WiFi (e.g., an IEEE 802 J 1 protocol), Long-Term Evolution (LTE), WiMAX (e.g,, an IEEE 802, 1 standard), a radio-frequency ID ( FI ' D) protocol, near-field communication (NFC), and/or other wireless communication proioeois. Such, wireline interfaces may include an Ethernet interface, a Universal Serial Bus (USB) interface, or similar interface to comrainiicate via a wire, twisted pair of wires, a coaxial cable, an optical link, a. fiber-optic link, or other physical connection to a wireline network,

j00?6| In some embodiments, a UAV 200 may include communication -systems .216 that allow for both short-range eonHiiuaicatioe: and long-range communicaiion. For example, the UAV 200 may be configured for short-range communications- using Bluetooth and lo long-range communications under CDMA protocol in such an embodiment,, the UAV 200 may be configured to function as a "hot spot;" or in other words, as a gateway or proxy between a retnote support device and one o more data networks, such as a cellular network aad/or the Internet, Configured as such, the UAV 200 may facilitate data communications that the remote support device would otherwise be unable to perform by itself.

{00771 For example, die UAV 200 ma provide a WiFi connection to a. remote device, and serve ' as a proxy or gateway to a ..cellular service provider's data-network, which the UAV .might connect to under an LTE or a 3G protocol, for instance. The UA 200 could also serve as a proxy or gateway to a high-altitude balloon network, a satellite network, or a combination of these networks, among others, which a remote device might, not be able to otherwise access. J . Po er Systems

{0078J In a further aspect, the UAV 200 may nclude power sysiem(s) 218. The power system 218 may include one or more batteries for providing . power to the UAV 200. In one example, the one or more batteries may be rechargeable and each battery may be recharged via a wired connection between the battery and. a power supply and/or via a reless charging system, such as .an inductive charging system that applies an external time- varying magnetic field to an internal battery.

E. Fayioads

{0079! The UAV 200 .may employ ' various systems and configurations in order to transport a payload 220. in some impjeraeotatioos, the payload 220 of a given UAV 200 may include or take the form of a "package" designed to transport various goods to a target delivery location. For example, the UAV 200 can Include a compartment, is which an item or items - may b transported; Such a package may one or more food items, purchased goods, medical items, or any other ohject($) having a size and weight suitable- to be transported between two locations by the UA V. ΐη, ' other embodiments, a payload 220 may simply be the one or more items that are being deli vered (e.g., without any package housing the items), {0080! In some embodiments, -the payload 220 ma be attached to the UAV and located substantially outside of . the UAV during some or all of a flight -by the UAV. For example, the package may be tethered or otherwise seleasably attached below the UAV during fli ht to a target location. In an embodiment where a package carries goods below the UAV, the package may include various features thai protect its ' contents from the environment, .reduce aerodynamic drag on the system, and prevent the contents of the package from shifting during UAV flight.

|O08 ' i | For instance, , when the payload 220 takes the form of a package for transporting : items, the package may include a outer shell constructed of water-resistant cardboard, plastic, or any other lightweight and water-resistant material. Further, iii order to reduce drag, the package may feature smooth surfaces with a pointed front that reduces the frontal c-ross-seciionai area. Further, the sides of the package may taper from a wide ' ' bottom to a narrow top, which allows the package to serve as a narrow pylon that reduces interference effects on die wing(s) of (he UAV. This may move some of the frontal area and volume of tire package awa from the wirtg(s) of the U AV, thereby pre venting the reduction of lift on the wing(s) cause by the package. Yet further, in some embodiments, the outer shell o -the package -may be constructed from a single sheet, of material in order to reduce air gaps or extra material, both of which ma increase drag on the system. Additionally of alternatively, the package may include a stabilizer to dampen package flutter. This reduction in flutter may allow the package to have a less rigid connection to the UAV and may cause ifae contents of the package to shift less during flight.

(0082! In order to deliver the payload, the UAV may include a retractabl delivery system that lowers the payload to the ground while the UAV hovers above. For instance,, the UAV may include a tether that is coupled to die payload by a release mechanism, A winch can unwind and wind the tether to lower and raise the release mechanism. The release mechanism can be configured to secure the payload white being lowered from, die UAV by the tetter and release the payload upon reaching ground level. The release mechanism, can ' then be ' retracted to the UAV by reeling in the tether ' using the winch.

{00831 In some implementations,, the payload 220 may be passively released once it is lowered to the ground. For example, a passive release mechanism ma include one or more swing arms adapted to retract into -and extend from a housing. An ' extended swing arm may form a hook on which the payload 220 may be attached. Upon lowering the release mechanism and the payload 220 to the ground via a tether, a gravitational force as well as a downward inertia! force on the release -mechanism may cause the payload 220 to detach from the hook allowing the release mechanism to be raised upwards toward the UAV. The release mechanism may further include, a spring* mechanism that biases the swing arm to retract into the housing when there are no other external forces o the swing arm. For .instance, -a spring may exert a force on the swing arm thai: pushes or pulls the swing arm toward the housing such that the swing arm retracts into the housing once the weight of the payload .220 no longer forces the swing arm to extend .from the housing. Retracting the swing arm into the housing may reduce the likelihood of the release mechanism snagging the payload 220 or other .nearby objects when raising the release mechanism toward the UAV upon deliver of the - . payload 220,

[008 1 Active payload release mechanisms are also possible. For example, sensors such as a barometric pressure based altimeter and or aeeeietorneters may help to detect the position of the release mechanism (and the payload) relative- to the ground. Data from the sensors can be communicated back to the UAV and/or a control system over a wireles link and used to help hi determining when the release mechanism, has reached ground level (e.g., b detecting a measurement with the acceterometer that is characteristic of ground impact). In other examples, the " UAV may determine that the payload has reached the ground based on a weight sensor detecting a threshold low downward force on the tether and/or based on a threshold low .racas rcmc of power drawn by the winch when lowering ' the payioad.

[0085J Other systems and techniques for dehverm a payioad, in addition or ia the alternative to a tethered delivers' system are also possible. For example, a UAV 200 could include an air-bag drop system or a parachute drop system. Alternatively, a UAV 200 carryin a payioad could simpl land on the .ground at a deliver location. Other examples are also possible.

IV. illustrative UAV Deployment Systems

[0086] UAV systems may be implemented in order to provide various UAV-relaied services. In, particular, UAVs may ' be provided .at a number of different launch, sites that may be in communication with regional and/or central control systems. Such a distributed ' UAV system may allow UAVs to be quickly deployed to provide services across a large geographic area (e.g., that is much larger than the flight range of any single UAV), For example, UAVs capabk of carrying payloads may be distributed at a number of launch sites across a large geographic area (possibly even throughout an entire country, or even worldwide), in order to provide on-demand transport of various items to- locations throughout the geographic area. Figure 3 is a simplified block diagram illustrating a. distributed- UAV system 300, according to art example embodiment.

} ' 8087J In the illustrative UAV system 300, aa access system 302 may allow for interaction with, control of, and/or tilization of a network of UAVs 304, la some embodiments, an access system 302 may be a computing system that allows lor human- controlled dispatch of UAVs 304. As such, the control system may include or otherwise provide a. user interface through which a user can access and/or control the UAVs 304,

[0088] In so ic embodiments., dispatch of the UAVs 304 may additionally or alternatively be accomplished, via one or more automated processes. For instance, the access system 302 ma dispatch one of the UAVs 304 to transport a payioad to a target location, and the UAV may autonomously navigate to the target location by utilizing various on-board sensors, such as a GPS receiver and/or other various navigational ' sensors.

[0089] Further, the access system 302 may provide for remote operation of a

UAV, For instance, the access system 302 may allow an operator to control the flight of a UAV via its user interface. As a specific example, an operator ma use the access system 302 to dispatch a UAV 304 to a target location. The UAV 304 may men -autonomously navigate to the general area of the target .location. At this point, the operator ma use the access system 302 to take control of the UAV 304 and navigate the UAV to the target location (e.g., to a particular person to whom a payioad is being ' transported). Oiher examples of remote operation of a UAV are also possible.

{0090$ In an. illustrative embodiment, the UAVs 304 may take various forms. For example, each of the UAVs 304 may be a UAV such as those Illustrated hi Figures 1, 2, 3, or 4. However, UAV system 300 ttmy also utilize oiher types of UAV without departing rom the scope of the invention. In. some implementations, all of the UAVs 304 may be of the same- or a similar configuration.. However, in other implementations, the UAVs 304 may include a number of different types of UAVs. For instance the UAVs 304 may include a number of types of UAVs, with, each type of UAV being configured for a different type- or types of payioad delivery capabilities.

}W9tj The UAV syste 300 may further include a remote device 306, which may take various forms. Generally, the remote device 306 may be any device through which a direct or -indirect request to dispatch a U A V can be made. (Mote that an indirect request may involve any communication that may be responded to by dispatching a UAV, such as requesting a package deliver}'). In an example embodiment, the remote device 306 may be a mobile phone, tablet computer, laptop computer, personal computer, or any network- connected computing device. Further, in some instances, the remote device 306 may not be s coropiiti.rsg device. As art example, a standard telephone, which allows for communication via plain old telephone, service (POTS), may serve as the remote device 306. Other type of remote devices axe also possible,

|(M)921 Further, the remote device 306 may be configured to communicate with access system 302 via one or more types of communication networkis) 308. For example, the remote device 306 may communicate with the access system 302 (or a. human operator of the access system 302) by communicating over a POTS -network,,, a ' cellular network, and/or a data network such as the Internet. Other types of networks ma also be utilized.

[009 ! hi some embodiments, the remote device 306 may be configured to allow a user to request delivery of one or more items to a desired location. For example, a use -could request UA V delivery of a package to their home via their mobile phone, tablet, or laptop. As another example, a user could request dynamic delivery to wherever they are located at the time of delivery. To provide such dynamic delivery, the UA V sy stem 300 may receive location information (e.g., GPS coordinates, etc.) from the user's mobile phone, or an other device on the user's person, such that a fJAV can navigate to the user's location (as indicated by their mobile phone). fOOSMJ In an illustrative arrangement, the central dispatch system 310 may be a server or group of servers, which is configured to receive dispatch messages requests and/or dispatch instructions from the access system 302. Such dispatch messages may request or instruct the central dispatch system 310 to coordinate the :. deployment of UAVs to various target locations. The central dispatch system 310 may he further configured to route such requests or instructions to one or more local dispatch systems 312. To provide such functionality:, the central dispatch system 310 may communicate with the access system 302 via a data network, such as the Internet or a private network that is established for communications between access systems and automated dispateh systems.

| 9S| In the illustrated configuration, the central dispatch system 310 m y be configured to coordinate the dispatch of UAVs 304 from a number of different local dispatch systems 312. As such, the central dispatch system 310 may kee track of which UAVs 304 a e -located at which local dispatch systems .312, which UAVs 304 are currently available for deployment, -and/or which services or operations each of the UA Vs 304 is configured- for (in the event that a UAV fleet includes multiple types of UAVs configured for different services and/or operations). Additionally or alternatively, each local dispatch system 312 may be configured to track which of its associated UAVs 304 ate currently available for deployment and/or are currently in the midst of item transport.

}8$96f In sores eases, when the central dispateh system 310 receive a request fo

UAV-re!ated service (e.g., transport of -an item) from the access system. 302, the central dispatch system 310 ma select -a specific -UAV 304 to dispateh. The centra! dispateh system 310 may accordingly instruct the local dispateh system 312 that is associated with the selected UAV to dispatch the selected UAV. The local dispatch system 312 may then operate its associated deployment system 314 to launch the selected UAV. In other cases, the central dispatch system 310 ma forward, -a request for a U V-related .service, to a local dispatch system 312 that is near the location where the support is requested and leave the selection of a particular UA 304 to the local dispatch system 312,

| ' 0097| in an example configuration, the local dispateh system 312 may be implemented as a computing system at the same location as the deployment systemfs) 314 tha it controls. For example, the local dispatch, system 312 may he implemented by a .competing system installed at a building, s-uch as a warehouse, where the deployment systemfs) 314 and UAV(s) 304 that are associated with the particula local dispatch system 312 are also located. In other embodiments, the local dispatch system 3.12 -may be implemented at a location tha is remote to its associated deployment systernis) 3.1 and UAV(s) 3Q

|θίί98| Mufflcfoos variations on and. alternatives to the illustrated configuration of the

UAV system 300 are possible. For example, in some embodiments, a user of the. remote device 306 could request delivery of a package directl from the central dispatch system 310. To do so, an application ma be implemented on the remote device 306 that allows the user to provide information regarding a requested deliver)', and generate and send, a data message to request that the UA.V system 300 provide the delivery, in such an embodiment, the central dispateh system. 310 may include automated functionality to .handle requests ' that are generated by such art application, evaluate such, requests, and, if appropriate, coordinate with an appropriate local dispatch system 312 to deploy a UAV.

}M99f Farther, some or all of the functionality that is attributed herein to the central dispatch system 3 0. the local dispateh system(s) 312, the access system 302, and/or the deployment systeniis) 314 may be combined in a singl system, implemented in a more complex system, and/or redistributed amon the central dispatch system 310, the local dispatch systera(s) 312, the access syste 302, and/or the deployment system(s) 314 in various ways.

}0.!.00] Yet further, while eac local dispatch system. 312 is shown as having two associated deployment systems 314, a given local dispateh system 3 2 may alternatively have more or fewer associated deployment systems 314. Similarly, while the centra! dispatch system .310 is shown as being in commtt«icatfc with two local dispatch systems 31 , the central dispateh system 310 may alternatively he in communication with more or fewer local dispatch systems 3.12.

J tei! In a further aspect, the deployment systems 314 may take various forms. In general, the deployment systems 314 may take the form of or include systems for physically launching one or more of the IJAVs 304. Such launch systems may include features thai provide for an automated UAV launch and/or features that allow for a human-assisted UAV launch. Further, the deployment ' systems 314 may each be configured to launch one particular UA 304, or to launch multiple UA s 304.

|Θ102] The deployment systems 314 ma further be configured to provide additional functions, including for example, diagnostic-related functions such as verifying system functionality of the U AV, verifying functionality of devices that arc housed within a. UAV (e.g,,. a pay lo a d delivery apparatus), and/or maintaining- devices or other items that, are housed in the UA (e.g., by monitoring a status of a. pay load such as its temperatee, weight, etc.), }Θ1Θ3| In some embodiments, the deployment systems 314 and their corresponding

UAVs 304 (and possibly associated local dispatch systems 312) may be strategically distributed throughout an area such as a city. For example, the deployment systems 314 maybe strategically distributed such that each deployment system 314 is proximate to one or more pay load pickup locations (e.g., near a restaurant store, or warehouse). However, the deployment systems 314 (and possibly the local dispatch systems 312) may be distributed in other ways, depending upon the -particular implementation. As an additional example, kiosks that allow users to transport packages via UAVs may be installed in. various locations. Such kiosks may include UAV launch systems, and may allow a user to provide their package for loading onto a UAV and pay for UAV shipping . .services, . among other possibilities. Other examples are also possible.

010 1 m a farther aspect, the UAV system 300 may include or have acces to a user- aceouni database 316, The user-account database 316 may include data for a number of user accounts, and which are each associated with one or more person. For a given user account, the user-account database 316 may include data related to or useful in providing UAV-related services. Typically, the user data associated with each user account is optionally provided by an associated user and or is collected with the associated user's permission.

jfiiwSf Further, in some embodiments, a person m be required to register for a user account with the UAV system 300, if they wish to be provided with UAV-related services by the U AVs 304 from UAV system 300. As such, the user-account database 316 may include authorization information for a given user account (e g,, a user name and password), and/or other information that may be used to authorize access to a user account.

[0106] In some embodiments, a person may associate one or more of their devices with their user account, such that they can access the services of UAV system 300, For example, when a. person uses an associated mobile phone to, e,g,, place a call to an operator of the access system 302 or send a message requesting a UAV-related service to a dispatch system, the phone may be identified via a unique device identification number, and the call or message may then be attributed to the associated user account. Other example* are also possible.

V. Browsing Available Items

f0 i07| As discussed above, a user ma submit a. transport request via a remote device

(e.g., a smart phone, laptop, personal computer, or any other client computing device), and a UAV system may responsively deliver one or more items to a target location based on the transport request, in order to facilitate placing the transport request, a client computing device may provide an application having various interfaces for browsing and selecting items available for UAV delivery. Figures 4A-4C illustrate examples of such interfaces, and while the illustrated examples are related to UAV delivery of food items, other examples may relate to UA V delivery of various other items or goods.

pieSl Figure 4A illustrates an example . interface :. 400 for browsing and selecting a vendor (e.g., a restaurant, store,, etc.). As shown, interface 400 Includes a list of available restaurants 402 that provide one or m r food items available for UAV delivery. The list of available restaurants 402 may be based on a desired delivery time and location. For instance, interface 400 shows a list of restaurants ' 402 wife oite o more food items available to be delivered around 4:30 PM. to the user's current: location., which may be determined, for instance, based on a GPS receiver of the client computing device. A restaurant may be included or excluded from, the list 402 based on Its proximity to the target delivery location, an expected preparation time of a food item, an expected transit time from the restaurant to the target delivery location., and/or an availability of UAVs to the restaurant. Other factors may he also be considered.

jOi 09 j For each restaurant included i the list, interface 400 may display a restaurant name (e.g., "Sandwich Shop," "Mexican Grill, ' " "Pizza Place"), an expected delivery time, and a status of the availability of UAVs to deliver one or more food items from the restaurant (e.g., displaying "airways open" for a restaurant that is associated with one or more UAVs available to deliver). Other information associated with, each restaurant may be displayed as well, including but not limited to : a closing time fo each restaurant (e.g., displaying "last flight departs at 9pm" for a. restaurant that closes at 9pm).

[OJJOJ Through interface 400, the riser ma select one of the available restaurants, and the client computing device may provide another, interface for selecting one or more food hems f om, the selected restaurant, as shown by example interface 410 in. Figure 4 . For instance, in response to selecting "Sandwich Shop" through interface 400, interface 410 may display a menu that includes a list of food items provided by the Sandwic Shop. 1» some examples, the displayed menu may ' be limited to only include ' items that are available for UAV delivery. Such available Item may be determined based on weight, a. size, a temperature, a container, or a fluidity of a particular food item.. Other characteristics of a food item may be relevant as we!!,

f0i:li| Further, while the user is selecting one or more items, for UAV delivery through, interface 410, the client computing de vice may consider an oyera.ll size and/or weight of the selected items. For instance, the device may determine that the selected items exceed a predeteraiined size or weight threshold (e.g., a size or weight too large to h delivered by a UAV), and the device may indicate this to the user (e.g., through interface 410). The user may then de-seieet one or more items until the device determines that the size and/or weight threshol s are .not exceeded.. In some examples, interface. 410 may indicate a weight associated, with each food item .as well as the maximum threshold weight in order to help the user determine which food hems to select or not select.

[01.12] Alternatively or additionally, in response to detecting that the user has selected various items having a total weight that -exceeds a weight limit of single OA V, interface 410 may indicate that delivery will require .multiple UAVs, This may result in greater fees and/or increased delivery times, so interface 410 may provide the user wit an option to approve the multi-UAV delivery or modify the order to reduce the total order weight below the weight limit of a single UAV. .If the user approves the multi-UAV delivery, then the selected items may be di vided amongst a number of U AVs and delivered in multiple flights,

VI, Transport Request Process and Interface

[01.13] Figure 5 is a Sow chart of an example method 500 that could be used to place a UAV transpor request The example method 500 may include one or more operations, functions, or actions, as depicted by one or more of blocks 502, 504, 506, and/or 508, each of which may he carried out, by any of the devices or systems disclosed herein however, other configurations could also.be used.

[011 J Further, those skilled in the art will understand that the flow chart, described herein illustrates functionality and operation of certain implementations of example embodiments, in this regard, each block of the flow chart may represent a module, a. segment, or a portion of program code, which includes one or more instructions executable by a processor for implementing- specific logical functions or steps i the process. The program code may be stored on any type of computer readable medium, for example, such as a storage device including a disk or hard drive. In addition, each block- may represent circuitry thai is wired to perform the specific logical functions in the process. Alternative implementations are ' included within the scope of the example embodiments of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonabl skilled in the ait.

(OilSj Method 500 begins at block 502, which includes receiving, by a client computing device, a transport request for transport of one or more items by an unmanned aerial vehicle. The client computing device may take various forms, such as a snutrtphone, a tablet, or a persona! computer, and the transport request may include a purchase order of one or more items via aa online marketplace. For example, the transport request may include a purchase order of one or more food items from a restaurant as described above by way of Figures 4A and 4B.

|0i 16| Method 500 continues at block 504, which includes responsive to receiving the transport request, the client comparing device (i) determining .one. or more UAV- aceessible sub-areas within a geographic area associated with the client computing device, and («) displaying on a graphic display, a graphic map interface indicating the one or more UAY-accessihle sub-areas, as well as at block 506, which inclu es receiving, via the graphic map interface of the client .computing device, a selection of one of the Li AY-aeeesslble sub- areas.

}0117] Identifying one or more UAV-accessib!e sub-areas may first involve identifying the geographic area associated with the client computing device, in some examples, the geographic area associated with the client computing device may include an area surrounding the client computing device. In other examples., the geographic area associated with the client computing device may be determined based on user input specifying a target deliver}- location through an interface of the client computing device, jfil 18J For example, figure 6A shows an example interface 600 of an application for selecting a target UAV delivery location, interface 600 includes a graphic map interface 602, The map interface 602 may include aa overhead view of a geographic area. For instance, the geographic area may be an area surrounding a " location of the client computing device, which may be indicated by an icon 604 displayed oa the map interface 602.

[011 1 In order to select the target delivery location, via interface 600. the user may interact with the map in erface 602, For instance, the map interface 602 ma be displayed on a touchscreen of the client computing device such that the user may select a desired target delivery location b touching a corresponding location oa the map interlace 602. in other examples, the user ma click on the map interface 602 using a mouse or stylus, or the user may enter an address of the desired target delivery location into an address field 606 of interface 600,

|01201 The client computing device may determine whether there are any UAV- accessible delivery locations at or near tire desired target deliver location, for instance, by referencing a database of V AV~aceess»hle deliver locations. A particular location may be deemed IJAV-aecessible if it is determined, that a IJAV can deliver a paylpad to that location. For example, a location may be UAV-accessible if it is tether-accessible (i.e,, if a UAV can deliver a payioad to the location by lowering the payioad from the UAV to fee ground via a tether). Alt fcitively, a location may be UAV-aceessible if a UAV can laud at the location, release a pay ioad, and take off from the location.

fOOll A topographical map of a geographic area may be -used to determine whether the geographic area includes any UA.V-accessib.ie locations. For instance,, the topographical map may indicate that a sub-area of the geographic area has an unobstructed vertical path between the ground and the sky. In order for the sub-ares to be tIAV-aecessibic, die unobstructed vertical, path ma be large enough to accommodate the delivery of one or more items by the UAV (e.g., the unobstructed path over the sub-area may be large enough for the UAV to lower an- item to the ground via a tether a ' nd or large enough for the UAV to land and take off). In some examples, in order for the sub-area to be UAV-acee&sible, the unobstructed vertical path may cover an are -on. the ground equal to or larger than a cirettlar area having a diameter of three meters. Other examples are possible as well

£0122) Further, the■topographical map may indicate one or more surface features of a sub-area of the geographic area. For instance, the sub-area may include a water surface feature (e.g., a lake, pond, river, swimming pool, etc.), a ground surface feature (e.g., dirt, grass, plants, concrete, asphalt, etc. located at ground level), or a building surface feature (e.g., a house,- garage, shed, apartment, eondo commercial building, etc.), A particular sub- area may or may not be UAV-accessibk based on its surface, features. For instance, in examples- where a UAV is delivering -a food item, a sub-area having a water feature may not ' be UAV-aeeessib!e because the water feature may damage the food item. Similarly, a sub- area having a building feature may not he UAV-accessible because delivering the food, item to the oof of a building ma lead to a poor user experience.

|0i23| Jn some examples, in addition to or in the alternati ve to usin a topographical map to determine surface features of a geographic area, various image processing techniques may be applied to a satellite image of a geographic area to determine its surface features. For example, image processing ma be applied to die image to determine that a particular surface of a geographic area includes grass, -concrete, asphalt, gravel, water, plants, building structures, etc. Further, image processing may be applied to the image to identify three- dimensional characteristics of the surface features. Projective geometry, for instance, may be used to determine a three-dimensional shape of one or more surface features fn a two- dimensional satellite image, A sub-area, of the geographic area may thus be identified as U .Ay-accessible based on a three-dimerisioiiai shape of one or more surface features depicted in an image of the geographic area. [63.24! Further, .a sob-area of the geographic area may be identified as UAV- aecessibie based on various onboard sensors of a UAV. As ' discussed above, a UAV may include one or more ' downward-facing cameras (e.g,, as part of an optica! flow navigational system). These cameras ma capture an image and/or video of an area below the UAV. Based, on the captured image ami'br video, a computing system of the UAV may determine one. r more surface features of the area (e.g., whether the area includes water, earth, a building,, or some other feature) The computing system may further determine, based on the captured image and/or video, whethe the area includes an unobstructed path between the surface feature (e.g., the ground, the roof of a building, etc.) and the sky. Alternatively or additionally, the UAV may include a LIDAR system for detecting the environment around the UA V to determine whether an area is tIAV-accessible. Other examples are possible as well.

{ ' 0125} in some examples, a ma of propert fines for a geographic area may alternatively or additionally be used, to determine whether the geographic area includes any IJA.V-accessible locations, and particular sub-are within the geographic are may or may not be UAV-aceessihie based on its proximity to one or more property lines. For' instance, if the user specifies his or her home as the target delivery location, it may be desirable to avoid delivering the food item , to the user's neighbor's property (e.g., to avoid, disputes between neighbors,, to ensure deliver to the correct address, etc.). Thus, in some examples, a sub- area may not be UAV-aecessibie if it is located within three meters of a property line.

[03:26} in further examples, a particular location may or may not be designated as

UA V-accessihie based on whether the location includes a ' hazard (i .e., whether the location is a safe location for a person or a UAV to occupy). For example, a roadwa may provide an unobstructed vertical path to the sky; however, it would be unsafe for a UA to land or otherwise deliver a payload to the surface of a roadway or for a person to retrieve the payload from the roadway. Various other locations may also not be designated as UAV-aecessibie based on safety concerns, such as construction sites, railways, bridges, etc,

[0327} Additionally, a weather condition or other environmental condition of a geographic area may be considered in determining whether it includes a UAV-acoessibie location. For example, when an area is experiencing strong winds, a location within the area may not be deemed UAV-aecessibie, or, alternatively, in order to be UAV-aecessibie, the location may need to include a larger unobstructed path between the ground and the sky (e.g., providing a large clearance from trees, buildings, or other objects that may emit, potentially damaging debris in windy conditions). Other taciors may be taken into account as well, including, but not limited to, an extent of rainfall, snow, hail, sleet, or lightoing at a particular location. As such, a particular location may be dynamically identified as OAV-accessible based on various weather and/or ^environmental conditions,

[01281 In accordance with the above examples, a number of UAV~aceessib!e sub- areas associated with a geographic area may be determined and stored in a database (e.g., by Storing GPS coordinates for each UAV-accessible sub-area). Referring back to Figure 6 A, the client computing device may refer to the database of tJAV-aceessible sub-areas to determine whether there are any UAV-accessible sub-areas at or near the desired target deliver location selected by the ' user, if there are no IJAV -access ble sub-areas within a threshold distance , of the ' selected delivery location, the interface 600 may display an indication as such and-'or may display an indication 60 of the nearest IJAV-aeeessible sub- area. Alternatively, if there are one or more UAV-accessible sub-areas at or near the selected deli very location, then the client computing device may display a more detailed graphic map interface for selecting one of the U AV-accessible sub-areas, Esampies of such an interface are shown in Figures 6B and 6C.

}0!29j Figure 6B illustrates- an example interface 610 for selecting a target UAV delivery location. Interface 610 includes a graphic ma interface 12 displaying an overhead view of a geographic area. As mentioned above, the geographic area depicted in map interface 612 may represent a portion of the geographic are depicted i map interface 602, The overhead view may he of a real estate property, such as a particular residence, business, park, mimicipal building, or some othe location that ma be selected via interface 600 (e.g., by interacting with map interface 602 or by inputting an address of the location).

[0130 J In one example, a user may select vi interface 600 his or her home as the target delivery location, and map interface 612 may display an overhead view depicting ' due or more UAV-accessib!e delivery locations at or near the user's horse. For instance, as. depicted in Figure 6B, the UAV -accessible delivery locations near the user's home may- include a backyard location 614 and a patio location 616, and these deli ver locations may be indicated via map interface 612. In other examples, there may be more or fewer UAV- accessible delivery: locations than those illustrated, and map interface 612 may display more or fewer UAV-accessible delivery locations in various locations, including the backyard, patio, and/or other areas {eg,, front yard, driveway, porch, etc.).

f0i3i| Map interface 61.2 may display the UAV ' -acccssiblc delivery locations in various manners. For instance,- map interface 612 may displa one or more graphics 618 associated with each UAV-accessible delivery- location. The graphics 6 IS may indicate a boundary of each, deliver location. For instance, in examples where a U V-aceessible delivery location comprises a circular area having an unobstructed vertical p th to the sky; the graphic 61 ma include a circle superimposed on the - circular area. Other shapes or arrangements are possible, as well.

j0132) Further, map inierfe.ee 612 ma display a nam of each UAV-accessible delivery location. For instance, ma ' interface 612 may display the name "Backyard" near the indicated backyard delivery location 61.4 and . the name "Patio" ne r the indicated patio delivery location 616. In examples where ma interface 61.2 displays different or additional deliver locations, map interface 612 may also display corresponding names for each displayed delivery location (e.g. "Front Yard," "Driveway," "Porch," etc).

}6i33| hi some examples, the names of the UAV-aceessible delivery locations may be specified by a user (e.g., by Inputting the names via Interface 610). For instance, the user may determine that the backyard delivery location 614 is located in the user's backyard and responsivcly label the location as "Backyard. ^ In other examples * the names of the UAV- aceesssbie delivery locations may be determined by applying image processing techniques to satellite images ' . For instance, as discussed above, various image processing techniques may be applied to a satellite iraage to determine that a particular surface of a geographic area includes grass, ...concrete, asphalt, gravel, water, plants, building structures, etc. Based on the relative position of the determined surface features, identities and/or names may be associated with particular areas. As an example, a grass area may be identified as a backyard or a concrete area may be identified - as a patio based on their relative locations to a building structure- Other examples are possible as well.

[013 1 Of the displayed UAV-accessible delivery locations, a user may select one

(e.g., via touchscreen or ' mouse ' input) as a target deliver location, interlace 610 may provide a visual indication of the selected delivery location. For instance, as depicted in Figure 6B, responsive to receiving a selection of the backyard deliver location 614, interface 610 may display the graphic 618 of the backyard delivery location 61 in a particular color that is different from the unselected patio delivery location 616, Further, interface 6.10 may display an icon. 620 overlaid oft the selected backyard deliver location 614, While the icon 620 is depicted as an. aerodynamically shaped package having a handle for attaching to a UAV, the icon 620 may take various othe -forms as well,

|0i35| While map interface 612 displays an overhead view of the geographic area selected . via interface . 600, Figure 6C illustrates an example interface 630 that includes a graphic map interface 632 displaying an oblique view of flic geographic area. A user may

77 access interface 630, for instance, by selecting an oblique view button 634 via interface 610. Similarly, a user may access interface 610 by selecting an overhead view button 636 via interface 630.

pi 36 Map interface 632 may display an oblique vie of the same or similar geographic area ami the same or similar UAV~accessibfe delivery locations as those displayed by ma interface . .612. For instance,, as illustrated, map interface 632 displays an oblique view of the user's home that was selected via interface 600.

[01:37] Map interface 632 may further display the available UA.V-aecessfeie delivery locations near the user's home, which may include the backyard location 614 arid■the patio location 616.. The UAy-aecessible delivery ' locations may be indicated by graphics 18 displayed by interface 630. As illustrated, the graphics 618 may include a boundary of the UAV-aceessible delivery locations, such as an ellipse or circle superimposed on the ground. The graphics 618 may further include one or mors lines extending from the boundary (e.g., the ellipse or circle) upwards towards the sky, and the lines may fade from opaque at the boundary to "transparent closer to the sky. In this manner, the graphics 618 may appear as holographic cylinders that fade into transparency as mey extend above the ground,

[0138] Similar to snap interface 612, a user may select one of the UAV-accesstble deliver locations display ed via ma interface 632 as a target delivery location. Interface 630 may provide a visual indication of the selected delivery location, for instance), by displaying the graphic 618 of the selected deli very location in a particular color that is different (torn the unsefected. delivery location and/or by displaying ' an icon (e.g., a. UAV delivery package) overlaid on the selected, delivery location,

Θ:Ϊ39 Referring next to Figure 6D, an. example interface 640 is shown for selecting a target delivery rime and location. Interface 640 may include a delivery time field 642. for selecting a target delivery time. For instance, a user may select a time corresponding- to immediate or as soon as possible deli-very (e.g., on the next available UAV), or the user may select lime corresponding to some time in the future. As illustrated, the time field 642 may take the ' f rm of a dropdown menu, but other examples are possible as well.

[01.40] In some examples, the time field 642 may allow selection of location-based future delivery. In location-based future delivery, rather than initiating or completing a delivery at a user-specified time, the delivery may be carried out based, on a user-specified location. For example, if the use chooses !ocation-based future delivery and selects the user's home as the destination, a IJAV system may initiate the delivery once it is determined that the user has arrived at his or her home (e.g., based on GPS coordinates of the client ' consulting device). Alternatively, the- UAV system may predict a time when the user will arrive at his or her home (e.g., based on a calendar associated with the user, a current location of the user, current and/or prior movement patterns of the user, etc.), and earn out the deli very request such that the delivery is completed at the predicted time.

(8141| Interface 640 may further include a list 644 of available UAV-accessibie delivery locations, and a wser may select a target delivery location from the delivery location list 644, The list 644 may identify the locations by an address and/or a name associated with each location. For instance, a user may specify mat a particular address is associated with me user's home, and the list 644 ma identify that address by the name "Home," Other examples are possible .as well.

{81-42 j hi some examples, the list 64 may be populated based on. the user's order history and/or user preferences associated with an account of the user. For instance, the list 644 may include recently selected delivery locations, frequently selected delivery locations-, and/or delivery locations identified as preferred by the user. Alternatively or additionally, the delivery location list 644 may identify one or more UA V-accessible delivery locations within a threshold distance of t e user (e.g., based on GPS coordinates of the client .computing device) and/or within a threshold distance of a location specified b the user (e.g., via interface 600),

(81431 Further, a user may select from the list 644 a current location of the client computing device as the target delivery location. Such a selection may cause tire UAV to deliver- its payload to the ' UAV-aceessihle delivery location that is nearest to the current location of the- client computing device.

[Θ1 4| Referring next to Figure 6E, an example interface 650 is shown for placing a

UAV delivery request. Interface 650.may include a price field 652, a payment field 654, and a destination field 656. The price field 652 ma display a total price of the UAV delivery request, which may be. itemized to display a cost of the ordered items, a tax, and any delivery fees or other fees. The payment field 65 may provide an interface for a user to input payment information, such as a credit/debit card number. The destination field 656 may identify the selected target delivery location (e.g., by an address and/or a name associated with the selected target delivery location). In some examples, the selected target delivery location may be- a deliver location selected through interface 610 aad or 630, In other examples, the selected target delivery location may be a default location based on the user's order histor (e.g., most recent or most frequent delivery location.) user preferences associated with an account of the user (e.g., a location identified as preferred fay the user). j0:145f Referring back to Figure 5, -method 500 continues at block 508, which includes responsive to receiving a selection of one of tlie UA V -accessible snb-areas, the client device causes the UAV to transport one or more items -to the selected UAV-aecessible sub-area. For instance, once a user has placed an order tor one or more goods (e.g., via one or more of the interfaces depicted in Figures 4A and 4B) and selected a target delivery location (e.g.., via one or more of the interfaces depicted in Figures 6A-6E), a UAV system, such as UAV system 300, may dispatch one or more UAVs to the target delivery location. A pay oad package containing me ordered goods may be attached to the UAV, and the UAV may navigate to the target delivery location where tlie payioad may be delivered by the UAV eg., by lowering the payioad to the ground via a tether) sad retrieved by the user.

VII. Interface Features for ' Tracking FnliiHinent Process for UAV Transport Request

A. State Update Methods

JQ146J Figures- ?A and ?B are flow charts illustrating methods for providing a client- facing application for a UAV transport .service, according to an example embodiment, .In. particular. Figure 7A illustrates a method 700 that can be implemented by a client device to provide status infonnadon and ated functionality during fulfillment of a UAV transport request. Correspondingly, Figure ?B illustrates a method 750 thai may be implemented by a UAV transport service provider's server system to facilitate the functionality of a client- factng application shown in Figure ?A. (Of course, it should be understood mat the client- facing functionality of method 7(K) is not required in the context of the server-side method illustrated in Figure 7B„)

Θ:ί 47 Referring to Figure 7A in greate detail, method 700 involves a client device recei ving a transport request for transport of one or more items to a target location by a U AV, as shown by block 702, This request may be received- via a ciiesH-faci&g application as described above, or in another manner. Responsive to receipt of the transport request, the client device displays a preparation status screen including; (a) a transport-preparation status corresponding to transport the one or more selected items, and fb) a first arrival-time estimate, as shown by block 704. (Examples of preparation status screens are provided below with reference to Figures SB and SC.)

|0Ι48| Next, the client device recei ves a status message indicating ma the one- or more selected items are loaded on a first UAV for transport to the target location, as shown b block 706. In response, the cheat device displays a Sigh progress screen including: (a) a map with a flight path visualization and a location -indication for the first UAV, and (b) an updated arrival time, as shown by block 708. The updated arrival time is based an updated flight time estimate for transport of the requested items by a UAV, which may be determined at or near the time when loading of the items onto the UAV is complete. (Examples of flight progress screens are provided later with reference to Figures 8D to 8R)

|0i49 Continuing wit method 700, the client device may subsequently receive a status message indicating that the UAV has released the one or more selected items in a ground, delivery- rea associated with the target location, as shown fay block 710. In. response, the client -displays a delivery confirmation screen, as shown by block ' 712. (An example of a delivery confirmation screen is provided later with reference to Figure SR.)

f ISOf Referring ' now to Figure.7B, method 75 may be implemented by the server system, in conjunction with the implementation of .method 700 by a client device. In particular, the server system may implement method 750 in order to provide the client device with status updates, and other transport-related functionality, which facilitates ' the client-side application and functionality of method 700.

[01511 More specifically,, method 750 involves a server system receiving a transport request for transport df one of more items to a target location by a UAV, as shown by block 752. This transport request may be received from client device, which in tern may have received the request via a client-facing application, using- techniques, such as those described above. The transport request received at block 752 may aIso : originate from another source, which teneme s a different technique to initiate the request

[01521 in response to receipt of the transport request, the -server system determines a total transport time, as shows by block 754. The total transport time is determined based on a combination of * at least (a) a preparation-to-loading time estimate for the one or more selected items, and ' (b) a flight time estimate for transport of the one or more items to the target location by a- UAV. Techniques , for ■■ determining total transport time are discussed in greater detail, in. section VIll(C) below,

[01531 After determining the total transport time, the server system sends a first status message to a client device, as shown by Mock ' 756. The first status message includes: (a) a transport-preparation status for the one or more selected items, and (b) an arrival time corresponding to the total transport time. As such, the first status message may serve as an indication to the client device to display a preparation status screen indicating the transport- preparation states and Ac arrival time. (The first status message may thus he utilized by the client device to carry. out block 704 of method 700.) j0:154j Subsequently, the server system receives aft indication that the one or more selected items ate loaded oh a first UAV, and ready fo transport to the target location, as shows, by block 758, in response, the server system: ft) determines .an updated total transport time,, as shown by block 760, and (ii) sends a second status message to die client device, which indicates an updated arrival time, as shown by block 762. Th updated total transport time may be determined b first determining an apdated flight ime estimate, and then determining a corresponding arrival time. Since the items sre loaded at this point in time, the transport preparation phase may be considered, complete, such that the updated tots! transport time prepamtion-to-loading time estimate,

f iSSJ Since the second status message indicates the requested items are loaded on the UAV, this indicates that the items are ready for transport (or perhaps that the ' UAV flight has just begun). Accordingly, the second status message may serve as an indication to the client device to display a flight progress screen. For example, the second status message may Serve as an indication to display a screen in accordance with block 708 of method 700 (e.g.., a screen including (a) a map with a flight path ' visualization, and a location indication for the UAV, and (b) die updated arrival time indicated in the second stains message).

(0156 ' i Continuing now with method 750, the server system ma subsequently receive an indication that the U AV has released the one or more selected items in a . ground deli very area .associated with the target location, as shown, by block 764, in response, the server system may send a third message indicating to the client device to display a delivery confirmation screen, as shown by block 766. As such, the third -status message may provide the indication to the client device at block 710 of method 700, which prompts ' the client device to display a delivery confirmation screen at block 712 of method 7CKI.

|0157| Methods 70 and 750 involve states update at least three times during the ftdfil nent of a UAV transport request Specifically, states updates are provides when, the order is initially confirmed, when, pre-flight tasks are complete and the item(s) are loaded on the UAV and the IJAV's flight is about to- begin, and when delivery is completed such that the ordering -party can retrieve the requested items at. the target - location., it should be understood, .however, thai additional status updates may be provided at various points during the pre-flight and/or flight phase (e.g., hi response to various events in the fulfillment process).

B, Phases of UA Transpor Process

|filS8J As noted above, the ' fulfillment process for a UAV transport request, may be characterized by two primary phases; (i) a pre-flight phase and (ii) a flight phase. Further, status information and delivery-related: functioaalUy may be designed around these two primary phases of the UAV transport process.

[0159} Generally, the pre- lighr phase may include tasks such as order processing, item retrieval, item preparation, and packaging, transport from packaging location, to UAV loading area, and/or loading time, among others. In the specific example of food delivery, the- pie-flight -phase may include- tasks such as food preparation (e.g., cooking) and food packaging, among other possibilities. Other examples are of course possible.

|0J60| To facilitate status updates and improve delivery timing estimates, the pre- flight phase may be conceptually divided into a number of defined sab-phases. In an ' example embodiment, the pre-flight sub-phases may generally be organized .around different tasks require to prepare and load requested items onto a UAV for transport. For instance, the - pre-flight phase .may include: an order-processing sub-phase, an item-preparation sub- phase, and/of a. loading sub-phase,. ' among other possibilities. As- described later in reference to Figures SA to 8, a server system may send updates, and a client-facing application may update displayed information and available ftmctionality. according to the current sub-phase of a particular UAV delivery. Further, by considering fi.rai.ng for certain sab-phases individually, a service provider system may be able to improve the accuracy of the preparation-to-loading time estimate for the entire pre-f!ight phase.

}816IJ The -flight phase -may also be conceptually divided into sub-phases, in an example embodiment, these sub-phases may be defined, at least in part, t correspond to certain portions o the UAV flight where status updates are deemed appropriate. For example, the flight phase .may include a flight-initiation sub-phase, a mid-flight sub-phase, an arrival sub-phase, an in-progfess ' deliver sab-phase, and a delivery-complete sub- phase. Additional details and examples of flight sub-phases are described later in reference to Figures 8

C, D namicall Determining Transport Timing Information

{9162| As noted above, method 750 ma involve a service-provider system determining timing information related to -fulfillment of a transport request by a UAV during item preparation and UAV flight In particular, . the service-provider system ma determine a total transport time upon receipt of a UAV transport request, and may update the total transport time to refine the estimated arrival ' time as the fulfillment process progresses. Herein, the total transport, time should be understood to be the amount of time between a current time (e.g., when the total transport time is -calculated) and the estimated time that deli very of the requested items is completed (e.g., an estimated arrival time). As such, the service-pro ider system and/or the 'client device may calculate total transport tune periodically, eoaunuously, aid/or in response to various predetermined events (e.g., transitions between phases or sub-phases) during the fulfillment process,

[01631 For example, once a transport request (e.g., .an order) is placed via an exemplary client application, and recei ved by the server system at block 752 of method 750, the server system may base the determination of a total transport time (e.g., at block 754) based OR the combination of timing estimates that are separately determined for two distinct phases of the UAV food delivery process, in particular, the server system may calculate separate time estimates for (1) the pre-iight phase and (¾). &e- flight phase, and use the combination of " these estimates to d termine the total transport time.

}016 1 As a specific example, in the context of UAV * food delivery, the delivery service system may estimate timing of the pre-ilight phase by calculating a preparation-to- loading time estimate ' for. the speci fic food items that were ordered. And, for the flight phase, the delivery service system may estimate a flight time for the UAV to transport the food items to the delivery location, and lower the food items to the ground in delivery area via tether. The preparation-to-loading time- estimate and the ' flight time estimate may then be combined to determine the total transport time for UAV delivery of the food items.

jfil6S| Is a further aspect, the prepa-ration^to-loading time may include an estimated time for food preparation at the food source (i.e., at the restaurant). This estimate may be based on general knowledge about the type of food ordered and/or the type of restaurant from which the food wa ordered. Farther, data regarding deliveries that have been made may be aggregated and machine-learning processes ma be applied to improve the accuracy with which food type and/or restaurant type factors into the preparation-to-loading time. Additionally, records ma be kept for specific restaurants,, for ' specific food items, and/or for .specific combinations of food items from specific food sources (or perhaps -even combinations of items from multiple food sources). Machine learning processes may likewise be applied t improve the accuracy with which tills source-specific information affects the preparaiion-to-ioadisig time.

[Θ1.66] Further, while a UAV preferably arrive just as or shortly before the food items are ready to be loaded onto the UAV, this timing may sot always be possible. Accordingly, a food delivery service may use data relating to UA availability and/or flight time to the food source for pickup to more accurately estimate tire preparation-to-loading time for a particular order. [01671 The flight time for a particular order may also be calculated ' using various types of information. For example, the flight time may be determined based on UAV capabilities, UAV traffic on or near the flight path ftom the food source to the target delivery location, the type area/location where the food items will be delivered, limitations on flight speed trajectory for the particular food items being transported, and/or priorit of the . particular order, among other possibilities.

[01.68J As noted above, the pre-flight phase and/or the flight phase of the delivery process may be further divided into various sub-phases. I some implementations, the server system may determine separate timing data for some or all sub-phases hi the pre-fJight phase, and use the timing data for these sub-phases to deternsme the total preparatioiHo- loading. Similarly, the server system may determine separate timing data for some or all sub- phases in the flight phase, and use the timing data for these - sub-phases to determine the estimated- flight time ,

J0169J As a specific example,. at a time when demand for UAV transport .services is high, it may not be possible for a UA to arrive at the item source location at the earliest time the requested items are ready fo loading. in such an embodiment, reparatioa-tO ' loading time may be extended to allow time for a UAV to arrive to pick up the items ftom the source location (e.g., ftom. a restaurant). Further, in such an embodiment m example method may further involve the server system using a demand indication' for UAV transport services to detenmne a preparaiion-to-loading time estimate.

| ' βΙ7β| if the level of demand is such that an. estimated item preparation time ' is less than an expected arrival time for a UAV at the item source location, then the server system may further send a message to an account associated with the item source location, which indicates the expected arrival time of the UAV at tile item source location for item pickup. In the context of food delivery, the message may further suggest that, preparation of the requested food items be delayed so that the items are prepared just in time fo (or shortly after) the- UAV is expected to arrive fo pick-up at the source restaurant. Yet further,, the server system may determine an expected item preparation time for the particular food items Indicated i the transport request, and based thereon, may include a suggested time at which preparation of the requested food items should begin. Other examples and variations on tins specific example are also possible,

f0171 I a further aspect, the service provider system may update the estimation of the total transport time (and the eorrespoiiding arrival time) throughout the fulfillment process. Tire updated total transport time may be determined based on the sub-phase of the pre-fligbt phase arid/or flight phase that remain to be completed, at or the time the update is calculated.

D, Example Interface Screens for U AV Transport Status

|0l?21 In accordance with example embodinients, a client-facing application may update a GUI as the transport process progresses through its various phases and/or sub- phases. In some implementations,, some or ail updates to the GUI may be provided in realtime.

{01.73| To facilitate such updates, a server may detect when transitions between the various phases arid/or sub-phases occur, -and send status update messages to the cheat device associated with the particular UA V trans ort request. Note that: such states update messages may be sent every time a new sub-phase begins,, or only sent when some (but not all) sub- phases begin. Further, each, time a status update message is sent, the server system may update estimated timing information correspondin to the transport request (e.g., total transport time, estimated arrival time,, etc,) * and provide the updated transport timing information to the client device. Alternatively, the server system might include updated transport timing information in some, but not all, status messages.

{0174} Pigs. 8A to 8H show a sequence of GUI screens that y be displayed by an example, client-facing application. These screens, may be displayed by a client-facing application as a UAV ' food deli very process progresses through its various phases and sub- phases. Each screen may include timing estimates determined in accordance with methodology described herein. Further, each screen shown in Figures 8A t ¾H may include information and/or provide access to fiinctionaiity related to a. current phase and/or sub-phase of the fulfillment process for a UAV ' transport request. Additionally, the estimated iirae of arri val (ETA) shown in Figures HA to 8H ma be updated frequently (and possibl pushed ' to the client device ia real-time) during the course of the delivery, in so doing, the service- provider system that provides the ETA estimations may utilize data and consider factors as described herein, s as to provide a highly accurate ETA (which will typically become even more accurate as the delivery process progresses),

[01.751 Note that the example screens shown in Figs. 8A to SH characterize certain phases and sub-phases of the UAV food delivery using terminology derived from, commercial airline flights, instead of using terminology that is typical for food delivery. By characterizing the phases and. sub-phases of the UAV food delivery in terms derived from commercial airline flights an example client-facing application may enhance the user-experience. In particular, since the UAV food delivery process can differ slgaificaatly from, traditional food deliver ' processes, siich characterization may help users better imderstand the status of ' their order by analogizing to a diiiereftt bui familiar context, ϊί should be understood, however, that embodiments in which an application does not include such characterizations arc also possible.

i Illustrative State Screens for Pre-Flight Phase

|01?6 Referring now to Figure 8A, a screen 8Θ0 is shown. Screen BOO may be displayed by the client application during an order processin sub-phase. For example, screen 800 may be displayed while waiting for a restaurant to confirm receipt of a UAV iransport request for food iicm(s) .and/or while ' waiting for verifieaiion of a payment correspondin to a transport request. Screen 8 0 includes status information. 802, In the illustrated example, the status during the order processing sub-phase is characterized as "confirming .flight". Of course, other characterizations of status are also possible during the order processing sub-phase.

JM77J Referring now to Figure SB, Figure 8B shows a semen 810 from an example client application, for a UA V iransport senice. Screen 810 may be displayed by the client application, during an ite -prepairatien sub-phase of the fulfillment process for a UAV iransport request Screen 810 thus provides an example of preparation status screen, which may be displayed by a client device at block 704 of method 700. in the context of a UAV food delivery application, screen 81 may he displayed while re uested food items are being prepared -and/or packaged at the restaurant. Screen 810 includes order identification information 811 , status information 812, and an arrival time estimate 814 for arrival of the UA V with the requested items at the target location.

0i.7¾J In the illustrated example, the status information 812 characterizes the status dnring the item-preparation sub-phase as "at the gate". Of course, other characterizations of status are also possible during an . stem-preparation sub-phase,

|¾179| In a further aspect, the arrival-time estimate 814 may be determined upon eoollrmiug the order (and before preparation of items begins). In particular, the arrival-time estimate 814 may be determined based on the current time and an estimate of total transport time determined in accordance with block 754 of method 750, For instance, arrival-time estimate 814 may be based on a total transport lime that is determined based on. a combination of at least (a) a preparation-to-loading time estimate for the one or more selected items, and (b) a flight time estimate for transport, of the one or -more items to the target location by a UAV. Thus, at the time screen 81Q is displayed, the anival-time estimate 81 may account for the pre-flight phase and the flight phase. As such, the preparation-to- loading time used to determine estimated timing of the pre-tligst phase, and ultimately the arrival time estimate 81 , may be determined based on timing estimates for and/or current status of the item-preparation ' phase.

plS0| in another aspect, a sendee-provider system may update die prepamtio.n~to-

!oading time, such that the client-facing application can update die corresponding arrivai-time estimate 814 in screen 810, as item pcepaxation progresses. For example, before preparation of food items begins at a restaurant, the preparation-io-ioading time may be calculated based on an estimated preparation time for such food items. The preparation-to-ioadsng time could then be updated as certain food items m prepared. Specifically, when a food item is prepared and ready for loading (or ready for packaging), ibe,preparatioii-io~loading time may be updated based on the actual time at winch the food Item was prepared. The total transport time, and the corresponding arrival-time estimate 814, may men be updated according to the updated preparation-fo-io iding time.

JQ181J To facilitate such updates in an item-preparation screen 810, a sendee- provider system, may be configured to interface with a ■ item-provider application. Such an item-provider application may be utilized at the food source to provide updates daring item preparation, and possibly at other times as well (such as when food items are loaded onto a UAV). As a specific example, an item-provider application may be installed on a restaurant's competing device, and may allow the. restaurant, to provide update(s) as food item(s) are prepared and ready for loading on a UAV, Other examples and variations on the above- described example are possible.

[01821 Referring now to Figure 8€, Figure 8C shows a screen 820 from an example client application for a UAV transport service. Screen 820 may e displayed by the client application durin a loading sub-phase of the fulfillment process for a UAV transport request. Further, .screen 820 includes order identification information 811, status information 822, and an arrival time estimate 824 for arrival of a UAV with the requested item at the target location.

|0J83j Screen 820 provides as example of loading screen, which may be displayed by a client device in some implementations of method 700 (e.g.,. after displaying a preparation status screen, and before displaying a screen corresponding to the flight phase). In. the context of a UAV food delivery application, screen 820 ma he displayed after the requested food items are prepared (e.g., cooked), while the food items are being packaged and loaded at the restaurant. Alternatively, packaging of items may be considered part of the item-preparation sub-phase, such that screen 820 is displayed after the requested food items are prepared and packaged, while the food items are bein transported to and loaded onto a UAV for transport to the target location.

|01 | In the ' illustrated example, the status information 822 on loading status screes

820 characterizes the status during the loading sub-phase as "now boarding." Of course, other characterizations of status are aiso possible during the loading sab-phase.

J01S51 in a further aspect, the arri val-time estimate 824 for screen 820 ma initially be determined after preparation of requested items is completed, and before loading of items begins. (And, in an embodiment where packaging of items is considered part of the loading phase, before packaging begins.) As such, the arrival-time estimate 824 may he based on an ' updated total transport time, which is determined after an initial determination of total transport time at block 754 of method 750, and before a subsequent update at block 760 of method 750. For instance, arrival-time estimate 824 may be based on a iota! transport time that is determined based on a combination of at least, (a) ati updated preparatlan-tQ-ioadin time estimate corresponding to the remaining portion of the pre-flight phase (e.g., the loading phase), and (b) an updated flight time estimate for transport of the one or more items to the target location by a UAV,

|0186| in a further aspect, the loading time, and the corresponding arrival-time estimate 824 in screen 820, can he updated as the loading phase progresses. For example, before preparation of food items begins at a restaurant, the preparatto»-to-loading time may be calculated based on estimated timing for tasks suc -as transporting the food items from a food preparation location (e.g., restauran kitchen) to a ' UAV pick-up location, packaging items, placing items in packaging, placing items or a package containing the items into a UAV compartment, and/or attaching items or a package containing the items to a UAV (e.g., to a tether), among other possibilities. The loading time estimate could then be updated as certain tasks tram the loadin phase are completed, to reflect the time remaining in the loading phase. Th total transport time, and the corresponding arrival-time estimate 824, may then be updated according to the updated loading time,

|0J8?j To facilitate such updates in a loading screen 820, a service-provider system may be configured to interface with an item-provider application, as noted above. Such an item-provider application may be utilized at the food source (and or at a UAV pick-up location) to provide updates during item loading, transport to a U AV pick-up location, and/or item packaging. As a specific example, an item-provider application may be installed, on a .restaurant's computing device (e.g.. a mobile phone or tablet), and may allow a restaurant employee to provide update(s). as food iiem(s) are packaged;, transported to a UAV pick-up location for loading, and/or loaded on ' the UAV. Other examples arid variations on ' the above-described example are possible,

il illustrative States Screens for Fight Phase

plSSJ Figure 8D shows a screen 830 from as example cheat application for a UAV transport service. Screen 830 ncludes order identification information Si 1, status information 832, an arrival time estimate 834 for arrival of a UAV vviili the requested items at the target location,, and a map. feature 836 with a flight path visualisation 837. Further, screen 830 may be displayed y the client application during a departing (or flight-initiation) sub-phase of the fulfillment process for a UAV transport request. As such, screen 830 provides an example of flight-progress: screen, which may be displayed by a client device at block 70S of method 700.

}0189| In the illustrated example, the status information 832 characterizes the status during the tem-pre aration sub-phase as "departing". Of course, other characterizations of status are also possible during the flight-initiation sub-phase of the flight phase.

[0I.9OJ In the context of a UA V food delivery application, screen 830 may initially be displayed when the service-provider system recei ves an indication that requested food items are loaded on a UAV (e.g., at block 758 of method 750), Such an indication, may be provided by the item source (e.g., by a restaurant) via an tan-provider application, and or by the UAV itself, which may be configured to detect when items are loaded and communicate an indication to this effect to a service-provider system.

|8l9l I In a further aspect, the arrival-time estimate 834 shown in screen 830 may be initially determined, upon receipt of the indication that items have been loaded onto the UAV. in particular, the arrival-time estimate 834 may be determined based on the current time and an updated estimate of total transport time determined in accordance with block 760 of method 750. For instance, si block 760, the pie-flight phase may be considered complete, such that the total transport time is based on the estimated flight, time to the target location, and possibly a separate time estimate for delivery (e.g., time required to lower the food items to the ground via tether and/or for the UAV to move away from the delivery area). Accordingly, when screen 830 is initially displayed, arrival-time estimate 834 may reflect this updated total transport time estimate.

(0i92{ in a further aspect, as the flight-initiation phase- progresses, the flight time

.estimate, and the corresponding arrival- time estimate 834 displayed in screen 830, ma be updated to reflect timing, for ' the remaining portion of the fli-ght-im * .tiation phase (and the remainder of the flight phase). For example, the flight-initiation phase ma be defined to include- any time the UAV spends on jhc- ground after ftetfts arc loaded, and a take-off process (e.g., the process of the UAV ascending to a certain altitude). Accordingly, during the flight- initiation phase, the flight time estimate, -and the corresponding arrival-time estimate 834, may be updated to reflect the actual take-off time and/or the amount of time it takes for the UAV to ascend to a predetermined altitude Cor to fly a certain distance from the item source location).

[0J 93 J Figure 8B shows another screen 840 from an example, client application fo a

UAV transport service. Screen -840 includes order identification information 81 1 , status information 842, an arrival ime estimate 844 -for arrival of a UAV with the requested items at the target location, and .a .ma feature 8 6 with a flight pam visualizatio 847. Further, screen 840 may be displayed by the client application during art "en-route" (or mid-flight) sub-phase of the fulfillment process for a UAV transport request. As such, screen 840 provides another example of flight-progress screen, which may he display ed by a client device ' at block 70S of method 700,

[01.941 Jn the illustrated example, the status information 842 characterizes the status during the item-preparation sub-phase as "en route". Of course, oilier characterizations of status are also possible during the mid-flight sub-phase of the fli ght phase.

jfli* S] Is the context of a U AV food delivery application, screen ¾40 ma initially be displayed when the service-provider system -receives an indication' that UAV flight has begun, and provides the client-facing application with an indication to this effect. The service-provider system -may be provided with such an indication by the item provider (e.g., by a restaurant via an item-provider application. Additionally or alternatively, such an indication may be provided to the service-provider system by the UAV itself, which may be configured send a message to a service-provide system when a UAV -flight begins. Alternatively, a UAV may simply report its location informatio (e.g>, GPS coordinates) to a service-provider system. The service-provider system may then determine that the flight has begun when the reported location information indicates that the first UAV has left the source location, or is more than a predetermined distance from the source location. Further, the service-provider system .may consider the mid-flight phase of flight to continue until it determines that the UAV at or within a predetermined distance from, the target location. |0i96| In a further aspect, the arrival-rime esiimate 844 shown its screen 840 may be initially determined upon receipt of the ' indication that UAV flight has begun, la particular, the arrival-time estimate 844 may be determined based on. the current time and an updated estimate of total transport time based on an estimate of the remaining flight time arid a deliver lime -estimate. Further, as the UAV's flight to the target location progresses, -fttc flight time estimate, and the corresponding aniva!-tiroe estimate 844 displayed in screen 840, may be updated to reflect timing for the remaining portion of the flight.

M97\ Figure 8F shows another screen 850 from an example client application for a

UAV transport service. Screen 850 includes order identification information M l, status information 852, an arrival time estimate 85 for arrival of " a UAV with the .requested items at the target location, and- map feature 856 with a flight path visualization- 857. Further, screen 850 may he displayed fay the client application during -an- approach sub-phase- of the ftiSfillnient proces for a UAV transport request. As such, screen 850 provides another exam le of flight-progress screen, which may be displayed by a client: device at block 708 of method 700.

[01981 In the illustrated example, the status information 852 characterizes the status during the approach sub-phase as "final .approach - approaching your destination". Of coarse, other characterizations of .status are also possible during the approach sub-phase of the flight phase.

j0199 j In the context of a UA food delivery application, screen 850 ma initially be displayed when the service-provider system determines or receives an judication that UAV is within a certain predetermined distance from, the target location. For instance,- such an indication may be provided to the service-provider system by the UAV itself, which may be configured send a message to a service-provider system when the UAV' determines it is within a certain predetermined distance from the target location. Alternatively,, a UAV may simply report its location information (e.g., GPS coordinates) to a serv ee-provider system, and the service-provider system may use such location reports to determine when the UAV is within certain predetermined distance from the target location.

| 2W| In a further aspect, the arrival-time estimate 854.shown in screen 850 may be initially determined upon receipt of the indication that UAV is within a certain predetermined distance from the target location. In particular, the arrival-time estimate 854 may be determined based on the current time and an updated estimate of total transport time, which is determined based on. the remaining flight time and a delivery time estimate.

|Θ2ΘΙ J Figure 8G shows another screen 860 from an example client application for a

UAV transport, service. Screen 860 includes order identification information 81 1, status information 862, an arrival time estimate 864 for arrival of a UAV with the requested items at the target location, a graphic map interface 866 showing a delivery area at the target location. jO202J Screen 860 -may be displayed by the client application upon arrival of -fttc

UA V at the target location, during an in-progress deli very phase of the fulfillment process far a. UAV tratisport request. As such, sc een Sf>0 may be considered a delivery status screen, and r y displayed by a client device as part of method 700 (e.g., after block 708). in the illustrated example, the status information 862 characterizes the status during the delivery phase as "pulling up to the gate - your -food is being lowered " ". Of course, othe characterizations of status are also possible during the approach sub-phase of the flight phase. Further, in some embodiments, the delivery status screen 86 may share some of (o perhaps even ' ail of) the .functionality of the screen 610 shown, in Figure 6B,

(0203 j Figure SH shows another screen 870 from an example client application for a UAV transport service. In particular, screen 870 may be displayed by the client application w!ien delivery is complete, and the order is ready to be picked up by the user. For example, screen 870 may be displayed when requested food items have been lowered to th - ground via a tether, and released from the tether (or the tether is released from the UAV).

0204J Note that the distinction between the approach sub-phase, in-progress delivery sab-phase, and delivery completion may be particularly beneficial to the user-expeiiesce in the context of tethered UAV delivery. More specifically, tethered delivery can involve finding the. appropriate area within, the delivery location- and/or the. appropriate time (e.g., avoiding winds) to lower the food items. As such, tethered, delivery can. take longer, and. is more significant part of the larger delivery process than, e.g., a deli very driver walking a pizza to purchaser's front door. Further, for safety reasons, it ma be desirable for the use to be clear of the delivery area until the food items are on the ground and have been released from the tether (or the tether is released from the UAV), and possibly until the UAV has flown a certain distance from the delivery area. At the same titae, users typically appreciate having access to their food items as -quickly as possible, By providing distinct graphical indications , for arrival, in-progress tethered delivery, and completion of a tethered delivery, the user is better informed as to the status of their food delivery, such that they can safely pick op their food at the earliest time possible.

Vlli, Additional Aspects of Clieaf-faeiiig Application for UAV Transport

A. Status Information for Melii-tJAV Deliveries

|0205| As noted above, there may be instances where multiple UAVs are utilized, to fulfill a single order request In such instances, a client-facing application may be operable to provide information related to the multiple U AVs fulfilling the same transport, request (6266] For example, the clieni-ftscing application may show the locations of all UAVs fulfilling the same order ø» a map feature, such as those shown in Figures 8D to 8G. Alternatively, the client-facing application may only show the location of one of the UAVs that is fulfilling a given order. As an example, the client-facing applicatio may only show die .locations of the UAV that is the furthest from the target location and/or that is scheduled to he the last UAV to lower its food items to the ground in the delivery area. By only indicating the last UAV in the group, the client-facing application may provide an indication of the expected time that the entire order will be ready for retrieval from the delivery area (since safety considerations will likely prevent the user from retrieving any food items before all UAVs: have lowered their food items to the ground).

B. Multi-Tasking Features

(0207! hi a farther aspect, an example client-facing application may provide multitasking features. One such multi-tasking feature- may allow a user to track the status- of their order while simultaneously using features other than status ' tracking (e.g., while using features that do not allow for Ml screen status tracking such as illustrated in figures 8A to 8H).

(6-298] As an example. Figure 9 is an illustration of a screen ' 900 from an example client application for a UAV transport service. Screen 900 includes a dynamic status bar 902. The status bar 902 may be displayed, after a user has submitted a UAV .transport request, while the user is using interface features other than a status tracking interface such as illustrated in figures 8A to ' 8H. As such, if a user is viewin a mil status tracking interface, such as shown in Figures 8A to 8H, and then navigates awa from the full status tracking interface, status bar 902 may be displayed so that the user can continue to track the status of the .fulfillment process for their UAV transport request while using: other features of the application.

JO209| Status bar 902 includes status information 908 and an arrival time estimate

906, The status iaformatton 908 and the arrival time estimate 906 may be dynamically updated in status bar 902, in a similar manner the status information and arrival time estimate is updated in. Figures 8A to 8H.

(6218] Further, status bar 902 includes a flight-progress feature 904. The flight- progress feature 904 ma be displayed or become active once the Sight phase of the fulfillment process has begun. As the UAV flight progresses, the- arrow-shaped UAV graphic on the flight-progress feature 904 may be updated to indicate a current or recent distance of the UAV from the target location. C. Delivery Area Obstructions

{0211} There may be situations where a UAV .arrives at a target location but unable to lower the requested items to the ground due to obstructions in- the deliver area. For instoee, a child or dog raay be located on the ground in the delivery,, preventing delivery for safety reasons. Or, when a UAV arrive at target location and surveys the specified delivery area, the UAV may discover that rainwater has pooled in the delivery rendering deliver)' undesirable- at that location. Other examples are also possible.

{0212| Accordingly, an example client-facing- application- may provide status updates and. functionality related to situations where .delivery is delayed or possibly even canceled due to aoforeseeo obstructions ia the delivery area, in scenarios where the ' UAV arrives and finds the delivery area is obstructed, the UAV ma communicate and provide status information to a service-provider system, which in turn may provide information and support functionality related to alternative deiivery options and/or canceling deii very.

{0213 j For example, otice an obstruction is detected in the delivery area, the client- facing application may displa a screen indicating that an obstruction will, delay the requested items being lowered to the ground. If the UAV determines that it can search for an alternate deiivery; area (e.g., associated with the same target location), the client application may also display an indication to this effect, which may identify the alternate del ivery area. The .client- facing application could also display an interface that allows the user to specify an alternate delivery location. Such an interface could include a map that identifies alternate delivery areas, such thai the user can provide input specifying a particular one e i.be alternate delivery areas (e.g., an interface that functions similarly to the interfaces shown in Figures 6A to 6C.

[02141 Additionally or alternatively, a UAV or a service-provider system could determine that delivery is not possible due to an obstruction in the delivery area. Such a dctemiinatio could be made with or without considering the possibility of alternate delivery areas. In either case, the client-facing application could display a screen indicating that the UAV delivery is being canceled due- t an obstruction in the delivery area. Further, the -client application ma display interface features that allow the user to indicate a . ' subsequent action, such as: (a) attempting delivery of the same physical items after the UAV returns to a distribution center or warehouse (which may he undesirable for some items, such as hot food), (b) restarting the fislfsi!ment process so replacement iteni(s) are transported by a UAV (e.g., such that a restaurant prepares the same food items again, and a UAV transports the .newly-prepared food to a different, delivery area as soon as possible), or (c) canceling the order completely. Other options may also be provided. J . Post-Delivery Issue Resolution

{0215| Once Items have been delivered (e.g., lowered to the ground in the delivery area), an example cHent-facing applicaiioii may display interface features that allow a user to re ort defective Items and/or select corrective actions. In some cases, such interface features may allow the user thai ordered the items to comsiwnieate directly with the item source (e.g., the restaurant from which food items were ordered),

|0216| Further, two-way issue resolution functionality may be provided via the combination of a client-facin application and an item-provider application. For example, the UAV food-delivery ' application and a corresponding food-provider application may aikwv a user to . -capture iroageCs) of " an item they believe is defective or damaged, using a camera on the device that is providing the client-facing application. As a specific example, a user could take pieiure(s) of food that has spilled or has leaked during flight, food that lacks requested modifications from the standard menu, or an entire- order that is lacking item(s) that were paid for. The image(s) may he sent to food provider's device and displayed by the food-provider application. The food-provider application may then provide imerfe.ee features that allow the food provider to approve or deny UAV transport of -replacernent iiera(s). Gt&er functionality for issue -resolution and variations on the examples described above are of course possible. IX. Ceiictastou

}8217J While various aspects of the disclosure have been disclosed herein, other aspects and embodiments will, be apparent to those skilled in the art. Accordingly, the embodiments disclosed herein are for- purposes of illustration, and are sot intended to be limiting, with the true scope and spirit of the disclosure being indicated by the following claims.