Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
UNMANNED AERIAL VEHICLE
Document Type and Number:
WIPO Patent Application WO/2023/242745
Kind Code:
A1
Abstract:
An unmanned aerial vehicle for remote harvesting, the unmanned aerial vehicle comprising: a camera assembly attached to the unmanned aerial vehicle; a cutting assembly attached to the unmanned aerial vehicle; and a controller, configured to: guide the unmanned aerial vehicle to a cutting location; capture an image using the camera assembly; annotate the image using a training data set of images; transmit the annotated image to a remote web application for correction; receive from the remote web application, a corrected image; parse the corrected image for instructions; and control the cutting assembly based on the parsed instructions.

Inventors:
ANKUR APRATIM (IN)
RYU ANNEMARIE ELISE (US)
PALANGE RAHUL SHRIKANT (IN)
Application Number:
PCT/IB2023/056114
Publication Date:
December 21, 2023
Filing Date:
June 14, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
GLOBAL VILLAGE FRUIT INC (US)
International Classes:
A01D46/30; A01D46/24; G05D1/12; A01B79/02; A01G3/08; B64D1/22; G06V10/764; G06V20/17
Foreign References:
US20190150357A12019-05-23
US20200132477A12020-04-30
US20130325346A12013-12-05
US20170055433A12017-03-02
US20170359943A12017-12-21
US20210112723A12021-04-22
US20200364456A12020-11-19
Other References:
REINECKE ET AL.: "The influence of drone monitoring on crop health and harvest size", 2017 1ST INTERNATIONAL CONFERENCE ON NEXT GENERATION COMPUTING APPLICATIONS (NEXTCOMP, 2017, Mauritius, pages 5 - 10, XP033145675, Retrieved from the Internet [retrieved on 20230813], DOI: 10.1109/NEXTCOMP.2017.8016168
Download PDF:
Claims:
Claims:

1. An unmanned aerial vehicle for remote harvesting, the unmanned aerial vehicle comprising: a flying assembly; a camera assembly attached to the flying assembly; a cutting assembly attached to the flying assembly; a navigation controller; a cutting policy controller; and a target selection controller, configured to: capture an image using the camera assembly; annotate the image using a training data set of images; transmit the annotated image to a remote web application for correction; receive from the remote web application, a corrected image; parse the corrected image for instructions; and send the parsed instructions to the navigation controller, wherein the navigation controller is able to guide the unmanned aerial vehicle to a cutting location.

2. The unmanned aerial vehicle of claim 1 , wherein the target selection controller is configured to classify the image by: comparing the image to pre-trained images present in the training data set; determining a state of vegetation based on the image compared with the pre-trained images; and annotating the image based on the state of the vegetation.

3. The unmanned aerial vehicle of claim 2, wherein the state of the vegetation is either at a desired stage of maturation for harvesting or not at the desired stage of maturation for harvesting.

4. The unmanned aerial vehicle of claims 1-3, wherein the training data set is generated by: parsing the corrected image for corrections made to the annotated image; and saving the corrections to the training data set.

5. The unmanned aerial vehicle of claims 1-4, wherein the corrections to the image are made by a human operator.

6. The unmanned aerial vehicle of claim 5, wherein the corrections are selected from one or more of repositioning annotations, deleting annotations, or adding annotations.

7. The unmanned aerial vehicle of claims 1-6, wherein the cutting assembly includes: a motor holder; a motor; a cutter mount; and a cutting wheel, wherein the motor holder fixes the motor to the unmanned aerial vehicle, and wherein the cutting wheel is attached to the unmanned aerial vehicle by the cutter mount.

8. The unmanned aerial vehicle of claim 7, wherein the cutting wheel is composed of blades selected from one or more of a snipe scissor, a disk blade cutter, a blade cutter, or a jigsaw blade.

9. The unmanned aerial vehicle of claims 1-8, wherein one or more of the navigation controller, cutting policy controller, and target selection controller is further configured to connect the unmanned aerial vehicle to an unmanned ground vehicle using a tether management unit and standard drone cables.

10. The unmanned aerial vehicle of claim 9, wherein the unmanned aerial vehicle is connected to the unmanned ground vehicle using a radio connection.

11. The unmanned aerial vehicle of claim 10, wherein the unmanned ground vehicle allocates a configurable workspace zone around the unmanned aerial vehicle.

12. The unmanned aerial vehicle of claim 11, wherein the unmanned aerial vehicle: transmits to the unmanned ground vehicle, a request to move out of the allocated workspace zone and an estimated power requirement; and receives from the unmanned ground vehicle, instructions for movement based on available power at the unmanned aerial vehicle, and the estimated power requirement.

13. The unmanned aerial vehicle of claims 1-12, wherein the navigation controller is configured to guide the unmanned aerial vehicle to the cutting location using a navigation system, and wherein the navigation system is configured to: transmit a current location of the unmanned aerial vehicle to a remote operator; receive instructions from the remote operator; and maneuver the unmanned aerial vehicle based on the received instructions.

14. .The unmanned aerial vehicle of claims 1-13, wherein the navigation controller is configured to guide the unmanned aerial vehicle to a cutting location and engage in loiter state adjacent to a locked target determined from the corrected image.

15. The unmanned aerial vehicle of claims 1-14, wherein the cutting policy controller is able to align cutting blades of the cutting assembly with a locked target and cut or prune the locked target.

16. The unmanned aerial vehicle of claims 14-15, wherein the locked target is a target fruit.

17. The unmanned aerial vehicle of claim 16, wherein cutting or pruning the locked target comprises cutting or slicing a stem or branch of the target fruit.

18. The unmanned aerial vehicle of claims 14-17, wherein the locked target comprises a jackfruit, coconut, palm fruit, apple, pear, orange, grapefruit, banana, mango, galgal fruit, apricot, guava, lychee, avocado or papaya.

19. The unmanned aerial vehicle of claims 1-18, wherein the target selection controller is configured to scan an environment surrounding the unmanned aerial vehicle by rotating the unmanned aerial vehicle at a certain height.

20. The unmanned aerial vehicle of claim 19, wherein the target selection controller is configured to capture a video stream comprising one or more image frames, and classify each image by: a) comparing the image to fruit images provided in a pre-trained data set; b) determining a state of vegetation within the image based on the image compared with the fruit images provided in the pre-trained data set; and c) annotating the image based on the state of the vegetation.

21. The unmanned aerial vehicle of claims 19-20, wherein the annotated images are sent to a remote human facing web application for correction of location, size, boundary, labels, and combinations thereof of the annotations received in the application.

22. The unmanned aerial vehicle of claims 1-21, wherein the cutting location is located ten feet or higher above the ground.

23. A method for controlling an unmanned aerial vehicle comprising a camera assembly and a cutting assembly, the method comprising: guiding the unmanned aerial vehicle to a cutting location; capturing an image using the camera assembly; annotating the image using a training data set of images; transmitting the annotated image to a remote web application for correction; receiving from the remote web application, a corrected image; parsing the corrected image for instructions; and controlling the cutting assembly based on the parsed instructions.

24. The method of claim 23, wherein controlling the cutting assembly comprises aligning cutting blades of the cutting assembly with a locked target and cutting or pruning the locked target.

25. The method of claim 23, wherein the locked target is a target fruit and the step of cutting or pruning the locked target comprises cutting or slicing a stem or branch of the target fruit.

26. The method of claims 23-25, further comprising: comparing the image to fruit images present in the training data set; determining a state of vegetation within the image based on the image compared with the fruit images provided in the training data set; and annotating the image based on the state of the vegetation.

27. The method of claim 26, wherein the state of the vegetation is either at a desired stage of maturation for harvesting or not at the desired stage of maturation for harvesting.

28. The method of claims 23-28, wherein the training data set is generated by: parsing the corrected image for corrections made to the annotated image; and saving the corrections to the training data set.

29. The method of claim 28, wherein the corrections to the image are made by a human operator.

30. The method of claim 29, wherein the corrections are one or more of repositioning annotations, deleting annotations, or adding annotations to the image.

31. The method of claims 23-30, further comprising guiding the unmanned aerial vehicle to a cutting location and engaging in loiter state adjacent to a locked target.

32. The method of claims 23-31, further comprising scanning an environment surrounding the unmanned aerial vehicle by rotating the unmanned aerial vehicle at a certain height and collecting images.

33. The method of claims 24-32, wherein the locked target comprises a jackfruit, coconut, palm fruit, apple, pear, orange, grapefruit, banana, mango, galgal fruit, apricot, guava, lychee, avocado or papaya.

34. The method of claim 34, wherein the locked target comprises jackfruit.

35. The method of claims 23-34, wherein the cutting location is located ten feet or higher above the ground.

36. A non-transitory computer readable medium for controlling an unmanned aerial vehicle with a camera assembly and a cutting assembly, the non-transitory computer readable medium storing instructions that when executed by a processor of the unmanned aerial vehicle configures the unmanned aerial vehicle for performing steps comprising: guiding the unmanned aerial vehicle to a cutting location; capturing an image using the camera assembly; annotating the image using a training data set of images; transmitting the annotated image to a remote web application for correction; receiving from the remote web application, a corrected image; parsing the corrected image for instructions; and controlling the cutting assembly based on the parsed instructions.

Description:
UNMANNED AERIAL VEHICLE

FIELD OF THE INVENTION

Embodiments of this disclosure relate generally to systems and methods for using an unmanned aerial vehicle to harvest fruit.

BACKGROUND OF THE INVENTION

Harvesting fruit on a large scale manually is laborious, time consuming, and can be risky, especially for fruits harvested from the top of trees. When fruit ripens, there is a small window of optimal time within which they should be harvested to ensure longest shelf life. At the time of harvesting, farmers have to cover large land areas of farms within the small window of time to ensure that all fruit is harvested at the optimum time. In rushing through the harvesting of fruit, not only are they under a lot of physical stress, but they are also more prone to making mistakes in selecting the fruits at the desired stage of maturation to harvest.

SUMMARY OF THE INVENTION

A first aspect of the present disclosure provides an unmanned aerial vehicle for remote harvesting, the unmanned aerial vehicle comprising: a flying assembly, a camera assembly attached to the flying assembly; a cutting assembly attached to the flying assembly; a navigation controller able to guide the unmanned aerial vehicle to a cutting location; a cutting policy controller; and a target selection controller able to classify and lock fruit or branch locations in the environment for cutting or pruning. In an embodiment, the target selection controller is configured to capture an image using the camera assembly; annotate the image using a training data set of images; transmit the annotated image to a remote web application for correction; receive from the remote web application, a corrected image; parse the corrected image for instructions; and send the parsed instructions to the navigation controller. The navigation controller is able to guide the unmanned aerial vehicle to a cutting location and preferably engage in loiter state adjacent to a locked target determined from the corrected image. Preferably, the locked target is a target fruit. In an embodiment, the target fruit is a jackfruit, coconut, palm fruit, or similar fruit harvested from trees. In an embodiment, the cutting location is located five feet or higher, ten feet or higher, twenty feet or higher, thirty feet or higher, or even fifty feet or higher, above the ground. In an embodiment, the target fruit to be harvested includes, but is not limited to, jackfruit, coconut, palm fruit, apples, pears, oranges, grapefruit, bananas, mangos, galgal fruit, apricots, guava, lychee, avocado and papaya. In an embodiment, the target fruit is jackfruit.

In an embodiment, the navigation controller is able to guide the unmanned aerial vehicle to a cutting location and engage in a loiter state about 0.5 to 5 feet from the locked target, preferably 0.5 to 3 feet from the locked target, preferably 1-2 feet from the locked target. In an embodiment, the cutting assembly comprises one or more cutting blades, where the cutting policy controller is able to align the cutting blades with the locked target, including but not limited to a target fruit’s stem or branch, and apply an angle of attack for cutting or pruning the target. Preferably, the cutting assembly and one or more cutting blades are selected based on the target fruit’s stem or branch to be cut for harvesting. For example, larger and heavier cutting blades may be selected for fruit having larger or more dense stems or branches.

According to an implementation of the first aspect, the target selection controller is configured to scan an environment surrounding the unmanned aerial vehicle by rotating the unmanned aerial vehicle at a certain height (where the specific height of the environment scan is preferably based on the height of surrounding trees or vegetation), capture a video stream comprising one or more image frames, and classify each image by: comparing the image to fruit images present in a pretrained data set within the training data set; determining a state of the vegetation within the image based on the image compared with a database of pre-trained images; and annotating the image based on the state of the vegetation. In an embodiment, the annotated images are sent to a remote human facing web application for correction of location, size, boundary, labels, and combinations thereof of the annotations received in the application, and for training a reinforcement learning agent to select from the same action space (i.e., the location, size/boundary, labels of annotations) to match the changes made to the annotations in the application.

According to an implementation of the first aspect, the state of the vegetation is either at a desired stage of maturation or not at the desired stage of maturation.

According to an implementation of the first aspect, the training data set is generated by: parsing the corrected image for corrections made to the annotated image; and saving the corrections to the training data set.

According to an implementation of the first aspect, the corrections to the image are made by a human operator. According to an implementation of the first aspect, the corrections are selected from one or more of repositioning annotations, resizing/ changing boundaries of annotations, deleting annotations, or adding labels to annotations.

According to an implementation of the first aspect, the cutting assembly includes: 2 motor holders; 2 motors; a cutter mount; and a cutting wheel, wherein the motor holder fixes the motor to the unmanned aerial vehicle, and wherein the cutting wheel is attached to the unmanned aerial vehicle by the cutter mount. In an embodiment, the cutter mount includes: a motor controller that drives the cutting wheel motors and cutter translation motors; a separate power source (e.g., a separate onboard lipo battery on the cutter); mounts for 3 additional motors for translating the cutting blades, a depth camera (e.g., RealSense™ camera technology), mount for an additional monocamera, 2 mini laser pointers for close distance target locking of the stem and the cutter, a wired communication interface to the navigation controller in the UAV, a damping interface (mechanical) at the end where it connects with the UAV.

According to an implementation of the first aspect, the cutting wheel is composed of blades selected from one or more of a snipe scissor, a disk blade cutter, a blade cutter, or a jigsaw blade. The cutting blades are mounted on a guided wheel or bar structure rotating at very high rpm. The cutter mount translates the blades forwards-backwards, laterally and axially for the cutting blades to come in physical contact with the stem of the target fruit or branch to be cut or pruned.

According to an implementation of the first aspect, the navigation controller of the unmanned aerial vehicle is further configured to connect to an unmanned ground vehicle for navigation guidance, telemetry and return to home (RTH).

According to an implementation of the first aspect, the unmanned aerial vehicle is connected to the unmanned ground vehicle using wireless radio transmitter receiver connection, including but not limited to a wireless radio 2.4GHz FHSS transmitter receiver connection.

According to an implementation of the first aspect, the unmanned ground vehicle allocates a configurable workspace zone around the unmanned aerial vehicle.

According to an implementation of the first aspect, the unmanned aerial vehicle: transmits to the unmanned ground vehicle, a request to move out of the allocated workspace zone and an estimated power requirement; and receives from the unmanned ground vehicle, instructions for movement based on available power at the unmanned aerial vehicle, and the estimated power requirement. According to an implementation of the first aspect, the navigation controller is configured to guide the unmanned aerial vehicle to the cutting location using visual odometry, and wherein the navigation system is configured to: transmit a current location of the unmanned aerial vehicle to a remote operator; receive instructions from the remote operator; and maneuver the unmanned aerial vehicle based on the received instructions.

A second aspect of the present disclosure provides a method for controlling any of the unmanned aerial vehicles described above, where the unmanned aerial vehicle comprises a camera assembly and a cutting assembly, and where the method comprises: guiding the unmanned aerial vehicle to a cutting location; capturing an image using the camera assembly; annotating the image using a training data set of images; transmitting the annotated image to a remote web application for correction; receiving from the remote web application, a corrected image; parsing the corrected image for instructions; and controlling the cutting assembly based on the parsed instructions.

In an embodiment, controlling the cutting assembly comprises aligning cutting blades of the cutting assembly with a locked target and cutting or pruning the locked target. Preferably, the locked target is a target fruit and the step of cutting or pruning the locked target comprises cutting or slicing a stem or branch of the target fruit. In an embodiment, the target fruit is a jackfruit, coconut, palm fruit, or other fruit harvested from trees, and the stem or branch to be cut is located five feet or higher, ten feet or higher, twenty feet or higher, thirty feet or higher, or fifty feet or higher, above the ground.

According to an implementation of the second aspect, the method further comprises comparing the image to fruit images present in the training data set; determining a state of the vegetation within the image based on the image compared with the fruit images provided in the training data set; and annotating the image based on the state of the vegetation, where the state of the vegetation is either at a desired stage of maturation for harvesting or not at the desired stage of maturation for harvesting. In an embodiment, the training data set is generated by: parsing the corrected image for corrections made to the annotated image; and saving the corrections to the training data set. The corrections include, but are not limited to, repositioning annotations, deleting annotations, or adding annotations to the image. Optionally, the corrections to the image are made by a human operator.

According to an implementation of the second aspect, the method further comprises guiding the unmanned aerial vehicle to a cutting location and engaging in loiter state adjacent to a locked target. In an embodiment, the cutting location and position where the unmanned vehicle is engaged in the loiter state is located five feet or higher, ten feet or higher, twenty feet or higher, thirty feet or higher, or fifty feet or higher, above the ground. Optionally, the method further comprises scanning an environment surrounding the unmanned aerial vehicle by rotating the unmanned aerial vehicle at a certain height and collecting images (where the specific height of the environment scan is preferably based on the height of surrounding trees or vegetation).

A third aspect of the present disclosure provides a non-transitory computer readable medium for controlling an unmanned aerial vehicle as described in the first and second aspects of the invention. According to an implementation of the third aspect, the unmanned aerial vehicle comprises a camera assembly and a cutting assembly, and the non-transitory computer readable medium is able to store instructions that when executed by a processor of the unmanned aerial vehicle configures the unmanned aerial vehicle for operation. In an embodiment, the non-transitory computer readable medium configures the unmanned aerial vehicle to perform any of the method steps described in the first and second aspects of the invention. In an embodiment, the non-transitory computer readable medium configures the unmanned aerial vehicle for performing steps comprising: guiding the unmanned aerial vehicle to a cutting location; capturing an image using the camera assembly; annotating the image using a training data set of images; transmitting the annotated image to a remote web application for correction; receiving from the remote web application, a corrected image; parsing the corrected image for instructions; and controlling the cutting assembly based on the parsed instructions.

BRIEF DESCRIPTION OF DRAWINGS

Embodiments of the present disclosure will be described in even greater detail below based on the exemplary figures. The present disclosure is not limited to the exemplary embodiments. All features described and/or illustrated herein can be used alone or combined in different combinations in embodiments of the present disclosure. The features and advantages of various embodiments of the present disclosure will become apparent by reading the following detailed description with reference to the attached drawings which illustrate the following:

Fig. 1 is a simplified block diagram of a system to harvest fruit including an unmanned aerial vehicle, an unmanned ground vehicle, and a remote web application according to one or more examples of the present application. Fig. 2 is a simplified diagram depicting an exemplary environment for using an unmanned aerial vehicle to harvest fruit according to one or more examples of the present application.

Fig. 3 is an exemplary process for using an unmanned aerial vehicle to harvest fruit according to one or more examples of the present application.

Fig. 4 is another exemplary process for using an unmanned aerial vehicle to harvest fruit according to one or more examples of the present application.

DETAILED DESCRIPTION OF THE INVENTION

Examples of the presented application will now be described more fully hereinafter with reference to the accompanying Figs, in which some, but not all, examples of the application are shown. Indeed, the application may be exemplified in different forms and should not be construed as limited to the examples set forth herein; rather, these examples are provided so that the application will satisfy applicable legal requirements. Where possible, any terms expressed in the singular form herein are meant to also include the plural form and vice versa, unless explicitly stated otherwise. Also, as used herein, the term “a” and/or “an” shall mean “one or more” even though the phrase “one or more” is also used herein. Furthermore, when it is said herein that something is “based on” something else, it may be based on one or more other things as well. In other words, unless expressly indicated otherwise, as used herein “based on” means “based at least in part on” or “based at least partially on”.

Harvesting fruit on a large scale manually is laborious, time consuming, and risky. At the time of harvest, there is a small window of time, within which fruits should be harvested for the longest shelf life. During harvest, farmers have to cover vast areas of farms within the small window of time to ensure that they harvest all the fruit at the optimum time. In rushing through the harvesting of fruit, not only do they stress themselves, but they are also more prone to making mistakes in selecting the fruits at the desired stage of maturation to harvest and leaving the fruits not at the desired stage of maturation for later. Introducing automation into the selection and harvesting of fruit may potentially make this process more efficient, less labor intensive, and accurate in selection.

Furthermore, some fruit trees can grow up to over seventy (70) feet tall. Climbing such trees to select and harvest fruit could be fatal. Using unmanned aerial vehicles to examine and harvest fruits on such tall trees could eliminate the need for humans to climb such trees and reduce risk associated with harvesting.

Fig. 1 is a simplified block diagram of a system 100 to harvest fruit including an unmanned aerial vehicle, an unmanned ground vehicle, and a remote web application according to one or more examples of the present application. System 100 shown in Fig. 1 depicts an unmanned aerial vehicle (UAV) 102, an unmanned ground vehicle (UGV) 104, and remote web application 106. The operations of UAV 102 are controlled by a controller 102c, which may incorporate a navigation controller, cutting policy controller, and target selection controller either as integrated or separate components or modules. The controller 102c may communicate with the different components of the UAV 102 for operation of the UAV 102. UAV 102, as shown in Fig. 1, includes a camera assembly 102a, a cutting assembly 102b, a flying assembly 102e, memory 102d, classifier 102f, and antenna 102g. A 2.4 GHz FHSS radio with antenna 102g of UAV 102 may primarily be used to establish wireless contact with the unmanned ground vehicle (UGV) 104 and the remote web application 106. The UAV 102 may send information and receive information and instructions from the unmanned ground vehicle (UGV) 104 and the remote web application 106. The antenna 102g may be configured to use various wireless network interfaces to establish contact with the remote web application 106 and the UGV 104 such as Wi-Fi, Bluetooth, Infrared connection, cellular network connections (such as 2G, 3G, 4G, 5G, LTE etc.), and the like. Using this connection, the UAV 102 may be controlled by the UGV 104 or the remote web application 106. In some embodiments, the remote web application 106 may be operated by a human operator that controls the UGV 104 and the UAV 102 using the remote web application 106.

Camera assembly 102a of UAV 102 may include a depth camera and a mono camera that are physically and electrically coupled to the UAV 102. In some embodiments, the camera assembly 102a of UAV 102 may include a camera mount that attaches a camera to the UAV 102. In such embodiments, the camera mount may also include a motor configured to swivel in different directions to maneuver the camera over a large field of view. In such embodiments, the camera assembly 102a may also include camera controls that control camera operations such as focus, light, mode of operation, click a picture, make a video, etc. In some embodiments, the operation of the camera of the camera assembly 102a may be controlled by controller 102c. In such embodiments, the controller 102c may be configured to instruct the camera assembly 102a to snap a picture of the camera’s field of view. The camera assembly 102a may then provide the image to controller 102c for further processing. In some embodiments, the controller 102c may be configured to even instruct the camera assembly 102a to reposition the camera to alter the field of view of the camera, or even change the size of the aperture to change that the camera focuses on. In some embodiments, the controller 102c may transmit the images received from the camera to the remote web application 106 via the connection established between the UAV 102 and the remote web application 106 via antenna 102g. In such embodiments, the controller 102c of UAV 102 may receive instructions to operate the camera from a human operator operating the remote web application 106 via its user input/output interface 106a. The instructions may be received via the connection between the remote web application 106 and the UAV 102 via antenna 102g. In some embodiments, the controller 102c may instruct antenna 102g of UAV 102 to transmit a live video feed from camera assembly 102a to the remote web application 106. The live stream provided by the camera assembly 102a may be examined by the human operator at the remote web application 106 to control the operation of the UAV 102.

Cutting assembly 102b of UAV 102 may include 2 motor holders that affix 2 motors to UAV 102 and a cutter mount that attaches the cutting wheel to the UAV 102. In such embodiments, the cutting wheel may include blades selected from one or more of a snipe scissor, a disk blade cutter, a blade cutter, or a jigsaw blade, mounted on a guided wheel or bar structure. In some embodiments, the selection of the blade and guiding bar for the cutting assembly is based on the type of fruit that is to be harvested. For example, the selected blade and cutting assembly may be selected to harvest large fruit, including but not limited to coconut and jackfruit, smaller fruit, including but not limited to coconut and jackfruit, or even clusters of fruit, including but not limited to palm fruit and banana. In such embodiments, the cutting assembly 102b of UAV 102 may be configured to be compatible with all the different blades listed previously. In such embodiments, one blade and its guiding structure may be substituted for another based on the fruit that the UAV 102 will be used to harvest. In some embodiments, controller 102c may instruct cutting assembly 102b to begin operating the cutting wheel to separate a selected fruit from a tree, and in such embodiments, the controller 102c may also instruct the cutting assembly 102b to end operation of the cutting wheel once the fruit is separated from the tree. In such embodiments, the controller 102c may analyze the live stream provided by camera assembly 102a to determine whether the selected fruit is separated from the tree in order to stop the operation of the cutting wheel of cutting assembly 102b. In some embodiments, controller 102c of UAV 102 may receive instructions to start and stop operating the cutting wheel of cutting assembly 102b from a human operator operating remote web application 106 via the user I/O interface 106a. In such embodiments, the instructions received from the human operator via the remote web application 106 may be based on the live video feed received at the remote web application 106 from UAV 102.

In some embodiments, the UAV 102 may include an interface to physically connect with cutting assemblies 102b. In such embodiments, the UAV 102 may also include a configuration application that configures the UAV 102 to the specific cutting mechanism it is carrying, since any one cutting mechanism design would not be adequate to operate on different types of fruits and the different nature of fruit locations on a tree.

Flying assembly 102e of UAV 102 may include propeller mounts that affix propellers to the UAV 102. In some embodiments, the propeller mounts affix single or coaxial propellers to the UAV. In some embodiments, the propellers of the flying assembly 102e may be used to lift the UAV 102 off the ground, and may also be used to turn the UAV 102 in various directions while the UAV 102 is in the air. In some embodiments, controller 102c may instruct flying assembly 102e to begin operating the propellers to lift the UAV 102 off the ground and maneuver it in the air, and in such embodiments, the controller 102c may also instruct the flying assembly 102e to end operation of the propellers. In such embodiments, the controller 102c instructs the UAV to lift to a certain altitude followed by a yaw motion to scan the surrounding environment. The height at which the preliminary scan is executed is based on the height of the surrounding trees/foliage and/or instructions or preset configurations from the remote web application. In such embodiments, the controller 102c may analyze the live stream provided by camera assembly 102a to determine whether UAV 102 should be moved closer to or away from a tree in order to examine the various fruits on the tree. In some embodiments, controller 102c of UAV 102 may receive instructions to start and stop operating the propellers of flying assembly 102e from a human operator operating remote web application 106 via the user VO interface 106a. In such embodiments, the instructions received from the human operator via the remote web application 106 may be based on the live video feed received at the remote web application 106 from UAV 102.

As the controller 102c of UAV 102 receives an image of a fruit hanging on a tree from a camera assembly 102a, the controller 102c sends this picture to the classifiers 102f for annotation. In some embodiments, the controller 102c may receive a live stream of video instead of an image. In such embodiments, the controller 102c may extract an image from the live stream of video for analysis. In some embodiments, the classifiers 102f are part of a machine learning algorithm and include a first supervised classifier based on pre-trained data of fruit images, and a second reinforcement classifier learning from rewards from matching the annotation updates made in the human-in-loop operator app. The first classifier receives the image of the fruit from the controller 102c. The first classifier of classifiers 102f annotates the image. In some embodiments, the first classifier may annotate the received image to show that the fruit is at the desired stage of maturation for harvesting or not at the desired stage of maturation for harvesting. In some embodiments, the first classifier may highlight portions of the fruit shown in the image to determine whether a fruit is at the desired stage of maturation or not at the desired stage of maturation for harvesting. In some embodiments, a fruit may be at a desired stage of maturation when the fruit is ripe. In some other embodiments, a fruit may be at the desired stage of maturation even at a certain unripe stage. In some embodiments, annotations may also include a bounding rectangle around the fruit, and the position of the fruit with respect to the camera frame. Additionally, annotations may also include the multiple labels representing the type of fruit, category, estimated weight, bounding dimensions, stalk colour, bounding dimensions of the stalk, and other attributes of the fruit. In some embodiments, the classifiers 102f may be configured to provide more or less labels for each of the annotations in the images, or to limit the number of annotations captured/sent to the remote web application.

Once the first classifier of classifiers 102f annotates the images, the controller 102c transmits the annotated image to the remote web application 106 using the antenna 102g.

Remote web application 106 includes a user input/output (I/O) 106a and a controller 106b. The controller 106b may be configured to receive an annotated image from UAV 102. The received image may be displayed to a human operator operating the remote web application 106. In some embodiments, the remote web application 106 may be executed on mobile devices such as smartphones, tablets, or portable computers. In such embodiments, the user I/O 106a may render the remote web application 106 on a display of the mobile devices described above. In such embodiments, the display may be interactive and be receptive of user inputs. In some other embodiments, the human operator may interact with the remote web application 106 using other user input devices such as mouse, keyboard, stylus or other input device. In some embodiments, the remote web application 106 may render the annotated image received from the UAV 102 on the display of the mobile device that executes the remote web application 106. The human operator may correct the annotated image. The human operator may reposition the existing annotations if the human operator believes they were placed incorrectly, adjust size, i.e., bounding box of the annotations, delete annotations that the human operator believes to be incorrect, or add json labels to annotations that the human operator believes were missing from the image, to capture any metadata to the annotations that could be utilized as feedback to train the classifiers for future operations.

Once the human operator is done correcting the annotations in the image frameset, the human operator may instruct the controller 106b of the remote web application 106 to transmit the corrected image back to the UAV 102. The image is received by the antenna 102g of the UAV 102. The corrected image is sent back to the classifiers 102f. The corrected image is sent to the second classifier of the classifiers 102f. The second classifier is responsible for training using the corrected annotations received from the remote web application 106. In some embodiments, the second classifier of the classifiers 102f is a reinforcement learning agent with a pre-defined action space matching the set of actions allowed on the annotated image framesets (viz repositioning annotations in the frame, resizing annotation’s bounding box, deleting annotations, adding/modifying labels in the annotations) and a reward function that optimizes action selection policies matching the output annotated image frameset received from the remote web application 106 for the input annotated image frameset received from first classifier. Any differences in the annotations made by the first classifier and received from the remote web application are saved in a training dataset in memory 102d. The training data set is used to train the controller 102c of the UAV 102 to annotate further images. The controller 102c parses the corrected image received from the remote web application 106 for instructions on whether to harvest the fruit in the corrected image. If the controller 102c parses the instructions from the corrected image and determines that the fruit should be harvested, the controller 102c instructs the cutting assembly 102b to cut the stem of the fruit and separate it from the tree.

If the controller 102c parses the instructions from the corrected image and determines that the fruit should not be harvested, the controller 102c instructs the flying assembly 102e to maneuver the drone to a different fruit in the tree. Once the drone is repositioned, the controller 102c instructs camera assembly 102a to snap an image of the different fruit and that image is sent to the first classifier of classifiers 102f for annotation. The annotated image is then transmitted by the antenna 102g to the remote web application 106 and a corrected image is received from the remote web application 106. The controller 102c parses the corrected image for instructions relating to the harvesting of the different fruit in the image. Based on the received instructions, the controller 102c controls the operation of the UAV 102.

The UAV 102 is connected to the unmanned ground vehicle (UGV) 104. The primary communication interfaces between the UGV and UAV include a 2.4GHz FHSS radio and a (802.11) wifi radio. In some embodiments, additional wireless communication interfaces may connect to the UAV 102 and the UGV 104 over a radio connection such 4G LTE, or the like. In some embodiments, the UGV 104 includes an antenna 104c that assists in establishing the connections between the UGV 104 and the UAV 102. In some embodiments, the antenna 104c may also be used to establish a connection between UGV 104 and the remote web application 106. In some embodiments, the UGV 104 may be controlled by a controller 104a located within the UGV 104. Alternatively, the UGV 104 may be controlled by a human operator that controls the remote web application 106. In some embodiments, the UGV 104 is connected to the UAV 102 through drone cables. In some embodiments, the connection between the UAV 102 and the UGV 104 is controlled by tether unit 104b. In such embodiments, the tether unit 104b controls the connections, physical and the wireless connections between the UAV 102 and the UGV 104. In some embodiments, the UAV 102 may connect to the UGV 104 without the use of any drone cables. In such embodiments, the connection between UAV 102 and UGV 104 may be entirely wireless. The UGV is electrically powered using an onboard LiOn battery ( 120-200 Ah). The UGV moves the UAV at different locations in a farm, The UGV dispatches the UAV at the closest possible accessible ground to the target set of trees, usually in the range of 20-400 feet from the trees. Post operation or during low power the UAV is called off at the UGV, where the larger battery of the UGV is used to charge spare battery sets that are manually swapped into the UAV.

In some embodiments, the UGV includes a UAV guidance unit 104d that allocates a configurable workspace zone around the UGV 104. In such embodiments, the UAV guidance unit 104d utilizes a Lidar, a depth camera, and an associated computer to create a spherical volume of about 200 feet radius around the UGV 104 for operation. In such embodiments, the UAV 102 may only operate in the configured workspace around the UGV 104. In some embodiments, the UAV 102 continuously sends its activity-state and position data to the UGV 104. In some embodiments, if the UAV 102 wishes to move out of the configurable workspace, the UAV 102 sends a request to the UGV 104 before heading out of the allocated workspace zone. In such embodiments, the UAV guidance unit 104d of the UGV 104 acts on this request based on evaluation of available power in the UAV 102 and the estimated power required in order to venture out of the configured workspace. In some embodiments, the estimated power may be calculated based on a predicted location that the UAV 102 may transmit to the UGV 104. In such embodiments, the UAV guidance unit 104d accordingly attempts to relocate and/or sends a notification for assisted movement to the UAV 102.

Fig. 2 is a simplified diagram depicting an exemplary environment for using an unmanned aerial vehicle to harvest fruit according to one or more examples of the present application. UAV 202 shown in Fig. 2 is connected to UGV 206 using drone cables 204. The drone cables 204 are a part of the tether unit 104b of UGV 206 that are used to connect the UAV 202 to the UGV 206. In some embodiments, the UAV 202 may connect to the UGV 206 without the use of any drone cables 204. In such embodiments, the connection between UAV 202 and UGV 206 may be entirely wireless. The UAV 202 includes a camera assembly 102a and a cutting assembly 102b. The UAV 202 is maneuvered close to fruit 210 of tree 208 in order to determine which of the fruit 210 on tree 208 may be harvested. In some embodiments, the UAV 202 may be controlled by a human operator 212 via the remote web application rendered on a mobile device 214.

Fig. 3 is an exemplary process for using an unmanned aerial vehicle to harvest fruit according to one or more examples of the present application. The process 300 may be performed by the environment 100 of Fig. 1 such as the controller 102c. However, it will be recognized that any of the following blocks may be performed in any suitable order and that the process 300 may be performed in any environment and by any suitable computing device and/or controller. For instance, the process 300 may also be performed by the controller 102c shown in Fig. 1.

At block 302, the controller (e.g., 102c) guides the unmanned aerial vehicle (e.g., UAV 102) to a cutting location on a tree. At block 304, the controller 102c captures an image using the camera assembly (e.g., controller 102a). At block 306, the controller 102c annotates the image using a training data set of images. At block 308, the controller 102c transmits the annotated image to a classifier for correction. At block 310, the controller 102c receives from a remote web application, a corrected image. At block 312, the controller 102c parses the corrected image for instructions. At block 314, the controller 102c controls the cutting assembly (e.g., cutting assembly 102b) based on the parsed instructions. Fig. 4 is another exemplary process for using an unmanned aerial vehicle to harvest fruit according to one or more examples of the present application.

The process 400 may be performed by the environment 100 of Fig. 1 such as the controller 102c. However, it will be recognized that any of the following blocks may be performed in any suitable order and that the process 400 may be performed in any environment and by any suitable computing device and/or controller. For instance, the process 400 may also be performed by the controller 102c shown in Fig. 1.

At block 402 and with reference to structure of Fig. 1, the controller 102c receives an image captured by the camera assembly 102a of a fruit hanging on a tree. At block 404, the controller 102c transmits the image to the first classifier of classifiers 102f. At block 406, the first classifier of classifiers 102f provides preliminary annotations to the received image. At block 407, the data related to the preliminary annotations is fused with the captured image data. In some embodiments, fused data includes overlaying the preliminary annotations of block 406 on the captured image from block 402. At block 408 the controller 102c, relying on the antenna 102g and its associated wireless network interface, transmits the annotated image to the remote web application for review. Simultaneously, the controller 102c transmits the captured image from block 402 to the second classifier of the classifiers 102f at block 410. The controller 102c then receives the corrected image from the remote web application via the antenna 102f and its associated wireless network interface. The second classifier of classifiers 102f compares the corrected image to the annotated image. The changes made to the annotated image in the web application are extracted from the annotated image and saved in a training data set for training the classifiers 102f further. At block 412, the changes made in the corrected image and the annotated image are fused together and the corrected image is generated and sent to the controller 102c for action, i.e., cutting or not cutting the fruit. In some embodiments, the changes to the annotated image are also sent from the corrected image to the first classifier of classifiers 102f to train the first classifier for future annotations.

While embodiments of the invention have been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive. It will be understood that changes and modifications may be made by those of ordinary skill within the scope of the following claims. In particular, the present invention covers further embodiments with any combination of features from different embodiments described above and below. For example, the various embodiments of the kinematic, control, electrical, mounting, and user interface subsystems can be used interchangeably without departing from the scope of the invention. Additionally, statements made herein characterizing the invention refer to an embodiment of the invention and not necessarily all embodiments.

The terms used in the claims should be construed to have the broadest reasonable interpretation consistent with the foregoing description. For example, the use of the article “a” or “the” in introducing an element should not be interpreted as being exclusive of a plurality of elements. Likewise, the recitation of “or” should be interpreted as being inclusive, such that the recitation of “A or B” is not exclusive of “A and B,” unless it is clear from the context or the foregoing description that only one of A and B is intended. Further, the recitation of “at least one of A, B and C” should be interpreted as one or more of a group of elements consisting of A, B and C, and should not be interpreted as requiring at least one of each of the listed elements A, B and C, regardless of whether A, B and C are related as categories or otherwise. Moreover, the recitation of “A, B and/or C” or “at least one of A, B or C” should be interpreted as including any singular entity from the listed elements, e.g., A, any subset from the listed elements, e.g., A and B, or the entire list of elements A, B and C.

When a group of materials, compositions, components or compounds is disclosed herein, it is understood that all individual members of those groups and all subgroups thereof are disclosed separately. Every combination of components described or exemplified herein can be used to practice the invention, unless otherwise stated. Whenever a range is given in the specification, for example, a temperature range, a time range, or a composition range, all intermediate ranges and subranges, as well as all individual values included in the ranges given are intended to be included in the disclosure. Additionally, the end points in a given range are to be included within the range. In the disclosure and the claims, “and/or” means additionally or alternatively. Moreover, any use of a term in the singular also encompasses plural forms.

As used herein, “comprising” is synonymous with "including," "containing," or "characterized by, " and is inclusive or open-ended and does not exclude additional, unrecited elements or method steps. As used herein, "consisting of" excludes any element, step, or ingredient not specified in the claim element. As used herein, "consisting essentially of" does not exclude materials or steps that do not materially affect the basic and novel characteristics of the claim. Any recitation herein of the term “comprising”, particularly in a description of components of a composition or in a description of elements of a device, is understood to encompass those compositions and methods consisting essentially of and consisting of the recited components or elements.

One of ordinary skill in the art will appreciate that starting materials, device elements, analytical methods, mixtures and combinations of components other than those specifically exemplified can be employed in the practice of the invention without resort to undue experimentation. All art- known functional equivalents, of any such materials and methods are intended to be included in this invention. The terms and expressions which have been employed are used as terms of description and not of limitation, and there is no intention that in the use of such terms and expressions of excluding any equivalents of the features shown and described or portions thereof, but it is recognized that various modifications are possible within the scope of the invention claimed. The invention illustratively described herein suitably may be practiced in the absence of any element or elements, limitation or limitations which is not specifically disclosed herein. Headings are used herein for convenience only.