Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
APPARATUS AND SYSTEM FOR SELECTIVE CROP HARVESTING
Document Type and Number:
WIPO Patent Application WO/2022/254203
Kind Code:
A1
Abstract:
Broadly speaking, embodiments of the present techniques provide a robotic end-effector for selective crop harvesting, which is particularly suitable for harvesting crops that grow in dense clusters, such as strawberries.

Inventors:
GHALAMZAN ESFAHANI AMIR MASOUD (GB)
Application Number:
PCT/GB2022/051386
Publication Date:
December 08, 2022
Filing Date:
June 01, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
UNIV OF LINCOLN (GB)
International Classes:
A01D46/30
Domestic Patent References:
WO2020159123A12020-08-06
WO2016055552A12016-04-14
Foreign References:
US20200008355A12020-01-09
CN110393089A2019-11-01
FR2760595A11998-09-18
Attorney, Agent or Firm:
APPLEYARD LEES IP LLP (GB)
Download PDF:
Claims:
CLAIMS

1. A robotic end-effector for fruit harvesting, the robotic end-effector comprising: a vision system for identifying a location of a ripe fruit on a plant; a first pair of fingers for moving any objects that at least partly occlude the identified ripe fruit on the plant; a second pair of fingers for gripping a stem of the identified ripe fruit, the second pair of fingers comprising a sensor for indicating when the stem is located between the second pair of fingers in a position suitable for gripping; and a cutting mechanism for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut.

2. The robotic end-effector as claimed in claim 1 wherein the cutting mechanism is provided in proximity to the second pair of fingers, such that when the stem of the identified ripe fruit is cut, the second pair of fingers grips a portion of the stem that remains attached to the fruit.

3. The robotic end-effector as claimed in claim 2 wherein the cutting mechanism cuts the stem at a point where the stem protrudes from the second pair of fingers.

4. The robotic end-effector as claimed in claim 2 wherein one of the fingers of the second pair of fingers comprises a slot in the finger, and the cutting mechanism is arranged to move through the slot to cut a stem gripped by the second pair of fingers.

5. The robotic end-effector as claimed in claim 4 wherein the other of the fingers of the second pair of fingers comprises a groove for receiving a cutting edge of the cutting mechanism when the cutting mechanism moves through the slot to cut the stem.

6. The robotic end-effector as claimed in claim 4 wherein the other of the fingers of the second pair of fingers comprises a gripping surface, and wherein the gripping surface comprises an angled portion for receiving a cutting edge of the cutting mechanism when the cutting mechanism moves through the slot to cut the stem.

7. The robotic end-effector as claimed in any preceding claim wherein the vision system comprises a depth sensor for generating a three-dimensional map of the plant, and wherein the vision system uses the three-dimensional map to identify location of ripe fruits on a plant and any objects that at least partly occlude the identified ripe fruits.

8. The robotic end-effector as claimed in claim 7 wherein the depth sensor is an RGB-D camera.

9. The robotic end-effector as claimed in claim 7 or 8 further comprising: a first actuation mechanism for controlling the actuation of the first pair of fingers.

10. The robotic end-effector as claimed in claim 9 wherein, responsive to receiving the location of an object that at least partly occludes a fruit, the first actuation mechanism controls the first pair of fingers to push away the object, by increasing a separation distance between the first pair of fingers.

11. The robotic end-effector as claimed in any claims 9 or 10 wherein the first pair of fingers are haptic fingers comprising sensors for measuring forces exerted on the fingers by objects being moved by the fingers.

12. The robotic end-effector as claimed in claim 11 wherein the forces measured by the sensors of the haptic fingers are used by the first actuation mechanism to control the actuation of the haptic fingers.

13. The robotic end-effector as claimed in any preceding claim wherein the vision system comprises: at least one image sensor provided in the vicinity of the second pair of fingers for determining when a stem of the ripe fruit is located between the second pair of fingers.

14. The robotic end-effector as claimed in claim 13 wherein the at least one image sensor is an RGB sensor.

15. The robotic end-effector as claimed in claim 13 wherein the at least one image sensor is two RGB sensors provided in the vicinity of the second pair of fingers to enable stereo vision.

16. The robotic end-effector as claimed in any preceding claim further comprising: a second actuation mechanism for controlling the actuation of the second pair of fingers and the cutting mechanism.

17. The robotic end-effector as claimed in claim 16, wherein, responsive to receiving feedback from the sensor that the stem of the ripe fruit is located between the second pair of fingers, the second actuation mechanism controls the second pair of fingers to grip the stem, by decreasing a separation distance between the second pair of fingers.

18. The robotic end-effector as claimed in any preceding claim wherein the sensor of the second pair of fingers is a proximity sensor.

19. A robotic system for fruit harvesting, the robotic system comprising: at least one picking arm; a control system for controlling the at least one picking arm; and a robotic end-effector, according to any one of claims 1 to 18, coupled to the at least one picking arm.

20. The robotic system as claimed in claim 19 further comprising at least one container for receiving fruit harvested by the robotic end-effector, wherein after the cutting mechanism has cut the stem of a ripe fruit, the control system controls the at least one picking arm to move the robotic end- effector to above the at least one container, and wherein the second actuation mechanism controls the second pair of fingers to release the stem and the attached fruit, by increasing a separation distance between the second pair of fingers, so that the fruit drops into the container.

21. The robotic system as claimed in claim 19 or 20 further comprising a mechanism for moving the robotic system.

22. A method for fruit harvesting using a robotic end-effector, the method comprising: identifying, using a vision system of the robotic end-effector, a location of a ripe fruit on a plant; moving, using a first pair of fingers of the robotic end-effector, any objects that at least partly occlude the identified ripe fruit on the plant; gripping a stem of the identified ripe fruit, using a second pair of fingers of the robotic end-effector; sensing, using a sensor of the second pair of fingers, when the stem is located between the second pair of fingers; and cutting, using a cutting mechanism, the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut.

23. The method as claimed in claim 22 wherein moving any objects that at least partly occlude the identified ripe fruit comprises controlling a first actuation mechanism to increase a separation distance between the first pair of fingers.

24. The method as claimed in claim 22 or 23 wherein the first pair of fingers are haptic fingers comprising sensors for measuring forces exerted on the fingers by objects being moved by the fingers, and wherein moving any objects that at least partly occlude the identified ripe fruit comprises using the forces measured by the sensors to control the first actuation mechanism.

25. The method as claimed in claim 22, 23 or 24 wherein gripping a stem of the identified ripe fruit comprises: receiving feedback from the sensor that the stem of the identified ripe fruit is located between the second pair of fingers; and controlling a second actuation mechanism to decrease a separation distance between the second pair of fingers.

26. The method as claimed in any of claims 21 to 25 further comprising: identifying all ripe fruits on the plant; selecting a ripe fruit from the identified fruits to harvest; determining whether the selected ripe fruit is in a cluster; and determining a harvesting schedule when the selected ripe fruit is determined to be in a cluster, the harvesting schedule defining an order in which ripe fruits in the cluster are to be harvested.

27. A non-transitory data carrier carrying code which, when implemented on a processor, causes the processor to carry out the method of any of claims 22 to 26.

Description:
Apparatus and System for Selective Crop Harvesting

Field

The present techniques generally relate to an apparatus, system and method for selective crop harvesting. In particular, the present techniques provide a robotic end-effector for fruit harvesting that is able to detect, select and cut edible crops that grow in dense clusters.

Background

Picking fruits in different growing conditions is a challenging problem for selective harvesting technology, as it is difficult to design and build an effective robotic device (known as an end-effector or picking head) which is able to deal with complex picking operations. A human's hand enables dexterous manipulation of a fruit with 27 degrees of freedom, and over 80% of the grasping information can be encoded into just 6 eigen grasp. In contrast, conventional robotic end- effectors are customised for specific applications, such as pick-and-place operations in industrial environments.

Currently, there are two types of picking heads available for robotic harvesting of high value crops: (i) a picking head having a parallel jaw gripper, which may not be suitable for all types of crops, and (ii) a picking head that has a customised design for picking particular fruit in a very specific picking scenario, which is only suitable for a specific type of crop or method of harvesting. Consequently, the effectiveness of commonly available robotic picking heads is limited, as different robotic picking heads may be needed for different crop types.

Some robotic picking heads are used to pick soft fruits such as strawberries. Some of the robotic picking heads that are currently available for picking strawberries are cup-shaped picking heads, which have opening parts that locate the peduncle of a strawberry and position the strawberry in front of cutting scissors in order to harvest the strawberry. The cutting action causes the strawberry to detach from the plant and fall into a punnet for collecting the strawberries. In this example, the picking head does not directly touch the flesh of the strawberry, which minimises bruising. However, because the strawberry falls from a height into the punnet, the harvesting can inadvertently cause damage/bruising to the fruit. Furthermore, fruit placement within the punnet is not controlled, which may result in uneven distribution of the fruit in the punnet (which may also cause damage to fruit that are below other fruit).

Similarly, the design of the cup-shaped picking head, and the design of other types of picking head, may not be suitable for harvesting crops that grow in dense clusters.

The present applicant has therefore identified the need for an improved apparatus for automatic detection, selection and harvesting of crops that grow in dense clusters.

Summary

In a first approach of the present techniques, there is provided a robotic end-effector for fruit harvesting, the robotic end-effector comprising: a vision system for identifying a location of a ripe fruit on a plant; a first pair of fingers for moving any objects that at least partly occlude the identified ripe fruit on the plant; a second pair of fingers for gripping a stem of the identified ripe fruit, the second pair of fingers comprising a sensor for indicating when the stem is located between the second pair of fingers in a position suitable for gripping; and a cutting mechanism for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut.

The present techniques advantageously enable ripe fruit to be harvested without bruising or damaging the fruit. The present techniques are particularly advantageous for harvesting fruit that grows in dense clusters, such as strawberries. It will be understood that this is an example, and non-limiting, fruit that may be harvested using the robotic end-effector of the present techniques. More generally, the present techniques may be used to harvest different types of fruit and vegetable crop, including those which grow individually and those which grow in clusters.

The cutting mechanism is preferably provided in proximity to the second pair of fingers, such that when the stem of the identified ripe fruit is cut by the cutting mechanism, the second pair of fingers continues to grip a portion of the stem that remains attached to the fruit. In other words, when a cutting operation performed by the cutting mechanism is complete, the fruit is not immediately dropped into a container for collecting the harvested fruit. Instead, the fruit continues to be gripped, via a portion of the stem that is still attached to the fruit, by the second pair of fingers. This is advantageous because the robotic end- effector may be controlled to gently place the harvested fruit in a container. The second pair of fingers may release their grip on the portion of the stem that remains attached to the fruit when the end-effector is close to the container.

Preferably, the cutting mechanism may be partially or fully covered or encased for safety reasons, i.e. to avoid any risk of a human operator being able to come into contact with the cutting mechanism.

When the stem of the identified ripe fruit is gripped by the second pair of fingers, the identified ripe fruit may be in proximity to a first side (e.g. a bottom side) of the second pair of fingers. The cutting mechanism may be provided in proximity to a second, opposite side (e.g. a top side) of the second pair of fingers. Thus, the cutting mechanism may be positioned relative to the second pair of fingers such that it cuts the stem at a point where the stem protrudes from the second pair of fingers. In this way, a portion of the stem that is still attached to the fruit remains gripped by the second pair of fingers.

Alternatively, one of the fingers of the second pair of fingers may comprise a slot in the finger, which extends all the way through the finger, from an edge of the finger to a gripping surface of the finger. The cutting mechanism may be arranged to move through the slot to cut a stem gripped by the second pair of fingers. This may be advantageous because a portion of the stem which is gripped by the second pair of fingers may be held more firmly and/or may be substantially straight (compared to the stem which protrudes from the second pair of fingers), which may make it easier for the cutting mechanism to cut through the stem.

The other of the fingers of the second pair of fingers may comprise a groove for receiving a cutting edge of the cutting mechanism when the cutting mechanism moves through the slot to cut the stem. This may enable the cutting mechanism to fully cut through the stem, as the groove provides a space for the cutting edge of the cutting mechanism to pass through the stem.

In another example, the other of the fingers of the second pair of fingers may comprise a gripping surface. The gripping surface may comprise an angled portion for receiving a cutting edge of the cutting mechanism when the cutting mechanism moves through the slot to cut the stem. This may enable the cutting mechanism to cut the stem at an angle, which may advantageously enable the stem to be cut using a single cutting action.

The vision system may comprise a depth sensor for generating a three- dimensional map of the plant. The vision system may use the three-dimensional map to identify location of ripe fruits on a plant and any objects that at least partly occlude the identified ripe fruits. The depth sensor may be an RGB-D (red-green- blue-depth) camera.

The robotic end-effector may further comprise a first actuation mechanism for controlling the actuation of the first pair of fingers. Thus, a dedicated actuation mechanism is used to control the movement and operation of the first pair of fingers.

Responsive to receiving the location of an object that at least partly occludes a fruit, the first actuation mechanism may control the first pair of fingers to push away the object, by increasing a separation distance between the first pair of fingers. Thus, the first pair of fingers may be close together when the robotic end- effector is being used to image a plant and identify ripe fruits, and/or when the robotic end-effector is moving towards an identified ripe fruit. The first pair of fingers may be moved further apart when an object that at least partly occludes a fruit needs to be moved away, so that the fruit can be better seen (to determine if it is suitable for harvesting) and/or so that the second pair of fingers can grip a stem of the fruit.

The first pair of fingers may be non-sensing fingers. That is, the first pair of fingers may not themselves obtain any feedback about the objects they contact. Alternatively, the first pair of fingers may be haptic fingers comprising sensors for measuring forces exerted on the fingers by objects being moved by the fingers. This may be advantageous because having the sensors can determine interactions between the first pair of fingers and the plant, which enables intelligent manipulation of a cluster of fruits when other sensors (e.g. visual sensors) may not be able to see what the first pair of fingers are interacting with due to occlusion. The forces measured by the sensors of the haptic fingers may be used by the first actuation mechanism to control the actuation of the haptic fingers, and the movements of a robotic manipulator. This may be advantageous because more effective manipulation movements may be generated to push away occluding matter, which yields increased success of cluster manipulation. Furthermore, it may avoid exerting large or excessive forces on soft fruits, thereby minimising the risk of bruising the fruits during the picking process.

The vision system may comprise at least one image sensor for capturing images of the fruit/cluster of fruits. The at least one image sensor may be an RGB sensor. In some cases, the at least one image sensor may be two RGB sensors provided in the vicinity of the second pair of fingers to enable stereo vision (i.e. depth sensing). Stereo vision may be useful because it provides the robotic end-effector with richer sensory information. Having the RGB sensor(s) in the vicinity of the second pair of fingers may be advantageous because the sensor(s) capture images of the fruit or cluster of fruits at fruit level, whereas other sensors of the vision system may view the fruit from a different perspective/angle. This also reduces the risk of every sensor of the vision system being occluded during the picking process, i.e. it provides some redundancy in the vision system. The robotic end-effector may further comprise a second actuation mechanism for controlling the actuation of the second pair of fingers and the cutting mechanism. Thus, a separate, dedicated actuation mechanism is used to control the movement and operation of the second pair of fingers and the cutting mechanism. As the cutting mechanism is only operated when it is confirmed that the second pair of fingers are gripping a stem of a fruit to be harvested, advantageously a single actuation mechanism is used to control both the second pair of fingers and cutting mechanism, thereby reducing complexity and the number of components needed to control the robotic end-effector.

As noted above, the second pair of fingers comprise a sensor for determining when a stem is located between the second pair of fingers. Advantageously, the sensor may be used to determine when the second pair of fingers need to be actuated to grip the stem, so that the fruit can be harvested. Responsive to receiving feedback from the sensor that the stem of the ripe fruit is located between the second pair of fingers, the second actuation mechanism may control the second pair of fingers to grip the stem, by decreasing a separation distance between the second pair of fingers.

The sensor of the second pair of fingers may be a proximity sensor, such as an infrared sensor. It will be understood that this is a non-limiting example proximity sensor, and that any suitable sensor may be used to determine when the stem is located between the second pair of fingers are closed in a position suitable for gripping.

In a second approach of the present techniques, there is provided a robotic system for fruit harvesting, the robotic system comprising: at least one picking arm; a control system for controlling the at least one picking arm; and a robotic end-effector, of the type described herein, coupled to the at least one picking arm.

The robotic system may further comprise at least one container for receiving fruit harvested by the robotic end-effector, wherein after the cutting mechanism has cut the stem of a ripe fruit, the control system controls the at least one picking arm to move the robotic end-effector to above the at least one container, and wherein the second actuation mechanism controls the second pair of fingers to release the stem and the attached fruit, by increasing a separation distance between the second pair of fingers, so that the fruit drops into the container.

The robotic system may further comprise a mechanism for moving the robotic system towards, between and/or around plants. The mechanism may be a tracked or wheeled rover, or a vehicle capable of navigating autonomously.

In a third approach of the present techniques, there is provided a method for fruit harvesting using a robotic end-effector, the method comprising: identifying, using a vision system of the robotic end-effector, a location of a ripe fruit on a plant; moving, using a first pair of fingers of the robotic end-effector, any objects that at least partly occlude the identified ripe fruit on the plant; gripping a stem of the identified ripe fruit, using a second pair of fingers of the robotic end-effector; sensing, using a sensor of the second pair of fingers, when the stem is located between the second pair of fingers; and cutting, using a cutting mechanism, the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut.

The step of moving any objects that at least partly occlude the identified ripe fruit may comprise controlling a first actuation mechanism to increase a separation distance between the first pair of fingers.

The first pair of fingers may be haptic fingers comprising sensors for measuring forces exerted on the fingers by objects being moved by the fingers. The step of moving any objects that at least partly occlude the identified ripe fruit may comprise using the forces measured by the sensors to control the first actuation mechanism, as well as the movements of a robotic manipulator. As mentioned above, this may be advantageous because the measured forces may enable more effective manipulation movements to be generated to push away occluding matter, which yields increased success of cluster manipulation. Furthermore, it may avoid exerting large or excessive forces on soft fruits, thereby minimising the risk of bruising the fruits during the picking process.

The step of gripping a stem of the identified ripe fruit may comprise: receiving feedback from the sensor that the stem of the identified ripe fruit is located between the second pair of fingers; and controlling a second actuation mechanism to decrease a separation distance between the second pair of fingers. Advantageously, the feedback from the sensor (which may be a proximity sensor) may be used to determine when the second pair of fingers need to be actuated to grip the stem, so that the fruit can be harvested.

The method may comprise identifying, using the vision system, all ripe fruits on the plant, and selecting one such ripe fruit to harvest. In some cases, the identified ripe fruit may be located on its own on a plant, such that harvesting the fruit is relatively straightforward. In other cases, the selected ripe fruit may be in a cluster of fruits (as is the case with strawberries, for example). In these cases, the identified ripe fruit might not be easy to harvest, as it may be occluded by other fruits in the cluster. Thus, the method may comprise determining whether the selected ripe fruit is in a cluster; and determining a harvesting schedule when the selected ripe fruit is determined to be in a cluster, the harvesting schedule defining an order in which ripe fruits in the cluster are to be harvested.

In a related approach of the present techniques, there is provided a non- transitory data carrier carrying processor control code to implement any of the methods, processes and techniques described herein.

As will be appreciated by one skilled in the art, the present techniques may be embodied as a system, method or computer program product. Accordingly, present techniques may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects.

Furthermore, the present techniques may take the form of a computer program product embodied in a computer readable medium having computer readable program code embodied thereon. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable medium may be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Computer program code for carrying out operations of the present techniques may be written in any combination of one or more programming languages, including object oriented programming languages and conventional procedural programming languages. Code components may be embodied as procedures, methods or the like, and may comprise sub-components which may take the form of instructions or sequences of instructions at any of the levels of abstraction, from the direct machine instructions of a native instruction set to high-level compiled or interpreted language constructs.

Embodiments of the present techniques also provide a non-transitory data carrier carrying code which, when implemented on a processor, causes the processor to carry out any of the methods described herein.

The techniques further provide processor control code to implement the above-described methods, for example on a general purpose computer system or on a digital signal processor (DSP). The techniques also provide a carrier carrying processor control code to, when running, implement any of the above methods, in particular on a non-transitory data carrier. The code may be provided on a carrier such as a disk, a microprocessor, CD- or DVD-ROM, programmed memory such as non-volatile memory (e.g. Flash) or read-only memory (firmware), or on a data carrier such as an optical or electrical signal carrier. Code (and/or data) to implement embodiments of the techniques described herein may comprise source, object or executable code in a conventional programming language (interpreted or compiled) such as C, or assembly code, code for setting up or controlling an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array), or code for a hardware description language such as Verilog (RTM) or VHDL (Very high speed integrated circuit Hardware Description Language). As the skilled person will appreciate, such code and/or data may be distributed between a plurality of coupled components in communication with one another. The techniques may comprise a controller which includes a microprocessor, working memory and program memory coupled to one or more of the components of the system. It will also be clear to one of skill in the art that all or part of a logical method according to embodiments of the present techniques may suitably be embodied in a logic apparatus comprising logic elements to perform the steps of the above-described methods, and that such logic elements may comprise components such as logic gates in, for example a programmable logic array or application-specific integrated circuit. Such a logic arrangement may further be embodied in enabling elements for temporarily or permanently establishing logic structures in such an array or circuit using, for example, a virtual hardware descriptor language, which may be stored and transmitted using fixed or transmittable carrier media.

In an embodiment, the present techniques may be implemented using multiple processors or control circuits. The present techniques may be adapted to run on, or integrated into, the operating system of an apparatus.

In an embodiment, the present techniques may be realised in the form of a data carrier having functional data thereon, said functional data comprising functional computer data structures to, when loaded into a computer system or network and operated upon thereby, enable said computer system to perform all the steps of the above-described method.

Brief description of the drawings

Implementations of the present techniques will now be described, by way of example only, with reference to the accompanying drawings, in which:

Figure 1A is a perspective view of a first example robotic end-effector;

Figure IB is a perspective view of the first example robotic end-effector;

Figures 2A and 2B show, respectively, a side view and a bottom view of the first example robotic end-effector;

Figure 3 is a top view of the second pair of fingers and the cutting mechanism of the first example robotic end-effector;

Figure 4A is a perspective view of the first and second pairs of fingers of the first example robotic end-effector;

Figure 4B is a perspective view of the first pair of fingers of the first example robotic end-effector;

Figure 5A shows a perspective view of an alternative form of the first example robotic end-effector; Figures 5B and 5C show, respectively, a perspective view and a side view of a second example robotic end-effector;

Figure 5D shows a side view of an alternative form of the second example robotic end-effector;

Figure 6A shows a perspective view of the first pair of fingers of the second example robotic end-effector;

Figure 6B shows a perspective view of the second pair of fingers and cutting mechanism of the second example robotic end-effector;

Figures 7A and 7B show perspective views of the second pair of fingers and cutting mechanism of the second example robotic end-effector;

Figure 8 is a block diagram of a robotic system comprising the robotic end- effector of the present techniques;

Figure 9 is a flowchart of example steps to harvest fruit using the robotic end-effector of the present techniques;

Figures 10A and 10B show perspective views of an alternative form of the first pair of fingers;

Figures 11A and 11B show, respectively, a perspective view and a front view of an alternative form of the second pair of fingers; and

Figure 12 is a flowchart of example steps to pick fruits.

Detailed description of the drawings

Broadly speaking, embodiments of the present techniques provide a robotic end-effector for selective crop harvesting, which is particularly suitable for harvesting crops that grow in dense clusters, such as strawberries.

The term "picking head" is used interchangeably herein with the term "robotic end -effector".

Selective harvesting of crops using robotic technology aim to address the societal and economical challenges of agricultural labour shortages. Existing robotic solutions for selective harvesting have limited capability because of complex picking requirements in different picking scenarios, e.g., picking strawberries in dense clusters. Most of the available solutions are developed for a very specific picking scenario, e.g. picking strawberries in isolation. However, most of the economically-viable (e.g. high-yielding and/or disease resistant) varieties of strawberry are grown in dense clusters. The bottleneck of the robotic solutions is the picking head they use. Most of the available picking heads for selective harvesting are capable of performing only two actions: opening the picking head, and closing the picking head. The available picking heads are limited in their ability to harvest strawberries in a dense cluster where a ripe strawberry to be picked is occluded. As a result, some ripe strawberries may not be detected using existing picking heads and, even if they are detected, it may not be possible to segment, assess and localise the ripe strawberry using existing picking heads due to their limited range of motion.

Moreover, existing picking heads cannot reach the ripe strawberry if it is surrounded by other unripe strawberries, and if those unripe strawberries cannot be easily pushed away by the picking head to reach the ripe strawberry. That is, ripe strawberry displacement is inevitable during the move/push of the unripe ones, which makes it difficult to successfully harvest the ripe strawberry.

The present techniques address the above-mentioned issues with conventional picking heads by providing a picking head or robotic end-effector that is able to pick fruits such as, but not limited to, strawberries which grow in dense clusters, without bruising or damaging the fruit. This robotic end-effector of the present techniques comprises a pair of fingers for removing occlusion, another pair of fingers solely that are used to make a grip on the stem of ripe fruit, and a cutting mechanism that effectively and gently cuts the stem of the gripped fruit. As will be described in more detail below, the robotic end-effector comprises a vision system for identifying ripe fruit and determining when a stem of a ripe fruit is located between the pair of fingers used for gripping the stem.

Figure 1A is a perspective view of a first example robotic end-effector 100 for fruit harvesting, while Figure IB is a perspective upside down view of the first example robotic end-effector 100. The robotic end-effector 100 comprises a vision system (not shown) for identifying a location of a ripe fruit on a plant. The robotic end-effector 100 comprises a first pair of fingers 102 for moving any objects that at least partly occlude the identified ripe fruit on the plant. The robotic end- effector 100 comprises a second pair of fingers 104 for gripping a stem of the identified ripe fruit. The second pair of fingers 104 comprise a sensor 108 for indicating when the stem is located between the second pair of fingers 104. As shown in Figure IB, the sensor 108 may be positioned in the middle of the second pair of fingers 104, so that the sensor 108 can determine whether a stem is located between the second pair of fingers. The robotic end-effector 100 comprises a cutting mechanism 106 for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers 104, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers 104 after the stem has been cut.

The robotic end-effector 100 may further comprise a first actuation mechanism (not shown) for controlling the actuation of the first pair of fingers 102. Thus, a dedicated actuation mechanism is used to control the movement and operation of the first pair of fingers 102.

In response to receiving the location of an object that at least partly occludes a fruit, the first actuation mechanism may control the first pair of fingers 102 to push away the object, by increasing a separation distance between the first pair of fingers. Thus, the individual fingers 102a, 102b of the first pair of fingers 102 may be close together when the robotic end-effector is being used to image a plant and identify ripe fruits, and/or when the robotic end-effector is moving towards an identified ripe fruit. The individual fingers 102a, 102b of the first pair of fingers 102 may be moved further apart from each other when an object that at least partly occludes a fruit needs to be moved away, so that the fruit can be better seen (to determine if it is suitable for harvesting) and/or so that the second pair of fingers 104 can grip a stem of the fruit. In Figure 1, the individual fingers 102a, 102b are shown as being close together. In Figure 1, arrows A and B indicate, respectively, the direction fingers 102a and 102b need to be moved to increase the separation distance between the first pair of fingers 102. Preferably, the first actuation mechanism causes the individual fingers 102a, 102b to move in unison.

Figures 2A and 2B show, respectively, a side view and a bottom view of the first example robotic end-effector. As mentioned above, the robotic end-effector 100 comprises a cutting mechanism 106 for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers 104, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers 104 after the stem has been cut.

Figure 3 is a top view of the second pair of fingers 104a, b and the cutting mechanism 106 of the first example robotic end-effector. Other features of the robotic end-effector, including the first pair of fingers are removed from the image to more clearly shown the second pair of fingers and the cutting mechanism. It can be seen that when a stem is gripped by the second pair of fingers 104a, b, the cutting mechanism 106 cuts the stem at a location above the second pair of fingers, which means the stem (and the attached fruit) continues to be gripped by the second pair of fingers 104a, b after the stem has been cut.

The robotic end-effector 100 may further comprise a second actuation mechanism (not shown) for controlling the actuation of the second pair of fingers 104 and the cutting mechanism 106. Thus, a separate, dedicated actuation mechanism is used to control the movement and operation of the second pair of fingers 104 and the cutting mechanism 106. As the cutting mechanism 106 is only operated when it is confirmed that the second pair of fingers 104 are gripping a stem of a fruit to be harvested, advantageously a single actuation mechanism is used to control both the second pair of fingers 104 and cutting mechanism 106, thereby reducing complexity and the number of components needed to control the robotic end-effector 100. The sensor 108 of the second pair of fingers 104 may also be used to determine when the second pair of fingers are firmly gripping a stem.

The vision system may comprise at least one image sensor for capturing images of the fruit or clusters of fruit. The at least one image sensor may be an RGB sensor. In some cases, the at least one image sensor may be two RGB sensors 110a, b (see Figure IB) provided in the vicinity of the second pair of fingers to enable stereo vision (i.e. depth sensing). Having the RGB sensor(s) in the vicinity of the second pair of fingers may be advantageous because the sensor(s) capture images of the fruit or cluster of fruits at fruit level, whereas other sensors of the vision system may view the fruit from a different perspective/angle. This also reduces the risk of every sensor of the vision system being occluded during the picking process, i.e. it provides some redundancy in the vision system.

As noted above, the second pair of fingers 104a, b comprise a sensor 108 for determining when a stem is located between the second pair of fingers. Advantageously, the sensor 108 may be used to determine when the second pair of fingers need to be actuated to grip the stem, so that the fruit can be harvested. Responsive to receiving feedback from the sensor 108 that the stem of the ripe fruit is located between the second pair of fingers 104, the second actuation mechanism may control the second pair of fingers 104 to grip the stem, by decreasing a separation distance between the second pair of fingers.

In Figure 3, individual fingers 104a, 104b of the second pair of fingers 104 are shown as being far apart. This position or separation distance may be adopted so that the second pair of fingers 104 are able to receive a stem of a fruit to be harvested. The individual fingers 104a, 104b may be actuated to decrease the separation distance between the second pair of fingers 104 when a stem located between the fingers is to be gripped. In Figure 3, arrows C and D indicate, respectively, the direction fingers 104a and 104b need to be moved to decrease the separation distance between the second pair of fingers 104. Preferably, the second actuation mechanism causes the individual fingers 104a, 104b to move in unison.

As shown in Figure 3, the cutting mechanism 106 is preferably provided in proximity to the second pair of fingers 104, such that when the stem of the identified ripe fruit is cut by the cutting mechanism 106, the second pair of fingers 104 continues to grip a portion of the stem that remains attached to the fruit. In other words, when a cutting operation performed by the cutting mechanism 106 is complete, the fruit is not immediately dropped into a container for collecting the harvested fruit. Instead, the fruit continues to be gripped, via a portion of the stem that is still attached to the fruit, by the second pair of fingers 104. This is advantageous because the robotic end-effector may be controlled to gently place the harvested fruit in a container. The second pair of fingers 104 may release their grip on the portion of the stem that remains attached to the fruit when the end-effector is close to the container.

As mentioned above, the second pair of fingers 104 comprise a sensor (not shown here) for indicating when the stem is located between the second pair of fingers 104. The sensor of the second pair of fingers 104 may be a proximity sensor, such as an infrared sensor. It will be understood that this is a non-limiting example proximity sensor, and that any suitable sensor may be used to determine when the stem is located between the second pair of fingers are closed in a position suitable for gripping.

When the stem of the identified ripe fruit is gripped by the second pair of fingers 104, the identified ripe fruit may be in proximity to a first side (e.g. a bottom side) of the second pair of fingers 104. The cutting mechanism may be provided in proximity to a second, opposite side (e.g. a top side) of the second pair of fingers, as shown in Figure 3. Thus, the cutting mechanism 106 may be positioned relative to the second pair of fingers such that it cuts the stem at a point where the stem protrudes from the second pair of fingers. In this way, a portion of the stem that is still attached to the fruit remains gripped by the second pair of fingers 104.

Figure 4A is a perspective view of the first and second pairs of fingers of the first example robotic end-effector, and Figure 4B is a perspective view of the first pair of fingers of the first example robotic end-effector. In both Figures, other components of the robotic end-effector have been removed to more clearly show the fingers. In Figure 4A, the individual fingers 102a, 102b of the first pair of fingers 102 are shown as being close together, while the individual fingers 104a, 104b of the second pair of fingers 104 are shown as being far apart. This may be the configuration used when the robotic end-effector is being moved towards a ripe fruit or to identify ripe fruits. The opposite configuration may be used when the robotic end-effector is used to harvest a ripe fruit. That is, during the harvesting process, the individual fingers 102a, 102b of the first pair of fingers 102 may be far apart (while they push away any objects that occlude the specific fruit to be harvested), while the individual fingers 104a, 104b of the second pair of fingers 104 may be close together (while they grip a stem of the specific fruit to be harvested). In this way, the first pair of fingers may be out of the way of fruit connected to the stem that is being gripped by the second pair of fingers.

The first pair of fingers 102 may be non-sensing fingers. That is, the first pair of fingers 102 may not themselves obtain any feedback about the objects they contact. Alternatively, the first pair of fingers 102 may be haptic fingers comprising sensors for measuring forces exerted on the fingers by objects being moved by the fingers.

Figure 5A shows a perspective view of an alternative form 100' of the first example robotic end-effector. The robotic end-effector 100' comprises a first pair of fingers 102' (having individual fingers 102a', 102b') and a second pair of fingers 104'. The functionality of the robotic end-effector 100' is the same as that of robotic end-effector 100, and is not described again for the sake of conciseness.

Figures 5B and 5C show, respectively, a perspective view and a side view of a second example robotic end-effector 200. It will be understood that the first and second robotic end-effectors operate in the same way, and differ only in the precise design of particular features.

The robotic end-effector 200 comprises a vision system (not shown) for identifying a location of a ripe fruit on a plant. The robotic end-effector 200 comprises a first pair of fingers 202 for moving any objects that at least partly occlude the identified ripe fruit on the plant. The robotic end-effector 200 comprises a second pair of fingers 204 for gripping a stem of the identified ripe fruit. The second pair of fingers 204 comprise a sensor (not shown) for indicating when the stem is gripped between the second pair of fingers 204. The sensor may be located between the second pair of fingers 204 in a similar manner to sensor 108 in Figure IB. The robotic end-effector 200 comprises a cutting mechanism (not visible here) for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers 204, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers 204 after the stem has been cut.

The robotic end-effector 200 may further comprise a first actuation mechanism (not shown) for controlling the actuation of the first pair of fingers 202. Thus, a dedicated actuation mechanism is used to control the movement and operation of the first pair of fingers 202.

In response to receiving the location of an object that at least partly occludes a fruit, the first actuation mechanism may control the first pair of fingers 202 to push away the object, by increasing a separation distance between the first pair of fingers. Thus, the individual fingers 202a, 202b of the first pair of fingers 202 may be close together when the robotic end-effector is being used to image a plant and identify ripe fruits, and/or when the robotic end-effector is moving towards an identified ripe fruit. The individual fingers 202a, 202b of the first pair of fingers 202 may be moved further apart from each other when an object that at least partly occludes a fruit needs to be moved away, so that the fruit can be better seen (to determine if it is suitable for harvesting) and/or so that the second pair of fingers 104 can grip a stem of the fruit. In Figure 5B, the individual fingers 202a, 202b are shown as being close together.

Figure 5D shows a side view of an alternative form of the second example robotic end-effector 200'. It will be understood that the second robotic end- effectors 200 and 200' operate in the same way, and differ only in the precise design of particular features. Specifically, the robotic end-effector 200' has a first pair of fingers 202' which are relatively smaller than the first pair of fingers 202 of the robotic end-effector 200 (see e.g. Figure 5C for comparison). The first pair of fingers 202' are smaller in size and do not protrude as far relative to the second pair of fingers. This may provide a more compact robotic end-effector, which may be advantageous when picking certain fruits. Figure 6A shows a perspective view of the first pair of fingers 202 of the second example robotic end-effector 200, and Figure 6B shows a perspective view of the second pair of fingers 202 and cutting mechanism 206 of the second example robotic end-effector 200. In Figure 6A, both individual fingers 202a, 202b of the first pair of fingers 202 are shown, while in Figure 6B, only finger 202a is shown and finger 202b is hidden so that the second pair of fingers 204 can be more clearly seen. The cutting mechanism 206 is for cutting the stem of the identified ripe fruit in response to a signal from the sensor (of the second pair of fingers 204) indicating that the stem is gripped between the second pair of fingers 204, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers 204 after the stem has been cut.

The first pair of fingers 202 may be non-sensing fingers. That is, the first pair of fingers 202 may not themselves obtain any feedback about the objects they contact. Alternatively, the first pair of fingers 202 may be haptic fingers comprising sensors for measuring forces exerted on the fingers by objects being moved by the fingers.

Figures 7A and 7B show perspective views of the second pair of fingers 204 and cutting mechanism 206 of the second example robotic end-effector 200.

The robotic end-effector 200 may further comprise a second actuation mechanism (not shown) for controlling the actuation of the second pair of fingers 204 and the cutting mechanism 206. Thus, a separate, dedicated actuation mechanism is used to control the movement and operation of the second pair of fingers 204 and the cutting mechanism 206. As the cutting mechanism 206 is only operated when it is confirmed that the second pair of fingers 204 are gripping a stem of a fruit to be harvested, advantageously a single actuation mechanism is used to control both the second pair of fingers 204 and cutting mechanism 206, thereby reducing complexity and the number of components needed to control the robotic end-effector 200. The sensor of the second pair of fingers 204 may also be used to determine when the second pair of fingers are firmly gripping a stem.

The vision system may comprise at least one image sensor for capturing images of the fruit/cluster of fruits. The at least one image sensor may be an RGB sensor. In some cases, the at least one image sensor may be two RGB sensors provided in the vicinity of the second pair of fingers to enable stereo vision (i.e. depth sensing). Having the RGB sensor(s) in the vicinity of the second pair of fingers may be advantageous because the sensor(s) capture images of the fruit or cluster of fruits at fruit level, whereas other sensors of the vision system may view the fruit from a different perspective/angle. This also reduces the risk of every sensor of the vision system being occluded during the picking process, i.e. it provides some redundancy in the vision system.

The second pair of fingers comprise a sensor for determining when a stem is located between the second pair of fingers. Advantageously, the sensor may be used to determine when the second pair of fingers need to be actuated to grip the stem, so that the fruit can be harvested. Responsive to receiving feedback from the sensor that the stem of the ripe fruit is located between the second pair of fingers 204, the second actuation mechanism may control the second pair of fingers 204 to grip the stem, by decreasing a separation distance between the second pair of fingers.

In Figures 7A and 7B, individual fingers 204a, 204b of the second pair of fingers 204 are shown as being close together. This position or separation distance may be adopted when the second pair of fingers 204 are gripping a stem of a fruit to be harvested. The individual fingers 204a, 204b may be actuated to increase the separation distance between the second pair of fingers 204 to enable a stem to be received or positioned between the fingers 204a, 204b prior to gripping.

As shown in Figures 7A and 7B, the cutting mechanism 206 is preferably provided in proximity to the second pair of fingers 204, such that when the stem of the identified ripe fruit is cut by the cutting mechanism 206, the second pair of fingers 204 continues to grip a portion of the stem that remains attached to the fruit. In other words, when a cutting operation performed by the cutting mechanism 206 is complete, the fruit is not immediately dropped into a container for collecting the harvested fruit. Instead, the fruit continues to be gripped, via a portion of the stem that is still attached to the fruit, by the second pair of fingers 204. This is advantageous because the robotic end-effector may be controlled to gently place the harvested fruit in a container. The second pair of fingers 204 may release their grip on the portion of the stem that remains attached to the fruit when the end-effector is close to the container.

As mentioned above, the second pair of fingers 204 comprise a sensor (not shown here, but see Figure IB for where the sensor may be located) for indicating when the stem is gripped between the second pair of fingers 204. The sensor of the second pair of fingers 204 may be a proximity sensor, such as an infrared sensor. It will be understood that this is a non-limiting example proximity sensor, and that any suitable sensor may be used to determine when the stem is located between the second pair of fingers are closed in a position suitable for gripping.

As shown in Figures 7A and 7B, one of the fingers 204b of the second pair of fingers may comprise a slot 208 in the finger, which extends all the way through the finger 204b, from an edge of the finger to a (stem) gripping surface of the finger 204b. The cutting mechanism 206 may be arranged to move through the slot 208 to cut a stem gripped by the second pair of fingers 204. This may be advantageous because a portion of the stem which is gripped by the second pair of fingers 204 may be held more firmly and/or may be substantially straight (compared to the stem which protrudes from the second pair of fingers), which may make it easier for the cutting mechanism to cut through the stem. Furthermore, as the cutting mechanism 206 is located in the slot 208, the cutting mechanism is at least partially covered or encased for safety reasons, i.e. to avoid any risk of a human operator being able to come into contact with the cutting mechanism.

The other of the fingers 204a of the second pair of fingers 204 may comprise a groove 210 for receiving a cutting edge 206a of the cutting mechanism 206 when the cutting mechanism 206 moves through the slot 208 to cut the stem. This may enable the cutting mechanism 206 to fully cut through the stem, as the groove 210 provides a space for the cutting edge 206a of the cutting mechanism 206 to pass all the way through the stem.

In both the first end-effector 100 and second end-effector 200, surfaces of the first pair of fingers 102 and/or second pair of fingers 104 and/or other components of the end-effectors may be substantially smooth for ease of cleaning and to avoid the accumulation of dirt. This is advantageous because dirt on the components may prevent the end-effector from operating correctly. For example, dirt on the second pair of fingers may prevent the sensor of the second pair of fingers from correctly determining when a stem is gripped between the fingers.

Generally speaking, the robotic end-effector described herein benefits from 2.5 degrees of freedom, which is higher than those of available picking heads. This added degree of freedom allows the robotic end-effector to deal with complex picking scenarios where the available picking heads fail. The robotic end-effector benefits from an effective combination of actuation systems and sensors to resolve the limitations of currently available picking heads. The robotic end-effector includes three separate movements (moving objects, gripping a stem, and cutting a stem) that are actuated using two actuators. This is useful as the ability of the robotic end-effector is increased without also significantly increasing the complexity or component count of the device.

Embodiments of the robotic end-effector benefit from an effective configuration of RGB, RGB-D and IR proximity sensors that helps to efficiently detect and localise the ripe strawberries. In addition, the combined sensory information can be used to estimate the size, weight and sort the quality of the strawberries to be picked.

Figure 8 is a block diagram of a robotic system 300 for fruit harvesting comprising the robotic end-effector 100, 200 of the present techniques. While Figure 8 shows the robotic system 300 comprising the first example robotic end- effector 100, it will be understood that this is merely illustrative. The robotic system 300 comprises at least one picking arm 302 and a control system 304 for controlling the at least one picking arm 302. The control system 304 may comprise at least one processor coupled to memory. The at least one processor may comprise one or more of: a microprocessor, a microcontroller, and an integrated circuit. The memory may comprise volatile memory, such as random access memory (RAM), for use as temporary memory, and/or non-volatile memory such as Flash, read only memory (ROM), or electrically erasable programmable ROM (EEPROM), for storing data, programs, or instructions, for example.

The robotic system comprises a robotic end-effector 100 of the types described herein, that is coupled to the at least one picking arm 302.

As shown in Figure 8, the robotic end-effector 100 comprises a vision system 120 for identifying a location of a ripe fruit on a plant. The vision system 120 may enable a three-dimensional map of a plant to be generated. The vision system 120 may use the three-dimensional map to identify location of ripe fruits on a plant and any objects that at least partly occlude the identified ripe fruits. As described above, the vision system 120 sensor may comprise a depth sensor, which may be, for example, an RGB-D (red-green-blue-depth) camera. The RGB- D camera may be mounted on the robotic end-effector in a position that enables the plant to be imaged from a distance. The vision system 120 may further comprise at least one further image sensor in the vicinity of the second pair of fingers 104. The image sensor may enable images to be captured of the environment of the second pair of fingers 104. This may be advantageous because, when the robotic end-effector 100 is being used to harvest a fruit, the RGB-D sensor may be obscured by foliage of the plant and so it may not be able to determine, for example, whether a stem of the ripe fruit is positioned between the second pair of fingers. Thus, the image sensor provides redundancy in the vision system 120. In some cases, the image sensor may be two RGB sensors provided in the vicinity of the second pair of fingers 104 to enable stereo vision (i.e. depth sensing).

The vision system 120 may also be able to determine the size and quality of each fruit that is to be harvested, or which is harvested. This may enable the fruit to be deposited into a suitable container. For example, this may enable large fruits to be placed into a container with other large fruits (so that they do not damage or squash smaller fruits), or it may enable large fruits to be dispersed among different containers (so that each container contains a mixture of fruit sizes). This may also enable any slightly damaged or rotten fruits to be discarded or separated from other higher quality fruits.

The robotic end-effector comprises a first pair of fingers 102 for moving any objects that at least partly occlude the identified ripe fruit on the plant. The first pair of fingers are operated by a first actuation mechanism 122. The robotic end- effector comprises a second pair of fingers 104 for gripping a stem of the identified ripe fruit. The second pair of fingers 104 comprise a sensor 108 for indicating when the stem is located between the second pair of fingers 104. The robotic end-effector comprises a cutting mechanism 106 for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers 104, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut. The second pair of fingers 104 and cutting mechanism 106 are operated by a second actuation mechanism 124. For the sake of conciseness, the operation of the robotic end- effector 100 will not be described here again.

The robotic system 300 may further comprise at least one container 306 for receiving fruit harvested by the robotic end-effector 100. After the cutting mechanism 106 has cut the stem of a ripe fruit, the control system 304 may control the at least one picking arm 302 to move the robotic end-effector 100 to above the at least one container 306. Once in position, the second actuation mechanism 124 controls the second pair of fingers 104 to release the stem and the attached fruit, by increasing a separation distance between the second pair of fingers, so that the fruit drops into the container. The robotic system 300 may further comprise a mechanism 308 for moving the robotic system towards, between and/or around plants. The mechanism 308 may be a tracked or wheeled rover, or a vehicle capable of navigating autonomously.

Figure 9 is a flowchart of example steps to harvest fruit using the robotic end-effector of the present techniques. The method begins by identifying, using a vision system of the robotic end-effector, a location of a ripe fruit on a plant (step S100).

The method comprises moving, using a first pair of fingers of the robotic end-effector, any objects that at least partly occlude the identified ripe fruit on the plant (step S102). The step of moving any objects that at least partly occlude the identified ripe fruit may comprise controlling a first actuation mechanism to increase a separation distance between the first pair of fingers.

The method comprises gripping a stem of the identified ripe fruit, using a second pair of fingers of the robotic end-effector (step S104). The step of gripping a stem of the identified ripe fruit may comprise: receiving feedback from the sensor 108 that the stem of the identified ripe fruit is located between the second pair of fingers; and controlling a second actuation mechanism to decrease a separation distance between the second pair of fingers. Advantageously, the feedback from the sensor may be used to determine when the second pair of fingers need to be actuated to grip the stem, so that the fruit can be harvested.

The method comprises sensing, using a sensor of the second pair of fingers, when the stem is located between the second pair of fingers (step S106).

The method comprises cutting, using a cutting mechanism, the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers (step S108), wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut.

As explained above, the robotic end-effectors comprise a first pair of fingers for moving any objects that at least partly occlude the identified fruit on the plant. Figures 10A and 10B show perspective views of an alternative form of the first pair of fingers that may be used with the end-effectors described herein. Specifically, Figure 10A shows an individual finger 1002a of the pair of fingers, and Figure 10B shows another individual finger 1002b of the pair of fingers. In use, the fingers 1002a, 1002b operate similarly to the fingers 202a, 202b described above with respect to Figures 5B to 6B. That is, when individual fingers 1002a, 1002b are close together, the robotic end-effector is being used to image a plant and identify ripe fruits, and/or is moving towards an identified ripe fruit. The individual fingers 1002a, 1002b may be moved further apart from each other when an object that at least partly occludes a fruit needs to be moved away, so that the fruit can be better seen (to determine if it is suitable for harvesting) and/or so that the second pair of fingers can grip a stem of the fruit. It can be seen from Figures 10A and 10B that the form of the first pair of fingers is smaller than those described with respect to the first and second example robotic end-effectors. Specifically, the fingers 1002a, 1002b are thinner and shorter than the first pair of fingers described up to now. The more compact fingers 1002, 1002b are advantageous because they enable more effective interaction with clusters of fruits such as strawberries.

As explained above, the robotic end-effectors comprise a second pair of fingers for gripping a stem of the identified ripe fruit. Figures 11A and 11B show, respectively, a perspective view and a front view of an alternative form of the second pair of fingers that may be used with the end-effectors described herein. Each finger of the second pair of fingers 104 and 204 described above has a contact or gripping surface which contacts a stem to be gripped and cut. The gripping surface of each finger in the pair 104, 204 is substantially parallel. Furthermore, the cutting mechanism 106, 206 moves in a plane normal to (perpendicular to) the gripping surfaces of the pairs of fingers 104, 204. However, it has been determined during experiments that multiple cutting actions are required to cut a stem when the cutting mechanism moves in a plane normal to the gripping surfaces of the second pair of fingers. It is thought that this arises from the cellular or tissue structure of the stem being non-uniform and/or the stem not being perfectly linear along its length. Multiple cutting actions per stem is undesirable as they slow down the operation of the robotic system and thereby reduce the potential yield in a given time period.

Thus, Figure 11A and 11B show an improved gripping finger 1004a of a second pair of fingers. It can be seen that the gripping finger 1004a comprises a shaped contact/gripping surface. The gripping surface of finger 1004a comprises a curved or angled portion 1000 and a straight or flat portion 1002. The straight portion 1002 enables the finger 1004a to grip the stem of a ripe fruit. The straight portion 1002 of finger 1004a faces a straight portion of a gripping surface of another finger in the second pair of fingers, or faces a straight gripping surface of the other finger. The angled portion 1000 enables the cutting mechanism to cut the stem at an angle. The angled portion 1000 may remove the need for need a groove in the gripping surface in which to receive a cutting edge of the cutting mechanism. As shown in Figure 11A, the angled portion 1000 may be angled relative to the straight portion 1002. In the illustrated example, the angle away from an axis of the straight portion 1002 may be 20° ± 0.5°. However, it will be understood that this is an illustrative example and other suitable angles may be used which enable a stem to be cut using a single cutting action.

Each individual finger of the second pair of fingers may have a form like that shown in Figures 11A and 11B. For example, with reference to Figures 7A and 7B, both the finger 204b of the second pair of fingers which comprises a slot 208 that extends all the way through the finger 204b, and the finger 204a which comprises a groove 210 for receiving a cutting edge of the cutting mechanism, may take the form shown in Figures 11A and 11B.

Alternatively, only one of the individual fingers of the second pair of fingers may take this form. For example, with reference to Figures 7A and 7B, the finger 204b of the second pair of fingers which comprises a slot 208 that extends all the way through the finger 204b may be unchanged, while the finger 204a which comprises a groove 210 for receiving a cutting edge of the cutting mechanism may instead take the form shown in Figures 11A and 11B.

Figure 12 is a flowchart of example steps to pick fruits. The basic process to harvest a ripe fruit is described above with reference to Figure 9. A more detailed process is now described. As explained above, the robotic end-effector 100 comprises a vision system 120 for identifying a location of a ripe fruit on a plant. The vision system 120 may alternatively be provided on the picking arm 302 in the vicinity of the robotic end-effector (e.g. above or below the robotic end- effector and mounted so that it is able to visualise a scene in front of the end- effector).

Wherever the vision system is located, the vision system may enable a three-dimensional map of a plant to be generated. The vision system may use the three-dimensional map to identify location of ripe fruits on a plant and any objects that at least partly occlude the identified ripe fruits. The vision system 120 may comprise a depth sensor, which may be, for example, an RGB-D (red- green-blue-depth) camera. The RGB-D camera may be mounted on the robotic end-effector (or on the picking arm in the vicinity of the robotic end-effector) in a position that enables the plant to be imaged from a distance. The RGB-D camera may, in conjunction with image processing and analysis software, be used to segment images, detect ripe fruits, estimate a ripeness of the fruits, and select and localise a ripe fruit for picking. The picking arm may be set to be in a default ("home") configuration in which the RGB-D camera may be able to survey a plant and its fruits from a distance.

The vision system 120 may also be able to determine the size, weight and/or quality of each fruit that is to be harvested, or which is harvested. This may enable the fruit to be deposited into a suitable container. For example, this may enable large fruits to be placed into a container with other large fruits (so that they do not damage or squash smaller fruits), or it may enable large fruits to be dispersed among different containers (so that each container contains a mixture of fruit sizes). This may also enable any slightly damaged or rotten fruits to be discarded or separated from other higher quality fruits.

Thus, the process begins by identifying, using the vision system, the location of fruits on a plant (step S200). This may comprise placing bounding boxes around every identifiable fruit on a plant. Each bounding box may be associated with a corresponding confidence value indicating how likely it is that the box contains a fruit. The confidence value may be low when a fruit is partly occluded by another fruit or another object (e.g. leaf). To improve the confidence values, the picking arm may be moved relative to the plant to capture images of the plant from different angles. Thus, although the images may originally be captured while the picking arm is in the home configuration, the picking arm may be moved (e.g. to the left and right) in order to better visualise the picking arm. Fruits that are partly occluded when viewed from the home configuration may be more clearly seen when viewed from a different position/angle. Thus, a position and/or orientation of the picking arm and therefore, the vision system, may be changed in order to identify the location of fruits on the plant.

The process may further comprise determining which of the identified fruits are ripe fruits, and selecting one such fruit to pick (step S202).

The process may comprise determine whether the selected/target fruit is in a cluster (step S204).

If the target fruit is not in a cluster, it may be relatively straightforward to pick the fruit using the end-effector. If the target fruit is not in a cluster, the process may comprise generating, using a motion planning module, a trajectory for the robotic arm to move the end-effector towards the selected/target fruit (step S208). The bounding box around the target fruit includes 2D and 3D coordinate information which is used by the motion planning module to move the end-effector towards the selected fruit. The trajectory may be generated to move the end-effector so that it is in the vicinity of the target fruit, e.g. 5-lOcm away from the target fruit. This may be advantageous because finer positioning may be determined once the end-effector is in the vicinity of the target fruit, when the target fruit can be more clearly seen by the sensor 108 in the vicinity of the second pair of fingers.

When the end-effector is in the vicinity of the target fruit, finer control of the end-effector may be performed to move the end-effector closer to the target fruit. This may comprise moving the picking arm as well as moving the end- effector. The image sensor 108 in the vicinity of the second pair of fingers is used to provide further information about the environment around the target fruit, such as whether the stem of the target fruit can be seen and accessed, and whether any objects are preventing the second pair of fingers from reaching the stem. Thus, the process may comprise using the vision system to actuate the first pair of fingers to move any occluding objects out of the way of the target fruit (step S210), and to actuate the second pair of fingers to grip a stem of the target fruit (step S212). Once the second pair of fingers are gripping a stem of the target fruit, the cutting mechanism may be deployed.

If at step S204 it is determined that the target fruit is in a cluster, it may not be straightforward to pick the fruit using the end-effector. In this case, there may be other fruits in the cluster which prevent the second pair of fingers from reaching the stem of the target fruit. There may be other fruits which are also ripe and can be picked and which are occluding the target fruit. When the target fruit is in a cluster, it may be more appropriate to generate a picking schedule to determine an order in which fruits are to be harvested. Thus, the process may comprise determining an order in which to pick ripe fruits in the cluster (step S206).

A scheduling module may determine which ripe fruit to pick first, based on which fruit is the easiest to reach and pick. This determination may be based on the bounding box information. For example, the easiest fruit to pick may be that whose bounding box has the maximum distance to the bounding boxes of the other fruits in the cluster. Picking the nearest and/or easiest to reach fruit first may also enable other more difficult fruits to be picked which may be located behind the nearest/easiest fruits.

The scheduling module may also estimate the size or weight of each ripe fruit. This enables control of the picking arm to deposit the harvested fruit into the most appropriate container or punnet, as mentioned above.

Once the scheduling module has determined an order in which to pick the ripe fruit, the process continues to step S208 to generate a trajectory to move the end-effector towards the first target fruit in the schedule. Steps S208 to S212 are repeated until each ripe fruit in the schedule has been picked. It will be understood that during the picking process, the schedule may be changed to take into account the fact that once a fruit has been harvested it may be easier to see which fruit is easier to harvest. Similarly, the picking process may cause some fruits to become occluded, which may impact the schedule. Thus, after step S212 has been completed for the first fruit, the process may return to step S206 to check the schedule and update it if necessary.

Those skilled in the art will appreciate that while the foregoing has described what is considered to be the best mode and where appropriate other modes of performing present techniques, the present techniques should not be limited to the specific configurations and methods disclosed in this description of the preferred embodiment. Those skilled in the art will recognise that present techniques have a broad range of applications, and that the embodiments may take a wide range of modifications without departing from any inventive concept as defined in the appended claims.