GB2607326A - Apparatus and systems for selective crop harvesting - Google Patents
Apparatus and systems for selective crop harvesting Download PDFInfo
- Publication number
- GB2607326A GB2607326A GB2107942.1A GB202107942A GB2607326A GB 2607326 A GB2607326 A GB 2607326A GB 202107942 A GB202107942 A GB 202107942A GB 2607326 A GB2607326 A GB 2607326A
- Authority
- GB
- United Kingdom
- Prior art keywords
- fingers
- pair
- stem
- fruit
- effector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01D—HARVESTING; MOWING
- A01D46/00—Picking of fruits, vegetables, hops, or the like; Devices for shaking trees or shrubs
- A01D46/24—Devices for picking apples or like fruit
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01D—HARVESTING; MOWING
- A01D46/00—Picking of fruits, vegetables, hops, or the like; Devices for shaking trees or shrubs
- A01D46/30—Robotic devices for individually picking crops
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/0045—Manipulators used in the food industry
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J15/00—Gripping heads and other end effectors
- B25J15/0052—Gripping heads and other end effectors multiple gripper units or multiple end effectors
- B25J15/0066—Gripping heads and other end effectors multiple gripper units or multiple end effectors with different types of end effectors, e.g. gripper and welding gun
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0084—Programme-controlled manipulators comprising a plurality of manipulators
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45003—Harvester
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Life Sciences & Earth Sciences (AREA)
- Environmental Sciences (AREA)
- Multimedia (AREA)
- Food Science & Technology (AREA)
- Harvesting Machines For Specific Crops (AREA)
Abstract
A robotic end-effector 100 for fruit harvesting comprises a vision system for locating a ripe fruit on a plant, a pair of clearing arms 102 for pushing aside any obstacles that occlude the identified fruit, a pair of jaws 104 for gripping a stem of the fruit, a sensor detecting when the stalk is located between the jaws, and a cutting mechanism 106 for slicing the stem so that the jaws hold the fruit by its stem during and after cutting, e.g. gripping it immediately below the cutter. The cutter may engage a groove in one jaw through a slot (208, Fig. 7) in the other jaw. Actuating the clearing arms may drive them apart, parting foliage or sweeping other obstructions away from the identified fruit, and they may be fingers with haptic sensors for measuring forces exerted on the moved objects, e.g. as feedback for force-controlled actuation of the clearing arms. The sensor may be a proximity or image sensor, e.g. a pair of RGB image sensors providing stereo vision of the stalk, and may trigger jaw closure to grip the stem. The manipulator is particularly suitable for harvesting crops that grow in dense clusters, such as strawberries.
Description
Apparatus and System for Selective Crop Harvesting
Field
The present techniques generally relate to an apparatus, system and method for selective crop harvesting. In particular, the present techniques provide a robotic end-effector for fruit harvesting that is able to detect, select and cut edible crops that grow in dense clusters.
Background
Picking fruits in different growing conditions is a challenging problem for selective harvesting technology, as it is difficult to design and build an effective robotic device (known as an end-effector or picking head) which is able to deal with complex picking operations. A human's hand enables dexterous manipulation of a fruit with 27 degrees of freedom, and over 80% of the grasping information can be encoded into just 6 eigen grasp. In contrast, conventional robotic end-effectors are customised for specific applications, such as pick-and-place operations in industrial environments.
Currently, there are two types of picking heads available for robotic harvesting of high value crops: (i) a picking head having a parallel jaw gripper, which may not be suitable for all types of crops, and (H) a picking head that has a customised design for picking particular fruit in a very specific picking scenario, which is only suitable for a specific type of crop or method of harvesting. Consequently, the effectiveness of commonly available robotic picking heads is limited, as different robotic picking heads may be needed for different crop types.
Some robotic picking heads are used to pick soft fruits such as strawberries.
Some of the robotic picking heads that are currently available for picking strawberries are cup-shaped picking heads, which have opening parts that locate the peduncle of a strawberry and position the strawberry in front of cutting scissors in order to harvest the strawberry. The cutting action causes the strawberry to detach from the plant and fall into a punnet for collecting the strawberries. In this example, the picking head does not directly touch the flesh of the strawberry, which minimises bruising. However, because the strawberry falls from a height into the punnet, the harvesting can inadvertently cause damage/bruising to the fruit. Furthermore, fruit placement within the punnet is not controlled, which may result in uneven distribution of the fruit in the punnet (which may also cause damage to fruit that are below other fruit).
Similarly, the design of the cup-shaped picking head, and the design of other types of picking head, may not be suitable for harvesting crops that grow in dense clusters.
The present applicant has therefore identified the need for an improved apparatus for automatic detection, selection and harvesting of crops that grow in dense clusters.
Summary
In a first approach of the present techniques, there is provided a robotic end-effector for fruit harvesting, the robotic end-effector comprising: a vision system for identifying a location of a ripe fruit on a plant; a first pair of fingers for moving any objects that at least partly occlude the identified ripe fruit on the plant; a second pair of fingers for gripping a stem of the identified ripe fruit, the second pair of fingers comprising a sensor for indicating when the stem is located between the second pair of fingers in a position suitable for gripping; and a cutting mechanism for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut.
The present techniques advantageously enable ripe fruit to be harvested without bruising or damaging the fruit. The present techniques are particularly advantageous for harvesting fruit that grows in dense clusters, such as strawberries. It will be understood that this is an example, and non-limiting, fruit that may be harvested using the robotic end-effector of the present techniques. More generally, the present techniques may be used to harvest different types of fruit and vegetable crop, including those which grow individually and those which grow in clusters.
The cutting mechanism is preferably provided in proximity to the second 5 pair of fingers, such that when the stem of the identified ripe fruit is cut by the cutting mechanism, the second pair of fingers continues to grip a portion of the stem that remains attached to the fruit. In other words, when a cutting operation performed by the cutting mechanism is complete, the fruit is not immediately dropped into a container for collecting the harvested fruit. Instead, the fruit 10 continues to be gripped, via a portion of the stem that is still attached to the fruit, by the second pair of fingers. This is advantageous because the robotic end-effector may be controlled to gently place the harvested fruit in a container. The second pair of fingers may release their grip on the portion of the stem that remains attached to the fruit when the end-effector is close to the container.
Preferably, the cutting mechanism may be partially or fully covered or encased for safety reasons, i.e. to avoid any risk of a human operator being able to come into contact with the cutting mechanism.
When the stem of the identified ripe fruit is gripped by the second pair of fingers, the identified ripe fruit may be in proximity to a first side (e.g. a bottom side) of the second pair of fingers. The cutting mechanism may be provided in proximity to a second, opposite side (e.g. a top side) of the second pair of fingers. Thus, the cutting mechanism may be positioned relative to the second pair of fingers such that it cuts the stem at a point where the stem protrudes from the second pair of fingers. In this way, a portion of the stem that is still attached to the fruit remains gripped by the second pair of fingers.
Alternatively, one of the fingers of the second pair of fingers may comprise a slot in the finger, which extends all the way through the finger, from an edge of the finger to a gripping surface of the finger. The cutting mechanism may be arranged to move through the slot to cut a stem gripped by the second pair of fingers. This may be advantageous because a portion of the stem which is gripped by the second pair of fingers may be held more firmly and/or may be substantially straight (compared to the stem which protrudes from the second pair of fingers), which may make it easier for the cutting mechanism to cut through the stem.
The other of the fingers of the second pair of fingers may comprise a groove for receiving a cutting edge of the cutting mechanism when the cutting mechanism moves through the slot to cut the stem. This may enable the cutting mechanism to fully cut through the stem, as the groove provides a space for the cutting edge of the cutting mechanism to pass through the stem.
The vision system may comprise a depth sensor for generating a three-dimensional map of the plant. The vision system may use the three-dimensional map to identify location of ripe fruits on a plant and any objects that at least partly occlude the identified ripe fruits. The depth sensor may be an RGB-D (red-greenblue-depth) camera.
The robotic end-effector may further comprise a first actuation mechanism for controlling the actuation of the first pair of fingers. Thus, a dedicated actuation mechanism is used to control the movement and operation of the first pair of fingers.
Responsive to receiving the location of an object that at least partly occludes a fruit, the first actuation mechanism may control the first pair of fingers to push away the object, by increasing a separation distance between the first pair of fingers. Thus, the first pair of fingers may be close together when the robotic end-effector is being used to image a plant and identify ripe fruits, and/or when the robotic end-effector is moving towards an identified ripe fruit. The first pair of fingers may be moved further apart when an object that at least partly occludes a fruit needs to be moved away, so that the fruit can be better seen (to determine if it is suitable for harvesting) and/or so that the second pair of fingers can grip a stem of the fruit.
The first pair of fingers may be non-sensing fingers. That is, the first pair of fingers may not themselves obtain any feedback about the objects they contact. Alternatively, the first pair of fingers may be haptic fingers comprising sensors for 35 measuring forces exerted on the fingers by objects being moved by the fingers.
This may be advantageous because having the sensors can determine interactions between the first pair of fingers and the plant, which enables intelligent manipulation of a cluster of fruits when other sensors (e.g. visual sensors) may not be able to see what the first pair of fingers are interacting with due to 5 occlusion. The forces measured by the sensors of the haptic fingers may be used by the first actuation mechanism to control the actuation of the haptic fingers, and the movements of a robotic manipulator. This may be advantageous because more effective manipulation movements may be generated to push away occluding matter, which yields increased success of cluster manipulation. 10 Furthermore, it may avoid exerting large or excessive forces on soft fruits, thereby minimising the risk of bruising the fruits during the picking process.
The vision system may comprise at least one image sensor for capturing images of the fruit/cluster of fruits. . The at least one image sensor may be an RGB sensor. In some cases, the at least one image sensor may be two RGB sensors provided in the vicinity of the second pair of fingers to enable stereo vision (i.e. depth sensing). Stereo vision may be useful because it provides the robotic end-effector with richer sensory information. Having the RGB sensor(s) in the vicinity of the second pair of fingers may be advantageous because the sensor(s) capture images of the fruit or cluster of fruits at fruit level, whereas other sensors of the vision system may view the fruit from a different perspective/angle. This also reduces the risk of every sensor of the vision system being occluded during the picking process, i.e. it provides some redundancy in the vision system.
The robotic end-effector may further comprise a second actuation mechanism for controlling the actuation of the second pair of fingers and the cutting mechanism. Thus, a separate, dedicated actuation mechanism is used to control the movement and operation of the second pair of fingers and the cutting mechanism. As the cutting mechanism is only operated when it is confirmed that the second pair of fingers are gripping a stem of a fruit to be harvested, advantageously a single actuation mechanism is used to control both the second pair of fingers and cutting mechanism, thereby reducing complexity and the number of components needed to control the robotic end-effector.
As noted above, the second pair of fingers comprise a sensor for determining when a stem is located between the second pair of fingers. Advantageously, the sensor may be used to determine when the second pair of fingers need to be actuated to grip the stem, so that the fruit can be harvested. Responsive to receiving feedback from the sensor that the stem of the ripe fruit is located between the second pair of fingers, the second actuation mechanism may control the second pair of fingers to grip the stem, by decreasing a separation distance between the second pair of fingers.
The sensor of the second pair of fingers may be a proximity sensor, such as an infrared sensor. It will be understood that this is a non-limiting example proximity sensor, and that any suitable sensor may be used to determine when the stem is located between the second pair of fingers are closed in a position suitable for gripping.
In a second approach of the present techniques, there is provided a robotic system for fruit harvesting, the robotic system comprising: at least one picking arm; a control system for controlling the at least one picking arm; and a robotic 20 end-effector, of the type described herein, coupled to the at least one picking arm.
The robotic system may further comprise at least one container for receiving fruit harvested by the robotic end-effector, wherein after the cutting mechanism has cut the stem of a ripe fruit, the control system controls the at least one picking arm to move the robotic end-effector to above the at least one container, and wherein the second actuation mechanism controls the second pair of fingers to release the stem and the attached fruit, by increasing a separation distance between the second pair of fingers, so that the fruit drops into the container.
The robotic system may further comprise a mechanism for moving the robotic system towards, between and/or around plants. The mechanism may be a tracked or wheeled rover, or a vehicle capable of navigating autonomously.
In a third approach of the present techniques, there is provided a method for fruit harvesting using a robotic end-effector, the method comprising: identifying, using a vision system of the robotic end-effector, a location of a ripe fruit on a plant; moving, using a first pair of fingers of the robotic end-effector, 5 any objects that at least partly occlude the identified ripe fruit on the plant; gripping a stem of the identified ripe fruit, using a second pair of fingers of the robotic end-effector; sensing, using a sensor of the second pair of fingers, when the stem is located between the second pair of fingers; and cutting, using a cutting mechanism, the stem of the identified ripe fruit when the stem is gripped between 10 the second pair of fingers, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut.
The step of moving any objects that at least partly occlude the identified 15 ripe fruit may comprise controlling a first actuation mechanism to increase a separation distance between the first pair of fingers.
The first pair of fingers may be haptic fingers comprising sensors for measuring forces exerted on the fingers by objects being moved by the fingers.
The step of moving any objects that at least partly occlude the identified ripe fruit may comprise using the forces measured by the sensors to control the first actuation mechanism, as well as the movements of a robotic manipulator. As mentioned above, this may be advantageous because the measured forces may enable more effective manipulation movements to be generated to push away occluding matter, which yields increased success of cluster manipulation. Furthermore, it may avoid exerting large or excessive forces on soft fruits, thereby minimising the risk of bruising the fruits during the picking process.
The step of gripping a stem of the identified ripe fruit may comprise: receiving feedback from the sensor that the stem of the identified ripe fruit is located between the second pair of fingers; and controlling a second actuation mechanism to decrease a separation distance between the second pair of fingers. Advantageously, the feedback from the sensor (which may be a proximity sensor) may be used to determine when the second pair of fingers need to be actuated to grip the stem, so that the fruit can be harvested.
In a related approach of the present techniques, there is provided a non-transitory data carrier carrying processor control code to implement any of the methods, processes and techniques described herein.
As will be appreciated by one skilled in the art, the present techniques may be embodied as a system, method or computer program product. Accordingly, present techniques may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects.
Furthermore, the present techniques may take the form of a computer program product embodied in a computer readable medium having computer readable program code embodied thereon. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable medium may be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present techniques may be written in any combination of one or more programming languages, including object oriented programming languages and conventional procedural programming languages. Code components may be embodied as procedures, methods or the like, and may comprise sub-components which may take the form of instructions or sequences of instructions at any of the levels of abstraction, from the direct machine instructions of a native instruction set to high-level compiled or interpreted language constructs.
Embodiments of the present techniques also provide a non-transitory data 30 carrier carrying code which, when implemented on a processor, causes the processor to carry out any of the methods described herein.
The techniques further provide processor control code to implement the above-described methods, for example on a general purpose computer system or 35 on a digital signal processor (DSP). The techniques also provide a carrier carrying processor control code to, when running, implement any of the above methods, in particular on a non-transitory data carrier. The code may be provided on a carrier such as a disk, a microprocessor, CD-or DVD-ROM, programmed memory such as non-volatile memory (e.g. Flash) or read-only memory (firmware), or on a data carrier such as an optical or electrical signal carrier. Code (and/or data) to implement embodiments of the techniques described herein may comprise source, object or executable code in a conventional programming language (interpreted or compiled) such as C, or assembly code, code for setting up or controlling an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array), or code for a hardware description language such as Verilog (RTM) or VHDL (Very high speed integrated circuit Hardware Description Language). As the skilled person will appreciate, such code and/or data may be distributed between a plurality of coupled components in communication with one another. The techniques may comprise a controller which includes a microprocessor, working memory and program memory coupled to one or more of the components of the system.
It will also be clear to one of skill in the art that all or part of a logical method according to embodiments of the present techniques may suitably be embodied in a logic apparatus comprising logic elements to perform the steps of the above-described methods, and that such logic elements may comprise components such as logic gates in, for example a programmable logic array or application-specific integrated circuit. Such a logic arrangement may further be embodied in enabling elements for temporarily or permanently establishing logic structures in such an array or circuit using, for example, a virtual hardware descriptor language, which may be stored and transmitted using fixed or transmittable carrier media.
In an embodiment, the present techniques may be implemented using 30 multiple processors or control circuits. The present techniques may be adapted to run on, or integrated into, the operating system of an apparatus.
In an embodiment, the present techniques may be realised in the form of a data carrier having functional data thereon, said functional data comprising 35 functional computer data structures to, when loaded into a computer system or network and operated upon thereby, enable said computer system to perform all the steps of the above-described method.
Brief description of the drawings
Implementations of the present techniques will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1A is a perspective view of a first example robotic end-effector; Figure 16 is a perspective view of the first example robotic end-effector; Figures 2A and 26 show, respectively, a side view and a bottom view of the first example robotic end-effector; Figure 3 is a top view of the second pair of fingers and the cutting mechanism of the first example robotic end-effector; Figure 4A is a perspective view of the first and second pairs of fingers of 20 the first example robotic end-effector; Figure 46 is a perspective view of the first pair of fingers of the first example robotic end-effector; Figure 5A shows a perspective view of an alternative form of the first example robotic end-effector; Figures 5B and 5C show, respectively, a perspective view and a side view of a second example robotic end-effector; Figure 5D shows a side view of an alternative form of the second example robotic end-effector; Figure 6A shows a perspective view of the first pair of fingers of the second 35 example robotic end-effector; Figure 6B shows a perspective view of the second pair of fingers and cutting mechanism of the second example robotic end-effector; Figures 7A and 76 show perspective views of the second pair of fingers and cutting mechanism of the second example robotic end-effector; Figure 8 is a block diagram of a robotic system comprising the robotic end-effector of the present techniques; and Figure 9 is a flowchart of example steps to harvest fruit using the robotic end-effector of the present techniques.
Detailed description of the drawings
Broadly speaking, embodiments of the present techniques provide a robotic end-effector for selective crop harvesting, which is particularly suitable for harvesting crops that grow in dense clusters, such as strawberries.
The term "picking head" is used interchangeably herein with the term "robotic end-effector".
Selective harvesting of crops using robotic technology aim to address the societal and economical challenges of agricultural labour shortages. Existing robotic solutions for selective harvesting have limited capability because of complex picking requirements in different picking scenarios, e.g., picking strawberries in dense clusters. Most of the available solutions are developed for a very specific picking scenario, e.g. picking strawberries in isolation. However, most of the economically-viable (e.g. high-yielding and/or disease resistant) varieties of strawberry are grown in dense clusters. The bottleneck of the robotic solutions is the picking head they use. Most of the available picking heads for selective harvesting are capable of performing only two actions: opening the picking head, and closing the picking head. The available picking heads are limited in their ability to harvest strawberries in a dense cluster where a ripe strawberry to be picked is occluded. As a result, some ripe strawberries may not be detected using existing picking heads and, even if they are detected, it may not be possible to segment, assess and localise the ripe strawberry using existing picking heads due to their limited range of motion.
Moreover, existing picking heads cannot reach the ripe strawberry if it is surrounded by other unripe strawberries, and if those unripe strawberries cannot be easily pushed away by the picking head to reach the ripe strawberry. That is, ripe strawberry displacement is inevitable during the move/push of the unripe ones, which makes it difficult to successfully harvest the ripe strawberry.
The present techniques address the above-mentioned issues with conventional picking heads by providing a picking head or robotic end-effector that is able to pick fruits such as, but not limited to, strawberries which grow in dense clusters, without bruising or damaging the fruit. This robotic end-effector of the present techniques comprises a pair of fingers for removing occlusion, another pair of fingers solely that are used to make a grip on the stem of ripe fruit, and a cutting mechanism that effectively and gently cuts the stem of the gripped fruit. As will be described in more detail below, the robotic end-effector comprises a vision system for identifying ripe fruit and determining when a stem of a ripe fruit is located between the pair of fingers used for gripping the stem.
Figure íA is a perspective view of a first example robotic end-effector 100 for fruit harvesting, while Figure 13 is a perspective upside down view of the first example robotic end-effector 100. The robotic end-effector 100 comprises a vision system (not shown) for identifying a location of a ripe fruit on a plant. The robotic end-effector 100 comprises a first pair of fingers 102 for moving any objects that at least partly occlude the identified ripe fruit on the plant. The robotic end-effector 100 comprises a second pair of fingers 104 for gripping a stem of the identified ripe fruit. The second pair of fingers 104 comprise a sensor 108 for indicating when the stem is located between the second pair of fingers 104. As shown in Figure 13, the sensor 108 may be positioned in the middle of the second pair of fingers 104, so that the sensor 108 can determine whether a stem is located between the second pair of fingers. The robotic end-effector 100 comprises a cutting mechanism 106 for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers 104, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers 104 after the stem has been cut.
The robotic end-effector 100 may further comprise a first actuation 5 mechanism (not shown) for controlling the actuation of the first pair of fingers 102. Thus, a dedicated actuation mechanism is used to control the movement and operation of the first pair of fingers 102.
In response to receiving the location of an object that at least partly occludes a fruit, the first actuation mechanism may control the first pair of fingers 102 to push away the object, by increasing a separation distance between the first pair of fingers. Thus, the individual fingers 102a, 102b of the first pair of fingers 102 may be close together when the robotic end-effector is being used to image a plant and identify ripe fruits, and/or when the robotic end-effector is moving towards an identified ripe fruit. The individual fingers 102a, 102b of the first pair of fingers 102 may be moved further apart from each other when an object that at least partly occludes a fruit needs to be moved away, so that the fruit can be better seen (to determine if it is suitable for harvesting) and/or so that the second pair of fingers 104 can grip a stem of the fruit. In Figure 1, the individual fingers 102a, 102b are shown as being close together. In Figure 1, arrows A and B indicate, respectively, the direction fingers 102a and 102b need to be moved to increase the separation distance between the first pair of fingers 102. Preferably, the first actuation mechanism causes the individual fingers 102a, 102b to move in unison.
Figures 2A and 2B show, respectively, a side view and a bottom view of the first example robotic end-effector. As mentioned above, the robotic end-effector 100 comprises a cutting mechanism 106 for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers 104, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers 104 after the stem has been cut.
Figure 3 is a top view of the second pair of fingers 104a,b and the cutting mechanism 106 of the first example robotic end-effector. Other features of the 35 robotic end-effector, including the first pair of fingers are removed from the image to more clearly shown the second pair of fingers and the cutting mechanism. It can be seen that when a stem is gripped by the second pair of fingers 104a,b, the cutting mechanism 106 cuts the stem at a location above the second pair of fingers, which means the stem (and the attached fruit) continues to be gripped by the second pair of fingers 104a,b after the stem has been cut.
The robotic end-effector 100 may further comprise a second actuation mechanism (not shown) for controlling the actuation of the second pair of fingers 104 and the cutting mechanism 106. Thus, a separate, dedicated actuation mechanism is used to control the movement and operation of the second pair of fingers 104 and the cutting mechanism 106. As the cutting mechanism 106 is only operated when it is confirmed that the second pair of fingers 104 are gripping a stem of a fruit to be harvested, advantageously a single actuation mechanism is used to control both the second pair of fingers 104 and cutting mechanism 106, thereby reducing complexity and the number of components needed to control the robotic end-effector 100. The sensor 108 of the second pair of fingers 104 may also be used to determine when the second pair of fingers are firmly gripping a stem.
The vision system may comprise at least one image sensor for capturing images of the fruit or clusters of fruit. The at least one image sensor may be an RGB sensor. In some cases, the at least one image sensor may be two RGB sensors 110a,b (see Figure 1B) provided in the vicinity of the second pair of fingers to enable stereo vision (i.e. depth sensing). Having the RGB sensor(s) in the vicinity of the second pair of fingers may be advantageous because the sensor(s) capture images of the fruit or cluster of fruits at fruit level, whereas other sensors of the vision system may view the fruit from a different perspective/angle. This also reduces the risk of every sensor of the vision system being occluded during the picking process, i.e. it provides some redundancy in the vision system.
As noted above, the second pair of fingers 104a,b comprise a sensor 108 for determining when a stem is located between the second pair of fingers. Advantageously, the sensor 108 may be used to determine when the second pair of fingers need to be actuated to grip the stem, so that the fruit can be harvested.
Responsive to receiving feedback from the sensor 108 that the stem of the ripe fruit is located between the second pair of fingers 104, the second actuation mechanism may control the second pair of fingers 104 to grip the stem, by decreasing a separation distance between the second pair of fingers.
In Figure 3, individual fingers 104a, 104b of the second pair of fingers 104 are shown as being far apart. This position or separation distance may be adopted so that the second pair of fingers 104 are able to receive a stem of a fruit to be harvested. The individual fingers 104a, 104b may be actuated to decrease the separation distance between the second pair of fingers 104 when a stem located between the fingers is to be gripped. In Figure 3, arrows C and D indicate, respectively, the direction fingers 104a and 104b need to be moved to decrease the separation distance between the second pair of fingers 104. Preferably, the second actuation mechanism causes the individual fingers 104a, 104b to move in unison.
As shown in Figure 3, the cutting mechanism 106 is preferably provided in proximity to the second pair of fingers 104, such that when the stem of the identified ripe fruit is cut by the cutting mechanism 106, the second pair of fingers 104 continues to grip a portion of the stem that remains attached to the fruit. In other words, when a cutting operation performed by the cutting mechanism 106 is complete, the fruit is not immediately dropped into a container for collecting the harvested fruit. Instead, the fruit continues to be gripped, via a portion of the stem that is still attached to the fruit, by the second pair of fingers 104. This is advantageous because the robotic end-effector may be controlled to gently place the harvested fruit in a container. The second pair of fingers 104 may release their grip on the portion of the stem that remains attached to the fruit when the end-effector is close to the container.
As mentioned above, the second pair of fingers 104 comprise a sensor (not shown here) for indicating when the stem is located between the second pair of fingers 104. The sensor of the second pair of fingers 104 may be a proximity sensor, such as an infrared sensor. It will be understood that this is a non-limiting example proximity sensor, and that any suitable sensor may be used to determine when the stem is located between the second pair of fingers are closed in a position suitable for gripping.
When the stem of the identified ripe fruit is gripped by the second pair of fingers 104, the identified ripe fruit may be in proximity to a first side (e.g. a bottom side) of the second pair of fingers 104. The cutting mechanism may be 5 provided in proximity to a second, opposite side (e.g. a top side) of the second pair of fingers, as shown in Figure 3. Thus, the cutting mechanism 106 may be positioned relative to the second pair of fingers such that it cuts the stem at a point where the stem protrudes from the second pair of fingers. In this way, a portion of the stem that is still attached to the fruit remains gripped by the second 10 pair of fingers 104.
Figure 4A is a perspective view of the first and second pairs of fingers of the first example robotic end-effector, and Figure 4B is a perspective view of the first pair of fingers of the first example robotic end-effector. In both Figures, other components of the robotic end-effector have been removed to more clearly show the fingers. In Figure 4A, the individual fingers 102a, 102b of the first pair of fingers 102 are shown as being close together, while the individual fingers 104a, 104b of the second pair of fingers 104 are shown as being far apart. This may be the configuration used when the robotic end-effector is being moved towards a ripe fruit or to identify ripe fruits. The opposite configuration may be used when the robotic end-effector is used to harvest a ripe fruit. That is, during the harvesting process, the individual fingers 102a, 102b of the first pair of fingers 102 may be far apart (while they push away any objects that occlude the specific fruit to be harvested), while the individual fingers 104a, 104b of the second pair of fingers 104 may be close together (while they grip a stem of the specific fruit to be harvested). In this way, the first pair of fingers may be out of the way of fruit connected to the stem that is being gripped by the second pair of fingers.
The first pair of fingers 102 may be non-sensing fingers. That is, the first pair of fingers 102 may not themselves obtain any feedback about the objects they contact. Alternatively, the first pair of fingers 102 may be haptic fingers comprising sensors for measuring forces exerted on the fingers by objects being moved by the fingers.
Figure 5A shows a perspective view of an alternative form 100' of the first example robotic end-effector. The robotic end-effector 100' comprises a first pair of fingers 102' (having individual fingers 102a', 102b') and a second pair of fingers 104'. The functionality of the robotic end-effector 100' is the same as that of robotic end-effector 100, and is not described again for the sake of conciseness.
Figures 5B and 5C show, respectively, a perspective view and a side view of a second example robotic end-effector 200. It will be understood that the first and second robotic end-effectors operate in the same way, and differ only in the 10 precise design of particular features.
The robotic end-effector 200 comprises a vision system (not shown) for identifying a location of a ripe fruit on a plant. The robotic end-effector 200 comprises a first pair of fingers 202 for moving any objects that at least partly occlude the identified ripe fruit on the plant. The robotic end-effector 200 comprises a second pair of fingers 204 for gripping a stem of the identified ripe fruit. The second pair of fingers 204 comprise a sensor (not shown) for indicating when the stem is gripped between the second pair of fingers 204. The sensor may be located between the second pair of fingers 204 in a similar manner to sensor 108 in Figure 13. The robotic end-effector 200 comprises a cutting mechanism (not visible here) for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers 204, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers 204 after the stem has been cut.
The robotic end-effector 200 may further comprise a first actuation mechanism (not shown) for controlling the actuation of the first pair of fingers 202. Thus, a dedicated actuation mechanism is used to control the movement and operation of the first pair of fingers 202.
In response to receiving the location of an object that at least partly occludes a fruit, the first actuation mechanism may control the first pair of fingers 202 to push away the object, by increasing a separation distance between the first pair of fingers. Thus, the individual fingers 202a, 202b of the first pair of fingers 202 may be close together when the robotic end-effector is being used to image a plant and identify ripe fruits, and/or when the robotic end-effector is moving towards an identified ripe fruit. The individual fingers 202a, 202b of the first pair of fingers 202 may be moved further apart from each other when an object that at least partly occludes a fruit needs to be moved away, so that the fruit can be better seen (to determine if it is suitable for harvesting) and/or so that the second pair of fingers 104 can grip a stem of the fruit. In Figure 55, the individual fingers 202a, 202b are shown as being close together.
Figure 5D shows a side view of an alternative form of the second example robotic end-effector 200'. It will be understood that the second robotic end-effectors 200 and 200' operate in the same way, and differ only in the precise design of particular features. Specifically, the robotic end-effector 200' has a first pair of fingers 202' which are relatively smaller than the first pair of fingers 202 of the robotic end-effector 200 (see e.g. Figure 5C for comparison). The first pair of fingers 202' are smaller in size and do not protrude as far relative to the second pair of fingers. This may provide a more compact robotic end-effector, which may be advantageous when picking certain fruits.
Figure 6A shows a perspective view of the first pair of fingers 202 of the second example robotic end-effector 200, and Figure 6B shows a perspective view of the second pair of fingers 202 and cutting mechanism 206 of the second example robotic end-effector 200. In Figure 6A, both individual fingers 202a, 202b of the first pair of fingers 202 are shown, while in Figure 65, only finger 202a is shown and finger 202b is hidden so that the second pair of fingers 204 can be more clearly seen. The cutting mechanism 206 is for cutting the stem of the identified ripe fruit in response to a signal from the sensor (of the second pair of fingers 204) indicating that the stem is gripped between the second pair of fingers 204, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers 204 after the stem has been cut.
The first pair of fingers 202 may be non-sensing fingers. That is, the first pair of fingers 202 may not themselves obtain any feedback about the objects they contact. Alternatively, the first pair of fingers 202 may be haptic fingers comprising sensors for measuring forces exerted on the fingers by objects being moved by the fingers.
Figures 7A and 7B show perspective views of the second pair of fingers 204 and cutting mechanism 206 of the second example robotic end-effector 200.
The robotic end-effector 200 may further comprise a second actuation mechanism (not shown) for controlling the actuation of the second pair of fingers 204 and the cutting mechanism 206. Thus, a separate, dedicated actuation mechanism is used to control the movement and operation of the second pair of fingers 204 and the cutting mechanism 206. As the cutting mechanism 206 is only operated when it is confirmed that the second pair of fingers 204 are gripping a stem of a fruit to be harvested, advantageously a single actuation mechanism is used to control both the second pair of fingers 204 and cutting mechanism 206, thereby reducing complexity and the number of components needed to control the robotic end-effector 200. The sensor of the second pair of fingers 204 may also be used to determine when the second pair of fingers are firmly gripping a stem.
The vision system may comprise at least one image sensor for capturing images of the fruit/cluster of fruits. The at least one image sensor may be an RGB sensor. In some cases, the at least one image sensor may be two RGB sensors provided in the vicinity of the second pair of fingers to enable stereo vision (i.e. depth sensing). Having the RGB sensor(s) in the vicinity of the second pair of fingers may be advantageous because the sensor(s) capture images of the fruit or cluster of fruits at fruit level, whereas other sensors of the vision system may view the fruit from a different perspective/angle. This also reduces the risk of every sensor of the vision system being occluded during the picking process, i.e. it provides some redundancy in the vision system.
The second pair of fingers comprise a sensor for determining when a stem is located between the second pair of fingers. Advantageously, the sensor may be used to determine when the second pair of fingers need to be actuated to grip the stem, so that the fruit can be harvested. Responsive to receiving feedback from the sensor that the stem of the ripe fruit is located between the second pair of fingers 204, the second actuation mechanism may control the second pair of fingers 204 to grip the stem, by decreasing a separation distance between the second pair of fingers.
In Figures 7A and 73, individual fingers 204a, 204b of the second pair of fingers 204 are shown as being close together. This position or separation distance may be adopted when the second pair of fingers 204 are gripping a stem of a fruit to be harvested. The individual fingers 204a, 204b may be actuated to increase the separation distance between the second pair of fingers 204 to enable a stem to be received or positioned between the fingers 204a, 204b prior to gripping.
As shown in Figures 7A and 76, the cutting mechanism 206 is preferably provided in proximity to the second pair of fingers 204, such that when the stem of the identified ripe fruit is cut by the cutting mechanism 206, the second pair of fingers 204 continues to grip a portion of the stem that remains attached to the fruit. In other words, when a cutting operation performed by the cutting mechanism 206 is complete, the fruit is not immediately dropped into a container for collecting the harvested fruit. Instead, the fruit continues to be gripped, via a portion of the stem that is still attached to the fruit, by the second pair of fingers 204. This is advantageous because the robotic end-effector may be controlled to gently place the harvested fruit in a container. The second pair of fingers 204 may release their grip on the portion of the stem that remains attached to the fruit when the end-effector is close to the container As mentioned above, the second pair of fingers 204 comprise a sensor (not shown here, but see Figure 1B for where the sensor may be located) for indicating when the stem is gripped between the second pair of fingers 204. The sensor of the second pair of fingers 204 may be a proximity sensor, such as an infrared sensor. It will be understood that this is a non-limiting example proximity sensor, and that any suitable sensor may be used to determine when the stem is located between the second pair of fingers are closed in a position suitable for gripping.
As shown in Figures 7A and 76, one of the fingers 204b of the second pair of fingers may comprise a slot 208 in the finger, which extends all the way through the finger 204b, from an edge of the finger to a (stem) gripping surface of the finger 204b. The cutting mechanism 206 may be arranged to move through the slot 208 to cut a stem gripped by the second pair of fingers 204. This may be advantageous because a portion of the stem which is gripped by the second pair of fingers 204 may be held more firmly and/or may be substantially straight (compared to the stem which protrudes from the second pair of fingers), which may make it easier for the cutting mechanism to cut through the stem. Furthermore, as the cutting mechanism 206 is located in the slot 208, the cutting mechanism is at least partially covered or encased for safety reasons, i.e. to avoid any risk of a human operator being able to come into contact with the cutting mechanism.
The other of the fingers 204a of the second pair of fingers 204 may comprise a groove 210 for receiving a cutting edge 206a of the cutting mechanism 206 when the cutting mechanism 206 moves through the slot 208 to cut the stem. This may enable the cutting mechanism 206 to fully cut through the stem, as the groove 210 provides a space for the cutting edge 206a of the cutting mechanism 206 to pass all the way through the stem.
In both the first end-effector 100 and second end-effector 200, surfaces of the first pair of fingers 102 and/or second pair of fingers 104 and/or other components of the end-effectors may be substantially smooth for ease of cleaning and to avoid the accumulation of dirt. This is advantageous because dirt on the components may prevent the end-effector from operating correctly. For example, dirt on the second pair of fingers may prevent the sensor of the second pair of fingers from correctly determining when a stem is gripped between the fingers.
Generally speaking, the robotic end-effector described herein benefits from 2.5 degrees of freedom, which is higher than those of available picking heads. This added degree of freedom allows the robotic end-effector to deal with complex picking scenarios where the available picking heads fail. The robotic end-effector benefits from an effective combination of actuation systems and sensors to resolve the limitations of currently available picking heads. The robotic end-effector includes three separate movements (moving objects, gripping a stem, and cutting a stem) that are actuated using two actuators. This is useful as the ability of the robotic end-effector is increased without also significantly increasing the complexity or component count of the device.
Embodiments of the robotic end-effector benefit from an effective configuration of RGB, RGB-D and IR proximity sensors that helps to efficiently detect and localise the ripe strawberries. In addition, the combined sensory information can be used to estimate the size, weight and sort the quality of the strawberries to be picked.
Figure 8 is a block diagram of a robotic system 300 for fruit harvesting comprising the robotic end-effector 100, 200 of the present techniques. While Figure 8 shows the robotic system 300 comprising the first example robotic end-effector 100, it will be understood that this is merely illustrative. The robotic system 300 comprises at least one picking arm 302 and a control system 304 for controlling the at least one picking arm 302. The control system 304 may comprise at least one processor coupled to memory. The at least one processor may comprise one or more of: a microprocessor, a microcontroller, and an integrated circuit. The memory may comprise volatile memory, such as random access memory (RAM), for use as temporary memory, and/or non-volatile memory such as Flash, read only memory (ROM), or electrically erasable programmable ROM (EEPROM), for storing data, programs, or instructions, for example.
The robotic system comprises a robotic end-effector 100 of the types described herein, that is coupled to the at least one picking arm 302.
As shown in Figure 8, the robotic end-effector 100 comprises a vision system 120 for identifying a location of a ripe fruit on a plant. The vision system 120 may enable a three-dimensional map of a plant to be generated. The vision system 120 may use the three-dimensional map to identify location of ripe fruits on a plant and any objects that at least partly occlude the identified ripe fruits. As described above, the vision system 120 sensor may comprise a depth sensor, which may be, for example, an RGB-D (red-green-blue-depth) camera. The RGB-D camera may be mounted on the robotic end-effector in a position that enables the plant to be imaged from a distance. The vision system 120 may further comprise at least one further image sensor in the vicinity of the second pair of fingers 104. The image sensor may enable images to be captured of the environment of the second pair of fingers 104. This may be advantageous because, when the robotic end-effector 100 is being used to harvest a fruit, the RGB-D sensor may be obscured by foliage of the plant and so it may not be able to determine, for example, whether a stem of the ripe fruit is positioned between the second pair of fingers. Thus, the image sensor provides redundancy in the vision system 120. In some cases, the image sensor may be two RGB sensors provided in the vicinity of the second pair of fingers 104 to enable stereo vision (i.e. depth sensing).
The vision system 120 may also be able to determine the size and quality of each fruit that is to be harvested, or which is harvested. This may enable the fruit to be deposited into a suitable container. For example, this may enable large fruits to be placed into a container with other large fruits (so that they do not damage or squash smaller fruits), or it may enable large fruits to be dispersed among different containers (so that each container contains a mixture of fruit sizes). This may also enable any slightly damaged or rotten fruits to be discarded or separated from other higher quality fruits.
The robotic end-effector comprises a first pair of fingers 102 for moving any objects that at least partly occlude the identified ripe fruit on the plant. The first pair of fingers are operated by a first actuation mechanism 122. The robotic end-effector comprises a second pair of fingers 104 for gripping a stem of the identified ripe fruit. The second pair of fingers 104 comprise a sensor 108 for indicating when the stem is located between the second pair of fingers 104. The robotic end-effector comprises a cutting mechanism 106 for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers 104, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut. The second pair of fingers 104 and cutting mechanism 106 are operated by a second actuation mechanism 124. For the sake of conciseness, the operation of the robotic end-effector 100 will not be described here again.
The robotic system 300 may further comprise at least one container 306 for receiving fruit harvested by the robotic end-effector 100. After the cutting mechanism 106 has cut the stem of a ripe fruit, the control system 304 may control the at least one picking arm 302 to move the robotic end-effector 100 to above the at least one container 306. Once in position, the second actuation mechanism 124 controls the second pair of fingers 104 to release the stem and the attached fruit, by increasing a separation distance between the second pair of fingers, so that the fruit drops into the container.
The robotic system 300 may further comprise a mechanism 308 for moving the robotic system towards, between and/or around plants. The mechanism 308 may be a tracked or wheeled rover, or a vehicle capable of navigating autonomously.
Figure 9 is a flowchart of example steps to harvest fruit using the robotic end-effector of the present techniques. The method begins by identifying, using a vision system of the robotic end-effector, a location of a ripe fruit on a plant (step S100).
The method comprises moving, using a first pair of fingers of the robotic end-effector, any objects that at least partly occlude the identified ripe fruit on the plant (step 5102). The step of moving any objects that at least partly occlude the identified ripe fruit may comprise controlling a first actuation mechanism to increase a separation distance between the first pair of fingers.
The method comprises gripping a stem of the identified ripe fruit, using a second pair of fingers of the robotic end-effector (step 5104). The step of gripping a stem of the identified ripe fruit may comprise: receiving feedback from the sensor 108 that the stem of the identified ripe fruit is located between the second pair of fingers; and controlling a second actuation mechanism to decrease a separation distance between the second pair of fingers. Advantageously, the feedback from the sensor may be used to determine when the second pair of fingers need to be actuated to grip the stem, so that the fruit can be harvested.
The method comprises sensing, using a sensor of the second pair of fingers, when the stem is located between the second pair of fingers (step 5106).
The method comprises cutting, using a cutting mechanism, the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers (step 5108), wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut.
Those skilled in the art will appreciate that while the foregoing has described 5 what is considered to be the best mode and where appropriate other modes of performing present techniques, the present techniques should not be limited to the specific configurations and methods disclosed in this description of the preferred embodiment. Those skilled in the art will recognise that present techniques have a broad range of applications, and that the embodiments may 10 take a wide range of modifications without departing from any inventive concept as defined in the appended claims.
Claims (25)
- CLAIMS1. A robotic end-effector for fruit harvesting, the robotic end-effector comprising: a vision system for identifying a location of a ripe fruit on a plant; a first pair of fingers for moving any objects that at least partly occlude the identified ripe fruit on the plant; a second pair of fingers for gripping a stem of the identified ripe fruit, the second pair of fingers comprising a sensor for indicating when the stem is located 10 between the second pair of fingers in a position suitable for gripping; and a cutting mechanism for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut.
- 2. The robotic end-effector as claimed in claim 1 wherein the cutting mechanism is provided in proximity to the second pair of fingers, such that when the stem of the identified ripe fruit is cut, the second pair of fingers grips a portion of the stem that remains attached to the fruit.
- 3. The robotic end-effector as claimed in claim 2 wherein the cutting mechanism cuts the stem at a point where the stem protrudes from the second pair of fingers.
- 4. The robotic end-effector as claimed in claim 2 wherein one of the fingers of the second pair of fingers comprises a slot in the finger, and the cutting mechanism is arranged to move through the slot to cut a stem gripped by the second pair of fingers.
- 5. The robotic end-effector as claimed in claim 4 wherein the other of the fingers of the second pair of fingers comprises a groove for receiving a cutting edge of the cutting mechanism when the cutting mechanism moves through the slot to cut the stem.
- 6. The robotic end-effector as claimed in any preceding claim wherein the vision system comprises a depth sensor for generating a three-dimensional map of the plant, and wherein the vision system uses the three-dimensional map to identify 5 location of ripe fruits on a plant and any objects that at least partly occlude the identified ripe fruits.
- 7. The robotic end-effector as claimed in claim 6 wherein the depth sensor is an RGB-D camera.
- 8. The robotic end-effector as claimed in claim 6 or 7 further comprising: a first actuation mechanism for controlling the actuation of the first pair of fingers.
- 9. The robotic end-effector as claimed in claim 8 wherein, responsive to receiving the location of an object that at least partly occludes a fruit, the first actuation mechanism controls the first pair of fingers to push away the object, by increasing a separation distance between the first pair of fingers.
- 10. The robotic end-effector as claimed in any claims 8 or 9 wherein the first pair of fingers are haptic fingers comprising sensors for measuring forces exerted on the fingers by objects being moved by the fingers.
- 11. The robotic end-effector as claimed in claim 10 wherein the forces measured 25 by the sensors of the haptic fingers are used by the first actuation mechanism to control the actuation of the haptic fingers.
- 12. The robotic end-effector as claimed in any preceding claim wherein the vision system comprises: at least one image sensor provided in the vicinity of the second pair of fingers for determining when a stem of the ripe fruit is located between the second pair of fingers.
- 13. The robotic end-effector as claimed in claim 12 wherein the at least one 35 image sensor is an RGB sensor.
- 14. The robotic end-effector as claimed in claim 12 wherein the at least one image sensor is two RGB sensors provided in the vicinity of the second pair of fingers to enable stereo vision.
- 15. The robotic end-effector as claimed in any preceding claim further comprising: a second actuation mechanism for controlling the actuation of the second pair of fingers and the cutting mechanism.
- 16. The robotic end-effector as claimed in claim 15, wherein, responsive to receiving feedback from the sensor that the stem of the ripe fruit is located between the second pair of fingers, the second actuation mechanism controls the second pair of fingers to grip the stem, by decreasing a separation distance between the second pair of fingers.
- 17. The robotic end-effector as claimed in any preceding claim wherein the sensor of the second pair of fingers is a proximity sensor.
- 18. A robotic system for fruit harvesting, the robotic system comprising: at least one picking arm; a control system for controlling the at least one picking arm; and a robotic end-effector, according to any one of claims 1 to 17, coupled to the at least one picking arm.
- 19. The robotic system as claimed in claim 18 further comprising at least one container for receiving fruit harvested by the robotic end-effector, wherein after the cutting mechanism has cut the stem of a ripe fruit, the control system controls the at least one picking arm to move the robotic end-30 effector to above the at least one container, and wherein the second actuation mechanism controls the second pair of fingers to release the stem and the attached fruit, by increasing a separation distance between the second pair of fingers, so that the fruit drops into the container.
- 20. The robotic system as claimed in claim 18 or 19 further comprising a mechanism for moving the robotic system.
- 21. A method for fruit harvesting using a robotic end-effector, the method comprising: identifying, using a vision system of the robotic end-effector, a location of a ripe fruit on a plant; moving, using a first pair of fingers of the robotic end-effector, any objects that at least partly occlude the identified ripe fruit on the plant; gripping a stem of the identified ripe fruit, using a second pair of fingers of the robotic end-effector; sensing, using a sensor of the second pair of fingers, when the stem is located between the second pair of fingers; and cutting, using a cutting mechanism, the stem of the identified ripe fruit 15 when the stem is gripped between the second pair of fingers, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut.
- 22. The method as claimed in claim 21 wherein moving any objects that at least partly occlude the identified ripe fruit comprises controlling a first actuation mechanism to increase a separation distance between the first pair of fingers.
- 23. The method as claimed in claim 21 or 22 wherein the first pair of fingers are haptic fingers comprising sensors for measuring forces exerted on the fingers by objects being moved by the fingers, and wherein moving any objects that at least partly occlude the identified ripe fruit comprises using the forces measured by the sensors to control the first actuation mechanism.
- 24. The method as claimed in claim 21, 22 or 23 wherein gripping a stem of the identified ripe fruit comprises: receiving feedback from the sensor that the stem of the identified ripe fruit is located between the second pair of fingers; and controlling a second actuation mechanism to decrease a separation distance between the second pair of fingers.
- 25. A non-transitory data carrier carrying code which, when implemented on a processor, causes the processor to carry out the method of any of claims 21 to 14.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2107942.1A GB2607326B (en) | 2021-06-03 | 2021-06-03 | Apparatus and systems for selective crop harvesting |
GBGB2114906.7A GB202114906D0 (en) | 2021-06-03 | 2021-10-19 | Apparatus and system for selective crop harvesting |
PCT/GB2022/051386 WO2022254203A1 (en) | 2021-06-03 | 2022-06-01 | Apparatus and system for selective crop harvesting |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2107942.1A GB2607326B (en) | 2021-06-03 | 2021-06-03 | Apparatus and systems for selective crop harvesting |
Publications (3)
Publication Number | Publication Date |
---|---|
GB202107942D0 GB202107942D0 (en) | 2021-07-21 |
GB2607326A true GB2607326A (en) | 2022-12-07 |
GB2607326B GB2607326B (en) | 2023-07-19 |
Family
ID=76838790
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB2107942.1A Active GB2607326B (en) | 2021-06-03 | 2021-06-03 | Apparatus and systems for selective crop harvesting |
GBGB2114906.7A Ceased GB202114906D0 (en) | 2021-06-03 | 2021-10-19 | Apparatus and system for selective crop harvesting |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GBGB2114906.7A Ceased GB202114906D0 (en) | 2021-06-03 | 2021-10-19 | Apparatus and system for selective crop harvesting |
Country Status (1)
Country | Link |
---|---|
GB (2) | GB2607326B (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007088225A1 (en) * | 2006-01-31 | 2007-08-09 | Universidad Politécnica de Madrid | Computer vision system for picking small row-cultivated fruits |
US20100292841A1 (en) * | 2009-05-13 | 2010-11-18 | Wickham Joseph S | Robot with 3d grasping capability |
JP2011211969A (en) * | 2010-03-31 | 2011-10-27 | National Agriculture & Food Research Organization | Peduncle removing apparatus and fruit harvesting apparatus |
CN102577755A (en) * | 2012-02-17 | 2012-07-18 | 中国农业大学 | Accurate picking actuating mechanism of strawberry picking robot and ridge-culture strawberry picking robot |
KR101405858B1 (en) * | 2013-12-04 | 2014-06-16 | 안성훈 | Robot apparatus for harvesting fruit |
CN108551883A (en) * | 2018-06-25 | 2018-09-21 | 南京工程学院 | A kind of terminal executor of picking robot |
EP3539735A1 (en) * | 2018-03-13 | 2019-09-18 | Soluciones Robóticas Agrícolas S.L. | Robotic arm end effector for fruit harvesting |
US20210212257A1 (en) * | 2019-01-30 | 2021-07-15 | Shenzhen University | Fruit and vegetable picking method and device based on machine vision and storage medium |
WO2021144955A1 (en) * | 2020-01-17 | 2021-07-22 | Agrist株式会社 | Harvesting device and harvesting system |
-
2021
- 2021-06-03 GB GB2107942.1A patent/GB2607326B/en active Active
- 2021-10-19 GB GBGB2114906.7A patent/GB202114906D0/en not_active Ceased
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007088225A1 (en) * | 2006-01-31 | 2007-08-09 | Universidad Politécnica de Madrid | Computer vision system for picking small row-cultivated fruits |
US20100292841A1 (en) * | 2009-05-13 | 2010-11-18 | Wickham Joseph S | Robot with 3d grasping capability |
JP2011211969A (en) * | 2010-03-31 | 2011-10-27 | National Agriculture & Food Research Organization | Peduncle removing apparatus and fruit harvesting apparatus |
CN102577755A (en) * | 2012-02-17 | 2012-07-18 | 中国农业大学 | Accurate picking actuating mechanism of strawberry picking robot and ridge-culture strawberry picking robot |
KR101405858B1 (en) * | 2013-12-04 | 2014-06-16 | 안성훈 | Robot apparatus for harvesting fruit |
EP3539735A1 (en) * | 2018-03-13 | 2019-09-18 | Soluciones Robóticas Agrícolas S.L. | Robotic arm end effector for fruit harvesting |
CN108551883A (en) * | 2018-06-25 | 2018-09-21 | 南京工程学院 | A kind of terminal executor of picking robot |
US20210212257A1 (en) * | 2019-01-30 | 2021-07-15 | Shenzhen University | Fruit and vegetable picking method and device based on machine vision and storage medium |
WO2021144955A1 (en) * | 2020-01-17 | 2021-07-22 | Agrist株式会社 | Harvesting device and harvesting system |
Also Published As
Publication number | Publication date |
---|---|
GB202107942D0 (en) | 2021-07-21 |
GB202114906D0 (en) | 2021-12-01 |
GB2607326B (en) | 2023-07-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Hohimer et al. | Design and field evaluation of a robotic apple harvesting system with a 3D-printed soft-robotic end-effector | |
Lehnert et al. | Autonomous sweet pepper harvesting for protected cropping systems | |
EP3863814B1 (en) | Autonomous crop harvester | |
Williams et al. | Robotic kiwifruit harvesting using machine vision, convolutional neural networks, and robotic arms | |
US10779472B2 (en) | Robotic fruit picking system | |
US8306663B2 (en) | Robot with 3D grasping capability | |
Chua et al. | Robotic manipulation of food products–a review | |
KR20190122227A (en) | Automatic harvester effector | |
WO2016055552A1 (en) | Automated harvesting apparatus | |
WO2010063075A1 (en) | Crop picking device and method | |
US11477942B2 (en) | Robotic fruit harvesting machine with fruit-pair picking and hybrid motorized-pneumatic robot arms | |
WO2022254203A1 (en) | Apparatus and system for selective crop harvesting | |
NL1042547B1 (en) | HARVESTER FOR BROCCOLI | |
Qiu et al. | Tendon-driven soft robotic gripper with integrated ripeness sensing for blackberry harvesting | |
Parsa et al. | Modular autonomous strawberry picking robotic system | |
Park et al. | Human-centered approach for an efficient cucumber harvesting robot system: Harvest ordering, visual servoing, and end-effector | |
Hemming et al. | Field test of different end-effectors for robotic harvesting of sweet-pepper | |
GB2607326A (en) | Apparatus and systems for selective crop harvesting | |
Parsa et al. | Autonomous strawberry picking robotic system (robofruit) | |
Jianqiao et al. | Research status and development direction of design and control technology of fruit and vegetable picking robot system | |
Xiong et al. | Push and drag: An active obstacle separation method for fruit harvesting robots | |
Pool et al. | An end-effector for robotic removal of citrus from the tree | |
Lehnert et al. | Lessons learnt from field trials of a robotic sweet pepper harvester | |
Guo | Review of current mechanical design in agricultural end effector | |
Lehnert et al. | Lessons learnt from field trials of a robotic sweet pepper harvester for protected cropping systems |