WO2022254203A1 - Appareil et système de récolte sélective de culture - Google Patents
Appareil et système de récolte sélective de culture Download PDFInfo
- Publication number
- WO2022254203A1 WO2022254203A1 PCT/GB2022/051386 GB2022051386W WO2022254203A1 WO 2022254203 A1 WO2022254203 A1 WO 2022254203A1 GB 2022051386 W GB2022051386 W GB 2022051386W WO 2022254203 A1 WO2022254203 A1 WO 2022254203A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- fingers
- pair
- stem
- fruit
- effector
- Prior art date
Links
- 238000003306 harvesting Methods 0.000 title claims abstract description 49
- 239000012636 effector Substances 0.000 claims abstract description 163
- 238000000034 method Methods 0.000 claims abstract description 83
- 235000013399 edible fruits Nutrition 0.000 claims description 306
- 230000007246 mechanism Effects 0.000 claims description 130
- 238000005520 cutting process Methods 0.000 claims description 113
- 238000000926 separation method Methods 0.000 claims description 23
- 230000003247 decreasing effect Effects 0.000 claims description 4
- 230000005089 fruit drop Effects 0.000 claims description 3
- 235000021012 strawberries Nutrition 0.000 abstract description 17
- 240000009088 Fragaria x ananassa Species 0.000 abstract 1
- 241000196324 Embryophyta Species 0.000 description 39
- 241000220223 Fragaria Species 0.000 description 29
- 230000008569 process Effects 0.000 description 18
- 235000016623 Fragaria vesca Nutrition 0.000 description 13
- 235000011363 Fragaria x ananassa Nutrition 0.000 description 13
- 230000015654 memory Effects 0.000 description 10
- 208000034656 Contusions Diseases 0.000 description 6
- 230000009471 action Effects 0.000 description 6
- 238000013461 design Methods 0.000 description 6
- 238000013459 approach Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 240000004244 Cucurbita moschata Species 0.000 description 2
- 235000009854 Cucurbita moschata Nutrition 0.000 description 2
- 235000009852 Cucurbita pepo Nutrition 0.000 description 2
- 241001061260 Emmelichthys struhsakeri Species 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000001953 sensory effect Effects 0.000 description 2
- 235000020354 squash Nutrition 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 235000012055 fruits and vegetables Nutrition 0.000 description 1
- 238000002955 isolation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000003936 working memory Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01D—HARVESTING; MOWING
- A01D46/00—Picking of fruits, vegetables, hops, or the like; Devices for shaking trees or shrubs
- A01D46/30—Robotic devices for individually picking crops
Definitions
- the present techniques generally relate to an apparatus, system and method for selective crop harvesting.
- the present techniques provide a robotic end-effector for fruit harvesting that is able to detect, select and cut edible crops that grow in dense clusters.
- picking heads there are two types of picking heads available for robotic harvesting of high value crops: (i) a picking head having a parallel jaw gripper, which may not be suitable for all types of crops, and (ii) a picking head that has a customised design for picking particular fruit in a very specific picking scenario, which is only suitable for a specific type of crop or method of harvesting. Consequently, the effectiveness of commonly available robotic picking heads is limited, as different robotic picking heads may be needed for different crop types.
- Some robotic picking heads are used to pick soft fruits such as strawberries.
- Some of the robotic picking heads that are currently available for picking strawberries are cup-shaped picking heads, which have opening parts that locate the peduncle of a strawberry and position the strawberry in front of cutting scissors in order to harvest the strawberry.
- the cutting action causes the strawberry to detach from the plant and fall into a punnet for collecting the strawberries.
- the picking head does not directly touch the flesh of the strawberry, which minimises bruising.
- the strawberry falls from a height into the punnet, the harvesting can inadvertently cause damage/bruising to the fruit.
- fruit placement within the punnet is not controlled, which may result in uneven distribution of the fruit in the punnet (which may also cause damage to fruit that are below other fruit).
- cup-shaped picking head and the design of other types of picking head, may not be suitable for harvesting crops that grow in dense clusters.
- the present applicant has therefore identified the need for an improved apparatus for automatic detection, selection and harvesting of crops that grow in dense clusters.
- a robotic end-effector for fruit harvesting comprising: a vision system for identifying a location of a ripe fruit on a plant; a first pair of fingers for moving any objects that at least partly occlude the identified ripe fruit on the plant; a second pair of fingers for gripping a stem of the identified ripe fruit, the second pair of fingers comprising a sensor for indicating when the stem is located between the second pair of fingers in a position suitable for gripping; and a cutting mechanism for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut.
- the present techniques advantageously enable ripe fruit to be harvested without bruising or damaging the fruit.
- the present techniques are particularly advantageous for harvesting fruit that grows in dense clusters, such as strawberries. It will be understood that this is an example, and non-limiting, fruit that may be harvested using the robotic end-effector of the present techniques. More generally, the present techniques may be used to harvest different types of fruit and vegetable crop, including those which grow individually and those which grow in clusters.
- the cutting mechanism is preferably provided in proximity to the second pair of fingers, such that when the stem of the identified ripe fruit is cut by the cutting mechanism, the second pair of fingers continues to grip a portion of the stem that remains attached to the fruit.
- the second pair of fingers may release their grip on the portion of the stem that remains attached to the fruit when the end-effector is close to the container.
- the cutting mechanism may be partially or fully covered or encased for safety reasons, i.e. to avoid any risk of a human operator being able to come into contact with the cutting mechanism.
- the identified ripe fruit When the stem of the identified ripe fruit is gripped by the second pair of fingers, the identified ripe fruit may be in proximity to a first side (e.g. a bottom side) of the second pair of fingers.
- the cutting mechanism may be provided in proximity to a second, opposite side (e.g. a top side) of the second pair of fingers.
- the cutting mechanism may be positioned relative to the second pair of fingers such that it cuts the stem at a point where the stem protrudes from the second pair of fingers. In this way, a portion of the stem that is still attached to the fruit remains gripped by the second pair of fingers.
- one of the fingers of the second pair of fingers may comprise a slot in the finger, which extends all the way through the finger, from an edge of the finger to a gripping surface of the finger.
- the cutting mechanism may be arranged to move through the slot to cut a stem gripped by the second pair of fingers. This may be advantageous because a portion of the stem which is gripped by the second pair of fingers may be held more firmly and/or may be substantially straight (compared to the stem which protrudes from the second pair of fingers), which may make it easier for the cutting mechanism to cut through the stem.
- the other of the fingers of the second pair of fingers may comprise a groove for receiving a cutting edge of the cutting mechanism when the cutting mechanism moves through the slot to cut the stem. This may enable the cutting mechanism to fully cut through the stem, as the groove provides a space for the cutting edge of the cutting mechanism to pass through the stem.
- the other of the fingers of the second pair of fingers may comprise a gripping surface.
- the gripping surface may comprise an angled portion for receiving a cutting edge of the cutting mechanism when the cutting mechanism moves through the slot to cut the stem. This may enable the cutting mechanism to cut the stem at an angle, which may advantageously enable the stem to be cut using a single cutting action.
- the vision system may comprise a depth sensor for generating a three- dimensional map of the plant.
- the vision system may use the three-dimensional map to identify location of ripe fruits on a plant and any objects that at least partly occlude the identified ripe fruits.
- the depth sensor may be an RGB-D (red-green- blue-depth) camera.
- the robotic end-effector may further comprise a first actuation mechanism for controlling the actuation of the first pair of fingers.
- a dedicated actuation mechanism is used to control the movement and operation of the first pair of fingers.
- the first actuation mechanism may control the first pair of fingers to push away the object, by increasing a separation distance between the first pair of fingers.
- the first pair of fingers may be close together when the robotic end- effector is being used to image a plant and identify ripe fruits, and/or when the robotic end-effector is moving towards an identified ripe fruit.
- the first pair of fingers may be moved further apart when an object that at least partly occludes a fruit needs to be moved away, so that the fruit can be better seen (to determine if it is suitable for harvesting) and/or so that the second pair of fingers can grip a stem of the fruit.
- the first pair of fingers may be non-sensing fingers. That is, the first pair of fingers may not themselves obtain any feedback about the objects they contact.
- the first pair of fingers may be haptic fingers comprising sensors for measuring forces exerted on the fingers by objects being moved by the fingers. This may be advantageous because having the sensors can determine interactions between the first pair of fingers and the plant, which enables intelligent manipulation of a cluster of fruits when other sensors (e.g. visual sensors) may not be able to see what the first pair of fingers are interacting with due to occlusion.
- the forces measured by the sensors of the haptic fingers may be used by the first actuation mechanism to control the actuation of the haptic fingers, and the movements of a robotic manipulator.
- This may be advantageous because more effective manipulation movements may be generated to push away occluding matter, which yields increased success of cluster manipulation. Furthermore, it may avoid exerting large or excessive forces on soft fruits, thereby minimising the risk of bruising the fruits during the picking process.
- the vision system may comprise at least one image sensor for capturing images of the fruit/cluster of fruits.
- the at least one image sensor may be an RGB sensor.
- the at least one image sensor may be two RGB sensors provided in the vicinity of the second pair of fingers to enable stereo vision (i.e. depth sensing).
- Stereo vision may be useful because it provides the robotic end-effector with richer sensory information.
- Having the RGB sensor(s) in the vicinity of the second pair of fingers may be advantageous because the sensor(s) capture images of the fruit or cluster of fruits at fruit level, whereas other sensors of the vision system may view the fruit from a different perspective/angle. This also reduces the risk of every sensor of the vision system being occluded during the picking process, i.e. it provides some redundancy in the vision system.
- the robotic end-effector may further comprise a second actuation mechanism for controlling the actuation of the second pair of fingers and the cutting mechanism.
- a separate, dedicated actuation mechanism is used to control the movement and operation of the second pair of fingers and the cutting mechanism.
- a single actuation mechanism is used to control both the second pair of fingers and cutting mechanism, thereby reducing complexity and the number of components needed to control the robotic end-effector.
- the second pair of fingers comprise a sensor for determining when a stem is located between the second pair of fingers.
- the sensor may be used to determine when the second pair of fingers need to be actuated to grip the stem, so that the fruit can be harvested.
- the second actuation mechanism may control the second pair of fingers to grip the stem, by decreasing a separation distance between the second pair of fingers.
- the sensor of the second pair of fingers may be a proximity sensor, such as an infrared sensor. It will be understood that this is a non-limiting example proximity sensor, and that any suitable sensor may be used to determine when the stem is located between the second pair of fingers are closed in a position suitable for gripping.
- a robotic system for fruit harvesting comprising: at least one picking arm; a control system for controlling the at least one picking arm; and a robotic end-effector, of the type described herein, coupled to the at least one picking arm.
- the robotic system may further comprise at least one container for receiving fruit harvested by the robotic end-effector, wherein after the cutting mechanism has cut the stem of a ripe fruit, the control system controls the at least one picking arm to move the robotic end-effector to above the at least one container, and wherein the second actuation mechanism controls the second pair of fingers to release the stem and the attached fruit, by increasing a separation distance between the second pair of fingers, so that the fruit drops into the container.
- the robotic system may further comprise a mechanism for moving the robotic system towards, between and/or around plants.
- the mechanism may be a tracked or wheeled rover, or a vehicle capable of navigating autonomously.
- a method for fruit harvesting using a robotic end-effector comprising: identifying, using a vision system of the robotic end-effector, a location of a ripe fruit on a plant; moving, using a first pair of fingers of the robotic end-effector, any objects that at least partly occlude the identified ripe fruit on the plant; gripping a stem of the identified ripe fruit, using a second pair of fingers of the robotic end-effector; sensing, using a sensor of the second pair of fingers, when the stem is located between the second pair of fingers; and cutting, using a cutting mechanism, the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut.
- the step of moving any objects that at least partly occlude the identified ripe fruit may comprise controlling a first actuation mechanism to increase a separation distance between the first pair of fingers.
- the first pair of fingers may be haptic fingers comprising sensors for measuring forces exerted on the fingers by objects being moved by the fingers.
- the step of moving any objects that at least partly occlude the identified ripe fruit may comprise using the forces measured by the sensors to control the first actuation mechanism, as well as the movements of a robotic manipulator.
- this may be advantageous because the measured forces may enable more effective manipulation movements to be generated to push away occluding matter, which yields increased success of cluster manipulation.
- it may avoid exerting large or excessive forces on soft fruits, thereby minimising the risk of bruising the fruits during the picking process.
- the step of gripping a stem of the identified ripe fruit may comprise: receiving feedback from the sensor that the stem of the identified ripe fruit is located between the second pair of fingers; and controlling a second actuation mechanism to decrease a separation distance between the second pair of fingers.
- the feedback from the sensor (which may be a proximity sensor) may be used to determine when the second pair of fingers need to be actuated to grip the stem, so that the fruit can be harvested.
- the method may comprise identifying, using the vision system, all ripe fruits on the plant, and selecting one such ripe fruit to harvest.
- the identified ripe fruit may be located on its own on a plant, such that harvesting the fruit is relatively straightforward.
- the selected ripe fruit may be in a cluster of fruits (as is the case with strawberries, for example). In these cases, the identified ripe fruit might not be easy to harvest, as it may be occluded by other fruits in the cluster.
- the method may comprise determining whether the selected ripe fruit is in a cluster; and determining a harvesting schedule when the selected ripe fruit is determined to be in a cluster, the harvesting schedule defining an order in which ripe fruits in the cluster are to be harvested.
- present techniques may be embodied as a system, method or computer program product. Accordingly, present techniques may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects.
- the present techniques may take the form of a computer program product embodied in a computer readable medium having computer readable program code embodied thereon.
- the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
- a computer readable medium may be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
- Computer program code for carrying out operations of the present techniques may be written in any combination of one or more programming languages, including object oriented programming languages and conventional procedural programming languages.
- Code components may be embodied as procedures, methods or the like, and may comprise sub-components which may take the form of instructions or sequences of instructions at any of the levels of abstraction, from the direct machine instructions of a native instruction set to high-level compiled or interpreted language constructs.
- Embodiments of the present techniques also provide a non-transitory data carrier carrying code which, when implemented on a processor, causes the processor to carry out any of the methods described herein.
- the techniques further provide processor control code to implement the above-described methods, for example on a general purpose computer system or on a digital signal processor (DSP).
- DSP digital signal processor
- the techniques also provide a carrier carrying processor control code to, when running, implement any of the above methods, in particular on a non-transitory data carrier.
- the code may be provided on a carrier such as a disk, a microprocessor, CD- or DVD-ROM, programmed memory such as non-volatile memory (e.g. Flash) or read-only memory (firmware), or on a data carrier such as an optical or electrical signal carrier.
- Code (and/or data) to implement embodiments of the techniques described herein may comprise source, object or executable code in a conventional programming language (interpreted or compiled) such as C, or assembly code, code for setting up or controlling an ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array), or code for a hardware description language such as Verilog (RTM) or VHDL (Very high speed integrated circuit Hardware Description Language).
- a controller which includes a microprocessor, working memory and program memory coupled to one or more of the components of the system.
- a logical method may suitably be embodied in a logic apparatus comprising logic elements to perform the steps of the above-described methods, and that such logic elements may comprise components such as logic gates in, for example a programmable logic array or application-specific integrated circuit.
- Such a logic arrangement may further be embodied in enabling elements for temporarily or permanently establishing logic structures in such an array or circuit using, for example, a virtual hardware descriptor language, which may be stored and transmitted using fixed or transmittable carrier media.
- the present techniques may be implemented using multiple processors or control circuits.
- the present techniques may be adapted to run on, or integrated into, the operating system of an apparatus.
- the present techniques may be realised in the form of a data carrier having functional data thereon, said functional data comprising functional computer data structures to, when loaded into a computer system or network and operated upon thereby, enable said computer system to perform all the steps of the above-described method.
- Figure 1A is a perspective view of a first example robotic end-effector
- Figure IB is a perspective view of the first example robotic end-effector
- Figures 2A and 2B show, respectively, a side view and a bottom view of the first example robotic end-effector
- Figure 3 is a top view of the second pair of fingers and the cutting mechanism of the first example robotic end-effector
- Figure 4A is a perspective view of the first and second pairs of fingers of the first example robotic end-effector
- Figure 4B is a perspective view of the first pair of fingers of the first example robotic end-effector
- Figure 5A shows a perspective view of an alternative form of the first example robotic end-effector
- Figures 5B and 5C show, respectively, a perspective view and a side view of a second example robotic end-effector
- Figure 5D shows a side view of an alternative form of the second example robotic end-effector
- Figure 6A shows a perspective view of the first pair of fingers of the second example robotic end-effector
- Figure 6B shows a perspective view of the second pair of fingers and cutting mechanism of the second example robotic end-effector
- Figures 7A and 7B show perspective views of the second pair of fingers and cutting mechanism of the second example robotic end-effector
- Figure 8 is a block diagram of a robotic system comprising the robotic end- effector of the present techniques
- Figure 9 is a flowchart of example steps to harvest fruit using the robotic end-effector of the present techniques.
- Figures 10A and 10B show perspective views of an alternative form of the first pair of fingers
- Figures 11A and 11B show, respectively, a perspective view and a front view of an alternative form of the second pair of fingers.
- Figure 12 is a flowchart of example steps to pick fruits.
- embodiments of the present techniques provide a robotic end-effector for selective crop harvesting, which is particularly suitable for harvesting crops that grow in dense clusters, such as strawberries.
- picking head is used interchangeably herein with the term “robotic end -effector”.
- the available picking heads are limited in their ability to harvest strawberries in a dense cluster where a ripe strawberry to be picked is occluded. As a result, some ripe strawberries may not be detected using existing picking heads and, even if they are detected, it may not be possible to segment, assess and localise the ripe strawberry using existing picking heads due to their limited range of motion.
- the present techniques address the above-mentioned issues with conventional picking heads by providing a picking head or robotic end-effector that is able to pick fruits such as, but not limited to, strawberries which grow in dense clusters, without bruising or damaging the fruit.
- This robotic end-effector of the present techniques comprises a pair of fingers for removing occlusion, another pair of fingers solely that are used to make a grip on the stem of ripe fruit, and a cutting mechanism that effectively and gently cuts the stem of the gripped fruit.
- the robotic end-effector comprises a vision system for identifying ripe fruit and determining when a stem of a ripe fruit is located between the pair of fingers used for gripping the stem.
- Figure 1A is a perspective view of a first example robotic end-effector 100 for fruit harvesting
- Figure IB is a perspective upside down view of the first example robotic end-effector 100.
- the robotic end-effector 100 comprises a vision system (not shown) for identifying a location of a ripe fruit on a plant.
- the robotic end-effector 100 comprises a first pair of fingers 102 for moving any objects that at least partly occlude the identified ripe fruit on the plant.
- the robotic end- effector 100 comprises a second pair of fingers 104 for gripping a stem of the identified ripe fruit.
- the second pair of fingers 104 comprise a sensor 108 for indicating when the stem is located between the second pair of fingers 104.
- the sensor 108 may be positioned in the middle of the second pair of fingers 104, so that the sensor 108 can determine whether a stem is located between the second pair of fingers.
- the robotic end-effector 100 comprises a cutting mechanism 106 for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers 104, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers 104 after the stem has been cut.
- the robotic end-effector 100 may further comprise a first actuation mechanism (not shown) for controlling the actuation of the first pair of fingers 102.
- a dedicated actuation mechanism is used to control the movement and operation of the first pair of fingers 102.
- the first actuation mechanism may control the first pair of fingers 102 to push away the object, by increasing a separation distance between the first pair of fingers.
- the individual fingers 102a, 102b of the first pair of fingers 102 may be close together when the robotic end-effector is being used to image a plant and identify ripe fruits, and/or when the robotic end-effector is moving towards an identified ripe fruit.
- the individual fingers 102a, 102b of the first pair of fingers 102 may be moved further apart from each other when an object that at least partly occludes a fruit needs to be moved away, so that the fruit can be better seen (to determine if it is suitable for harvesting) and/or so that the second pair of fingers 104 can grip a stem of the fruit.
- the individual fingers 102a, 102b are shown as being close together.
- arrows A and B indicate, respectively, the direction fingers 102a and 102b need to be moved to increase the separation distance between the first pair of fingers 102.
- the first actuation mechanism causes the individual fingers 102a, 102b to move in unison.
- Figures 2A and 2B show, respectively, a side view and a bottom view of the first example robotic end-effector.
- the robotic end-effector 100 comprises a cutting mechanism 106 for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers 104, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers 104 after the stem has been cut.
- Figure 3 is a top view of the second pair of fingers 104a, b and the cutting mechanism 106 of the first example robotic end-effector. Other features of the robotic end-effector, including the first pair of fingers are removed from the image to more clearly shown the second pair of fingers and the cutting mechanism. It can be seen that when a stem is gripped by the second pair of fingers 104a, b, the cutting mechanism 106 cuts the stem at a location above the second pair of fingers, which means the stem (and the attached fruit) continues to be gripped by the second pair of fingers 104a, b after the stem has been cut.
- the robotic end-effector 100 may further comprise a second actuation mechanism (not shown) for controlling the actuation of the second pair of fingers 104 and the cutting mechanism 106.
- a separate, dedicated actuation mechanism is used to control the movement and operation of the second pair of fingers 104 and the cutting mechanism 106.
- a single actuation mechanism is used to control both the second pair of fingers 104 and cutting mechanism 106, thereby reducing complexity and the number of components needed to control the robotic end-effector 100.
- the sensor 108 of the second pair of fingers 104 may also be used to determine when the second pair of fingers are firmly gripping a stem.
- the vision system may comprise at least one image sensor for capturing images of the fruit or clusters of fruit.
- the at least one image sensor may be an RGB sensor.
- the at least one image sensor may be two RGB sensors 110a, b (see Figure IB) provided in the vicinity of the second pair of fingers to enable stereo vision (i.e. depth sensing).
- Having the RGB sensor(s) in the vicinity of the second pair of fingers may be advantageous because the sensor(s) capture images of the fruit or cluster of fruits at fruit level, whereas other sensors of the vision system may view the fruit from a different perspective/angle. This also reduces the risk of every sensor of the vision system being occluded during the picking process, i.e. it provides some redundancy in the vision system.
- the second pair of fingers 104a, b comprise a sensor 108 for determining when a stem is located between the second pair of fingers.
- the sensor 108 may be used to determine when the second pair of fingers need to be actuated to grip the stem, so that the fruit can be harvested.
- the second actuation mechanism may control the second pair of fingers 104 to grip the stem, by decreasing a separation distance between the second pair of fingers.
- individual fingers 104a, 104b of the second pair of fingers 104 are shown as being far apart. This position or separation distance may be adopted so that the second pair of fingers 104 are able to receive a stem of a fruit to be harvested.
- the individual fingers 104a, 104b may be actuated to decrease the separation distance between the second pair of fingers 104 when a stem located between the fingers is to be gripped.
- arrows C and D indicate, respectively, the direction fingers 104a and 104b need to be moved to decrease the separation distance between the second pair of fingers 104.
- the second actuation mechanism causes the individual fingers 104a, 104b to move in unison.
- the cutting mechanism 106 is preferably provided in proximity to the second pair of fingers 104, such that when the stem of the identified ripe fruit is cut by the cutting mechanism 106, the second pair of fingers 104 continues to grip a portion of the stem that remains attached to the fruit.
- the robotic end-effector may be controlled to gently place the harvested fruit in a container.
- the second pair of fingers 104 may release their grip on the portion of the stem that remains attached to the fruit when the end-effector is close to the container.
- the second pair of fingers 104 comprise a sensor (not shown here) for indicating when the stem is located between the second pair of fingers 104.
- the sensor of the second pair of fingers 104 may be a proximity sensor, such as an infrared sensor. It will be understood that this is a non-limiting example proximity sensor, and that any suitable sensor may be used to determine when the stem is located between the second pair of fingers are closed in a position suitable for gripping.
- the identified ripe fruit When the stem of the identified ripe fruit is gripped by the second pair of fingers 104, the identified ripe fruit may be in proximity to a first side (e.g. a bottom side) of the second pair of fingers 104.
- the cutting mechanism may be provided in proximity to a second, opposite side (e.g. a top side) of the second pair of fingers, as shown in Figure 3.
- the cutting mechanism 106 may be positioned relative to the second pair of fingers such that it cuts the stem at a point where the stem protrudes from the second pair of fingers. In this way, a portion of the stem that is still attached to the fruit remains gripped by the second pair of fingers 104.
- Figure 4A is a perspective view of the first and second pairs of fingers of the first example robotic end-effector
- Figure 4B is a perspective view of the first pair of fingers of the first example robotic end-effector.
- other components of the robotic end-effector have been removed to more clearly show the fingers.
- the individual fingers 102a, 102b of the first pair of fingers 102 are shown as being close together, while the individual fingers 104a, 104b of the second pair of fingers 104 are shown as being far apart.
- This may be the configuration used when the robotic end-effector is being moved towards a ripe fruit or to identify ripe fruits.
- the opposite configuration may be used when the robotic end-effector is used to harvest a ripe fruit.
- the individual fingers 102a, 102b of the first pair of fingers 102 may be far apart (while they push away any objects that occlude the specific fruit to be harvested), while the individual fingers 104a, 104b of the second pair of fingers 104 may be close together (while they grip a stem of the specific fruit to be harvested). In this way, the first pair of fingers may be out of the way of fruit connected to the stem that is being gripped by the second pair of fingers.
- the first pair of fingers 102 may be non-sensing fingers. That is, the first pair of fingers 102 may not themselves obtain any feedback about the objects they contact.
- the first pair of fingers 102 may be haptic fingers comprising sensors for measuring forces exerted on the fingers by objects being moved by the fingers.
- Figure 5A shows a perspective view of an alternative form 100' of the first example robotic end-effector.
- the robotic end-effector 100' comprises a first pair of fingers 102' (having individual fingers 102a', 102b') and a second pair of fingers 104'.
- the functionality of the robotic end-effector 100' is the same as that of robotic end-effector 100, and is not described again for the sake of conciseness.
- Figures 5B and 5C show, respectively, a perspective view and a side view of a second example robotic end-effector 200. It will be understood that the first and second robotic end-effectors operate in the same way, and differ only in the precise design of particular features.
- the robotic end-effector 200 comprises a vision system (not shown) for identifying a location of a ripe fruit on a plant.
- the robotic end-effector 200 comprises a first pair of fingers 202 for moving any objects that at least partly occlude the identified ripe fruit on the plant.
- the robotic end-effector 200 comprises a second pair of fingers 204 for gripping a stem of the identified ripe fruit.
- the second pair of fingers 204 comprise a sensor (not shown) for indicating when the stem is gripped between the second pair of fingers 204.
- the sensor may be located between the second pair of fingers 204 in a similar manner to sensor 108 in Figure IB.
- the robotic end-effector 200 comprises a cutting mechanism (not visible here) for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers 204, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers 204 after the stem has been cut.
- the robotic end-effector 200 may further comprise a first actuation mechanism (not shown) for controlling the actuation of the first pair of fingers 202.
- a dedicated actuation mechanism is used to control the movement and operation of the first pair of fingers 202.
- the first actuation mechanism may control the first pair of fingers 202 to push away the object, by increasing a separation distance between the first pair of fingers.
- the individual fingers 202a, 202b of the first pair of fingers 202 may be close together when the robotic end-effector is being used to image a plant and identify ripe fruits, and/or when the robotic end-effector is moving towards an identified ripe fruit.
- the individual fingers 202a, 202b of the first pair of fingers 202 may be moved further apart from each other when an object that at least partly occludes a fruit needs to be moved away, so that the fruit can be better seen (to determine if it is suitable for harvesting) and/or so that the second pair of fingers 104 can grip a stem of the fruit.
- the individual fingers 202a, 202b are shown as being close together.
- Figure 5D shows a side view of an alternative form of the second example robotic end-effector 200'.
- the robotic end-effector 200' has a first pair of fingers 202' which are relatively smaller than the first pair of fingers 202 of the robotic end-effector 200 (see e.g. Figure 5C for comparison).
- the first pair of fingers 202' are smaller in size and do not protrude as far relative to the second pair of fingers. This may provide a more compact robotic end-effector, which may be advantageous when picking certain fruits.
- Figure 6A shows a perspective view of the first pair of fingers 202 of the second example robotic end-effector 200
- Figure 6B shows a perspective view of the second pair of fingers 202 and cutting mechanism 206 of the second example robotic end-effector 200.
- both individual fingers 202a, 202b of the first pair of fingers 202 are shown, while in Figure 6B, only finger 202a is shown and finger 202b is hidden so that the second pair of fingers 204 can be more clearly seen.
- the cutting mechanism 206 is for cutting the stem of the identified ripe fruit in response to a signal from the sensor (of the second pair of fingers 204) indicating that the stem is gripped between the second pair of fingers 204, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers 204 after the stem has been cut.
- the first pair of fingers 202 may be non-sensing fingers. That is, the first pair of fingers 202 may not themselves obtain any feedback about the objects they contact.
- the first pair of fingers 202 may be haptic fingers comprising sensors for measuring forces exerted on the fingers by objects being moved by the fingers.
- Figures 7A and 7B show perspective views of the second pair of fingers 204 and cutting mechanism 206 of the second example robotic end-effector 200.
- the robotic end-effector 200 may further comprise a second actuation mechanism (not shown) for controlling the actuation of the second pair of fingers 204 and the cutting mechanism 206.
- a separate, dedicated actuation mechanism is used to control the movement and operation of the second pair of fingers 204 and the cutting mechanism 206.
- a single actuation mechanism is used to control both the second pair of fingers 204 and cutting mechanism 206, thereby reducing complexity and the number of components needed to control the robotic end-effector 200.
- the sensor of the second pair of fingers 204 may also be used to determine when the second pair of fingers are firmly gripping a stem.
- the vision system may comprise at least one image sensor for capturing images of the fruit/cluster of fruits.
- the at least one image sensor may be an RGB sensor.
- the at least one image sensor may be two RGB sensors provided in the vicinity of the second pair of fingers to enable stereo vision (i.e. depth sensing). Having the RGB sensor(s) in the vicinity of the second pair of fingers may be advantageous because the sensor(s) capture images of the fruit or cluster of fruits at fruit level, whereas other sensors of the vision system may view the fruit from a different perspective/angle. This also reduces the risk of every sensor of the vision system being occluded during the picking process, i.e. it provides some redundancy in the vision system.
- the second pair of fingers comprise a sensor for determining when a stem is located between the second pair of fingers.
- the sensor may be used to determine when the second pair of fingers need to be actuated to grip the stem, so that the fruit can be harvested.
- the second actuation mechanism may control the second pair of fingers 204 to grip the stem, by decreasing a separation distance between the second pair of fingers.
- individual fingers 204a, 204b of the second pair of fingers 204 are shown as being close together. This position or separation distance may be adopted when the second pair of fingers 204 are gripping a stem of a fruit to be harvested.
- the individual fingers 204a, 204b may be actuated to increase the separation distance between the second pair of fingers 204 to enable a stem to be received or positioned between the fingers 204a, 204b prior to gripping.
- the cutting mechanism 206 is preferably provided in proximity to the second pair of fingers 204, such that when the stem of the identified ripe fruit is cut by the cutting mechanism 206, the second pair of fingers 204 continues to grip a portion of the stem that remains attached to the fruit.
- the robotic end-effector may be controlled to gently place the harvested fruit in a container.
- the second pair of fingers 204 may release their grip on the portion of the stem that remains attached to the fruit when the end-effector is close to the container.
- the second pair of fingers 204 comprise a sensor (not shown here, but see Figure IB for where the sensor may be located) for indicating when the stem is gripped between the second pair of fingers 204.
- the sensor of the second pair of fingers 204 may be a proximity sensor, such as an infrared sensor. It will be understood that this is a non-limiting example proximity sensor, and that any suitable sensor may be used to determine when the stem is located between the second pair of fingers are closed in a position suitable for gripping.
- one of the fingers 204b of the second pair of fingers may comprise a slot 208 in the finger, which extends all the way through the finger 204b, from an edge of the finger to a (stem) gripping surface of the finger 204b.
- the cutting mechanism 206 may be arranged to move through the slot 208 to cut a stem gripped by the second pair of fingers 204. This may be advantageous because a portion of the stem which is gripped by the second pair of fingers 204 may be held more firmly and/or may be substantially straight (compared to the stem which protrudes from the second pair of fingers), which may make it easier for the cutting mechanism to cut through the stem.
- the cutting mechanism 206 is located in the slot 208, the cutting mechanism is at least partially covered or encased for safety reasons, i.e. to avoid any risk of a human operator being able to come into contact with the cutting mechanism.
- the other of the fingers 204a of the second pair of fingers 204 may comprise a groove 210 for receiving a cutting edge 206a of the cutting mechanism 206 when the cutting mechanism 206 moves through the slot 208 to cut the stem. This may enable the cutting mechanism 206 to fully cut through the stem, as the groove 210 provides a space for the cutting edge 206a of the cutting mechanism 206 to pass all the way through the stem.
- surfaces of the first pair of fingers 102 and/or second pair of fingers 104 and/or other components of the end-effectors may be substantially smooth for ease of cleaning and to avoid the accumulation of dirt. This is advantageous because dirt on the components may prevent the end-effector from operating correctly. For example, dirt on the second pair of fingers may prevent the sensor of the second pair of fingers from correctly determining when a stem is gripped between the fingers.
- the robotic end-effector described herein benefits from 2.5 degrees of freedom, which is higher than those of available picking heads. This added degree of freedom allows the robotic end-effector to deal with complex picking scenarios where the available picking heads fail.
- the robotic end-effector benefits from an effective combination of actuation systems and sensors to resolve the limitations of currently available picking heads.
- the robotic end-effector includes three separate movements (moving objects, gripping a stem, and cutting a stem) that are actuated using two actuators. This is useful as the ability of the robotic end-effector is increased without also significantly increasing the complexity or component count of the device.
- Embodiments of the robotic end-effector benefit from an effective configuration of RGB, RGB-D and IR proximity sensors that helps to efficiently detect and localise the ripe strawberries.
- the combined sensory information can be used to estimate the size, weight and sort the quality of the strawberries to be picked.
- FIG 8 is a block diagram of a robotic system 300 for fruit harvesting comprising the robotic end-effector 100, 200 of the present techniques. While Figure 8 shows the robotic system 300 comprising the first example robotic end- effector 100, it will be understood that this is merely illustrative.
- the robotic system 300 comprises at least one picking arm 302 and a control system 304 for controlling the at least one picking arm 302.
- the control system 304 may comprise at least one processor coupled to memory.
- the at least one processor may comprise one or more of: a microprocessor, a microcontroller, and an integrated circuit.
- the memory may comprise volatile memory, such as random access memory (RAM), for use as temporary memory, and/or non-volatile memory such as Flash, read only memory (ROM), or electrically erasable programmable ROM (EEPROM), for storing data, programs, or instructions, for example.
- volatile memory such as random access memory (RAM)
- RAM random access memory
- non-volatile memory such as Flash, read only memory (ROM), or electrically erasable programmable ROM (EEPROM), for storing data, programs, or instructions, for example.
- the robotic system comprises a robotic end-effector 100 of the types described herein, that is coupled to the at least one picking arm 302.
- the robotic end-effector 100 comprises a vision system 120 for identifying a location of a ripe fruit on a plant.
- the vision system 120 may enable a three-dimensional map of a plant to be generated.
- the vision system 120 may use the three-dimensional map to identify location of ripe fruits on a plant and any objects that at least partly occlude the identified ripe fruits.
- the vision system 120 sensor may comprise a depth sensor, which may be, for example, an RGB-D (red-green-blue-depth) camera.
- the RGB- D camera may be mounted on the robotic end-effector in a position that enables the plant to be imaged from a distance.
- the vision system 120 may further comprise at least one further image sensor in the vicinity of the second pair of fingers 104.
- the image sensor may enable images to be captured of the environment of the second pair of fingers 104. This may be advantageous because, when the robotic end-effector 100 is being used to harvest a fruit, the RGB-D sensor may be obscured by foliage of the plant and so it may not be able to determine, for example, whether a stem of the ripe fruit is positioned between the second pair of fingers. Thus, the image sensor provides redundancy in the vision system 120. In some cases, the image sensor may be two RGB sensors provided in the vicinity of the second pair of fingers 104 to enable stereo vision (i.e. depth sensing).
- the vision system 120 may also be able to determine the size and quality of each fruit that is to be harvested, or which is harvested. This may enable the fruit to be deposited into a suitable container. For example, this may enable large fruits to be placed into a container with other large fruits (so that they do not damage or squash smaller fruits), or it may enable large fruits to be dispersed among different containers (so that each container contains a mixture of fruit sizes). This may also enable any slightly damaged or rotten fruits to be discarded or separated from other higher quality fruits.
- the robotic end-effector comprises a first pair of fingers 102 for moving any objects that at least partly occlude the identified ripe fruit on the plant.
- the first pair of fingers are operated by a first actuation mechanism 122.
- the robotic end- effector comprises a second pair of fingers 104 for gripping a stem of the identified ripe fruit.
- the second pair of fingers 104 comprise a sensor 108 for indicating when the stem is located between the second pair of fingers 104.
- the robotic end-effector comprises a cutting mechanism 106 for cutting the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers 104, wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut.
- the second pair of fingers 104 and cutting mechanism 106 are operated by a second actuation mechanism 124.
- the operation of the robotic end- effector 100 will not be described here again.
- the robotic system 300 may further comprise at least one container 306 for receiving fruit harvested by the robotic end-effector 100.
- the control system 304 may control the at least one picking arm 302 to move the robotic end-effector 100 to above the at least one container 306.
- the second actuation mechanism 124 controls the second pair of fingers 104 to release the stem and the attached fruit, by increasing a separation distance between the second pair of fingers, so that the fruit drops into the container.
- the robotic system 300 may further comprise a mechanism 308 for moving the robotic system towards, between and/or around plants.
- the mechanism 308 may be a tracked or wheeled rover, or a vehicle capable of navigating autonomously.
- Figure 9 is a flowchart of example steps to harvest fruit using the robotic end-effector of the present techniques.
- the method begins by identifying, using a vision system of the robotic end-effector, a location of a ripe fruit on a plant (step S100).
- the method comprises moving, using a first pair of fingers of the robotic end-effector, any objects that at least partly occlude the identified ripe fruit on the plant (step S102).
- the step of moving any objects that at least partly occlude the identified ripe fruit may comprise controlling a first actuation mechanism to increase a separation distance between the first pair of fingers.
- the method comprises gripping a stem of the identified ripe fruit, using a second pair of fingers of the robotic end-effector (step S104).
- the step of gripping a stem of the identified ripe fruit may comprise: receiving feedback from the sensor 108 that the stem of the identified ripe fruit is located between the second pair of fingers; and controlling a second actuation mechanism to decrease a separation distance between the second pair of fingers.
- the feedback from the sensor may be used to determine when the second pair of fingers need to be actuated to grip the stem, so that the fruit can be harvested.
- the method comprises sensing, using a sensor of the second pair of fingers, when the stem is located between the second pair of fingers (step S106).
- the method comprises cutting, using a cutting mechanism, the stem of the identified ripe fruit when the stem is gripped between the second pair of fingers (step S108), wherein a portion of the stem that remains attached to the fruit remains gripped by the second pair of fingers after the stem has been cut.
- the robotic end-effectors comprise a first pair of fingers for moving any objects that at least partly occlude the identified fruit on the plant.
- Figures 10A and 10B show perspective views of an alternative form of the first pair of fingers that may be used with the end-effectors described herein. Specifically, Figure 10A shows an individual finger 1002a of the pair of fingers, and Figure 10B shows another individual finger 1002b of the pair of fingers. In use, the fingers 1002a, 1002b operate similarly to the fingers 202a, 202b described above with respect to Figures 5B to 6B.
- the robotic end-effector is being used to image a plant and identify ripe fruits, and/or is moving towards an identified ripe fruit.
- the individual fingers 1002a, 1002b may be moved further apart from each other when an object that at least partly occludes a fruit needs to be moved away, so that the fruit can be better seen (to determine if it is suitable for harvesting) and/or so that the second pair of fingers can grip a stem of the fruit.
- the form of the first pair of fingers is smaller than those described with respect to the first and second example robotic end-effectors.
- the fingers 1002a, 1002b are thinner and shorter than the first pair of fingers described up to now.
- the more compact fingers 1002, 1002b are advantageous because they enable more effective interaction with clusters of fruits such as strawberries.
- the robotic end-effectors comprise a second pair of fingers for gripping a stem of the identified ripe fruit.
- Figures 11A and 11B show, respectively, a perspective view and a front view of an alternative form of the second pair of fingers that may be used with the end-effectors described herein.
- Each finger of the second pair of fingers 104 and 204 described above has a contact or gripping surface which contacts a stem to be gripped and cut.
- the gripping surface of each finger in the pair 104, 204 is substantially parallel.
- the cutting mechanism 106, 206 moves in a plane normal to (perpendicular to) the gripping surfaces of the pairs of fingers 104, 204.
- Figure 11A and 11B show an improved gripping finger 1004a of a second pair of fingers.
- the gripping finger 1004a comprises a shaped contact/gripping surface.
- the gripping surface of finger 1004a comprises a curved or angled portion 1000 and a straight or flat portion 1002.
- the straight portion 1002 enables the finger 1004a to grip the stem of a ripe fruit.
- the straight portion 1002 of finger 1004a faces a straight portion of a gripping surface of another finger in the second pair of fingers, or faces a straight gripping surface of the other finger.
- the angled portion 1000 enables the cutting mechanism to cut the stem at an angle.
- the angled portion 1000 may remove the need for need a groove in the gripping surface in which to receive a cutting edge of the cutting mechanism. As shown in Figure 11A, the angled portion 1000 may be angled relative to the straight portion 1002. In the illustrated example, the angle away from an axis of the straight portion 1002 may be 20° ⁇ 0.5°. However, it will be understood that this is an illustrative example and other suitable angles may be used which enable a stem to be cut using a single cutting action.
- Each individual finger of the second pair of fingers may have a form like that shown in Figures 11A and 11B.
- both the finger 204b of the second pair of fingers which comprises a slot 208 that extends all the way through the finger 204b, and the finger 204a which comprises a groove 210 for receiving a cutting edge of the cutting mechanism may take the form shown in Figures 11A and 11B.
- the finger 204b of the second pair of fingers which comprises a slot 208 that extends all the way through the finger 204b may be unchanged, while the finger 204a which comprises a groove 210 for receiving a cutting edge of the cutting mechanism may instead take the form shown in Figures 11A and 11B.
- FIG 12 is a flowchart of example steps to pick fruits.
- the basic process to harvest a ripe fruit is described above with reference to Figure 9. A more detailed process is now described.
- the robotic end-effector 100 comprises a vision system 120 for identifying a location of a ripe fruit on a plant.
- the vision system 120 may alternatively be provided on the picking arm 302 in the vicinity of the robotic end-effector (e.g. above or below the robotic end- effector and mounted so that it is able to visualise a scene in front of the end- effector).
- the vision system may enable a three-dimensional map of a plant to be generated.
- the vision system may use the three-dimensional map to identify location of ripe fruits on a plant and any objects that at least partly occlude the identified ripe fruits.
- the vision system 120 may comprise a depth sensor, which may be, for example, an RGB-D (red- green-blue-depth) camera.
- the RGB-D camera may be mounted on the robotic end-effector (or on the picking arm in the vicinity of the robotic end-effector) in a position that enables the plant to be imaged from a distance.
- the RGB-D camera may, in conjunction with image processing and analysis software, be used to segment images, detect ripe fruits, estimate a ripeness of the fruits, and select and localise a ripe fruit for picking.
- the picking arm may be set to be in a default ("home") configuration in which the RGB-D camera may be able to survey a plant and its fruits from a distance.
- the vision system 120 may also be able to determine the size, weight and/or quality of each fruit that is to be harvested, or which is harvested. This may enable the fruit to be deposited into a suitable container. For example, this may enable large fruits to be placed into a container with other large fruits (so that they do not damage or squash smaller fruits), or it may enable large fruits to be dispersed among different containers (so that each container contains a mixture of fruit sizes). This may also enable any slightly damaged or rotten fruits to be discarded or separated from other higher quality fruits.
- the process begins by identifying, using the vision system, the location of fruits on a plant (step S200). This may comprise placing bounding boxes around every identifiable fruit on a plant. Each bounding box may be associated with a corresponding confidence value indicating how likely it is that the box contains a fruit. The confidence value may be low when a fruit is partly occluded by another fruit or another object (e.g. leaf).
- the picking arm may be moved relative to the plant to capture images of the plant from different angles. Thus, although the images may originally be captured while the picking arm is in the home configuration, the picking arm may be moved (e.g. to the left and right) in order to better visualise the picking arm.
- Fruits that are partly occluded when viewed from the home configuration may be more clearly seen when viewed from a different position/angle.
- a position and/or orientation of the picking arm and therefore, the vision system may be changed in order to identify the location of fruits on the plant.
- the process may further comprise determining which of the identified fruits are ripe fruits, and selecting one such fruit to pick (step S202).
- the process may comprise determine whether the selected/target fruit is in a cluster (step S204).
- the process may comprise generating, using a motion planning module, a trajectory for the robotic arm to move the end-effector towards the selected/target fruit (step S208).
- the bounding box around the target fruit includes 2D and 3D coordinate information which is used by the motion planning module to move the end-effector towards the selected fruit.
- the trajectory may be generated to move the end-effector so that it is in the vicinity of the target fruit, e.g. 5-lOcm away from the target fruit. This may be advantageous because finer positioning may be determined once the end-effector is in the vicinity of the target fruit, when the target fruit can be more clearly seen by the sensor 108 in the vicinity of the second pair of fingers.
- finer control of the end-effector may be performed to move the end-effector closer to the target fruit.
- This may comprise moving the picking arm as well as moving the end- effector.
- the image sensor 108 in the vicinity of the second pair of fingers is used to provide further information about the environment around the target fruit, such as whether the stem of the target fruit can be seen and accessed, and whether any objects are preventing the second pair of fingers from reaching the stem.
- the process may comprise using the vision system to actuate the first pair of fingers to move any occluding objects out of the way of the target fruit (step S210), and to actuate the second pair of fingers to grip a stem of the target fruit (step S212). Once the second pair of fingers are gripping a stem of the target fruit, the cutting mechanism may be deployed.
- step S204 it may not be straightforward to pick the fruit using the end-effector. In this case, there may be other fruits in the cluster which prevent the second pair of fingers from reaching the stem of the target fruit. There may be other fruits which are also ripe and can be picked and which are occluding the target fruit.
- the process may comprise determining an order in which to pick ripe fruits in the cluster (step S206).
- a scheduling module may determine which ripe fruit to pick first, based on which fruit is the easiest to reach and pick. This determination may be based on the bounding box information. For example, the easiest fruit to pick may be that whose bounding box has the maximum distance to the bounding boxes of the other fruits in the cluster. Picking the nearest and/or easiest to reach fruit first may also enable other more difficult fruits to be picked which may be located behind the nearest/easiest fruits.
- the scheduling module may also estimate the size or weight of each ripe fruit. This enables control of the picking arm to deposit the harvested fruit into the most appropriate container or punnet, as mentioned above.
- step S208 to generate a trajectory to move the end-effector towards the first target fruit in the schedule.
- Steps S208 to S212 are repeated until each ripe fruit in the schedule has been picked. It will be understood that during the picking process, the schedule may be changed to take into account the fact that once a fruit has been harvested it may be easier to see which fruit is easier to harvest. Similarly, the picking process may cause some fruits to become occluded, which may impact the schedule. Thus, after step S212 has been completed for the first fruit, the process may return to step S206 to check the schedule and update it if necessary.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Environmental Sciences (AREA)
- Harvesting Machines For Specific Crops (AREA)
Abstract
D'une manière générale, des modes de réalisation de la présente invention concernent un effecteur terminal robotique destiné à la récolte sélective de culture, qui convient particulièrement à la récolte des cultures qui croissent dans des grappes denses, telles que des fraises.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB2107942.1A GB2607326B (en) | 2021-06-03 | 2021-06-03 | Apparatus and systems for selective crop harvesting |
GB2107942.1 | 2021-06-03 | ||
GBGB2114906.7A GB202114906D0 (en) | 2021-06-03 | 2021-10-19 | Apparatus and system for selective crop harvesting |
GB2114906.7 | 2021-10-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022254203A1 true WO2022254203A1 (fr) | 2022-12-08 |
Family
ID=82115977
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB2022/051386 WO2022254203A1 (fr) | 2021-06-03 | 2022-06-01 | Appareil et système de récolte sélective de culture |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2022254203A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118176932A (zh) * | 2024-02-28 | 2024-06-14 | 北京市农林科学院智能装备技术研究中心 | 折断式采摘机械手的采摘方法及折断式采摘机械手 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2760595A1 (fr) * | 1997-03-13 | 1998-09-18 | Tecnoma | Procede de recolte d'objets portes par une tige, tels que des grappes de raisins |
WO2016055552A1 (fr) * | 2014-10-07 | 2016-04-14 | Katholieke Universiteit Leuven | Appareil de récolte automatisé |
CN110393089A (zh) * | 2018-09-21 | 2019-11-01 | 湖南科技大学 | 高空水果采集系统及其操作方法 |
US20200008355A1 (en) * | 2017-03-14 | 2020-01-09 | Metomotion Ltd. | Automated harvester effector |
WO2020159123A1 (fr) * | 2019-01-31 | 2020-08-06 | 박재언 | Effecteur final |
-
2022
- 2022-06-01 WO PCT/GB2022/051386 patent/WO2022254203A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2760595A1 (fr) * | 1997-03-13 | 1998-09-18 | Tecnoma | Procede de recolte d'objets portes par une tige, tels que des grappes de raisins |
WO2016055552A1 (fr) * | 2014-10-07 | 2016-04-14 | Katholieke Universiteit Leuven | Appareil de récolte automatisé |
US20200008355A1 (en) * | 2017-03-14 | 2020-01-09 | Metomotion Ltd. | Automated harvester effector |
CN110393089A (zh) * | 2018-09-21 | 2019-11-01 | 湖南科技大学 | 高空水果采集系统及其操作方法 |
WO2020159123A1 (fr) * | 2019-01-31 | 2020-08-06 | 박재언 | Effecteur final |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118176932A (zh) * | 2024-02-28 | 2024-06-14 | 北京市农林科学院智能装备技术研究中心 | 折断式采摘机械手的采摘方法及折断式采摘机械手 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Hohimer et al. | Design and field evaluation of a robotic apple harvesting system with a 3D-printed soft-robotic end-effector | |
US10779472B2 (en) | Robotic fruit picking system | |
Lehnert et al. | Autonomous sweet pepper harvesting for protected cropping systems | |
Williams et al. | Robotic kiwifruit harvesting using machine vision, convolutional neural networks, and robotic arms | |
Yaguchi et al. | Development of an autonomous tomato harvesting robot with rotational plucking gripper | |
Xiong et al. | Development and field evaluation of a strawberry harvesting robot with a cable-driven gripper | |
EP3863814B1 (fr) | Récolteuse autonome | |
US8306663B2 (en) | Robot with 3D grasping capability | |
US9913429B1 (en) | Tagging of fruit-producing flowers for robotic selective harvesting | |
KR20190122227A (ko) | 자동 수확기 이펙터 | |
US11477942B2 (en) | Robotic fruit harvesting machine with fruit-pair picking and hybrid motorized-pneumatic robot arms | |
WO2016055552A1 (fr) | Appareil de récolte automatisé | |
WO2016132264A1 (fr) | Machine de récolte multi-robot | |
WO2010063075A1 (fr) | Dispositif et procédé de récolte de produits agricoles | |
Yoshida et al. | Fast detection of tomato peduncle using point cloud with a harvesting robot | |
WO2022254203A1 (fr) | Appareil et système de récolte sélective de culture | |
Parsa et al. | Modular autonomous strawberry picking robotic system | |
Park et al. | Human-centered approach for an efficient cucumber harvesting robot system: Harvest ordering, visual servoing, and end-effector | |
Ren et al. | Mobile robotics platform for strawberry sensing and harvesting within precision indoor farming systems | |
Oliveira et al. | End-effectors for harvesting manipulators-state of the art review | |
Xiong et al. | Push and drag: An active obstacle separation method for fruit harvesting robots | |
Parsa et al. | Autonomous strawberry picking robotic system (robofruit) | |
Au et al. | The monash apple retrieving system | |
GB2607326A (en) | Apparatus and systems for selective crop harvesting | |
Lehnert et al. | A sweet pepper harvesting robot for protected cropping environments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22731774 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22731774 Country of ref document: EP Kind code of ref document: A1 |