WO2022217177A1 - Automated precision pollen applicator for row crops - Google Patents

Automated precision pollen applicator for row crops Download PDF

Info

Publication number
WO2022217177A1
WO2022217177A1 PCT/US2022/070890 US2022070890W WO2022217177A1 WO 2022217177 A1 WO2022217177 A1 WO 2022217177A1 US 2022070890 W US2022070890 W US 2022070890W WO 2022217177 A1 WO2022217177 A1 WO 2022217177A1
Authority
WO
WIPO (PCT)
Prior art keywords
plant
pollinating
plants
image
unit
Prior art date
Application number
PCT/US2022/070890
Other languages
English (en)
French (fr)
Inventor
Martin Arbelbide
Brice FLOYD
Jeremy K JOHNSON
Nitin KANDPAL
Collin LAMKEY
Manuel E RUIDIAZ
Alexander SACK
Stacy THORSON
Jeffrey Dale Wille
Original Assignee
Pioneer Hi-Bred International, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Hi-Bred International, Inc. filed Critical Pioneer Hi-Bred International, Inc.
Priority to US18/263,810 priority Critical patent/US20240306570A1/en
Priority to BR112023020306A priority patent/BR112023020306A2/pt
Priority to EP22785610.1A priority patent/EP4319709A1/en
Publication of WO2022217177A1 publication Critical patent/WO2022217177A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01HNEW PLANTS OR NON-TRANSGENIC PROCESSES FOR OBTAINING THEM; PLANT REPRODUCTION BY TISSUE CULTURE TECHNIQUES
    • A01H1/00Processes for modifying genotypes ; Plants characterised by associated natural traits
    • A01H1/02Methods or apparatus for hybridisation; Artificial pollination ; Fertility
    • A01H1/027Apparatus for pollination
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B69/00Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
    • A01B69/007Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow
    • A01B69/008Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow automatic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining

Definitions

  • Embodiments of this invention pertain to the imaging of fruiting bodies, such as com ears, on a live plant to detect characteristics that may be used for plant phenotyping and for automated pollination of crops such as maize.
  • Embodiments of this invention include dual side applicators and on-board real time graphic processing that allows multiple plant fruiting bodies on a single plant to be automatically pollinated in one pass.
  • Embodiments described herein involve an imaging system for identifying the location and/or other phenotypic characteristics of the plant fruit or flowers.
  • the imaging system may assess yield, yield potential (quantity), disease and overall health.
  • the imaging system is designed to account for image distortion and poor lighting as the imaging system is transported between rows of plants.
  • the image and location of the plant flower or fruit, such as an ear in the process of silking may be utilized to direct automated pollination of the plants.
  • the plant will have two or more plant fruiting bodies, each of which may be pollinated in one pass by automated imaging and pollinating units on each side of the row of plants.
  • FIG. 1 provides an illustration of the orientation of the corn ear and com silk detection device relative to the plant rows and its direction of travel through the field.
  • FIG. 2 provides an illustration of a dual side imaging system.
  • FIG. 3 further illustrates a dual side imaging system in a multi-row embodiment.
  • FIG. 4 provides an illustration of one embodiment of a camera with a semi- hemispherical lens.
  • FIG. 5 illustrates an in-field example of the detection of a corn ear and measurement of com ear height using the imaging system via a side-facing camera mounted to an inter-row implement.
  • FIG. 6 illustrates an in-field example of the detection of com silks using the imaging system via a forward-facing camera mounted to a human walking inter-row.
  • Fig. 7 illustrates an embodiment of a pollen applicator with a vertical adjustment and pivoting dual spray heads.
  • FIG. 8 illustrates an embodiment of an array of pollen applicators boom mounted on a transport device.
  • Embodiments described herein involve an imaging system for identifying the location and/or other phenotypic characteristics of the plant fruit or flowers, such as the fruiting bodies of hybrid cereal crops.
  • Hybrid cereal crops include, but are not limited to, wheat, maize, rice, barley, oats, rye and sorghum.
  • the imaging system is transported between rows of cereal crop plants.
  • corn is typically planted in rows spaced 15 to 30 inches from adjoining rows, although greater or lesser corn row spacing is also possible. Proper row spacing allows plants room to explore for nutrients and minimizes the adverse effects of competition from neighboring plants. In Iowa, and in most regions of the midwest, 20 inches and 30 inches are the most common row spacing configurations.
  • an imaging system transported between the rows would be about 15 inches from the row of com plants on each side, which tends to result in a limited field of view when a standard camera lens is used. Additional difficulties for imaging corn ears arise as a result of low or inconsistent lighting conditions that can be caused by clouds, time of day or night, the canopy formed by the uppermost leaves of the plant, by other leaves that obscure the camera’s view of the com ear and its silks, by the need to image multiple ears of corn, and by movement of the camera as it is transported between the rows.
  • a semi-hemispherical lens is used to provide an adequate field of view to identify one or more fruiting bodies on a plant.
  • this lens causes significant distortion of the image which makes it especially difficult to determine the ear height and location of the fruiting bodies.
  • the image is flattened, followed by object recognition within the image.
  • an on board image processing device is utilized for immediate recognition and location of the plant fruiting body.
  • the image from each camera opposite a com plant is utilized to detect a com ear and/or silks and associated with an x, y and optionally z coordinate position. Such coordinates may be utilized to direct an automated pollination sprayer to the location of the com ear and silk.
  • a GPS system, and optionally an inertial measurement unit (IMU) may be associated with the camera on each side of the row to determine these coordinates.
  • the known height of the camera on the transport device on which it is mounted may also be used for coordinate determination.
  • One embodiment is a multi-row camera system. Each camera in each row will have a left and right camera, and there may further be a plurality of such camera systems across several rows.
  • the camera may be mounted on a transport device that fits between the rows, or may be suspended from a boom (see figure 8) that allows the camera to be positioned under the canopy in a position suitable for imaging com ears.
  • Transport devices include, but are not limited to, robotic vehicles (such as those produced by Boston Dynamics and NAIO Technologies), tractors, and drones.
  • the imaging system may further assess plant characteristics such as yield, yield potential (quantity), disease and overall health.
  • plant characteristics such as yield, yield potential (quantity), disease and overall health.
  • the image and location of the plant flower or fruit, such as an ear in the process of silking, may be utilized to direct automated pollination of the plants.
  • the location information from the imaging device would be utilized to direct a pollen application device to deliver pollen to the corn ear silks.
  • the imaging permits a precise application of pollen that results in less waste and a more efficient pollen application that leads to improved seed set.
  • Figure 1 provides an illustration of a corn ear and corn silk detection device moving through the field parallel to rows.
  • the cameras are oriented at approximately 90 degrees to the left or right of the direction of travel, although any known angle of orientation may be used.
  • the camera detects objects in the plant rows closest to the imaging and detection system with a high degree of probability, while background and off target rows have a lower probability of object detection due to their distance from the system that results in reduced target size as well as a greater image distortion and shrinkage of distant objects relative to the closest rows. Therefore, with a one camera one row system, fruiting bodies occurring in the background have a greater potential to be missed by the image recognition software, and the second (or additional) fruiting bodies on a row crop plant commonly occurs in a distally related part of the plant.
  • the second ear is commonly located a few nodes from the first ear and oriented at a rotational axis of 90 to 180 degrees on the stalk and positioned lower on the plant, and therefore deeper in the canopy where pollen may not adequately shed.
  • dual ear com often doesn’t result in a significant grain yield increase when it does occur in hybrid grain production
  • some inbred varieties with proper spacing and growing conditions may be managed in a way to optimize the production of a second ear. In the past this is not done in the regular course of seed production due to the difficulty of obtaining sufficient seed yield on the second ear.
  • this invention by assuring that the second ear receives sufficient pollen, serves as a potential enabler of dual ear seed production. This can be of value in seed production, especially when seed quantities are low, such as when inbred breeder seed is scarce and every seed is needed for plant propagation and seed multiplication.
  • Figure 3 is similar to figure 2 and further illustrates a multi-row embodiment comprising both a left and right-side imaging and pollination unit in-between the interior rows.
  • This system may be used for other row crops which comprise multiple fruiting bodies, such as for wheat with a main stem and one or more tiller stems, as well as for crops such as rice, barley, oats, rye and sorghum.
  • Images are captured with a semi -hemispherical lens (14) as shown in Figure 4.
  • the lens feeds images into an on-board imaging system that processes the distorted hemispherical images into flattened and corrected images with identified plant reproductive parts, such as corn ears or silks, together with 3 -dimensional location information sufficient to direct a pollinating device to the location of the plant reproductive part.
  • An optional second camera may be used to direct the pollinating portion of the device to the plant reproductive part.
  • the images may be captured at a suitable rate for the speed of the activity.
  • rates of image capture of up to 30 frames per second were achieved using an NVIDIA graphics card.
  • One graphics card per imaging device was used, although it is also possible to feed the images from two or more imaging devices into a single graphics card, which may be preferable when a 360-degree view of an individual plant or row of plants is desired.
  • Positional data associated with the images from the dual cameras on each side of a row may be used to construct a series of photos of the plant that represent a nearly 360-degree view of the individual plant or row of plants, and the graphics card and data structure may be optimized for this task.
  • raw hemi- spherical images are flattened (e.g.
  • an IMU inertia measurement unit
  • an IMU may not be needed because the boom is relatively parallel to the ground.
  • the transport of the camera through the field causes the capture of images not consistently aligned on the x, y and z planes, resulting in warped images and incorrect ear height measurements. To correct this problem, further image adjustment was needed, and in some embodiments an IMU was added above the camera.
  • images were taken at a slight right-ward angle and IMU measurements were used to correct the right-ward angle in order to obtain a corrected perspective image directly perpendicular to the target object.
  • Images were undistorted (flattened) with pre-calibrated parameters (K,D and FoV Scale) and then a perspective warp was applied to straighten the image using inertial measurement unit data.
  • K,D and FoV Scale pre-calibrated parameters
  • a perspective warp was applied to straighten the image using inertial measurement unit data.
  • pixels were counted and height, distance and/or area were computed based on known pixel dimensions to allow measurement of vertical height of ears and silks as well as an assessment of depth.
  • images may be scaled down from the full resolution image.
  • a laser distance sensor (or ultrasonic sensor, lidar, multidimensional camera or radar) may be used to detect distance to stalks, and optionally, to determine distance to ground. The latter may be particularly useful on boom mounted systems.
  • the video frame extracted from video was undistorted from a hemi- spherical view to a flattened view using an undistortion matrix (camera model) for that particular sensor.
  • An object detection model was then used to identify an object of interest from within the video frame.
  • the pixel coordinates of the detections centroid or bounds of that detection were recorded.
  • Measurement of the object height used a combination of the pixel coordinates and a camera’s intrinsic matrix.
  • the center of the image collected was the same as the mounting height of the camera.
  • Measuring objects away from the center of the camera view required adding (for higher objects from center) or subtracting (for lower objects from center) a calibrated distance which is calculated from the camera’s intrinsic matrix associated with the specific camera and the objects distance from the camera (or depth) is used as a multiplier. This process was done for each frame of a video.
  • the video frame extracted from video was undistorted from a hemi -spherical view to a flattened view using an undistortion matrix for that particular sensor and/or camera model.
  • the IMU was used to correct for variable camera angles encountered when operating the system by measuring the camera orientation in space relative to an artificial horizon.
  • Pitch, roll, and yaw measurements from the IMU were used in Euler equations to warp the perspective of the image back to a nominally positioned camera as if it was level to the horizon and perpendicular to the target object.
  • An object detection model was used to identify an object of interest.
  • the pixel coordinates of the detections centroid or bounds of that detection were recorded, and a flattened image matrix model was used to convert pixel coordinates to real world measurements. This process was done for each frame of the video. This may be done either during or post video collection for determining ear or tiller height, potential yield or other plant characteristics, but must be done during video collection for use in directing automated pollination.
  • the cameras or IMUs that are associated with another transport device such as a robotic vehicle (such as those produced by Boston Dyanamics or NAIO Technologies), tractors and drones can be used through an application program interface (API) rather than adding additional camera sensors.
  • API application program interface
  • Onboard computation may also be used for the processing of imagery through an API as well, obviating the need to add additional hardware resources.
  • a series of GNSS points were collected, with each point representing the geographic coordinates of where the image was taken by the imaging system.
  • GPS described herein
  • several images were tagged with the same GPS position since the GPS system took about 10 positions per second, and the imaging system took about 30 frames per second.
  • a box image was created with a series of boxes, with each box representing a 17 foot long row of corn with a width of 30 inches and a point representing the location of the camera when each image was taken.
  • Natural lighting was utilized. However, artificial lighting may also be utilized to assist in non-daylight hours when phenotyping and/or pollination is needed.
  • Ultraviolet or infrared lighting or thermal imaging may be utilized to enhance illumination of the corn silks, flowers, or other plant parts. While the camera may have some level of automatic gain and exposure to enhance imaging in low light conditions, this can also result in motion blur. To remedy this, the exposure can be limited to a threshold value and then gain may be used, or an external sensor can be used to adjust exposure and lighting.
  • Images taken at about 30 frames per second will show the same ear across several images, so the system tracks continuity of the ear (or other plant reproductive part) throughout the various images.
  • One embodiment that may be utilized to achieve this continuity of image is to locate plant reproductive part relative to the center point of the image.
  • Images were taken of fixed height objects in order to correlate the height of objects in the raw image with those in a flattened and corrected image.
  • the distance between the camera and the object of interest must be known.
  • a 15-inch depth of field was used based on a standard plant row spacing of 30 inches. This distance assumption would be adjusted for the plant row spacing used.
  • Some camera movement also occurred since the camera was not always positioned in the center of the alley equally between the two rows. Modification of this distance assumption also requires tuning the camera intrinsic matrix and distortion coefficients model to the specific desired distance.
  • a system for camera stabilization such as camera gimbals or gyroscopic stabilizers may be added to maintain a stable camera position in 3D space while the transport system moves through the scene.
  • the camera drifting off center of the inter-row space or varying in its angles of view may be affected by either the irregular soil surface the camera is being moved along upon, or thorough other leaning of drifting of the apparatus.
  • a gimbal or related stabilizer could be used to alleviate such manipulation of the camera position.
  • a robotic arm can be used to direct a pollinating nozzle or spray tube.
  • FIG. 7 shows a close up of an individual spray unit, which comprises a sliding spray block (2) powered by a hydraulic lift (6) that can be rapidly raised and lowered along slide rail (4).
  • the range of vertical motion may be between 10 and 40 inches, likely centered around 24 inches.
  • the spray block (2) comprises two nozzles (3), each connected to a pollen transport tube (5) and in one embodiment, each nozzle is independently activated by a rotary servomotor inside the spray block (2) that allows for precise control of angular position and direction of pollen spray.
  • the top down orientation is especially beneficial to direct the spray nozzle to the approximate height of the second ear, where the spray can be directed to the silks without being blocked by the upper leaves of the plant.
  • a second camera may then be used to direct the pollen applicator more specifically to the plant reproductive part, and may be used to confirm application of the pollen to the silk as well.
  • the same type of camera, image processing and image recognition software may be used, or alternately, a standard camera lens with less curvature and need for image processing may be used.
  • the lead set of cameras may be physically positioned in front of pollen applicator by some distance to allow time for image processing. A lead camera distance of about 3 feet will allow for sufficient latency time when the machine is traveling at 5 mile per hour.
  • An ethernet or USB camera may be used to avoid signal delay. Colorant and/or fluorescent dye may be added to the pollen for verification and to enhance image verification of pollination.
  • the device and methods described herein may be used to phenotype or characterize plants.
  • a GPS heat map of silk density and distribution may be generated, com silks may be counted, 5%, 50%, 75% and/or 100% flowering time may be estimated, stem size can be measured or lodging resistance estimated, ear size and diameter may be measured and grain yield can be estimated.
  • the device and methods can also be utilized to count total primary ears, total secondary ears, and/or total ears.
  • the total number of primary spikes, total number of tillers, and/or total number of wheat seed heads may be counted.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Environmental Sciences (AREA)
  • Primary Health Care (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Animal Husbandry (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Mining & Mineral Resources (AREA)
  • Theoretical Computer Science (AREA)
  • Developmental Biology & Embryology (AREA)
  • Botany (AREA)
  • Agronomy & Crop Science (AREA)
  • Genetics & Genomics (AREA)
  • Soil Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Image Processing (AREA)
  • Breeding Of Plants And Reproduction By Means Of Culturing (AREA)
PCT/US2022/070890 2021-04-08 2022-03-01 Automated precision pollen applicator for row crops WO2022217177A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US18/263,810 US20240306570A1 (en) 2021-04-08 2022-03-01 Automated precision pollen applicator for row crops
BR112023020306A BR112023020306A2 (pt) 2021-04-08 2022-03-01 Aplicador de pólen de precisão automatizado para culturas em fileiras
EP22785610.1A EP4319709A1 (en) 2021-04-08 2022-03-01 Automated precision pollen applicator for row crops

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163172197P 2021-04-08 2021-04-08
US63/172,197 2021-04-08

Publications (1)

Publication Number Publication Date
WO2022217177A1 true WO2022217177A1 (en) 2022-10-13

Family

ID=83545831

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/070890 WO2022217177A1 (en) 2021-04-08 2022-03-01 Automated precision pollen applicator for row crops

Country Status (4)

Country Link
US (1) US20240306570A1 (pt)
EP (1) EP4319709A1 (pt)
BR (1) BR112023020306A2 (pt)
WO (1) WO2022217177A1 (pt)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111183776A (zh) * 2020-02-28 2020-05-22 北方民族大学 一种施肥设备及双侧施肥量自动控制方法
US20210090274A1 (en) * 2019-09-25 2021-03-25 Blue River Technology Inc. Identifying and treating plants using depth information in a single image
WO2021060374A1 (ja) * 2019-09-27 2021-04-01 HarvestX株式会社 自動授粉装置、自動授粉方法及び自動授粉システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210090274A1 (en) * 2019-09-25 2021-03-25 Blue River Technology Inc. Identifying and treating plants using depth information in a single image
WO2021060374A1 (ja) * 2019-09-27 2021-04-01 HarvestX株式会社 自動授粉装置、自動授粉方法及び自動授粉システム
CN111183776A (zh) * 2020-02-28 2020-05-22 北方民族大学 一种施肥设备及双侧施肥量自动控制方法

Also Published As

Publication number Publication date
BR112023020306A2 (pt) 2023-11-21
US20240306570A1 (en) 2024-09-19
EP4319709A1 (en) 2024-02-14

Similar Documents

Publication Publication Date Title
US11071991B2 (en) Weed control systems and methods, and agricultural sprayer incorporating same
US10721859B2 (en) Monitoring and control implement for crop improvement
US11771077B2 (en) Identifying and avoiding obstructions using depth information in a single image
US10761211B2 (en) Plant treatment based on morphological and physiological measurements
US10255670B1 (en) Image sensor and module for agricultural crop improvement
Grenzdörffer Crop height determination with UAS point clouds
Murakami et al. Canopy height measurement by photogrammetric analysis of aerial images: Application to buckwheat (Fagopyrum esculentum Moench) lodging evaluation
US12080019B2 (en) Extracting feature values from point clouds to generate plant treatments
BR112021004212A2 (pt) método para determinar o estresse de planta, sistema para determinar o estresse em plantas, sistema para gerenciar irrigação, sistema para agendar irrigação, método para agendar irrigação de terra e método para obter informações sobre as condições atmosféricas relacionadas ao estresse da planta
WO2021208407A1 (zh) 目标物检测方法、装置和图像采集方法、装置
JP6996560B2 (ja) 作物栽培支援装置
US20230306735A1 (en) Agricultural analysis robotic systems and methods thereof
US10602665B2 (en) Two armed robotic system for adjusting the height of an agricultural tool
US20230230202A1 (en) Agricultural mapping and related systems and methods
US20240306570A1 (en) Automated precision pollen applicator for row crops
US20220100996A1 (en) Ground Plane Compensation in Identifying and Treating Plants
CN113269050A (zh) 基于无人机遥感影像的农田画像及药肥大数据分析方法
CN112837314A (zh) 基于2D-LiDAR和Kinect的果树冠层参数检测系统和方法
CN115451965B (zh) 基于双目视觉的插秧机插植系统相对航向信息检测方法
US20230016410A1 (en) System for detecting crop characteristics
Yang Maize and Sorghum Plant Detection at Early Growth Stages Using Proximity Laser and Time-Offlight Sensors
Yamamoto et al. Onion Bulb Counting in a Large-scale Field Using a Drone with Real-Time Kinematic Global Navigation Satellite System
WO2024069631A1 (en) Plant phenotyping
Louargant et al. Aerial multispectral imagery for site specific weed management.
BR102022026786A2 (pt) Calibração de conjunto de câmeras em máquina agrícola

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22785610

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 18263810

Country of ref document: US

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112023020306

Country of ref document: BR

WWE Wipo information: entry into national phase

Ref document number: 2022785610

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2022785610

Country of ref document: EP

Effective date: 20231108

ENP Entry into the national phase

Ref document number: 112023020306

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20231002