WO2023223320A1 - Système et procédé de traitement agricole avec rétroaction en temps réel - Google Patents

Système et procédé de traitement agricole avec rétroaction en temps réel Download PDF

Info

Publication number
WO2023223320A1
WO2023223320A1 PCT/IL2023/050503 IL2023050503W WO2023223320A1 WO 2023223320 A1 WO2023223320 A1 WO 2023223320A1 IL 2023050503 W IL2023050503 W IL 2023050503W WO 2023223320 A1 WO2023223320 A1 WO 2023223320A1
Authority
WO
WIPO (PCT)
Prior art keywords
treatment
sensors
spraying
tool
processor
Prior art date
Application number
PCT/IL2023/050503
Other languages
English (en)
Inventor
Yonatan HOROVITZ
Edo Reshef
Original Assignee
Agromentum Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agromentum Ltd. filed Critical Agromentum Ltd.
Publication of WO2023223320A1 publication Critical patent/WO2023223320A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
    • A01M7/0089Regulating or controlling systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4808Evaluating distance, position or velocity data

Definitions

  • the present invention in some embodiments thereof, relates to autonomous agricultural treatment tools and vehicles and more particularly, to autonomous agricultural treatment tools and vehicles with real-time feedback.
  • plant protection spraying of pesticide for protecting crops against pests is performed using autonomous tractors with a spraying system.
  • standard spraying systems spray an excessive amount of chemicals and other treatment substances, in order to ensure that all parts of a tree or plant receive at least a minimal required amount of the substance.
  • some parts, more exposed or closer to the spraying tool receive an excessive amount of the treatment substance which may result in damage to the plant on the one hand, and environmental effect and waste on the other hand.
  • Systems according to the present invention may include a modeling system, the modeling system may be configured to create a model, such as, for example model is a three-dimensional (3D) model, of one or more plants in a treated area; one or more sensing units, each of the one or more sensing units may comprise one or more sensor arrays, each of the one or more sensor arrays may comprise at least one type of sensors; an autonomous treatment tool; and a processor, in active communication with the treatment tool and configured to control one or more operation parameters of the treatment tool at least according to one or more of: the model and data received from the one or more sensing units.
  • a modeling system the modeling system may be configured to create a model, such as, for example model is a three-dimensional (3D) model, of one or more plants in a treated area
  • one or more sensing units each of the one or more sensing units may comprise one or more sensor arrays, each of the one or more sensor arrays may comprise at least one type of sensors
  • an autonomous treatment tool and a processor, in active communication with the treatment
  • the modeling system may be configured to create the model based on information received from the one or more sensing units.
  • the 3D modeling system is a LiDAR based modeling system.
  • the sensor unit may further comprise a power source and a communication unit configured to send sensed data to the processor in real time.
  • the one or more sensing units may be adapted to be deployed at known locations on plants in the treated area and the data sent from each sensing unit to the processor may include information regarding the location of the sensing unit sending the data.
  • the data received from the one or more sensing units may be indicative of the compliance of the treatment to a treatment protocol.
  • the sensor types may be selected from a list consisting of: light sensors, chemical sensors, pressure sensors, geo-location sensors, pH sensors, acceleration sensors, weight sensors, conductivity sensors, and humidity sensors.
  • the treatment tool may be selected from a list consisting of: a spraying tool, a pollination tool, a harvesting tool, and a shaking tool.
  • the treatment tool is a spraying tool comprising a plurality of spraying nozzles, each spraying nozzle having a different spraying direction, and the operation parameters comprise one or more of: spraying direction, spraying pressure, spraying pattern and spraying frequency.
  • a sensor unit for autonomous agricultural treatment systems.
  • a sensor unit may include: one or more sensor arrays; a power source; and a communication unit.
  • each of the one or more sensor arrays may include a plurality of sensors from at least one type, and the communication unit may be adapted to send sensed data to a processor of an autonomous treatment tool, e.g., in real time.
  • the sending of sensed data may be upon sensing of a predefined change in a sensed parameter by at least one sensor unit.
  • the change may be indicative of the level of compliance of the treatment to a treatment protocol.
  • the sensors types in a sensor unit may be selected from a list consisting of: light sensors, chemical sensors, pressure sensors, geo-location sensors, pH sensors, conductivity sensors and humidity sensors.
  • a method of autonomous agricultural treatment may include: creating, by a modeling system, a model of one or more plants in the treated area; receiving, by a processor, from one or more sensing units deployed on the one or more plants indication regarding compliance of a treatment to a treatment protocol; and controlling, by the processor, one or more operation parameters of a treatment tool, to perform a treatment, wherein the controlling of the one or more operation parameters of the treatment tool may be according to at least one of: the model and the indications received from the one or more sensing units.
  • the model may be created based on, inter alia, location information received from one or more sensing units, and/or based on scans of the treated area received from one or more of: a LiDAR, a RADAR, and a camera.
  • the one or more sensing units may be deployed at known locations on plants in the treated area and the data sent from each sensing unit to the processor may include information regarding the location of the sensing unit sending the data.
  • the treatment tool according to some aspects of the method may be selected from a list consisting of: a spraying tool, a pollination tool, and a shaking tool
  • the treatment tool may be a spraying tool that may include a plurality of spraying nozzles, each spraying nozzle having a different spraying direction, and wherein the operation parameters comprise one or more of: spraying direction, spraying pressure, spraying pattern and spraying frequency.
  • FIG. 1 shows a block diagram of an agriculture treatment system according to embodiments of the present invention
  • FIG. 2 shows a schematic illustration of a sensing unit according to embodiments of the present invention
  • FIG. 3 is a schematic illustration of an autonomous treatment tool according to embodiments of the present invention
  • Fig. 4 is a flowchart of a method of autonomous agricultural treatment according to embodiments of the present invention.
  • FIG. 5 is a block diagram, depicting a computing device which may be included in a system for agricultural treatment, according to some embodiments of the present invention.
  • the terms “plurality” and “a plurality” as used herein may include, for example, “multiple” or “two or more”.
  • the terms “plurality” or “a plurality” may be used throughout the specification to describe two or more components, devices, elements, units, parameters, or the like.
  • the method embodiments described herein are not constrained to a particular order or sequence. Additionally, some of the described method embodiments or elements thereof can occur or be performed simultaneously, at the same point in time, or concurrently.
  • System 100 may include a modeling system 110, one or more sensor units 120 and at least one treatment tool 130.
  • modeling system 110 and treatment tool may be accommodated in a movable platform or housing 150. It should be noted that movable platform or housing 150 may not be required and each component of the system may be independent, while in communication with other components of system 100.
  • Modeling system 110 may be configured to create a model 110’ of one or more plants (e.g., trees, vines, fruits, etc.) in a treated area such as an orchard, a vineyard, an open field, and/or plantation.
  • Modeling system 110 may include one or more sensors 112 for providing information used by modeling system 110 to create a two-dimensional (2D) or three-dimensional (3D) model 110’ of one or more plants in the treated area.
  • sensors 112 may include one or more of: optical and/or electromagnetic sensors (such as RADARs and LiDARs).
  • System 100 may further include one or more Global Positioning Systems (GPS) 111, either as part of the modeling system 110, as part of the treatment tool 130 or as a separate component in active communication with one or more of treatment tool 130 and modeling system 110.
  • GPS Global Positioning Systems
  • system 100 may further include one or more additional sensors 113 such as barometers, light sensors, humidity sensors, pH sensors, accelerometers, and the like, which may serve for improving accuracy of determination of the location of modeling system 110 and/or the treatment tool in the treated area, and thus improve the accuracy of the created model 110’ and the performance of the treatment tool 130.
  • GPS Global Positioning Systems
  • sensors 112 and/or 113 may be integral, and/or external to modeling system 110, and may be in active communication with modeling system 110 to provide sensed data to modeling system 110.
  • Modeling system 110 may further include or be in active communication with a processor 114, and a memory device 116.
  • Memory device 116 may be configured to store executable code that when executed by processor 114, may allow processor 114 to create model 110’ of the treated area or a part thereof.
  • memory 116 may store computing instructions of a Simultaneous Localization and Mapping (SLAM) algorithm, allowing the creating of the 3D or 2D model 110’ of an area of interest (e.g., a treated area), based on input received from optical sensors 112 such as LiDARs.
  • SLAM Simultaneous Localization and Mapping
  • agricultural treatment system 100 may include one or more sensing units 120.
  • Sensing units 120 may be located in known locations on one or more plants in the treated area, and may provide information to modeling system 110 for the creation of a model of one or more plants in the treated area. Combined with a 3D model that was built, sensing units 120 may help to accurately build a comprehensive model of the treatment area.
  • modeling system 110 may receive information from, for example, optical sensors such as one-dimensional (single beam) or 2D (sweeping) laser rangefinders, 3D High Definition LiDAR, 3D Flash LIDAR, 2D or 3D sonar sensors and one or more 2D cameras, and based on the optical sensor(s) data, with or without additional information received from other sensors, such as sensing units 120, may create a relatively accurate model (e.g., a 3D model) of one or more plants (e.g., of an entire plantation, orchard, vineyard or other treated area).
  • optical sensors such as one-dimensional (single beam) or 2D (sweeping) laser rangefinders, 3D High Definition LiDAR, 3D Flash LIDAR, 2D or 3D sonar sensors and one or more 2D cameras, and based on the optical sensor(s) data, with or without additional information received from other sensors, such as sensing units 120, may create a relatively accurate model (e.g., a 3D model) of one or more plants (e.
  • Modeling system 110 may use, for example, Simultaneous Localization and Mapping (SLAM) algorithms to create a 3D model of a tree, a plurality of trees, and/or an entire treated area.
  • SLAM Simultaneous Localization and Mapping
  • Treatment tool 130 may be any tool used for treating plants in the treated area, such as, for example, pollination tools, spraying tools, irrigation tools, harvesting tools and the like. Examples of treatment tools are described with reference to Fig. 3 below.
  • Treatment tool 130 may be in active communication or otherwise operably connected to processor 114 or another processor of treatment system 100, and may receive operation instructions therefrom.
  • Processor 114 that may be the same processor used for modeling or a separate processor of treatment system 100
  • Sensing unit 220 may include one or more sensor arrays 222, a power source 224, and a communication unit 226.
  • each of the one or more sensor arrays 222 may include at least one type of sensors.
  • sensing unit 220 may include 2, 3, 4 or 5 sensors arrays 222, each including two or more sensors from a single type or from different types. It should be appreciated that any number of arrays and sensors combination may be used.
  • Communication unit 226 may be configured to send to communication module (140 in Fig. 1) sensed data from sensor arrays 222.
  • the sensed data received by communication module 140 may be used, according to some embodiments, to control operation parameters of treatment tool (130 in Fig. 1) so that the treatment will comply with a treatment protocol.
  • a treatment protocol For example, when the treatment tool is a pesticide spraying tool, and the spraying protocol requires that no less than a specific amount of pesticide would reach each part of a plant in the treated area, sensing units 220 may send to communication module of the treatment system (100 in Fig. 1) sensed data that indicates how much pesticide reached each sensing unit 220 and accordingly treatment system (100 in Fig. 1) may determine whether to continue spraying pesticide from treatment tool at a specific location as will be further detailed with respect to Figs. 3 and 4 herein.
  • sensing units 220 may be adapted to be deployed at known locations on plants in the treated area and the data sent from each sensing unit to the processor includes information regarding the location of the sensing unit sending the data.
  • sensing unit 220 may have a housing 260 configured to at least partially accommodate the sensor arrays 222, the power source 224 and communication unit 226.
  • housing 260 may have exposed sensor arrays 222 from at least two sides thereof.
  • Housing 260 may have a fluids sensor 265 located at a first end of housing 260, and a cover 262 configured to cover fluids sensor 265 and prevent fluids sprayed or otherwise distributed from treatment system 100 from directly reaching fluids sensor 265.
  • cover 262 may be open from a side directed towards a second end of housing 260, opposite to the first end. Opening 264 in cover 262 may be configured to allow fluids dripping along a face of sensor unit 220 to enter an internal space in cover 262 in which fluid sensor 265 is located, thus allowing fluid sensor 265 to sense accumulating fluids in the internal space of cover 262.
  • sensor arrays 222 may include one or more sensors selected from a list consisting of: light sensors, chemical sensors, pressure sensors, geo-location sensors, pH sensors, and humidity sensors.
  • Treatment tool 130 may be mounted on a vehicle 320 to carry treatment tool 130 within the treated area, such as within an orchard, a vineyard a plantation and the like.
  • Treatment tool 130 may be a spraying tool, for example for spraying pesticides.
  • treatment tool 130 may be an irrigation tool, a pollination tool, a shaking tool or any other harvesting tool known in the art.
  • the treatment tool 130 may be a spraying tool comprising a plurality of spraying nozzles 333a and 333b. Each spraying nozzle may have a different spraying direction.
  • Treatment tool 130 may be in active communication or otherwise operably connected to a processor 114 or another processor of treatment system 100 (in Fig. 1) and may receive operation instructions therefrom.
  • Processor 114 (that may be the same processor used for modeling or an independent processor) may be configured to control one or more operation parameters of the treatment tool 130 at least according to one or more of: the model generated by modeling system (110 in Fig. 1) and data received from the one or more sensing units (120 in Fig. 1).
  • the operation parameters may include one or more of: spraying direction, spraying pressure, spraying pattern and spraying frequency.
  • spraying direction when information received from modeling system (110 in Fig. 1) indicates that treatment tool 130 is located between a large tree on one side and a small tree on the other side (as illustrated in Fig. 3) nozzles 333b may be turned on while nozzles 333a may remain inactive.
  • the pressure of spaying as well as other spraying parameters may be controlled according to the location and direction of spraying of each nozzle, so that, for example, nozzle 333b’ may be controlled to spray in high pressure in order to reach the tree top while nozzle 333b” may be controlled to spray in a low pressure in order to avoid spraying the tree trunk.
  • sensed data received from sensing units located on plants in the treated area provide indications regarding the efficiency of the treatment and its compliance to a treatment protocol. For example, when some sensor arrays on one or more sensing units do not receive a sufficient amount of sprayed pesticide, this may indicate that the entire plant portion in proximity to that sensing unit did not receive sufficient pesticide and additional spraying may be required. In another example, when the amount of fluids sensed by fluid sensor exceeds a predefined threshold, an indication that an excessive amount of pesticide may have been sprayed at the proximity of the sensing unit and the processor 114 may instruct treatment tool to stop spraying from the nozzles directed in the direction of the sensing unit that sent the excess pesticide indication.
  • each sensing unit may send an indication to processor 114 if it was not sufficiently shaked (e.g., based on data from accelerometers of sensing units), treatment tool may return or repeat its operation to shake specific branches or plant portions that have not been properly shaked, or may control shaking force or frequency in order to reach the required shaking of each part of a plant in the treated area.
  • FIG. 4 is a flowchart of a method according to embodiments of the present invention.
  • step 410 may include creating, by a modeling system, a model (e.g., a 2D or 3D model) of one or more plants in the treated area.
  • a modeling system may receive information from, for example, optical sensors such as one-dimensional (single beam) or 2D (sweeping) laser rangefinders, 3D High Definition LiDAR, 3D Flash LIDAR, 2D or 3D sonar sensors and one or more 2D cameras, and based on the optical sensor(s) data, with or without additional information received from other sensors, such as sensing units located in known locations on plants in the treated area, may create a relatively accurate model (e.g., a 3D model) of one or more plants (e.g., of an entire plantation, orchard, vineyard or other treated area).
  • optical sensors such as one-dimensional (single beam) or 2D (sweeping) laser rangefinders, 3D High Definition LiDAR, 3D Flash LIDAR, 2D or 3D sonar sensors and one or more 2D cameras
  • the modeling system may use, for example, Simultaneous Localization and Mapping (SLAM) algorithms to create a 3D model of a tree, a plurality of trees, and/or an entire treated area.
  • SLAM Simultaneous Localization and Mapping
  • Step 420 may include, according to some embodiments, receiving, by a processor, from one or more sensing units deployed on the one or more plants, indication regarding compliance of a treatment to a treatment protocol (e.g., as described with reference to Fig. 3 above).
  • step 430 may include controlling, by the processor, one or more operation parameters of a treatment tool, to perform a treatment.
  • the controlling of the one or more operation parameters of the treatment tool may be according to at least one of: the 3D model and the indications received from the one or more sensing units (e.g., as described with reference to Fig. 3 above).
  • FIG. 5 is a block diagram depicting a computing device which may be included in a system for agricultural treatment, such as the systems depicted and described with reference to Figs. 1 and 3 above.
  • Computing device 1 may include a processor or controller 2 (such as processor 114 ini Figs. 1 and 3) that may be, for example, a central processing unit (CPU) processor, a chip or any suitable computing or computational device, an operating system 3, a memory 4 (such as memory 116 in Fig. 1), executable code 5, a storage system 6, input devices 7 (such as sensors 112, 113, and sensing units 120, in Fig. 1) and output devices 8 (such as treatment tool 130 in Figs. 1 and 3).
  • processor 2 or one or more controllers or processors, possibly across multiple units or devices
  • Operating system 3 may be or may include any code segment (e.g., one similar to executable code 5 described herein) designed and/or configured to perform tasks involving coordination, scheduling, arbitration, supervising, controlling or otherwise managing operation of computing device 1 , for example, scheduling execution of software programs or tasks or enabling software programs or other modules or units to communicate.
  • Operating system 3 may be a commercial operating system. It will be noted that an operating system 3 may be an optional component, e.g., in some embodiments, a system may include a computing device that does not require or include an operating system 3.
  • Memory 4 may be or may include, for example, a Random-Access Memory (RAM), a read only memory (ROM), a Dynamic RAM (DRAM), a Synchronous DRAM (SD-RAM), a double data rate (DDR) memory chip, a Flash memory, a volatile memory, a non-volatile memory, a cache memory, a buffer, a short term memory unit, a long term memory unit, or other suitable memory units or storage units.
  • Memory 4 may be or may include a plurality of possibly different memory units.
  • Memory 4 may be a computer or processor non-transitory readable medium, or a computer non-transitory storage medium, e.g., a RAM.
  • a non-transitory storage medium such as memory 4, a hard disk drive, another storage device, etc. may store instructions or code which when executed by a processor may cause the processor to carry out methods as described herein.
  • Executable code 5 may be any executable code, e.g., an application, a program, a process, task, or script. Executable code 5 may be executed by processor or controller 2 possibly under control of operating system 3.
  • executable code 5 may be an application that may create a model (e.g., a 3D or 2D model) of one or more plants in a treated area, based on input received from sensors, and may control a treatment tool based on the created model and additional sensor data, as further described herein.
  • a system according to some embodiments of the invention may include a plurality of executable code segments similar to executable code 5 that may be loaded into memory 4 and cause processor 2 to carry out methods described herein.
  • Storage system 6 may be or may include, for example, a flash memory as known in the art, a memory that is internal to, or embedded in, a micro controller or chip as known in the art, a hard disk drive, a CD-Recordable (CD-R) drive, a Blu-ray disk (BD), a universal serial bus (USB) device or other suitable removable and/or fixed storage unit.
  • Data such as image and location data may be stored in storage system 6 and may be loaded from storage system 6 into memory 4 where it may be processed by processor or controller 2.
  • some of the components shown in Fig. 5 may be omitted.
  • memory 4 may be a non-volatile memory having the storage capacity of storage system 6. Accordingly, although shown as a separate component, storage system 6 may be embedded or included in memory 4.
  • Input devices 7 may be or may include any suitable input devices, components, or systems, e.g., a detachable keyboard or keypad, a mouse, sensor units, sensors and the like.
  • Output devices 8 may include one or more treatment tools, one or more (possibly detachable) displays or monitors, speakers and/or any other suitable output devices.
  • Any applicable input/output (I/O) devices may be connected to Computing device 1 as shown by blocks 7 and 8.
  • NIC network interface card
  • USB universal serial bus
  • any suitable number of input devices 7 and output device 8 may be operatively connected to Computing device 1 as shown by blocks 7 and 8.
  • a system may include components such as, but not limited to, a plurality of central processing units (CPU) or any other suitable multi-purpose or specific processors or controllers (e.g., similar to element 2), a plurality of input units, a plurality of output units, a plurality of memory units, and a plurality of storage units.
  • CPU central processing units
  • controllers e.g., similar to element 2

Abstract

Un système et un procédé de traitement agricole selon la présente invention peuvent comprendre un système de modélisation, une unité de capteur et un outil de traitement. Le système de modélisation peut être configuré pour créer un modèle 2D ou 3D d'une ou de plusieurs plantes dans une zone traitée, l'unité de capteur peut comprendre : un ou plusieurs réseaux de capteurs; une source d'alimentation; et une unité de communication. L'unité de communication peut être conçue pour envoyer des données détectées à un processeur de l'outil de traitement. Un procédé selon la présente invention peut consister : à créer un modèle d'une ou de plusieurs plantes dans la zone traitée; à recevoir, par un processeur, en provenance d'une ou de plusieurs unités de détection déployées sur la ou les plantes, une indication concernant la conformité d'un traitement à un protocole de traitement; et à commander, par le processeur, un ou plusieurs paramètres de fonctionnement d'un outil de traitement, pour effectuer un traitement.
PCT/IL2023/050503 2022-05-16 2023-05-15 Système et procédé de traitement agricole avec rétroaction en temps réel WO2023223320A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263342208P 2022-05-16 2022-05-16
US63/342,208 2022-05-16

Publications (1)

Publication Number Publication Date
WO2023223320A1 true WO2023223320A1 (fr) 2023-11-23

Family

ID=88834770

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2023/050503 WO2023223320A1 (fr) 2022-05-16 2023-05-15 Système et procédé de traitement agricole avec rétroaction en temps réel

Country Status (1)

Country Link
WO (1) WO2023223320A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106950573A (zh) * 2017-02-23 2017-07-14 北京农业信息技术研究中心 一种基于无人机激光雷达的玉米涝渍灾害评估方法及系统
US10534086B2 (en) * 2015-07-13 2020-01-14 Agerpoint, Inc. Systems and methods for determining crop yields with high resolution geo-referenced sensors
EP3871481A1 (fr) * 2020-02-28 2021-09-01 CNH Industrial Italia S.p.A. Véhicule agricole équipé d'un dispositif d'imagerie 3d avant et arrière
US20220240494A1 (en) * 2021-01-29 2022-08-04 Neatleaf, Inc. Aerial sensor and manipulation platform for farming and method of using same
WO2023072980A1 (fr) * 2021-10-26 2023-05-04 Basf Agro Trademarks Gmbh Surveillance du traitement d'un champ agricole

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10534086B2 (en) * 2015-07-13 2020-01-14 Agerpoint, Inc. Systems and methods for determining crop yields with high resolution geo-referenced sensors
CN106950573A (zh) * 2017-02-23 2017-07-14 北京农业信息技术研究中心 一种基于无人机激光雷达的玉米涝渍灾害评估方法及系统
EP3871481A1 (fr) * 2020-02-28 2021-09-01 CNH Industrial Italia S.p.A. Véhicule agricole équipé d'un dispositif d'imagerie 3d avant et arrière
US20220240494A1 (en) * 2021-01-29 2022-08-04 Neatleaf, Inc. Aerial sensor and manipulation platform for farming and method of using same
WO2023072980A1 (fr) * 2021-10-26 2023-05-04 Basf Agro Trademarks Gmbh Surveillance du traitement d'un champ agricole

Similar Documents

Publication Publication Date Title
Emmi et al. New trends in robotics for agriculture: integration and assessment of a real fleet of robots
Pedersen et al. Agricultural robots—system analysis and economic feasibility
Berk et al. Development of alternative plant protection product application techniques in orchards, based on measurement sensing systems: A review
Roure et al. GRAPE: ground robot for vineyard monitoring and protection
JP2022523836A (ja) 農薬散布の制御方法、デバイス及び記憶媒体
CN109297467B (zh) 用于检测三维环境数据的系统和传感器模块
BR112019003692B1 (pt) Método e sistema para o controle de organismos nocivos
EP2340701A2 (fr) Appareil, procédé et produit de programme informatique pour contrôler un élément de coupe
AU2018362268A1 (en) System and method for irrigation management
Rajakumar et al. IoT based smart agricultural monitoring system
Durmuş et al. Data acquisition from greenhouses by using autonomous mobile robot
JP2022528389A (ja) 農地の農作物処理のための方法
JP2022526562A (ja) 農地の農作物処理のための方法
Bhardwaj et al. Artificial intelligence and its applications in agriculture with the future of smart agriculture techniques
WO2020193458A1 (fr) Appareil de lutte contre les insectes
US20220167546A1 (en) Method for plantation treatment of a plantation field with a variable application rate
Singh et al. Usage of internet of things based devices in smart agriculture for monitoring the field and pest control
Stone et al. Evolution of electronics for mobile agricultural equipment
WO2023223320A1 (fr) Système et procédé de traitement agricole avec rétroaction en temps réel
Hussain et al. An Intelligent Autonomous Robotic System for Precision Farming
Han et al. Intelligent agricultural machinery and field robots
DEVİ et al. A survey on the design of autonomous and semi autonomous pesticide sprayer robot
Gupta et al. Robotics and Artificial Intelligence (AI) in Agriculture with Major Emphasis on Food Crops
Emmi Contributions to the configuration of fleets of robots for precision agriculture
Karouta et al. Autonomous platforms

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23807175

Country of ref document: EP

Kind code of ref document: A1