WO2016123201A1 - Systèmes, dispositifs et procédés de télédétection robotique pour agriculture de précision - Google Patents

Systèmes, dispositifs et procédés de télédétection robotique pour agriculture de précision Download PDF

Info

Publication number
WO2016123201A1
WO2016123201A1 PCT/US2016/015093 US2016015093W WO2016123201A1 WO 2016123201 A1 WO2016123201 A1 WO 2016123201A1 US 2016015093 W US2016015093 W US 2016015093W WO 2016123201 A1 WO2016123201 A1 WO 2016123201A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
imaging system
resolution
multispectral
imagery
Prior art date
Application number
PCT/US2016/015093
Other languages
English (en)
Inventor
R. Vijay Kumar
Gareth Benoit CROSS
Chao Qu
Jnaneshwar DAS
Anurag MAKINENI
Yash Shailesh MULGAONKAR
Original Assignee
The Trustees Of The University Of Pennsylvania
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Trustees Of The University Of Pennsylvania filed Critical The Trustees Of The University Of Pennsylvania
Priority to US15/545,266 priority Critical patent/US10395115B2/en
Publication of WO2016123201A1 publication Critical patent/WO2016123201A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/25Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/282Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A40/00Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
    • Y02A40/10Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture

Definitions

  • This subject matter disclosed herein relates to systems and methods for data-driven remote sensing for precision agriculture. Specifically, the subject matter disclosed herein relates to the development of imaging systems and deployment modalities for close-range sensing of critical properties of specialty crops, such as apples, oranges, strawberries, peaches, and pecans.
  • Remote sensing satellites and airborne sensing with winged aircrafts have allowed scientists to map large farmlands and forests through acquisition of multi-spectral imagery and 3-D structural data.
  • data from these platforms lack the spatio-temporal resolution necessary for precision agriculture.
  • a typical remote sensing satellite image may have a pixel resolution of hundreds of meters, and airborne sensing may provide resolution of a few meters. It is desirable, however, to obtain data for monitoring orchard or vineyard health at a centimeter scale - a resolution at which stems, leaves, and fruits can be observed.
  • U.S. Patent Application Pub. No. 2013/0325346 unmanned ground vehicles
  • UGVs unmanned ground vehicles
  • They can carry a variety of bulky sensors such as LiDAR for volumetric mapping, and ground penetrating radar (GPR) and electrical conductance sensors for precise soil mapping. Due to the mobility constraints of unstructured farms, however, it is infeasible to use UGVs for rapid and persistent monitoring.
  • ground vehicles are intrusive.
  • Aerial platforms and hand-held sensors can alleviate some of the problems with using UGVs, but the available platforms for such systems are bulky and expensive, which can be prohibitive for large-scale deployments in farms. Furthermore, the spatio- temporal resolution of such systems are considered inadequate as discussed above.
  • a methodology for data-driven precision agriculture through close-range remote sensing with a versatile imaging system which may be deployed onboard low-flying unmanned aerial vehicles (UAVs), mounted to ground vehicles (e.g., unmanned ground vehicles (UGVs)), and/or carried by human scouts.
  • UAVs unmanned aerial vehicles
  • UUVs unmanned ground vehicles
  • the present technology stack may include methods for extracting actionable intelligence from the rich datasets acquired by the imaging system, as well as visualization techniques for efficient analysis of the derived data products.
  • the systems and methods discussed herein may include one or more of these four components: an imaging system, a deployment methodology, data analysis algorithms, and/or a visualization framework that, when used together, may help specialty crop growers to save resources (e.g., less fertilizers, water, and pesticides may be needed because of better stress and disease monitoring), optimize crop yield, and reduce costs (e.g., by allowing for a better allocation of labor due to the efficient estimation of crop yield and from the lower use of resources).
  • resources e.g., less fertilizers, water, and pesticides may be needed because of better stress and disease monitoring
  • optimize crop yield e.g., by allowing for a better allocation of labor due to the efficient estimation of crop yield and from the lower use of resources.
  • Figures 1 a and 1 b are top and side views, respectively, of a lightweight, low-cost, portable, compact, and self-contained multi-spectral 3-D imaging sensor suite designed for precision agriculture according to an embodiment of the presently disclosed subject matter;
  • Figure 2 is a perspective view of a UAV for precision agriculture according to an embodiment of the presently disclosed subject matter
  • Figure 3a is a perspective view of a UAV in-flight at a vineyard according to an embodiment of the presently disclosed subject matter
  • Figure 3b is a real-time 3-D map of the rows of grape trees imaged by the UAV shown in Figure 2;
  • Figures 4a through 4c are bottom, side, and wide-angle views, respectively, of a multi-rotor UAV with the sensor suite facing downwards according to an embodiment of the presently disclosed subject matter;
  • Figure 4d is a front view of a sensor suite harnessed on a human scout according to an embodiment of the presently disclosed subject matter;
  • Figure 4e illustrates a harnessed sensor suite according to an embodiment of the presently disclosed subject matter being used to scan a row of dwarf apple trees at an apple orchard in Biglerville, Pennsylvania;
  • Figure 5 is a flow chart illustrating a data processing pipeline of the system according to an embodiment of the presently disclosed subject matter
  • Figure 6a is a 3-D reconstruction of a row of grape trees spanning about 70 meters at a vineyard according to an embodiment of the presently disclosed subject matter;
  • Figure 6b is a map showing canopy characteristics and scene features from the reconstruction shown in Figure 6a;
  • Figure 7a is a multi-spectral 3-D reconstruction of a row of dwarf apple trees just using laser data according to an embodiment of the presently disclosed subject matter
  • Figure 7b is a multi-spectral 3-D reconstruction of a row of dwarf apple trees with thermal data overlaid on a 3-D point cloud according to an embodiment of the presently disclosed subject matter;
  • Figures 8a through 8d are 3-D point clouds of canopies that may be used to determine tree height, canopy volume, and leaf area according to an embodiment of the presently disclosed subject matter;
  • Figure 9 is a graph illustrating a Correlation between LiDAR area index and measured true leaf area for the data points corresponding to the emulated canopy stages of two trees according to an embodiment of the presently disclosed subject matter.
  • Figures 10a and 10b illustrate data from a thermal camera along with data from visible-range cameras according to an embodiment of the presently disclosed subject matter.
  • Systems, devices, and methods for robotic remote sensing for precision agriculture disclosed herein address the limitations of current farm monitoring practices through the development of a technology stack for high- resolution multi-spectral 3-D mapping of specialty crops.
  • the present subject matter exploits a versatile sensor suite capable of being deployed in multiple modalities (e.g., mounted onboard UAVs or UGVs, and/or carried by human scouts).
  • the target applications include but are not limited to yield estimation and disease monitoring for apples, oranges, strawberries, peaches, and pecans.
  • a sensor system may comprise an array of science sensors, navigation sensors, an onboard computer, a wireless communication link, and/or batteries.
  • sensor system 100 may combine different sensing modalities on a self-contained, lightweight, and compact platform that can be implemented using any of a variety of deployment modalities.
  • the sensors onboard may be selected to monitor a range of plant physiological and morphological properties such as canopy volume, leaf area, water stress, and crop yield (e.g., fruit count, fruit size).
  • sensor system 100 may include one or more laser range (e.g., LiDAR) scanners 101 , which may be configured to extract morphological properties (e.g., canopy volume and leaf area) of the agricultural subject.
  • LiDAR scanner 101 may be a Hokuyo UST-20LX laser scanner, which may provide high-resolution laser scans for monitoring plant morphology.
  • sensor system 100 may comprise one or more thermal cameras 102 and/or one or more multi- spectral cameras 103 configured to provide imaging in the red and near- infrared bands. Imagery from such bands may be used to monitor plant vigor, which may in turn be used in guiding pruning management and fertilization.
  • thermal cameras 102 and multi-spectral cameras 103 may be used together to estimate a range of plant properties related to photosynthetic efficiency and water stress.
  • thermal camera 102 may be a FLIR A35 thermal imaging camera, which may be used to collect temperature readings (e.g., at a resolution of 320x256).
  • Multi-spectral cameras 103 may include two monochrome Matrix Vision BIueFox cameras equipped with narrow-pass filters (e.g., at about 670nm and 800nm), which may together provide the necessary bands for calculation of the Normalized Difference Vegetation Index (NDVI) - an indicator of plant vigor.
  • sensor system 100 may further include an RGB camera 104 (e.g., an RGB BIueFox camera) to acquire true-color data. Data from the wide range of spectral bands may be used for fruit- counting and monitoring crop stress and disease.
  • sensor system 100 may include one or more of stereo camera rig 105 for visual odometry and reconstruction and/or a spectrometer 106.
  • One or more navigational sensors 107 e.g., global positioning system (GPS) sensor or other navigational sensors
  • GPS global positioning system
  • navigational sensors 107 may include a Microstrain 3DM- GX4-25 IMU, a ⁇ precise point positioning (PPP) GPS sensor, and/or two single-channel BIueFox cameras for stereo visual odometry.
  • GPS global positioning system
  • PPP ⁇ precise point positioning
  • sensor system 100 may include an inertial measurement unit (IMU) 108.
  • IMU inertial measurement unit
  • IMU inertial measurement unit
  • barometric pressure sensors and magnetic field sensors may further be included in IMU 108, although these may alternatively be provided as independent sensors (i.e. , a barometer and/or magnetometer).
  • An onboard computer 109 may be used to log data from all the sensors and to facilitate communication to a base- station or the deployment vehicle (e.g., through a wi-fi link). Power may be delivered by one or more batteries (e.g., two 2700 mAh lithium polymer batteries).
  • sensor system 100 may be configured to be relatively light (e.g., having a total weight of 1 .6 kg or less) and compact (e.g., having dimensions of about 40 cm x 13 cm x 13 cm). It may further include a strong but lightweight frame 110 (e.g., a carbon fiber frame) that supports a base plate 111 (e.g., a polycarbonate plate) on which all sensors are mounted.
  • This sensor arrangement may provide reliable operation in different deployment modes. Specifically, for example, an endurance of about an hour has been observed during the deployments.
  • the disclosed array of sensors may provide a platform for performing detailed and highly accurate environmental reconstruction, in some embodiments, it may be desirable that both the state estimation and scientific sensors be properly calibrated.
  • a complete system calibration may be performed, wherein the stereo camera rig 105 and the sensing cameras (e.g., thermal camera 102, multi-spectral camera 103, and/or RGB camera 104) are calibrated relative to IMU 108.
  • This process may include both camera-system calibration (e.g., stereo, color, and multispectral) and spatial/temporal calibration of the respective cameras and IMU 108.
  • thermal camera 102 may be calibrated relative to stereo camera rig 105. This procedure can be complicated by the fact that thermal camera 102 cannot observe visible wavelengths, and produces only a low-resolution image (e.g., 320x256). Thus a standard chessboard pattern may not be available to be used to calibrate thermal camera 102 directly.
  • an ordinary circle grid pattern printed on paper may be illuminated by a hot lamp, producing a pattern which is discernible in both the long-wave IR and optical regions of the spectrum. This approach allows for calibration of thermal camera 102 itself as well as with other cameras without introducing any complicated calibration device.
  • the present subject matter may be deployed as a distributed plurality of sensor arrays.
  • the system may be mounted onboard one or more UAVs, generally designated 200, for rapid sensing in unstructured farmlands.
  • UAV unmanned aerial vehicle
  • UAVs 200 are well-suited for precision agriculture due to their small size, superior mobility, and hover capability, which allows them to perform both high-altitude remote sensing and close inspection (e.g., at distances less than 2m) of problematic areas.
  • UAVs 200 equipped with multiple sensors are able to navigate autonomously through complex indoor and outdoor environments. Because of this combination of flight capability and maneuverability, such UAVs 200 may be advantageously applied to precision agriculture to produce both aerial views and side views (e.g., between rows of crops rather than just above them) of the subject agricultural area. This ability to collect data from multiple perspectives allows for complex modeling of the farmlands with high resolution and without large, intrusive, expensive, and/or overly complex data collection systems.
  • multiple UAVs 200 may be deployed with sensor system 100 onboard to acquire richer data.
  • multiple UAVs 200 may be used to collect data from different parts of the farm, or from different angles, and the fused data may be used to generate 3-D maps with larger spatial coverage.
  • Exemplary UAVs 200 used with the present subject matter may be configured to be lightweight (e.g., about 10 lbs. or less) yet capable of carrying a modest payload (e.g., about 4 lbs. or more) that includes the array of sensors and onboard control systems discussed above with respect to sensor system 100.
  • Figure 2 shows one generation of such vehicles with heterogeneous sensor modalities and constrained onboard computation power, including onboard navigational sensors, cameras, a laser scanner, and a computer. Multiple such low-cost and small UAVs 200 may be deployed in a farm for rapid tree mapping.
  • the present systems and methods may similarly be implemented using ground-based deployment systems 300, such as on unmanned ground vehicles (UGVs), on manned vehicles (e.g. a scout truck), or in a wearable or hand-held configuration of sensor system 100 provided on a human scout since human scouts periodically carry out inspections of farms.
  • ground deployment system 300 using human scouts
  • the present systems and methods may be deployed on a mechanically stabilized harness 310 that may be carried by such scouts. Mechanical stabilization improves the quality of the recorded data, resulting in higher precision of 3-D reconstruction.
  • a visualization device e.g., a display attached to harness 310 or a wearable display provided with the scout, such as augmented reality (AR) goggles or the like
  • AR augmented reality
  • ground-based deployment systems 300 may be used to acquire data of side views of trees.
  • the data acquired by such ground-based deployment systems 300 may be used to aid the planning of flights by UAVs 200 or further ground-based data collection.
  • multiple deployment mechanisms may be used in cooperation to generate the high-resolution 3-D model of the subject agricultural area.
  • the data acquired by sensor systems 100 discussed above may be processed in multiple stages.
  • Figure 5 shows an exemplary data processing pipeline, generally designated 400, of the present systems and methods.
  • a middleware platform for robotic systems such as the Robot Operating System (ROS)
  • ROS Robot Operating System
  • State-estimation and mapping algorithms may be used to generate high-resolution multi-spectral 3-D maps, from which actionable intelligence such as fruit count, fruit size, trunk size, and canopy volume may be extracted using statistical machine learning techniques.
  • data processing pipeline 400 includes the generation of navigation data by a navigation sensor array 410 (e.g., navigational sensor 107, IMU 108, and stereo camera rig 105).
  • a mathematical model may be used to estimate the location of the robot(s) (i.e., to determine the geographic location of a robot or sensor package with high accuracy).
  • the data from the navigation sensor array 410 may be used by a state estimator 415 (e.g., an extended Kaiman filter (EKF)) to generate pose estimates for every science sensor on the platform.
  • EKF extended Kaiman filter
  • Further inputs are provided from a science sensor array 420 and a pose memory 430 that stores information regarding sensor relative poses.
  • a point cloud assembler 440 may use the pose estimates received from state estimator 415, the science data from science sensor array 420, and the known relative poses between sensors from pose memory 430 to reconstruct a multi-spectral 3-D point cloud (i.e., a representation format for 3-D data of an environment).
  • the point clouds may also be converted to an octree representation for efficient storage and analysis.
  • machine learning techniques may include a set of techniques to determine trends in data and to learn models for predicting properties of interest from observed data correlated to the desired properties of interest (e.g., crop yield, stress, or disease). This includes techniques for dimensionality reduction, unsupervised learning, and supervised learning.
  • the present systems and methods may be used for simultaneous localization and mapping (SLAM), wherein an unknown environment is explored, the location of the robot with respect to the environment is determined, and a map of the environment is build simultaneously.
  • SLAM may be performed in 2-D or 3-D depending on the sensors used.
  • the sensing modalities of the sensor suite are selected to monitor a range of plant physiological and morphological properties, and actionable intelligence can be extracted from the data acquired by the system.
  • actionable intelligence can be extracted from the data acquired by the system.
  • data products that may be obtained using the systems and methods discussed above include reconstructing plant morphology, computation of plant vigor, estimation of leaf area, and automated fruit counting or yield estimation using remotely sensed data.
  • Plant vigor measured through NDVI, facilitates decision-making for fertilization.
  • Accurate estimation of leaf area has the potential to improve pruning and spraying management.
  • the capability to estimate yield accurately will enable growers to plan labor for harvesting and storage for harvested fruits, both of which may be facilitated by predictive models that use the acquired data to estimate a property of interest (i.e., leaf area or fruit count).
  • Figures 6a and 6b show a reconstructed point cloud of a row of grape trees from a sample vineyard.
  • features of the environment e.g., the canopy, trunk, ground
  • data was collected with sensor system 100 facing the side of the grape trees.
  • Figures 7a-9d show example data products obtained from a
  • Figure 7a illustrates a multi-spectral 3-D reconstruction of a row of dwarf apple trees just using laser data
  • Figure 7b illustrates a multi-spectral 3-D reconstruction of a row of dwarf apple trees with thermal data overlaid on a 3-D point cloud.
  • the data was acquired in the afternoon, with the direction of the trees facing the sun showing higher canopy temperature (blue is cooler, red is warmer).
  • NDVI may be computed using multi-spectral imagery acquired by sensor system 100.
  • An NDVI map may then be generated using multispectral data acquired by sensor system 100 (e.g., onboard UAV 200). These maps enable growers to plan fertilization and mitigation in response to stresses observed in the NDVI imagery.
  • FIGS. 8a-8d illustrate 3-D point clouds of canopies that may be used to determine tree height, canopy volume, and leaf area.
  • Figures 8a and 8b show 3-D reconstructions of two apple trees from an orchard in Biglerville, Pennsylvania.
  • Figures 8c and 8d show the representation of the trees in a format called Octree' that makes analysis and storage simpler.
  • an occupied voxel has an occupancy probability greater than 0.5, and volume of the tree was taken to be the bounding box of all occupied voxels.
  • Weighting of the voxels was done using the occupancy probability.
  • This metric may be referred to as the LiDAR area index.
  • Figure 9 shows the correlation between the LiDAR area index and the true leaf area for each of the data points.
  • This choice of input feature i.e. LiDAR area index
  • the result demonstrates the use of sensor system 100 in estimation of leaf area for rows of trees. In this way, the leaf area estimation methodology may be used to rapidly estimate the leaf area of trees in a farm, enabling precise fertilization, spraying, and pruning.
  • data acquired from sensor system 100 may be used to generate fruit count to provide this capability.
  • the present systems and methods may be used with an algorithm to generate fruit counts for the rows of trees using the data acquired by sensor system 100.
  • a fruit counting approach consists of two steps: fruit detection followed by fruit tracking.
  • fruit detection is carried using a support vector machine (SVM) classifier that uses different color spaces to classify each pixel in an image as originating from a fruit or not.
  • SVM support vector machine
  • a series of images were labeled to annotate regions that have a fruit enclosed (e.g., an orange in a use case involving an orange orchard).
  • the training dataset of images were used to train the SVM classier with candidate pixel colorspace values as input.
  • a running count is generated as sensor system 100 carries out scans of trees. To do so, a fruit tracking algorithm may be used to track fruits detected in a stream of images.
  • the optical flow of image descriptors may be computed across successive frames to estimate camera motion.
  • the fruit tracking algorithm uses the estimated camera motion between frames to predict the locations of fruits detected in previous frames. These detections are compared with fruits detected in current frame to ensure previously detected fruits are not recounted.
  • FIGS. 10a and 10b illustrate data from thermal camera along with data from visible-range cameras that make detection of fruits efficient under appropriate conditions. Image processing algorithms are used to detect fruits and generate a running count as the sensor suite is deployed onboard UAVs or carried by a human scout.
  • the present subject matter is different from prior work in systems and methods for monitoring agricultural products (e.g., U.S. Patent Application Pub. No. 2013/0325346) in at least the following aspects.
  • the present sensing system can be achieved at low- cost (e.g., about $10k or less), and it can be lightweight (e.g., about 4 lbs. or less), portable, compact (e.g., size of a show box), and self-contained (e.g., comes with onboard computing and batteries).
  • it may be deployed in a distributed array, such as on affordable multi-rotor UAVs, including on a plurality of UAVs controlled as a swarm to collectively obtain the high-resolution 3-D imagery of a subject agricultural area.
  • the present systems may be deployed by a human scout using a harness or by any of a variety of other mobile deployment device.
  • the process for generating actionable intelligence may be addressed using machine learning techniques.
  • the framework underlying the present subject matter considers data visualization as a valuable component of the technology stack.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

La présente invention concerne des systèmes, des dispositifs et des procédés destinés à une agriculture de précision pilotée par des données via une télédétection à courte portée à l'aide d'un système d'imagerie polyvalent. Ce système d'imagerie peut être déployé à bord de véhicules aériens sans pilote (UAV) volant bas et/ou porté par des éclaireurs humains. De plus, la présente pile technologique peut comprendre des procédés servant à extraire des renseignements décisionnels des ensembles de données riches acquis par le système d'imagerie, ainsi que des techniques de visualisation pour une analyse efficiente des produits de données obtenus. De cette manière, les présents systèmes et procédés peuvent aider des cultivateurs de récoltes de spécialités à réduire les coûts, à économiser les ressources et à optimiser le rendement des cultures.
PCT/US2016/015093 2015-01-27 2016-01-27 Systèmes, dispositifs et procédés de télédétection robotique pour agriculture de précision WO2016123201A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/545,266 US10395115B2 (en) 2015-01-27 2016-01-27 Systems, devices, and methods for robotic remote sensing for precision agriculture

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562108509P 2015-01-27 2015-01-27
US62/108,509 2015-01-27

Publications (1)

Publication Number Publication Date
WO2016123201A1 true WO2016123201A1 (fr) 2016-08-04

Family

ID=56544271

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/015093 WO2016123201A1 (fr) 2015-01-27 2016-01-27 Systèmes, dispositifs et procédés de télédétection robotique pour agriculture de précision

Country Status (2)

Country Link
US (1) US10395115B2 (fr)
WO (1) WO2016123201A1 (fr)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106326865A (zh) * 2016-08-25 2017-01-11 广州地理研究所 一种基于无人机的水利工程区域实时监测系统及方法和装置
US9599993B2 (en) 2012-04-30 2017-03-21 The Trustees Of The University Of Pennsylvania Three-dimensional manipulation of teams of quadrotors
WO2017083128A1 (fr) * 2015-11-10 2017-05-18 Digi-Star, Llc Drone agricole à utiliser pour commander la direction de labour et appliquer de la matière sur un champ
US9745060B2 (en) 2015-07-17 2017-08-29 Topcon Positioning Systems, Inc. Agricultural crop analysis drone
EP3306344A1 (fr) * 2016-10-07 2018-04-11 Leica Geosystems AG Capteur de vol
US10037028B2 (en) 2015-07-24 2018-07-31 The Trustees Of The University Of Pennsylvania Systems, devices, and methods for on-board sensing and control of micro aerial vehicles
WO2018138648A1 (fr) 2017-01-24 2018-08-02 Gamaya Sa Procédé et appareil d'enregistrement, de traitement, de visualisation et d'application de données agronomiques
WO2018191442A1 (fr) * 2017-04-11 2018-10-18 Agerpoint, Inc. Outil de gestion forestier pour évaluer le risque d'une défaillance catastrophique d'arbres due à des événements météorologiques
US10231441B2 (en) 2015-09-24 2019-03-19 Digi-Star, Llc Agricultural drone for use in livestock feeding
WO2019076758A1 (fr) * 2017-10-17 2019-04-25 Basf Se Véhicule aérien sans pilote
US10321663B2 (en) 2015-09-24 2019-06-18 Digi-Star, Llc Agricultural drone for use in livestock monitoring
CN110210408A (zh) * 2019-06-04 2019-09-06 黑龙江省七星农场 基于卫星与无人机遥感结合的作物生长预测系统及方法
JP2019528216A (ja) * 2016-08-18 2019-10-10 テベル・アドバンスト・テクノロジーズ・リミテッドTevel Advanced Technologies Ltd. 収穫及び希薄化(dilution)する(間引く)ためのドローンの隊管理用のシステム及び方法
CN110418572A (zh) * 2017-03-12 2019-11-05 株式会社尼罗沃克 作物拍摄用无人机
US10719075B2 (en) 2017-09-25 2020-07-21 International Business Machines Corporation System and method for controlling multiple vehicles based on directive
US10732647B2 (en) 2013-11-27 2020-08-04 The Trustees Of The University Of Pennsylvania Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft micro-aerial vehicle (MAV)
WO2020192385A1 (fr) * 2019-03-26 2020-10-01 深圳市大疆创新科技有限公司 Dispositif de détermination, système de caméra et objet mobile
EP3571629A4 (fr) * 2017-01-23 2020-10-28 The Board of Trustees of the University of Illinois Système cyber-physique adaptatif pour surveillance efficace d'environnements non structurés
US10884430B2 (en) 2015-09-11 2021-01-05 The Trustees Of The University Of Pennsylvania Systems and methods for generating safe trajectories for multi-vehicle teams
US10891482B2 (en) 2018-07-10 2021-01-12 Adroit Robotics Systems, devices, and methods for in-field diagnosis of growth stage and crop yield estimation in a plant area
CN112507908A (zh) * 2020-12-15 2021-03-16 国网陕西省电力公司电力科学研究院 一种协同遥感航拍系统及方法
WO2021089813A3 (fr) * 2019-11-08 2021-06-17 Kverneland Group Operations Norway As Système de mesure et d'interprétation d'une force
EP3878741A1 (fr) * 2020-03-12 2021-09-15 Bayer AG Aéronef sans pilote
EP3905109A1 (fr) * 2020-04-30 2021-11-03 Kverneland Group Operations Norway AS Système de contrôle des opérations agricoles par des supports optiques
SE2051335A1 (en) * 2020-05-27 2021-11-28 Airforestry Ab Method and system for remote or autonomous ligno transportation
JP2022024771A (ja) * 2020-07-28 2022-02-09 ヤマハ発動機株式会社 森林計測システム、コンピュータプログラムおよび幹直径推定モデルの生成方法
US11553640B2 (en) 2019-06-11 2023-01-17 Cnh Industrial Canada, Ltd. Agricultural wear monitoring system
CN115880354A (zh) * 2023-03-02 2023-03-31 成都工业学院 一种基于点云自适应切片的计算树冠体积的方法
US12025602B2 (en) 2020-01-08 2024-07-02 AgroScout Ltd. Autonomous crop monitoring system and method

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10165722B2 (en) * 2014-12-05 2019-01-01 Deere & Company Scouting systems
US10762982B1 (en) * 2015-10-07 2020-09-01 Trace Genomics, Inc. System and method for nucleotide analysis
US9904867B2 (en) * 2016-01-29 2018-02-27 Pointivo, Inc. Systems and methods for extracting information about objects from scene information
CN106407408B (zh) * 2016-09-22 2019-08-16 北京数字绿土科技有限公司 一种海量点云数据的空间索引构建方法及装置
US11609159B2 (en) * 2017-05-08 2023-03-21 The Trustees Of The University Of Pennsylvania Systems, devices, and methods for agricultural sample collection
US10943173B2 (en) * 2017-05-12 2021-03-09 Harris Lee Cohen Computer-implemented methods, computer readable medium and systems for generating a satellite data model for a precision agriculture platform
US11073843B2 (en) * 2017-07-06 2021-07-27 Kubota Corporation Agricultural field management system, agricultural field management method, and management machine
US10262224B1 (en) * 2017-07-19 2019-04-16 The United States Of America As Represented By Secretary Of The Navy Optical flow estimation using a neural network and egomotion optimization
US11080526B2 (en) * 2017-08-15 2021-08-03 Regents Of The University Of Minnesota Satellite image classification across multiple resolutions and time using ordering constraint among instances
US10806074B2 (en) * 2017-11-13 2020-10-20 Cnh Industrial America Llc System for treatment of an agricultural field using an augmented reality visualization
US11590522B2 (en) 2018-02-13 2023-02-28 SmartApply, Inc. Spraying systems, kits, vehicles, and methods of use
US10769466B2 (en) * 2018-02-20 2020-09-08 International Business Machines Corporation Precision aware drone-based object mapping based on spatial pattern recognition
US11275941B2 (en) 2018-03-08 2022-03-15 Regents Of The University Of Minnesota Crop models and biometrics
JP6721218B2 (ja) * 2018-05-16 2020-07-08 株式会社クピド・ファーム ブドウ粒計数方法
WO2020000043A1 (fr) * 2018-06-28 2020-01-02 University Of Southern Queensland Surveillance de caractéristique de croissance de plante
US11373399B2 (en) * 2018-07-26 2022-06-28 Hall Enterprise Llc Method of crop analysis using drone with flying and driving capability
EA202190644A1 (ru) * 2018-09-05 2021-07-08 Рубикон Ресёрч Пти Лтд Способ и система для определения стресса растений и полива на этой основе
US11108849B2 (en) 2018-12-03 2021-08-31 At&T Intellectual Property I, L.P. Global internet of things (IOT) quality of service (QOS) realization through collaborative edge gateways
US20200217830A1 (en) * 2019-01-08 2020-07-09 AgroScout Ltd. Autonomous crop monitoring system and method
US10659144B1 (en) 2019-01-31 2020-05-19 At&T Intellectual Property I, L.P. Management of massively distributed internet of things (IOT) gateways based on software-defined networking (SDN) via fly-by master drones
US11001380B2 (en) * 2019-02-11 2021-05-11 Cnh Industrial Canada, Ltd. Methods for acquiring field condition data
US11059582B2 (en) 2019-02-11 2021-07-13 Cnh Industrial Canada, Ltd. Systems for acquiring field condition data
US11440659B2 (en) * 2019-09-12 2022-09-13 National Formosa University Precision agriculture implementation method by UAV systems and artificial intelligence image processing technologies
JP6880380B2 (ja) * 2019-11-01 2021-06-02 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 画像処理装置、画像処理方法、及びプログラム
US20210207838A1 (en) * 2020-01-03 2021-07-08 AlgoLook, Inc. Air particulate classification
IT202000006982A1 (it) * 2020-04-02 2021-10-02 Metacortex Srl Sistema per calcolare la produzione e/o la resa di alberi da frutta, metodo di calcolo relativo e dispositivo previsto per realizzare il metodo
US20210323015A1 (en) * 2020-04-17 2021-10-21 Cnh Industrial Canada, Ltd. System and method to monitor nozzle spray quality
US20220107926A1 (en) * 2020-10-06 2022-04-07 The Climate Corporation Scalable geospatial platform for an integrated data synthesis and artificial intelligence based exploration
US20220366605A1 (en) * 2021-05-13 2022-11-17 Seetree Systems Ltd. Accurate geolocation in remote-sensing imaging
CN117980959A (zh) * 2021-09-27 2024-05-03 索尼半导体解决方案公司 信息处理装置和信息处理方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6422508B1 (en) * 2000-04-05 2002-07-23 Galileo Group, Inc. System for robotic control of imaging data having a steerable gimbal mounted spectral sensor and methods
US20040264761A1 (en) * 2003-04-30 2004-12-30 Deere & Company System and method for detecting crop rows in an agricultural field
US20090256909A1 (en) * 2008-04-11 2009-10-15 Nixon Stuart Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features
US20090290811A1 (en) * 2008-05-23 2009-11-26 Samsung Electronics Co., Ltd. System and method for generating a multi-dimensional image
US20120101861A1 (en) * 2010-10-25 2012-04-26 Lindores Robert J Wide-area agricultural monitoring and prediction
US20140152839A1 (en) * 2012-11-30 2014-06-05 University Of Utah Research Foundation Multi-spectral imaging with diffractive optics

Family Cites Families (69)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2697796B1 (fr) 1992-11-10 1994-12-09 Sextant Avionique Dispositif d'évitement de collisions pour aéronef notamment avec le sol.
US6278945B1 (en) 1997-11-24 2001-08-21 American Gnc Corporation Fully-coupled positioning process and system thereof
US6308911B1 (en) 1998-10-30 2001-10-30 Lockheed Martin Corp. Method and apparatus for rapidly turning a vehicle in a fluid medium
US6876945B2 (en) 2002-03-25 2005-04-05 Nicholas Jon Emord Seamless sensory system
CN1273270C (zh) * 2002-08-09 2006-09-06 日立工机株式会社 以燃气为动力的射钉枪
US7343232B2 (en) 2003-06-20 2008-03-11 Geneva Aerospace Vehicle control system including related methods and components
US7289906B2 (en) 2004-04-05 2007-10-30 Oregon Health & Science University Navigation system applications of sigma-point Kalman filters for nonlinear estimation and sensor fusion
US7818127B1 (en) 2004-06-18 2010-10-19 Geneva Aerospace, Inc. Collision avoidance for vehicle control systems
US7228227B2 (en) 2004-07-07 2007-06-05 The Boeing Company Bezier curve flightpath guidance using moving waypoints
US7249730B1 (en) 2004-09-23 2007-07-31 United States Of America As Represented By The Secretary Of The Army System and method for in-flight trajectory path synthesis using the time sampled output of onboard sensors
US8019544B2 (en) 2005-01-03 2011-09-13 The Boeing Company Real-time refinement method of spacecraft star tracker alignment estimates
WO2006113391A2 (fr) 2005-04-19 2006-10-26 Jaymart Sensors, Llc Centrale inertielle miniaturisee et procedes associes
US20070235592A1 (en) 2005-12-05 2007-10-11 Horn Phillippe L Minimum time or thrust separation trajectory for spacecraft emergency separation
US8050863B2 (en) 2006-03-16 2011-11-01 Gray & Company, Inc. Navigation and control system for autonomous vehicles
US20080014492A1 (en) * 2006-07-14 2008-01-17 Jens Ulrick Nielsen Compression assembly, solid oxide fuel cell stack, a process for compression of the solid oxide fuel cell stack and its use
CN101109640A (zh) 2006-07-19 2008-01-23 北京航空航天大学 基于视觉的无人驾驶飞机自主着陆导航系统
US7643893B2 (en) 2006-07-24 2010-01-05 The Boeing Company Closed-loop feedback control using motion capture systems
US7925049B2 (en) * 2006-08-15 2011-04-12 Sri International Stereo-based visual odometry method and system
US20080195316A1 (en) 2007-02-12 2008-08-14 Honeywell International Inc. System and method for motion estimation using vision sensors
CN101676744B (zh) 2007-10-31 2012-07-11 北京航空航天大学 一种复杂背景低信噪比下弱小目标高精度跟踪方法
US20110082566A1 (en) 2008-09-04 2011-04-07 Herr Hugh M Implementing a stand-up sequence using a lower-extremity prosthesis or orthosis
US8521339B2 (en) 2008-09-09 2013-08-27 Aeryon Labs Inc. Method and system for directing unmanned vehicles
US20100114408A1 (en) 2008-10-31 2010-05-06 Honeywell International Inc. Micro aerial vehicle quality of service manager
US20120010186A1 (en) * 2009-03-23 2012-01-12 Merck Frosst Canada Ltd. Heterocyclic compounds as inhibitors of stearoyl-coenzyme a delta-9 desaturase
US8380362B2 (en) 2009-07-10 2013-02-19 The Boeing Company Systems and methods for remotely collaborative vehicles
EP2280241A3 (fr) 2009-07-30 2017-08-23 QinetiQ Limited Commande de véhicule
CN101655561A (zh) 2009-09-14 2010-02-24 南京莱斯信息技术股份有限公司 基于联合卡尔曼滤波的多点定位数据与雷达数据融合方法
US8577539B1 (en) 2010-01-27 2013-11-05 The United States Of America As Represented By The Secretary Of The Air Force Coded aperture aided navigation and geolocation systems
DE112011100528T5 (de) 2010-02-14 2012-12-06 Trimble Navigation Limited GNSS-Signalverarbeitungmit regionaler Augmentationsnachricht
US9568321B2 (en) 2010-04-19 2017-02-14 Honeywell International Inc. Systems and methods for determining inertial navigation system faults
FR2959812B1 (fr) 2010-05-05 2012-11-16 Thales Sa Procede d'elaboration d'une phase de navigation dans un systeme de navigation impliquant une correlation de terrain.
US9031809B1 (en) 2010-07-14 2015-05-12 Sri International Method and apparatus for generating three-dimensional pose using multi-modal sensor fusion
US8676498B2 (en) 2010-09-24 2014-03-18 Honeywell International Inc. Camera and inertial measurement unit integration with navigation data feedback for feature tracking
US8756001B2 (en) 2011-02-28 2014-06-17 Trusted Positioning Inc. Method and apparatus for improved navigation of a moving platform
WO2013105926A1 (fr) 2011-03-22 2013-07-18 Aerovironment Inc. Aéronef réversible
US8868323B2 (en) 2011-03-22 2014-10-21 Honeywell International Inc. Collaborative navigation using conditional updates
US20140032167A1 (en) 2011-04-01 2014-01-30 Physical Sciences, Inc. Multisensor Management and Data Fusion via Parallelized Multivariate Filters
US9035774B2 (en) 2011-04-11 2015-05-19 Lone Star Ip Holdings, Lp Interrogator and system employing the same
US10027952B2 (en) 2011-08-04 2018-07-17 Trx Systems, Inc. Mapping and tracking system with features in three-dimensional space
CA2848217C (fr) 2011-09-14 2018-09-18 Trusted Positioning Inc. Procede et appareil pour la navigation comprenant des modeles non lineaires
FR2985581B1 (fr) 2012-01-05 2014-11-28 Parrot Procede de pilotage d'un drone a voilure tournante pour operer une prise de vue par une camera embarquee avec minimisation des mouvements perturbateurs
US9104201B1 (en) 2012-02-13 2015-08-11 C&P Technologies, Inc. Method and apparatus for dynamic swarming of airborne drones for a reconfigurable array
US8874360B2 (en) 2012-03-09 2014-10-28 Proxy Technologies Inc. Autonomous vehicle and method for coordinating the paths of multiple autonomous vehicles
EP2845071B1 (fr) 2012-04-30 2020-03-18 The Trustees Of The University Of Pennsylvania Manipulation en trois dimensions d'équipes de quadrirotors
WO2013181558A1 (fr) 2012-06-01 2013-12-05 Agerpoint, Inc. Systèmes et procédés de surveillance de produits agricoles
US20140008496A1 (en) 2012-07-05 2014-01-09 Zhou Ye Using handheld device to control flying object
KR20140019548A (ko) * 2012-08-06 2014-02-17 주식회사 케이티 이종망간의 통신 서비스 제공 방법
US9011250B2 (en) 2012-10-05 2015-04-21 Qfo Labs, Inc. Wireless communication system for game play with multiple remote-control flying craft
JP6055274B2 (ja) * 2012-10-31 2016-12-27 株式会社トプコン 航空写真測定方法及び航空写真測定システム
FR3000813B1 (fr) 2013-01-04 2016-04-15 Parrot Drone a voilure tournante comprenant des moyens de determination autonome de position dans un repere absolu lie au sol.
US9539723B2 (en) 2013-03-13 2017-01-10 Double Robotics, Inc. Accessory robot for mobile device
US9536427B2 (en) 2013-03-15 2017-01-03 Carnegie Mellon University Methods and software for managing vehicle priority in a self-organizing traffic control system
US20140312165A1 (en) * 2013-03-15 2014-10-23 Armen Mkrtchyan Methods, apparatus and systems for aerial assessment of ground surfaces
US20140263822A1 (en) * 2013-03-18 2014-09-18 Chester Charles Malveaux Vertical take off and landing autonomous/semiautonomous/remote controlled aerial agricultural sensor platform
US10063782B2 (en) * 2013-06-18 2018-08-28 Motorola Solutions, Inc. Method and apparatus for displaying an image from a camera
US20150321758A1 (en) 2013-08-31 2015-11-12 II Peter Christopher Sarna UAV deployment and control system
EP3047238A4 (fr) 2013-09-17 2017-05-24 InvenSense, Inc. Procédé et système pour navigation améliorée comprenant de multiples ensembles capteurs
DE102014211166A1 (de) 2013-11-20 2015-05-21 Continental Teves Ag & Co. Ohg Verfahren, Fusionsfilter und System zur Fusion von Sensorsignalen mit unterschiedlichen zeitlichen Signalausgabeverzügen zu einem Fusionsdatensatz
BR112016011577B1 (pt) * 2013-11-20 2021-01-12 Rowbot Systems Llc plataforma de veículo autônomo, sistema de plataforma de veículo autônomo, robô agrícola e método para a navegação autônoma de um robô agrícola
CN106030430A (zh) 2013-11-27 2016-10-12 宾夕法尼亚大学理事会 用于使用旋翼微型航空载具(mav)在室内和室外环境中的稳健的自主飞行的多传感器融合
UA123573C2 (uk) * 2014-08-22 2021-04-28 Зе Клаймет Корпорейшн Способи агрономічного та сільськогосподарського моніторингу із застосуванням безпілотних літальних систем
US9129355B1 (en) * 2014-10-09 2015-09-08 State Farm Mutual Automobile Insurance Company Method and system for assessing damage to infrastructure
FR3028186A1 (fr) 2014-11-12 2016-05-13 Parrot Equipement de telecommande de drone a longue portee
WO2016076586A1 (fr) 2014-11-14 2016-05-19 Lg Electronics Inc. Terminal mobile et son procédé de commande
US20160214715A1 (en) 2014-11-21 2016-07-28 Greg Meffert Systems, Methods and Devices for Collecting Data at Remote Oil and Natural Gas Sites
US20160214713A1 (en) 2014-12-19 2016-07-28 Brandon Cragg Unmanned aerial vehicle with lights, audio and video
US9915956B2 (en) 2015-01-09 2018-03-13 Workhorse Group Inc. Package delivery by means of an automated multi-copter UAS/UAV dispatched from a conventional delivery vehicle
US10037028B2 (en) 2015-07-24 2018-07-31 The Trustees Of The University Of Pennsylvania Systems, devices, and methods for on-board sensing and control of micro aerial vehicles
US10884430B2 (en) 2015-09-11 2021-01-05 The Trustees Of The University Of Pennsylvania Systems and methods for generating safe trajectories for multi-vehicle teams

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6422508B1 (en) * 2000-04-05 2002-07-23 Galileo Group, Inc. System for robotic control of imaging data having a steerable gimbal mounted spectral sensor and methods
US20040264761A1 (en) * 2003-04-30 2004-12-30 Deere & Company System and method for detecting crop rows in an agricultural field
US20090256909A1 (en) * 2008-04-11 2009-10-15 Nixon Stuart Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features
US20090290811A1 (en) * 2008-05-23 2009-11-26 Samsung Electronics Co., Ltd. System and method for generating a multi-dimensional image
US20120101861A1 (en) * 2010-10-25 2012-04-26 Lindores Robert J Wide-area agricultural monitoring and prediction
US20140152839A1 (en) * 2012-11-30 2014-06-05 University Of Utah Research Foundation Multi-spectral imaging with diffractive optics

Cited By (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9599993B2 (en) 2012-04-30 2017-03-21 The Trustees Of The University Of Pennsylvania Three-dimensional manipulation of teams of quadrotors
US10732647B2 (en) 2013-11-27 2020-08-04 The Trustees Of The University Of Pennsylvania Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft micro-aerial vehicle (MAV)
US9745060B2 (en) 2015-07-17 2017-08-29 Topcon Positioning Systems, Inc. Agricultural crop analysis drone
US10189568B2 (en) 2015-07-17 2019-01-29 Topcon Positioning Systems, Inc. Agricultural crop analysis drone
US10037028B2 (en) 2015-07-24 2018-07-31 The Trustees Of The University Of Pennsylvania Systems, devices, and methods for on-board sensing and control of micro aerial vehicles
US10884430B2 (en) 2015-09-11 2021-01-05 The Trustees Of The University Of Pennsylvania Systems and methods for generating safe trajectories for multi-vehicle teams
US10231441B2 (en) 2015-09-24 2019-03-19 Digi-Star, Llc Agricultural drone for use in livestock feeding
US10321663B2 (en) 2015-09-24 2019-06-18 Digi-Star, Llc Agricultural drone for use in livestock monitoring
US11627724B2 (en) 2015-09-24 2023-04-18 Digi-Star, Llc Agricultural drone for use in livestock feeding
WO2017083128A1 (fr) * 2015-11-10 2017-05-18 Digi-Star, Llc Drone agricole à utiliser pour commander la direction de labour et appliquer de la matière sur un champ
AU2017311697B2 (en) * 2016-08-18 2023-04-13 Tevel Aerobotics Technologies Ltd System and method for plantation agriculture tasks management and data collection
AU2017311696B2 (en) * 2016-08-18 2023-04-20 Tevel Aerobotics Technologies Ltd System and method for drone fleet management for harvesting and dilution
US11194348B2 (en) 2016-08-18 2021-12-07 Tevel Advanced Technologies Ltd. System and method for drone fleet management for harvesting and dilution
US11709493B2 (en) 2016-08-18 2023-07-25 Tevel Aerobotics Technologies Ltd. System and method for plantation agriculture tasks management and data collection
JP2019528216A (ja) * 2016-08-18 2019-10-10 テベル・アドバンスト・テクノロジーズ・リミテッドTevel Advanced Technologies Ltd. 収穫及び希薄化(dilution)する(間引く)ためのドローンの隊管理用のシステム及び方法
EP3500087A4 (fr) * 2016-08-18 2020-04-08 Tevel Advanced Technologies Ltd. Système et procédé de gestion de tâches agricoles de plantation et de collecte de données
US11846946B2 (en) 2016-08-18 2023-12-19 Tevel Advanced Technologies Ltd. System and method for mapping and building database for harvesting-dilution tasks using aerial drones
EP3500086A4 (fr) * 2016-08-18 2020-05-27 Tevel Advanced Technologies Ltd. Système et procédé de gestion d'une flotte de drones pour la récolte et la dilution
EP3500877A4 (fr) * 2016-08-18 2020-06-03 Tevel Advanced Technologies Ltd. Système et procédé de cartographie et de construction d'une base de données pour des tâches de récolte-dilution à l'aide de drones aériens
JP7026114B2 (ja) 2016-08-18 2022-02-25 テベル・エアロボティクス・テクノロジーズ・リミテッド 収穫及び希薄化(dilution)する(間引く)ためのドローンの隊管理用のシステム及び方法
CN106326865A (zh) * 2016-08-25 2017-01-11 广州地理研究所 一种基于无人机的水利工程区域实时监测系统及方法和装置
CN113029117A (zh) * 2016-10-07 2021-06-25 莱卡地球系统公开股份有限公司 飞行传感器
EP3306344A1 (fr) * 2016-10-07 2018-04-11 Leica Geosystems AG Capteur de vol
EP3306346A1 (fr) * 2016-10-07 2018-04-11 Leica Geosystems AG Capteur de vol
CN113029117B (zh) * 2016-10-07 2023-06-02 莱卡地球系统公开股份有限公司 飞行传感器
US10640209B2 (en) 2016-10-07 2020-05-05 Leica Geosystems Ag Flying sensor
US11703855B2 (en) 2017-01-23 2023-07-18 The Board Of Trustees Of The University Of Illinois Adaptive cyber-physical system for efficient monitoring of unstructured environments
EP3571629A4 (fr) * 2017-01-23 2020-10-28 The Board of Trustees of the University of Illinois Système cyber-physique adaptatif pour surveillance efficace d'environnements non structurés
US11199838B2 (en) 2017-01-23 2021-12-14 The Board Of Trustees Of The University Of Illinois Adaptive cyber-physical system for efficient monitoring of unstructured environments
WO2018138648A1 (fr) 2017-01-24 2018-08-02 Gamaya Sa Procédé et appareil d'enregistrement, de traitement, de visualisation et d'application de données agronomiques
CN110418572A (zh) * 2017-03-12 2019-11-05 株式会社尼罗沃克 作物拍摄用无人机
US11215597B2 (en) 2017-04-11 2022-01-04 Agerpoint, Inc. Forestry management tool for assessing risk of catastrophic tree failure due to weather events
WO2018191442A1 (fr) * 2017-04-11 2018-10-18 Agerpoint, Inc. Outil de gestion forestier pour évaluer le risque d'une défaillance catastrophique d'arbres due à des événements météorologiques
US10719075B2 (en) 2017-09-25 2020-07-21 International Business Machines Corporation System and method for controlling multiple vehicles based on directive
US11137775B2 (en) 2017-10-17 2021-10-05 Basf Se Unmanned aerial vehicle
WO2019076758A1 (fr) * 2017-10-17 2019-04-25 Basf Se Véhicule aérien sans pilote
CN111225854A (zh) * 2017-10-17 2020-06-02 巴斯夫欧洲公司 无人机
US10891482B2 (en) 2018-07-10 2021-01-12 Adroit Robotics Systems, devices, and methods for in-field diagnosis of growth stage and crop yield estimation in a plant area
US11580731B2 (en) 2018-07-10 2023-02-14 Adroit Robotics Systems, devices, and methods for in-field diagnosis of growth stage and crop yield estimation in a plant area
WO2020192385A1 (fr) * 2019-03-26 2020-10-01 深圳市大疆创新科技有限公司 Dispositif de détermination, système de caméra et objet mobile
CN110210408A (zh) * 2019-06-04 2019-09-06 黑龙江省七星农场 基于卫星与无人机遥感结合的作物生长预测系统及方法
CN110210408B (zh) * 2019-06-04 2020-06-02 黑龙江省七星农场 基于卫星与无人机遥感结合的作物生长预测系统及方法
US11553640B2 (en) 2019-06-11 2023-01-17 Cnh Industrial Canada, Ltd. Agricultural wear monitoring system
WO2021089813A3 (fr) * 2019-11-08 2021-06-17 Kverneland Group Operations Norway As Système de mesure et d'interprétation d'une force
US12025602B2 (en) 2020-01-08 2024-07-02 AgroScout Ltd. Autonomous crop monitoring system and method
WO2021180475A1 (fr) * 2020-03-12 2021-09-16 Bayer Aktiengesellschaft Aéronef sans pilote
EP3878741A1 (fr) * 2020-03-12 2021-09-15 Bayer AG Aéronef sans pilote
EP3905109A1 (fr) * 2020-04-30 2021-11-03 Kverneland Group Operations Norway AS Système de contrôle des opérations agricoles par des supports optiques
SE544874C2 (en) * 2020-05-27 2022-12-20 Airforestry Ab Method and system for remote or autonomous ligno harvesting and/or transportation
SE544809C2 (en) * 2020-05-27 2022-11-22 Airforestry Ab Method and system for remote or autonomous ligno transportation
SE2051336A1 (en) * 2020-05-27 2021-11-28 Airforestry Ab Method and system for remote or autonomous ligno harvesting and/or transportation
SE2051335A1 (en) * 2020-05-27 2021-11-28 Airforestry Ab Method and system for remote or autonomous ligno transportation
JP2022024771A (ja) * 2020-07-28 2022-02-09 ヤマハ発動機株式会社 森林計測システム、コンピュータプログラムおよび幹直径推定モデルの生成方法
CN112507908A (zh) * 2020-12-15 2021-03-16 国网陕西省电力公司电力科学研究院 一种协同遥感航拍系统及方法
CN112507908B (zh) * 2020-12-15 2024-05-31 国网陕西省电力公司电力科学研究院 一种协同遥感航拍系统及方法
CN115880354A (zh) * 2023-03-02 2023-03-31 成都工业学院 一种基于点云自适应切片的计算树冠体积的方法

Also Published As

Publication number Publication date
US10395115B2 (en) 2019-08-27
US20170372137A1 (en) 2017-12-28

Similar Documents

Publication Publication Date Title
US10395115B2 (en) Systems, devices, and methods for robotic remote sensing for precision agriculture
Das et al. Devices, systems, and methods for automated monitoring enabling precision agriculture
Mukherjee et al. A survey of unmanned aerial sensing solutions in precision agriculture
Kulbacki et al. Survey of drones for agriculture automation from planting to harvest
Reza et al. Rice yield estimation based on K-means clustering with graph-cut segmentation using low-altitude UAV images
Su et al. AI meets UAVs: A survey on AI empowered UAV perception systems for precision agriculture
Anthony et al. On crop height estimation with UAVs
Bouguettaya et al. A survey on deep learning-based identification of plant and crop diseases from UAV-based aerial images
Bender et al. A high‐resolution, multimodal data set for agricultural robotics: A Ladybird's‐eye view of Brassica
de Oca et al. The AgriQ: A low-cost unmanned aerial system for precision agriculture
Baofeng et al. Digital surface model applied to unmanned aerial vehicle based photogrammetry to assess potential biotic or abiotic effects on grapevine canopies
Andritoiu et al. Agriculture autonomous monitoring and decisional mechatronic system
Zhang et al. Opportunities of UAVs in orchard management
Bhandari et al. Towards collaboration between unmanned aerial and ground vehicles for precision agriculture
Ehsani et al. Affordable multirotor Remote sensing platform for applications in precision horticulture
Tahir et al. Application of unmanned aerial vehicles in precision agriculture
Vulpi et al. An RGB-D multi-view perspective for autonomous agricultural robots
Maslekar et al. Application of unmanned aerial vehicles (UAVs) for pest surveillance, monitoring and management
Giustarini et al. PANTHEON: SCADA for precision agriculture
Jimenez Soler et al. Validation and calibration of a high resolution sensor in unmanned aerial vehicles for producing images in the IR range utilizable in precision agriculture
Dhami et al. Crop height and plot estimation from unmanned aerial vehicles using 3D LiDAR
Mathivanan et al. Utilizing satellite and UAV data for crop yield prediction and monitoring through deep learning
DeepanshuSrivastava et al. UAVs in Agriculture
Kakarla et al. Types of Unmanned Aerial Vehicles (UAVs), Sensing Technologies, and Software for Agricultural Applications: AE565/AE565, 10/2021
Izere Plant Height Estimation Using RTK-GNSS Enabled Unmanned Aerial Vehicle (UAV) Photogrammetry

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16744022

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15545266

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16744022

Country of ref document: EP

Kind code of ref document: A1