EP3937628A1 - Dispositif de contrôle de traitement agricole - Google Patents
Dispositif de contrôle de traitement agricoleInfo
- Publication number
- EP3937628A1 EP3937628A1 EP20708129.0A EP20708129A EP3937628A1 EP 3937628 A1 EP3937628 A1 EP 3937628A1 EP 20708129 A EP20708129 A EP 20708129A EP 3937628 A1 EP3937628 A1 EP 3937628A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- weeds
- diseases
- deficiencies
- leaf symptoms
- detection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000011282 treatment Methods 0.000 title claims abstract description 143
- 241000196324 Embryophyta Species 0.000 claims abstract description 412
- 201000010099 disease Diseases 0.000 claims abstract description 324
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 claims abstract description 324
- 230000007812 deficiency Effects 0.000 claims abstract description 316
- 208000024891 symptom Diseases 0.000 claims abstract description 313
- 238000001514 detection method Methods 0.000 claims description 307
- 238000013528 artificial neural network Methods 0.000 claims description 49
- 238000000034 method Methods 0.000 claims description 47
- 238000012545 processing Methods 0.000 claims description 38
- 238000004891 communication Methods 0.000 claims description 35
- 230000004927 fusion Effects 0.000 claims description 19
- 238000004364 calculation method Methods 0.000 claims description 16
- 238000009333 weeding Methods 0.000 claims description 15
- 238000013507 mapping Methods 0.000 claims description 13
- 238000012512 characterization method Methods 0.000 claims description 9
- 230000004807 localization Effects 0.000 claims description 9
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 9
- 238000005507 spraying Methods 0.000 claims description 8
- 230000002123 temporal effect Effects 0.000 claims description 7
- 239000007921 spray Substances 0.000 claims description 6
- 230000006378 damage Effects 0.000 claims description 4
- 238000004590 computer program Methods 0.000 claims description 3
- 239000002689 soil Substances 0.000 claims description 3
- 241000894007 species Species 0.000 claims description 3
- 230000005540 biological transmission Effects 0.000 claims description 2
- 238000013527 convolutional neural network Methods 0.000 claims description 2
- 238000011084 recovery Methods 0.000 claims description 2
- 210000002569 neuron Anatomy 0.000 description 23
- 238000004422 calculation algorithm Methods 0.000 description 21
- 230000003287 optical effect Effects 0.000 description 20
- 230000003595 spectral effect Effects 0.000 description 19
- 230000008569 process Effects 0.000 description 15
- 230000006870 function Effects 0.000 description 14
- RAAUBRQLKXXMQK-UHFFFAOYSA-N 2-azaniumyl-5-chloro-4-oxopentanoate Chemical compound OC(=O)C(N)CC(=O)CCl RAAUBRQLKXXMQK-UHFFFAOYSA-N 0.000 description 11
- 239000000126 substance Substances 0.000 description 11
- 230000007480 spreading Effects 0.000 description 9
- 238000003892 spreading Methods 0.000 description 9
- 230000004913 activation Effects 0.000 description 6
- 238000005259 measurement Methods 0.000 description 6
- 241000607479 Yersinia pestis Species 0.000 description 4
- 230000001066 destructive effect Effects 0.000 description 4
- 239000000047 product Substances 0.000 description 4
- 238000012549 training Methods 0.000 description 4
- 238000010276 construction Methods 0.000 description 3
- 241001233957 eudicotyledons Species 0.000 description 3
- 230000001747 exhibiting effect Effects 0.000 description 3
- 239000004009 herbicide Substances 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 210000004205 output neuron Anatomy 0.000 description 3
- 238000007781 pre-processing Methods 0.000 description 3
- 230000035882 stress Effects 0.000 description 3
- 235000001466 Ribes nigrum Nutrition 0.000 description 2
- 241001312569 Ribes nigrum Species 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 210000004027 cell Anatomy 0.000 description 2
- 230000001427 coherent effect Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 150000001875 compounds Chemical class 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 210000002808 connective tissue Anatomy 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 238000001983 electron spin resonance imaging Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 244000037666 field crops Species 0.000 description 2
- 238000002347 injection Methods 0.000 description 2
- 239000007924 injection Substances 0.000 description 2
- JEIPFZHSYJVQDO-UHFFFAOYSA-N iron(III) oxide Inorganic materials O=[Fe]O[Fe]=O JEIPFZHSYJVQDO-UHFFFAOYSA-N 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 238000011176 pooling Methods 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 230000009758 senescence Effects 0.000 description 2
- YZHUMGUJCQRKBT-UHFFFAOYSA-M sodium chlorate Chemical compound [Na+].[O-]Cl(=O)=O YZHUMGUJCQRKBT-UHFFFAOYSA-M 0.000 description 2
- 238000009331 sowing Methods 0.000 description 2
- 208000011580 syndromic disease Diseases 0.000 description 2
- 239000005562 Glyphosate Substances 0.000 description 1
- 231100000674 Phytotoxicity Toxicity 0.000 description 1
- 241000209140 Triticum Species 0.000 description 1
- 235000021307 Triticum Nutrition 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000012271 agricultural production Methods 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000009172 bursting Effects 0.000 description 1
- 235000019577 caloric intake Nutrition 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 125000003636 chemical group Chemical group 0.000 description 1
- 238000000701 chemical imaging Methods 0.000 description 1
- 229930002875 chlorophyll Natural products 0.000 description 1
- 235000019804 chlorophyll Nutrition 0.000 description 1
- ATNHDLDRLWWWCB-AENOIHSZSA-M chlorophyll a Chemical compound C1([C@@H](C(=O)OC)C(=O)C2=C3C)=C2N2C3=CC(C(CC)=C3C)=[N+]4C3=CC3=C(C=C)C(C)=C5N3[Mg-2]42[N+]2=C1[C@@H](CCC(=O)OC\C=C(/C)CCC[C@H](C)CCC[C@H](C)CCCC(C)C)[C@H](C)C2=C5 ATNHDLDRLWWWCB-AENOIHSZSA-M 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013170 computed tomography imaging Methods 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 238000007596 consolidation process Methods 0.000 description 1
- 238000003967 crop rotation Methods 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000001704 evaporation Methods 0.000 description 1
- 230000008020 evaporation Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000004720 fertilization Effects 0.000 description 1
- 238000009472 formulation Methods 0.000 description 1
- 230000035784 germination Effects 0.000 description 1
- XDDAORKBJWWYJS-UHFFFAOYSA-N glyphosate Chemical compound OC(=O)CNCP(O)(O)=O XDDAORKBJWWYJS-UHFFFAOYSA-N 0.000 description 1
- 229940097068 glyphosate Drugs 0.000 description 1
- 230000036449 good health Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000002363 herbicidal effect Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000002513 implantation Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000000575 pesticide Substances 0.000 description 1
- 239000000049 pigment Substances 0.000 description 1
- 230000000069 prophylactic effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M7/00—Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
- A01M7/0089—Regulating or controlling systems
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B79/00—Methods for working soil
- A01B79/005—Precision agriculture
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M21/00—Apparatus for the destruction of unwanted vegetation, e.g. weeds
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M21/00—Apparatus for the destruction of unwanted vegetation, e.g. weeds
- A01M21/04—Apparatus for destruction by steam, chemicals, burning, or electricity
- A01M21/043—Apparatus for destruction by steam, chemicals, burning, or electricity by chemicals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B69/00—Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
- A01B69/001—Steering by means of optical assistance, e.g. television cameras
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M21/00—Apparatus for the destruction of unwanted vegetation, e.g. weeds
- A01M21/02—Apparatus for mechanical destruction
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M21/00—Apparatus for the destruction of unwanted vegetation, e.g. weeds
- A01M21/04—Apparatus for destruction by steam, chemicals, burning, or electricity
- A01M21/046—Apparatus for destruction by steam, chemicals, burning, or electricity by electricity
Definitions
- the present invention relates to an agricultural treatment control device intended to be mounted on an agricultural machine, integrating at least one controllable device for treating the plot and at least one detector for weeds or leaf deficiency symptoms. or diseases.
- Crop rotations was one of the first methods theorized at the start of the 20th century, as described in the document “Clyde E. Leighty, 1938 Yearbook of Agriculture”, consisting of an alternation of autumn and spring crops in order to break certain biological cycles of weeds;
- the curative measures are as follows: - Chemical weed control, as described in the document “Spraying in field crops.
- the keys to success Arvalis, avoids the emergence of weeds in the crop.
- the phytosanitary products dedicated to chemical weed control are suitable either for a pre-emergence treatment to prevent the germination of weeds present in the seed state, or for a post-emergence treatment, to destroy the weeds that have emerged in the crop.
- Chemical weed control is either selective, making it possible to treat a type of weed, or non-selective, making it possible to destroy all the plants present in the plot at the time of treatment.
- the repeated use of the same chemical group of weedkiller leads to the appearance of resistance of weeds, as well as phytotoxicity having an impact on the yield of the crop.
- Chemical weedkillers are applied to the plot by means of a sprayer;
- the cultivator a tool suitable for a wide range of soil
- Arvalis in pre-emergence or post-emergence, allows to destroy either the weed seedlings or the weeds to a more advanced stage. This weeding process improves soil structure and also disrupts the cycle of some pests.
- the tools used for mechanical weeding are tine harrows or rotary hoes for complete weeding or tine cultivators for inter-row or under-row treatment;
- Treatments whether chemical, mechanical or alternative, are carried out by a machine, usually attached to a motorized vehicle that moves through the crop.
- Control system for agricultural spreading describes a spreading control system comprising a set of spreading nozzles, means for mapping plants to be treated using, in one embodiment , cameras, and means for controlling the spreading according to the mapping data produced.
- This control system requires a first pass of the system in the agricultural plot in order to produce a map of this agricultural plot used in a second pass for the application of the treatment.
- Document FR 3,063,206 includes several embodiments, but the main embodiment comprises a single processing unit, which can admittedly use images from several cameras. Although this document also mentions “several processing units", this mention is succinct, and the only embodiment practice is that of a plurality of control subsystems each comprising a processing unit.
- Document CN 108 990 944 seems to describe a drone carrying a camera in the visible range and an infrared camera, the images of which are merged by a central processor.
- “Hyperspectral acquisition detection device” describes a hyperspectral acquisition device with direct detection capable of detecting the shape, texture and spectral reflectance signature of a weed, or of leaf symptoms. deficiencies or diseases, in a culture. This device is suitable for discerning weeds at early stages, including weeds of the same family. Likewise, this device is suitable for detecting leaf symptoms of deficiencies or diseases.
- the documents FR1905916 and WO2019EP85847 take up and supplement the two preceding documents. These latter four patent applications are incorporated herein by reference in their entirety for convenience.
- agricultural treatment equipment especially spray booms
- a detection system must therefore be able to detect with great reliability the presence of certain families of weeds or leaf symptoms of deficiencies or diseases, over a large width.
- the technical problem of the invention consists in detecting the presence of weeds, or leaf symptoms of deficiencies or diseases in real time during the journey of an agricultural machine.
- the present invention proposes to respond to this technical problem by equipping an agricultural machine with a set of sensors for weeds or leaf symptoms of deficiencies or diseases; said sensors of weeds or leaf symptoms of deficiencies or diseases collaborating in the detection and control of the treatment to be applied as a function of the detections made by each of said sensors of weeds or of leaf symptoms of deficiencies or diseases.
- the invention relates to an agricultural treatment control device intended to be mounted on an agricultural machine, said agricultural machine comprising at least one controllable treatment device, the agricultural treatment control device comprising:
- the invention is characterized in that at least one system for detecting weeds or leaf symptoms of deficiencies or diseases collaborates with a system for detecting weeds or leaf symptoms of deficiencies or diseases including the area of detection overlaps partially with that of said system for detecting weeds or leaf symptoms of deficiencies or diseases in order to decide collaboratively on the treatment to be applied to the detection zone of said system for detecting weeds or leaf symptoms of deficiencies or diseases.
- the device comprises a communication system between said at least one detection system for weeds or leaf symptoms of deficiencies or diseases and at least one treatment device. This embodiment allows a selective chemical, thermal or mechanical treatment in an agricultural plot.
- an agricultural treatment control device is composed of at least one sensor detecting the presence and location of weeds or leaf symptoms of deficiencies or diseases in an agricultural plot, and of a collaborative process for the automated decision to apply a treatment; the treatment may be of different natures, in particular chemical, mechanical or electrical.
- said at least one system for detecting weeds or leaf symptoms of deficiencies or diseases is adapted to collaborate with another system for detecting weeds or leaf symptoms of deficiencies or diseases of which the detection zone laterally overlaps partially with that of said system for detecting weeds or leaf symptoms of deficiencies or diseases.
- said at least one system for detecting weeds or leaf symptoms of deficiencies or diseases is adapted to collaborate with a system for detecting weeds or leaf symptoms of deficiencies or diseases of which the area. detection is overlapped in time with that of said system for detecting weeds or leaf symptoms of deficiencies or diseases.
- the location system comprises a geolocation system and / or an inertial unit.
- the device comprises at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases.
- one, in particular each, system for detecting weeds or leaf symptoms of deficiencies or diseases is equipped with a localization system.
- one, in particular each, system for detecting weeds or leaf symptoms of deficiencies or diseases is adapted to collaborate with another, in particular the others, systems for detecting weeds or leaf symptoms. deficiencies or diseases.
- one, in particular each, system for detecting weeds or leaf symptoms of deficiencies or diseases comprises a hyperspectral sensor.
- a system for detecting weeds or leaf symptoms of deficiency or disease is adapted to detect the presence of weeds or leaf symptoms of deficiencies or diseases from peculiarities specific to weeds or leaf symptoms. deficiencies or diseases.
- a weed or leaf symptom of deficiency or disease detection system is adapted to detect an area for a weed or a leaf symptom of deficiency or disease.
- a system for detecting weeds or leaf symptoms of deficiency or disease is completed with a probability of the presence of said peculiarities specific to weeds or leaf symptoms of deficiencies or diseases.
- the localization system is adapted to locate the treatment to be applied to the detection zone.
- the device comprises a communication system between said systems for detecting weeds or leaf symptoms of deficiencies or diseases.
- a temporal overlap of said information of detections of weeds or of leaf symptoms of deficiencies or diseases is obtained.
- one, in particular each, detection system comprises a system for the direct detection of features in the hyperspectral scene integrating a deep and convolutional neural network designed to detect at least one desired feature in said hyperspectral scene for a weed or leaf symptom of deficiency or disease from at least one compressed image of the hyperspectral scene.
- one, in particular each, detection system comprises a system for detecting features in the hyperspectral scene comprising:
- a neural network configured to calculate a hyperspectral hypercube of the hyperspectral scene from at least one compressed image and one uncompressed image of the hyperspectral scene
- a characterization module to detect the weed or the leaf deficiency symptom or disease from the hyperspectral hypercube.
- said agricultural treatment device comprises at least one spray nozzle, the flow rate or pressure of said at least one spray nozzle being controlled by the collaborative decision of all of said at least two systems. detection of weeds or leaf symptoms of deficiencies or diseases.
- This embodiment allows a chemical treatment of weed control of weeds or treatment of deficiencies or diseases in the plot by optimizing the quantity of phytosanitary product spread in the agricultural plot.
- said agricultural treatment device comprises at least one weed destruction LASER, said at least one LASER being controlled by the collaborative decision of all of said at least two weed detection systems or leaf symptoms of deficiencies or diseases.
- This embodiment allows a destructive treatment by LASER of the weeds of the plot, by optimizing the work rate by selecting only the weeds concerned by the treatment.
- said agricultural treatment device comprises at least one high pressure water jet whose objective is the destruction of weeds, said at least one high pressure water jet being controlled by the decision. collaborative of all of said at least two detection systems for weeds or leaf symptoms of deficiencies or diseases.
- This embodiment allows a destructive treatment by water jet high pressure of the weeds in the plot, by optimizing the work flow by selecting only the weeds concerned by the treatment.
- said agricultural treatment device comprises at least one mechanical weeding tool for hoeing, said at least one mechanical weeding tool for hoeing being controlled by the collaborative decision of all of said at least two systems. detection of weeds or leaf symptoms of deficiencies or diseases.
- This embodiment allows a destructive mechanical treatment of the weeds of the plot, by optimizing the work rate by selecting only the weeds concerned by the treatment.
- said agricultural treatment device comprises at least one electric weeding tool for destroying weeds, said at least one electric weeding tool being controlled by the collaborative decision of all of said at least two systems. detection of weeds or leaf symptoms of deficiencies or diseases.
- This embodiment allows a destructive electric weed control treatment of the weeds of the plot, by optimizing the work rate by selecting only the weeds concerned by the treatment.
- the agricultural treatment device is localized.
- all of said at least one system for detecting weeds or leaf symptoms of deficiencies or diseases is adapted to collaboratively construct a mapping of the agricultural plot traversed by said agricultural machine, said mapping being constructed by a geostatistical method with localized detection data representing the actual condition as measured by said at least one detection system for weeds or leaf symptoms of deficiencies or diseases.
- This embodiment allows the generation of a map of weed detections and symptoms of deficiencies or diseases in the treated agricultural plot for purposes of statistics and monitoring of agricultural plots.
- the device further comprises a control screen, and said map of the agricultural plot traveled is displayed on the control screen intended for the technician processing the agricultural plot.
- This embodiment allows the technician processing the agricultural plot to follow in real time the application of the treatment in the agricultural plot.
- a processor is adapted to produce statistics of spraying, prevalence, species, densities, or stages of weeds or leaf symptoms of deficiencies or diseases present in the plot. agricultural using the cartography of the agricultural parcel covered. This embodiment makes it possible to monitor the treatments in the plot.
- the invention relates to a method for collaborative control of agricultural treatment implemented by a device intended to be mounted on an agricultural machine, said agricultural machine comprising at least one controllable treatment device, the method agricultural treatment control including:
- the method of collaborative control of the treatment device mounted on an agricultural machine on which is mounted a set of systems for detecting weeds or leaf symptoms of deficiencies or diseases comprises, for each d '' at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases, the steps of:
- said projection uses the information coming from said inertial unit of said system for detecting weeds or leaf symptoms of deficiencies or diseases in order to determine the angle of capture of the image data relative to the normal vector. on the ground.
- the fusion is weighted according to the quality and the calculated distance of each detection.
- the invention is assembled on an agricultural machine comprising at least one controllable treatment device.
- the agricultural machine is such that said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases are fixed on the support of said at least one controllable treatment device and communicate with each other as well as with the plurality of said at least a controllable treatment device for, in operation, issuing the activation control command adapted to be received by each of said at least one controllable treatment device for triggering the treatment on the target plant.
- the information of roll, pitch and yaw is used; this roll, pitch and yaw information being calculated continuously and kept up to date by each of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases by means of an attitude estimation algorithm using the raw information from said inertial unit on board each of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases.
- the attitude estimation algorithm making it possible to calculate the roll, pitch and yaw information, can be an extended Kalman filter, a Mahony or Madgwick algorithm.
- said attitude information can be calculated from the raw information of the inertial units of all of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases. Said raw information from the inertial units being exchanged by means of the communication system connecting said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases continuously, the attitude estimation algorithm executed on each of said to At least two detection systems for weeds or leaf symptoms of deficiencies or diseases can use all the raw information.
- the estimates of roll, pitch and yaw are consolidated by a set of similar, coherent and covariate measurements between them.
- an extended Kalman filter can be used in each of said at least two detection systems for weeds or leaf symptoms of deficiencies or diseases, by taking the data from the inertial units of all of said at least two detection systems. detection of weeds or leaf symptoms of deficiencies or diseases.
- said attitude information can be calculated from the raw information of the inertial units to which the geolocation data of all of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases are added. .
- an extended Kalman filter can be used in each of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases, by taking the data from the inertial units as well as the geolocation data of the whole. said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases.
- R R z . R y . R x
- - R is the matrix containing the rotations along the three axes of roll, pitch and yaw;
- Said image data projected on the ground are used to detect the presence of weeds or leaf symptoms of deficiencies or diseases from the peculiarities specific to weeds or leaf symptoms of deficiencies or diseases in order to detect areas in said projected image data in which the target plants are present.
- Each of the detections of the presence of weeds or leaf symptoms of deficiencies or diseases is completed with a probability of the presence of said peculiarities specific to weeds or leaf symptoms of deficiencies or diseases. This probability information is necessary for geostatistical calculations making it possible to decide on the application of a treatment on the target plant.
- a hyperspectral sensor such as described in document FR1873313, “Hyperspectral acquisition detection device” or in document FR1901202, “Device for hyperspectral detection by fusion of sensors”, or in document FR1905916, “Detection device hyperspectral ”can be used to detect the desired features of weeds or leaf symptoms of deficiencies or diseases.
- each of the projected image data is geolocated from geolocation information obtained by means of said geolocation system of said system for detecting weeds or leaf symptoms of deficiencies or diseases.
- Said geolocation information obtained corresponds to the position of said system for detecting weeds or leaf symptoms of deficiencies or diseases at the time of capturing said image datum.
- Said ground projection operation is applied to said geolocation information in order to obtain the coordinates projected on the ground of said projected image data.
- Each of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases obtains continuously and by means of the communication system between the different systems for detecting weeds or leaf symptoms of deficiencies or diseases the geolocated detection information from all the other systems for detecting weeds or leaf symptoms of deficiencies or diseases. All of the information on said detections of weeds or leaf symptoms of deficiencies or diseases originating from all of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases is stored in a geographical database local to each of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases.
- Each of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases calculates the real-time geostatistics of the presence of weeds or of leaf symptoms of deficiencies or diseases from all of said information of detection of weeds or leaf symptoms of deficiencies or geolocalised diseases and for which information on the probability of presence is provided.
- the computation of geostatistics uses a kriging algorithm, as described in the book "Lognormal-de Wijsian Geostatistics for Ore Evaluation", DG Krige, 1981, ISBN 978-0620030069; Said krigeing algorithm making it possible to consolidate said information on the detection of weeds or leaf symptoms of deficiencies or diseases originating from all of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases, taking into account the respective probabilities of each of said detection.
- said information for detecting weeds or leaf symptoms of deficiencies or diseases consolidated by means of said geostatistical calculation confirms the presence of the desired feature of the weed or leaf symptom of deficiency or disease, the geolocated detection information is added to the list of target plants to be treated.
- Each of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases continuously calculates the instantaneous speed of movement by means of said geolocation information obtained by means of said geolocation system.
- the speed information is necessary in order to estimate the time of control of said at least one agricultural processing device and to anticipate the processing time according to said agricultural processing device.
- each of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases is estimated at each moment and for each of said target plants currently within reach of said at least one treatment device, which of said at least one treatment device is the most suitable for treating said target plant; For example, the spreading nozzle closest to the target plant is selected when said at least one treatment device is a spreading boom. Likewise, the treatment tool closest to the target plant can be selected. This determination uses the location data of the treatment device, expressed in the frame of reference of the plot in which the weeds or leaf symptoms of deficiencies or diseases are geolocated.
- the piloting commands are transmitted to said at least one agricultural processing device by means of the means of communication between said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases and said at least one agricultural treatment device.
- the computerized methods described herein are implemented by one or more computer programs executed by a processor of a programmable machine.
- FIG. 2 a structural schematic representation of the elements of the device of FIG. 1;
- FIG. 3 a schematic front view of a device for capturing a hyperspectral image according to one embodiment of the invention
- FIG. 4 a structural schematic representation of the elements of the device of FIG. 3;
- FIG. 5 a schematic representation of the influence weights of the neural network of FIG. 4;
- FIG. 4 a schematic representation of the architecture of the neural network in FIG. 4.
- FIG. 7 a schematic front view of the elements of a capture and detection device in a hyperspectral scene according to one embodiment of the invention
- FIG. 8 a structural schematic representation of the elements of the device of FIG. 7;
- FIG. 9 an alternative structural schematic representation of the elements of the device of FIG. 7;
- FIG. 10 a schematic representation of the diffractions obtained by the acquisition device of FIG. 8;
- FIG. 11 a schematic representation of the architecture of the neural network in Fig. 8.
- FIG. 12 a schematic front view of the elements of a capture and detection device in a hyperspectral scene according to a second embodiment of the invention
- FIG. 13 a structural schematic representation of the elements of the device of FIG. 12;
- FIG. 14 a schematic representation of the architecture of the neural network in Fig. 13.
- FIG. 15 a structural schematic representation, view in projection, of the elements of the device of FIG. 1;
- FIG. 16 a graph showing a method of collaborative piloting of agricultural treatment devices.
- FIG. 17 is a schematic representation similar to Figure 15 for another embodiment.
- compressed one refers to a two-dimensional image of a three-dimensional scene comprising spatial and spectral information of the three-dimensional scene.
- the spatial and spectral information of the three-dimensional scene is thus projected by means of an optical system onto a two-dimensional sensing surface.
- Such a “compressed” image can comprise one or more diffracted images of the three-dimensional scene, or parts thereof. In addition, it can also include part of a non-diffracted image of the scene.
- the term “Compressed” is used because a two-dimensional representation of three-dimensional spectral information is possible.
- spectral it is understood that we go beyond, in terms of number of frequencies detected, a “standard” RGB image of the scene.
- non-homogeneous we mean an image whose properties are not identical throughout the image.
- a “non-homogeneous” image can contain, at certain locations, pixels whose information essentially comprises spectral information at a certain respective wavelength band, as well as, in other locations, pixels. the information of which essentially comprises non-spectral information.
- Computer processing of such a “non-homogeneous” image is not possible, because the properties necessary for its processing are not identical depending on the locations in this image.
- 'peculiarity' we refer to a feature of the scene - this feature can be spatial, spectral, correspond to a shape, color, texture, spectral signature, or a combination thereof, and may in particular be interpreted semantically.
- object one refers to the common sense used for this term.
- An object detection on an image corresponds to the localization and a semantic interpretation of the presence of the object on the imaged scene.
- An object can be characterized by its shape, color, texture, spectral signature, or a combination of these characteristics.
- FIG. 1 illustrates a cooperative agricultural treatment control device intended to be mounted on an agricultural machine 1, said agricultural machine 1 comprising at least one controllable agricultural treatment device 3; said cooperative agricultural treatment control device comprising at least two detection systems 2 of weeds or leaf symptoms of deficiency or disease, each being mechanically adapted for attachment to the agricultural machine 1 and having an angle of sight of the objective acquisition in the direction of the direction of advance of said agricultural machine 1.
- the agricultural machine moves in the agricultural plot 5 in a direction of advance.
- the detection systems 2 can be arranged spaced apart from each other in a horizontal direction transverse to the direction of advance. They can for example be carried by a transverse beam of the agricultural machine.
- the agricultural treatment device 3 can be controlled to treat an area to be treated downstream of the area imaged by the detection system 2 of weeds or leaf symptoms of deficiencies or diseases along the movement of the agricultural machine.
- the plurality of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases is fixed on the agricultural machine so as to capture the visual information of the agricultural plot 5.
- Each of said at least two detection systems 2 weeds or leaf symptoms of deficiencies or diseases has a detection field intersecting the detection field of at least one detection system 2 of weeds or leaf symptoms of deficiencies or diseases neighboring .
- a hyperspectral sensor such as described in document FR1873313, “Hyperspectral acquisition detection device” or in document FR1901202, “Device for hyperspectral detection by fusion of sensors”, or in document FR1905916, “Detection device hyperspectral ”, or in document WO2019EP85847,“ Device for hyperspectral detection ”, can be used for each of said at least two detection systems 2 of weeds or of leaf symptoms of deficiencies or diseases.
- the detection system 2 of weeds or leaf symptoms of deficiencies or diseases comprises a capture device 10 and a computerized characterization module 21.
- FIG. 3 illustrates a capture device 10 d 'a three-dimensional hyperspectral image 15 comprising three juxtaposed sensors 11-13.
- a first sensor 11 makes it possible to obtain a compressed image 14 ′ of a focal plane P1 T of an observed scene. As illustrated in FIG. 4, this first sensor 11 comprises a first converging lens 30 which focuses the focal plane P1 T on an opening 31.
- a collimator 32 captures the rays passing through the opening 31 and transmits these rays to an array of diffraction 33.
- a second converging lens 34 focuses these rays coming from the diffraction grating 33 on a capture surface 35.
- the compressed image shows eight distinct R0-R7 diffractions obtained with two diffraction axes of the diffraction grating 33 arranged as far apart as possible from each other in a plane normal to the 'optical axis, that is to say substantially orthogonal to one another.
- three axes of diffractions can be used on the diffraction grating 33 so as to obtain a 14 ’diffracted image with sixteen diffractions.
- the three diffraction axes can be evenly distributed, that is to say separated from each other by an angle of 60 °.
- the compressed image comprises 2 R + 1 diffractions if one uses R equally distributed diffraction gratings, ie separated by the same angle from each other.
- the capture surface 35 may correspond to a CCD sensor (for "charge-coupled device” in the English literature, that is to say a charge transfer device), to a CMOS sensor (for “Complementary metal-oxide-semiconductor” in the Anglo-Saxon literature, a technology for manufacturing electronic components), or to any other known sensor.
- CCD sensor for "charge-coupled device” in the English literature, that is to say a charge transfer device
- CMOS sensor for “Complementary metal-oxide-semiconductor” in the Anglo-Saxon literature, a technology for manufacturing electronic components
- any other known sensor for example, the scientific publication “Practical Spectral Photography”, published in Eurographics, volume 31 (2012) number 2, proposes to associate this optical structure with a standard digital camera to capture the compressed image.
- each pixel of the compressed image 14 ’ is coded on 8 bits thus making it possible to represent 256 colors.
- a second sensor 12 makes it possible to obtain an undiffracted image 17 'of a focal plane P12' of the same observed scene, but with an offset induced by the offset between the first 11 and the second sensor 12.
- This second sensor 12 corresponds to an RGB sensor, that is to say a sensor making it possible to code the influence of the three colors Red, Green and Blue of the focal plane P12 ′. It makes it possible to account for the influence of the use of a blue filter F1, a green filter F2 and a red filter F3 on the observed scene.
- This sensor 12 can be produced by a CMOS or CCD sensor associated with a Bayer filter. As a variant, any other sensor can be used to acquire this RGB image 17 '. Preferably, each color of each pixel of the RGB image 17 'is coded on 8 bits. Thus, each pixel of the RGB image 17 'is coded on 3 times 8 bits. As a variant, a monochrome sensor could be used.
- a third sensor 13 makes it possible to obtain an infrared image 18 ′, IR, of a third focal plane P13 ′ of the same scene observed with also an offset with the first 11 and the second sensors 12. This sensor 13 allows to account for the influence of the use of an F4 infrared filter on the observed scene.
- any known type of sensor can be used to acquire this IR image 18.
- each pixel of the IR image 18 is coded on 8 bits.
- only one or the other of sensor 12 and sensor 13 is used.
- the distance between the three sensors 11-13 may be less than 1 cm so as to obtain a significant overlap of the focal planes P1 T-P13 'by the three sensors 11-13.
- the sensors are for example aligned along the x axis.
- the topology and the number of sensors can vary without changing the invention.
- sensors 11-13 can acquire an image of the same observed scene by using semi-transparent mirrors to transmit information from the observed scene to different sensors 11-13.
- FIG. 3 illustrates a device 10 comprising three sensors 11-13.
- other sensors can be mounted on device 10 to augment the information contained in the hyperspectral image.
- the device 10 can integrate a sensor whose wavelength is between 0.001 nanometers and 10 nanometers or a sensor whose wavelength is between 10,000 nanometers and 20,000 nanometers.
- the device 10 also comprises a construction module 16 of a hyperspectral image 15 from the different diffractions R0- R7 of the diffracted image 14 'and of the non-diffracted images 17 ', 18'.
- a preprocessing step is carried out to extract a focal plane P11-P13 present on each of the images 14 ', 17'-18' acquired by the three sensors 11-13.
- This pre-processing consists, for each focal plane P1 -P13 ', in isolating the common part of the focal planes P1 T-P13' then in extracting 26 this common part to form the image 14, 17-18 of each focal plane P11- P13 observed by specific sensor 11-13.
- the part of each image 14 ', 17'- 18' to be isolated can be defined directly in a memory of the capture device 10 according to the positioning choices of the sensors 11-13 between them, or a learning step can be used to identify the part to be isolated 25.
- the 17'-18 'images from RGB and IR sensors are cross-checked using two-dimensional cross-correlation.
- the extraction of the focal plane of the diffracted image 14 ' is calculated by interpolation of the shifts in x and y between the sensors 12-13 brought back to the position of the sensor 11 of the diffracted image knowing the distance between each sensor 11-13.
- This preprocessing step is not always necessary, in particular, when the sensors 1 1-13 are configured to capture the same focal plane, for example with the use of semi-transparent mirrors.
- the construction module 16 uses a neural network 20 to form a hyperspectral image 15 to based on the information from these three images 14, 17-18.
- This neural network 20 aims to determine the intensity l XYA of each voxel V X, Y, A of the hyperspectral image 15.
- the neural network 20 comprises an input layer 40, able to extract the information from images 14, 17-18, and an output layer 41, suitable in processing this information so as to create information for the voxel V XYA considered.
- the first neuron of the input layer 40 makes it possible to extract the intensity I i R (x, y) of the IR image 18 as a function of the x and y coordinates of the voxel V X, Y, A sought . For example, if the IR image 18 is coded on 8 bits, this first neuron transmits to the output layer 41 the 8-bit value of the pixel of the IR image 18 at the sought x and y coordinates.
- the second neuron of the input layer 40 performs the same task for the red color 17a of the RGB image 17.
- the desired intensity l R (x; y) is also coded on 8 bits.
- the third neuron searches for the intensity l v (x; y) in the same way for the green color 17b and the fourth neuron searches for the intensity l B (x; y) for the blue color 17c.
- the following neurons of the input layer 40 are more complex, because each of the following neurons is associated with an R0-R7 diffraction of the diffracted image 14.
- This relation between the three coordinates of the voxel V XYA and the position in x and y can be encoded in a memory during the integration of the neural network 20.
- a learning phase makes it possible to define this relation by using a known model whose parameters are sought from representations of known objects.
- An example model is defined by the following relation:
- n floor (M (d t -1) / D MA x);
- n between 0 and M, the number of diffractions of the compressed image
- Mod represents the mathematical operator modulo.
- a learning phase therefore makes it possible to define the parameters A S ii Ce x, A SiiCeY , Xoffsetx (n), and y 0 ffsetY (n), so that each neuron can quickly find the intensity of the corresponding pixel .
- other models are possible, in particular depending on the nature of the diffraction grating 33 used.
- the information related to the intensity of the pixel l n (x, y) sought by each neuron can be determined by a product of convolution between the intensity of the pixel of the compressed image 14 and its close neighbors in the different R0-R7 diffractions.
- the output of these neurons from the input layer 40 is also coded on 8 bits.
- this output neuron 41 associates a weight with each item of information as a function of the wavelength l of the voxel sought. Following this modulation on the influence of the contributions of each image 17-18 and of each diffraction R0-R7, this output neuron 41 can sum the contributions to determine an average intensity which will form the intensity I c, g , l of the voxel n c, g, l sought, for example coded on 8 bits.
- This process is repeated for all the coordinates of the voxel n CU l , so as to obtain a hypercube containing all the spatial and spectral information from the non-diffracted images 17-18 and from each diffraction R0-R7.
- the output neuron 41 will use the spatial information of the non-diffracted images obtained with blue F1 and green F2 filters as well as the information on the different R0-R7 diffractions obtained as a function of the wavelength considered. It is possible to configure the neural network 20 so as not to take into account certain diffractions R0-R7 so as to limit the time for calculating the sum of the contributions. In the example of FIG.
- the weight of each contribution as a function of the wavelength l of the voxel n c, g, l sought can also be defined during the implantation of the neural network 20 or determined by a learning phase.
- the learning can be carried out by using known scenes picked up by the three sensors 11-13 and by determining the weights of each contribution for each wavelength l so that the information from each known scene corresponds to the information contained in the known scenes. This learning can be carried out independently or simultaneously with the learning of the relationships between the three coordinates of the voxel n c, g, l and the position in x and y on the diffracted image 14.
- This neural network 20 can be implemented in a on-board system so as to process in real time the images coming from the sensors 11-13 to define and store a hyperspectral image 15 between two acquisitions from the sensors 11-13.
- the on-board system can comprise a power supply for the sensors 11-13, a processor configured to perform the calculations of the neurons of the input layer 40 and of the output layer 41 and a memory integrating the weights of each neuron of the input layer 40 as a function of the wavelength h.
- the different treatments can be carried out independently on several electronic circuits without changing the invention.
- an acquisition circuit can acquire and transmit the information coming from the neurons of the first layer 40 to a second circuit which contains the neuron of the second layer 41.
- the invention thus makes it possible to obtain a hyperspectral image 15 rapidly and with great discretization in the spectral dimension.
- the use of a neural network 20 makes it possible to limit the complexity of the operations to be carried out during the analysis of the diffracted image 14.
- the neural network 20 also allows the association of the information of this diffracted image 14 with that of non-diffracted images. 17-18 to improve precision in the spatial dimension.
- a computerized characterization module 21 is used downstream to determine a weed or a leaf symptom of deficiency or disease.
- the input to the computerized characterization module is the three-dimensional hyperspectral image 15.
- the computerized characterization module can for example apply a predefined treatment, characterizing the weed or the leaf symptom of deficiency or disease, to the three-dimensional hyperspectral image, and giving as output a presence or an absence of the weed. or leaf symptom of deficiency or disease.
- the computerized characterization module can for example be applied, as described in the article “Hyperspectral image analysis techniques for the detection and classification of the early onset of plant disease and stress”, Amy Lowe, Nicola Harrison and Andrew P. French, Plant Methods (2017), a detection based on indices (for example the index "Normalized Difference Vegetation Index” - NDVI - or "Photochemical Reflectance Index” (PRI)), in order to pre-process the hyperspectral image 15 in three dimensions by selecting a subset of spectral bands which are assembled by means of an index.
- indices for example the index "Normalized Difference Vegetation Index" - NDVI - or "Photochemical Reflectance Index” (PRI)
- the resulting image makes it possible to identify the presence of plants in the image.
- the value in a pixel is compared to a predefined scale to classify the detection in that pixel.
- a single pixel value between -0.2 and 0.2 indicates the presence of healthy plant in that pixel.
- indices are applicable, each making it possible to process the hyperspectral image and to detect the presence of either a weed, a leaf symptom of deficiency or disease, or the presence of plants. Potentially applicable indices include the following:
- NDVI Normalized difference vegetation index
- PRI Photochemical reflectance index
- Plant senescence reflectance index (PSRI), defined by the equation (Red-Green) / NIR, where Red represents the sum of the intensities of the voxels of wavelengths between 620 and 700 nm, Green represents the sum of the intensities voxels with wavelengths between 500 and 578 nm, NIR represents the sum of the intensities of voxels with wavelengths between 700 and 1000 nm, making it possible to detect the senescence of a plant, the stress of a plant or the maturity of a fruit;
- NPQI Normalized phaeophytinization index
- SI PI Structure Independent Pigment Index
- Leaf rust disease severity index (LRDSI), defined by equation 6.9 c (R605 / R455) - 1.2, to detect leaf rust disease of wheat.
- the predefined equation gives a probability of the presence of the weed or the leaf symptom of deficiency or disease.
- an additional output from the computerized characterization module is a location of the weed or leaf deficiency symptom or disease in image 17 or 18.
- the detection system described above is considered as a single detection system, even if it uses different sensors whose information is merged to detect an adventitia or a syndrome. leaf deficiency or disease.
- the detection system 2 of weed or leaf deficiency symptom or disease comprises a capture device 202.
- FIG. 7 illustrates a device 202 for capturing a hyperspectral scene 203 comprising a sensor, or acquisition system 204, making it possible to obtain a two-dimensional compressed image 211 of a focal plane 303 of an observed scene.
- the hyperspectral scene can be located in space by means of an orthonormal coordinate system (x; y; z) not shown.
- the capture device 202 is similar to that described above. [142] This optical structure makes it possible to obtain a compressed image 211, illustrated in FIG. 10, showing several R0-R7 diffractions of the focal plane 303 arranged around an undiffracted image of small size C.
- the capture device 202 may include a first converging lens 241 which focuses the focal plane 303 on a mask 242.
- a collimator 243 captures the rays passing through the mask 242 and transmits these rays to a prism 244.
- a second converging lens 245 focuses these rays coming from the prism 244 on a capture surface 246.
- the mask 242 defines a coding for the image 213.
- the capture surfaces 35 or 246 may correspond to the photographic acquisition device of a computer or any other portable device including a photographic acquisition arrangement, by adding the capture device 202 of the hyperspectral scene 203 in front of the photographic acquisition device.
- the acquisition system 204 may comprise a compact mechanical embodiment which can be integrated into a portable and autonomous device and the detection system is included in said portable and autonomous device.
- the sensing surfaces 35 or 246 can be a device whose sensed wavelengths are not in the visible part.
- the device 202 can integrate sensors with a wavelength between 0.001 nanometers and 10 nanometers or a sensor with a wavelength between 10,000 nanometers and 20,000 nanometers, or a sensor with a length of wave is between 300 nanometers and 2000 nanometers. It may be an infrared device.
- the detection system 2 implements a neural network 212 to detect a feature in the observed scene from the information of the compressed image 211.
- This neural network 212 aims to determine the probability of presence of the desired feature for each pixel located at the x and y coordinates of the observed hyperspectral scene 203.
- the neural network 212 comprises an input layer 230, able to extract the information from the image 211 and an output layer 231, able to process this information so as to generate an image whose intensity of each pixel at x and y coordinates, corresponds to the probability of presence of the feature at x and y coordinates of the hyperspectral scene 203.
- the input layer 230 is populated from the pixels forming the compressed image.
- the input layer is a tensor of order three, and has two spatial dimensions of size X M AX and Y M AX, and a depth dimension of size D M AX, corresponding to the number of subsets of l 'compressed image copied to the input layer.
- the invention uses the nonlinear relation f (x t , y t , d t ) (x im g, yimg) defined for x t e [0..X M AX [, y t e [0..U MAc [ and d t e [0..D MA x [making it possible to calculate the coordinates x img and y img of the pixel of the compressed image whose intensity is copied into the tensor of order three of said input layer of the network of neurons at coordinates (x t , y t , d t ).
- the input layer 230 can be populated as follows:
- n floor (M (d r iyD MA x);
- n between 0 and M, the number of diffractions of the compressed image
- Mod represents the mathematical operator modulo.
- the invention makes it possible to correlate the information contained in the different diffractions of the diffracted image with information contained in the central non-diffracted part of the image.
- the compressed image obtained by the optical system contains the focal plane of the non-diffracted scene at the center, as well as the projections diffracted along the axes of the different diffraction filters.
- the neural network uses, for the direct detection of the particularities sought, the following information from said at least one diffracted image:
- MASK image of the compression mask used
- CASSI measured compressed image
- Img Selected image whose pixel is copied.
- the architecture of said neural network 212, 214 is composed of a set of convolutional layers assembled linearly and alternately with decimation (pooling) or interpolation (unpooling) layers.
- a convolutional layer of depth d is defined by d convolution kernels, each of these convolution kernels being applied to the volume of the input tensor of order three and of size Xi nput , yi nput , di nput ⁇
- the convolutional layer thus generates an output volume, tensor of order three, having a depth d.
- An ACT activation function is applied to the calculated values of the output volume of this convolutional layer.
- this function can be a ReLu function, defined by the following equation:
- a decimation layer reduces the width and height of the input third order tensor for each depth of said third order tensor. For example, a MaxPool (2,2) decimation layer selects the maximum value of a sliding tile on the surface of 2x2 values. This operation is applied to all the depths of the input tensor and generates an output tensor having the same depth and a width divided by two, as well as a height divided by two.
- An interpolation layer makes it possible to increase the width and height of the third order tensor at the input for each depth of said third order tensor.
- a MaxUnPool (2,2) interpolation layer copies the input value of a sliding point on the surface of 2x2 output values. This operation is applied to all depths of the input tensor and generates an output tensor having the same depth and width multiplied by two, as well as a height multiplied by two.
- a neural network architecture allowing the direct detection of features in the hyperspectral scene can be as follows:
- the number of layers of CONV (d) convolution and MaxPool decimation (2.2) can be changed in order to facilitate the detection of features having a higher semantic complexity.
- a higher number of convolution layers makes it possible to process more complex signatures of shape, texture, or spectral characteristics of the particularity sought in the hyperspectral scene.
- the number of CONV deconvolution (d) and MaxUnpool (2, 2) interpolation layers can be changed to facilitate reconstruction of the output layer. For example, a higher number of deconvolution layers makes it possible to reconstruct an output with greater precision.
- the CONV convolution layers (64) may have a different depth than 64 in order to handle a different number of local features. For example, a depth of 128 makes it possible to locally process 128 different features in a complex hyperspectral scene.
- the MaxUnpool interpolation layers (2, 2) can be of different interpolation dimension.
- a MaxUnpool layer (4, 4) can increase the processing dimension of the top layer.
- the ReLu (x) type ACT activation layers inserted following each convolution and deconvolution can be of a different type.
- the MaxPool decimation layers (2, 2) can be of different decimation size.
- a MaxPool layer (4, 4) makes it possible to reduce the spatial dimension more quickly and to concentrate the semantic research of the neural network on the local particularities.
- fully connected layers can be inserted between the two central convolution layers at line 6 of the description in order to process detection in a higher mathematical space.
- three fully connected layers of size 128 can be inserted.
- the dimensions of the CONV convolution layers (64), MaxPool decimation (2, 2), and MaxUnpool interpolation (2, 2) can be adjusted over one or more layers, in order to accommodate the architecture of the neural network as close as possible to the type of particularities sought in the hyperspectral scene.
- the weights of said neural network 212 are calculated by means of training. For example, learning by backpropagation of the gradient or its derivatives from training data can be used to calculate these weights.
- the neural network 212 can determine the probability of the presence of several distinct features within the same observed scene.
- the last convolutional layer will have a depth corresponding to the number of distinct features to be detected.
- the convolutional layer CONV (1) is replaced by a convolutional layer CONV (u), where u corresponds to the number of distinct features to be detected.
- FIG. 12 illustrates a device 302 for capturing a hyperspectral scene 203 comprising a set of sensors making it possible to obtain at least one compressed image in two dimensions 211 or 213 and at least one standard image 312 of a hyperspectral focal plane 303 of a observed scene.
- the capture device 302 comprises at least one acquisition device, or sensor, 301 of a compressed image as described above with reference to FIG. 8.
- the capture device 302 may further comprise a device for acquiring an uncompressed "standard" image, comprising a converging lens 331 and a capture surface 232.
- the capture device 302 may further comprise a device. acquisition of a compressed image as described above with reference to FIG. 9.
- the standard image acquisition device and the compressed image acquisition device are arranged juxtaposed with parallel optical axes, and optical beams at least partially overlapping.
- a portion of the hyperspectral scene is imaged at a time by the acquisition devices.
- the focal planes of the various image acquisition sensors are offset with respect to each other transversely to the optical axes of these sensors.
- a set of partially reflecting mirrors is used so as to capture said at least one standard non-diffracted images 312 and said at least one compressed image 211, 213 of the same hyperspectral scene 203 on several sensors simultaneously.
- the sensing surface 232 can be a device whose sensed wavelengths are not in the visible part.
- the device 202 can integrate sensors with a wavelength between 0.001 nanometers and 10 nanometers or a sensor with a wavelength between 10,000 nanometers and 20,000 nanometers, or a sensor with a length of wave is between 300 nanometers and 2000 nanometers.
- the detection means uses a neural network 214 to detect a feature in the observed scene from the information of the compressed images 211 and 213, and standard image 312.
- This neural network 214 aims to determine the probability of presence of the desired feature for each pixel located at the x and y coordinates of the observed hyperspectral scene 203.
- the neural network 214 includes an encoder 251 for each compressed image and for each uncompressed image; each encoder 251 has an input layer 250, able to extract the information from the image 211, 312 or 213.
- the neural network merges the information coming from the different encoders 251 by means of convolution layers or of fully connected layers 252 (particular case shown in the figure).
- a decoder 253 and its output layer 350 able to process this information so as to generate an image whose intensity of each pixel, at the x and y coordinate, corresponds to the probability of the presence of the feature at the x and y coordinates of the hyperspectral scene 203, is inserted following the fusion of the information.
- the input layer 250 of an encoder 251 is filled with the various diffractions of the compressed image 211 as described above.
- the population described above corresponds to the population of the first input ("Input1") of the neural network, according to the architecture presented below.
- the population of the input layer relative to the "standard” image is populated by directly copying the "standard” image into the neural network.
- the third input "Input3" of the neural network is populated as described above for the compressed image 213.
- a neural network architecture allowing the direct detection of features in the hyperspectral scene can be as follows:
- Input1 corresponds to the portion of the input layer 250 populated from the compressed image 211.
- Input2 corresponds to the portion of the input layer 250 populated from the 'standard image 312, and
- Input3 corresponds to the portion of the input layer 250 populated from the compressed image 213.
- the line “CONV (64)" in the fifth line of the architecture operates the fusion of the information.
- the line “CONV (64)” in the fifth line of the information fusion architecture can be replaced by a fully connected layer having as input all the MaxPool outputs (2, 2) of the processing paths of the set of inputs “input1”, “input2” and “input3” and at output a tensor of order one serving as input to the next layer “CONV (64)” presented in the sixth line of l 'architecture.
- the fusion layer of the neural network takes into account the offsets of the focal planes of the different image acquisition sensors, and integrates the homographic function making it possible to merge the information from the different sensors by taking into account the parallaxes of the images. different images.
- the weights of said neural network 214 are calculated by means of learning. For example, learning by backpropagation of the gradient or its derivatives from training data can be used to calculate these weights.
- the neural network 214 can determine the probability of the presence of several distinct features within the same observed scene.
- the last convolutional layer will have a depth corresponding to the number of distinct features to be detected.
- the convolutional layer CONV (1) is replaced by a convolutional layer CONV (u), where u corresponds to the number of distinct features to be detected.
- a separate dedicated acquisition device is not necessarily used to obtain the “standard” image 312.
- part of the compressed image 211 comprises a “standard” image of the hyperspectral scene. This is in particular the image portion C described above. In this case, it is possible to use this image portion “C” of the compressed image 211 as the “standard” input image of the neural network.
- the neural network 214 uses, for the direct detection of the particularities sought, the following information from said at least one compressed image:
- a detected feature of the hyperspectral scene is a two-dimensional image in which the value of each pixel at the coordinates x and y corresponds to the probability of the presence of a feature. at the same x and y coordinates of the hyperspectral focal plane of the scene 203.
- the feature corresponds to a feature potentially indicative of the presence of a weed or of a leaf symptom of deficiency or disease in this pixel.
- Each weed, each leaf symptom of deficiency or disease can be characterized by one or more peculiarities.
- the detection system then combines the results of the detection of each feature associated with a weed or a leaf symptom of deficiency or disease to determine a probability of the presence of the weed or of the leaf symptom of deficiency or disease. If necessary, this process is repeated for all the weeds or foliar symptoms of predetermined deficiency or disease sought in the plot. It is possible, however, as a variant, to provide, according to the embodiments of the invention, for the detection of other particularities. According to one example, such a further feature can be obtained from the image obtained from the neural network presented above. For this, the neural network 212, 214, can present a subsequent layer, suitable for processing the image in question and determining the desired feature.
- this subsequent layer can for example count the pixels of the image in question for which the probability is greater than a certain threshold.
- the result obtained is then an area (possibly related to a standard area of the image).
- the result obtained may then correspond to a concentration of the chemical compound in the imaged hyperspectral scene which may be indicative of a weed or a leaf symptom of deficiency or disease.
- this subsequent layer may for example have only one neuron whose value (real or Boolean) will indicate the presence or absence of an object or a particularity sought in the hyperspectral scene. .
- This neuron will have a maximum value in the presence of the object or feature and a minimum value in the opposite case.
- This neuron will be fully connected to the previous layer, and the connection weights will be calculated through training.
- the neural network can also be architected to determine this feature without going through the determination of an image of the probability of presence of the feature in each pixel.
- the detection system described above is considered as a single detection system, even if it uses different sensors whose information is merged to detect an adventitia or a syndrome. leaf deficiency or disease.
- each detection system 2 can include a location system, of the type comprising an inertial unit and / or a geolocation system.
- the agricultural treatment control device further comprises a communication system connecting the detection systems 2 of weeds or leaf symptoms of deficiencies or diseases.
- the communication system is suitable for exchanging data between the detection systems 2 of weeds or leaf symptoms of deficiencies or diseases such as, in particular, data for detecting weeds or leaf symptoms of deficiencies or diseases, data location from inertial units, and / or geolocation systems.
- the plurality of said at least one controllable agricultural treatment device 3 is also fixed on the agricultural machine so as to be able to treat the target plants 4.
- the agricultural treatment devices 3 may be arranged spaced with respect to each other in a horizontal direction transverse to the direction of advance. They can for example be carried by a transverse beam of the agricultural machine, where appropriate by the same beam which carries the detection systems 2. In addition, they can be spaced from them in the transverse direction.
- the agricultural treatment control device further comprises a system for locating agricultural treatment devices.
- the agricultural processing control device further comprises a communication system connecting the detection 2 of weeds or leaf symptoms of deficiencies or diseases.
- the agricultural treatment device further comprises a communication system suitable for exchanging data between the detection systems 2 of weeds or leaf symptoms of deficiencies or diseases and the agricultural treatment devices 3.
- the number of controllable agricultural treatment devices 3 need not be the same as the number of detection systems 2 for weeds or leaf symptoms of deficiency or disease. Indeed, according to one example, the collaborative treatment decision is transmitted to the controllable agricultural treatment device 3 having the least distance from the target plant.
- FIG. 15 illustrates the device, provided with two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases, mounted on an agricultural machine 1, in which each of the detection systems 2 of weeds or leaf symptoms of deficiencies or diseases , is directed at an angle towards the ground of the agricultural plot 5, and having an overlap of their respective detection zones.
- the first detection system will be characterized by the reference “.1”
- the second detection system will be characterized by the reference “.2”.
- said detection system 2.1 of weeds or leaf symptoms of deficiencies or diseases takes a picture 6.1 of the area of agricultural plot 5 facing its objective; said detection system 2.2 of weeds or leaf symptoms of deficiencies or diseases, takes a snapshot 6.2 of the area of agricultural plot 5 facing its objective; said areas facing the optical objectives 9 of said detection systems 2.1 and 2.2 of weeds or leaf symptoms of deficiencies or diseases, have a common acquisition area.
- FIG. 16 gives an example of a collaborative processing method for the acquired data.
- the collaborative processing method is designated by the reference 8, and the steps thereof by reference signs ". Index”.
- capturing 8.1 of the image information of the agricultural plot 5 traversed makes it possible to obtain the acquired images 6.1 and 6.2.
- the plurality of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases is composed of homogeneous systems, exhibiting the same detection properties.
- - R is the matrix containing the rotations along the three axes of roll, pitch and yaw;
- the angles a, b, and g correspond respectively to the current yaw, roll and pitch angles of the detection system 2 of weeds or leaf symptoms of deficiencies or diseases considered as calculated from the raw data of the inertial unit on board the detection system 2 of weeds or leaf symptoms of deficiencies or diseases considered; these roll, pitch and yaw information are continuously calculated and kept up to date by the detection system 2 of weeds or leaf symptoms of deficiencies or diseases considered by means of an attitude estimation algorithm using the raw information of said inertial unit on board the detection system 2 of weeds or leaf symptoms of deficiencies or diseases considered.
- the attitude estimation algorithm making it possible to calculate the roll, pitch and yaw information, can be an extended Kalman filter, a Mahony or Madgwick algorithm.
- the ortho-projection 8.2 of the image information acquired from the agricultural plot 5 traversed makes it possible to obtain the acquired images 7.1 and 7.2 from the images 6.1 and 6.2.
- Said image data projected on the ground are used to detect the presence of weeds or leaf symptoms of deficiencies or diseases from the peculiarities specific to weeds or leaf symptoms of deficiencies or diseases determined by one of the above methods, in order of detecting the zones, identified at the coordinates of the image Xdetect and Y of tect, in said projected image data in which the target plants 4 are present.
- a target plant 4 is a plant for which the detection device detects a weed or a leaf symptom of deficiency or disease.
- each of the 8.3 detections of the presence of weeds or leaf symptoms of deficiencies or diseases is completed with a probability of the presence of said peculiarities specific to weeds or leaf symptoms of deficiencies or diseases. In certain exemplary embodiments, this probability information is necessary for the geostatistical calculations making it possible to decide on the application of a treatment to the target plant.
- each of said detections of weeds or leaf symptoms of deficiencies or diseases is geolocated 8.4 at the coordinates lat and Ing by means of the geolocation system embedded in each of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases.
- Rad fract clistance / EARTI-I RADIUS
- Lat target (180.asin (lat 2i + lat 22 )) ⁇
- Lng target (180. (((Lng + atan2 (lng 2i , lng 22 ) + 3n) mod2n) -n)) / p
- Pixei 2 meter ratio is the ratio between one pixel in the image and one meter on the ground
- - X detect is the x coordinate, in pixels, of the center of the detection in the image
- - Y detect is the y coordinate, in pixels, of the center of the detection in the image
- Lat is the latitude measured by said geolocation system of said detection system 2 of weeds or leaf symptoms of deficiencies or diseases;
- - Ing is the longitude measured by said geolocation system of said detection system 2 of weeds or leaf symptoms of deficiencies or diseases;
- - lat target is the latitude of the target plant 4 detected in the image
- - Ing target is the longitude of target plant 4 detected in the image.
- Each of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases obtains continuously and by means of the communication system between the different detection systems 2 of weeds or leaf symptoms of deficiencies or diseases detection information geolocated by the lat target and lng target coordinates coming from all the other detection systems 2 of weeds or leaf symptoms of deficiencies or diseases.
- Each of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases communicates therefore continuously and by means of the communication system between the different detection systems 2 of weeds or leaf symptoms of deficiencies or diseases the detection information geolocated by the lat target and lng target coordinates to all the other systems for detecting 2 weeds or leaf symptoms of deficiencies or diseases.
- the GeoJSON format as described in document RFC7946, “The GeoJSON Format”, IETF 08/2016, makes it possible to transport said geolocated detection information on said communication system.
- the ESRI Shapefile format as described in the document ESRI Shapefile technical description, 07/1998, makes it possible to transport said geolocated detection information on said communication system.
- said latitude and longitude information can be calculated from the raw information of the inertial units of all of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases. Said raw information from the inertial units being exchanged by means of the communication system connecting said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases continuously, the algorithm for estimating the latitude, executed on each said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases can use all the raw information.
- the latitude and longitude information is calculated relatively in the coordinate system of the agricultural parcel being traversed.
- an extended Kalman filter can be used in each of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases, by taking the data from the inertial units of all of said at least two systems. detection of weeds or leaf symptoms of deficiencies or diseases.
- the calculation of the 8.4 geolocation of a detection of weed or leaf symptom of deficiency or disease is based on the same relationship with the following elements:
- - lat is the latitude calculated in the coordinate system of the agricultural plot traversed from the data obtained from the inertial units of all of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases of said detection system 2 weeds or leaf symptoms of deficiency or disease;
- Ing is the longitude calculated in the coordinate system of the agricultural parcel traversed from the data from the inertial units of all of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases of said detection system 2 weeds or leaf symptoms of deficiencies or diseases.
- [247] - sensor angia is the angle between the vertical and the average viewing angle of the detection system 2 of weeds or leaf symptoms of deficiencies or diseases;
- - sensor height is the height from the ground of the detection system 2 of weeds or leaf symptoms of deficiencies or diseases
- ratiopixel2meter is the ratio of one pixel in the image to one meter on the ground
- [250] - X detect is the x coordinate, in pixels, of the center of the detection in the image
- [251] - Y detect is the y coordinate, in pixels, of the center of the detection in the image
- [252] - w img is the width of the image in pixels
- [253] - h img is the height of the image in pixels
- [254] - X tarçjet is the relative longitudinal coordinate in meters of the target plant 4 detected in the image
- [255] - Y tarçjet is the relative coordinate in meters facing said detection system 2 of weeds or leaf symptoms of deficiencies or diseases of the target plant 4 detected in the image.
- All the information of said detections of weeds or leaf symptoms of deficiencies or diseases coming from all of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases is stored in a geographic database local to each of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases.
- Each of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases having its detection zone of the particularities sought; weeds or leaf symptoms of deficiencies or diseases; in agricultural plot 5, overlapping with said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or neighboring diseases, a covering side of said information of detection of weeds or leaf symptoms of deficiencies or diseases is obtained.
- each of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases detecting at the present moment the desired peculiarities of weeds or leaf symptoms of deficiencies or diseases in the agricultural plot 5 in the detection zone within range of the optical objective of said detection system 2 of weeds or leaf symptoms of deficiencies or diseases, a temporal recovery of said information of detections of weeds or leaf symptoms of deficiencies or diseases is obtained .
- temporal overlap reference is made to the fact that the detection zones at two successive distinct instants overlap if the determination frequency is sufficiently high.
- Figure 17 illustrates this embodiment, and represents in dotted lines the optical field acquired by the detection system 2 of weed or leaf symptoms of deficiencies or diseases at a first instant h, and in phantom lines the optical field acquired by the same detection system 2 of weed or leaf symptoms of deficiencies or diseases in a second instant t 2 .
- the optical fields are geographically shifted due to the travel of the agricultural machine during the time interval.
- the photographs and images obtained at the second instant are represented with the index “.3”. However, at all times, the detections are geolocated in a common repository.
- said information for detecting weeds or leaf symptoms of deficiencies or diseases stored in said geographic database local to each of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases contains the redundancy of said information of detection of weeds or leaf symptoms of deficiencies or diseases.
- the 8.5 merger operation can be a kriging operation, as described in the book "Lognormal-de Wijsian Geostatistics for Ore Evaluation", DG Krige, 1981, ISBN 978-0620030069, taking into account all of said detection information.
- the result is determined from the detection result obtained for this point by each of the detection systems.
- the result makes it possible to decide whether or not a treatment at this point. For example, the result is compared with a certain predetermined threshold and, if the result is positive, the application of the treatment is ordered.
- the merger in question takes into account the quality of the detection.
- the result of the merger may include a map of the probability of the presence of the weed or of the symptom.
- leaf deficiency or disease obtained from these individual maps. Therefore, inherently, each individual map carries information about the quality of detection, and the merged result takes this quality into account. For example, if, at a given location, one detection system determines a 90% probability of the presence of a leaf symptom of a certain disease, and another detection system determines a probability of the presence of a leaf symptom of a certain disease. this same disease at 30% is that the quality of detection of at least one of the two detection systems is low, and the final result transcribes this quality of detection.
- the distance of each detection is also taken into account. Indeed, if in a given location, located close to the optical axis of a detection system, a 30% probability of the presence of a leaf symptom of a certain disease is determined, and another detection system, for which this same location is far from the optical axis, determines a probability of the presence of a leaf symptom of the same disease at 90%, a greater weight will be applied to the detection system facing the location studied during fusion.
- the merger operation 8.5 is an operation taking into account all the information on the detection of weeds or leaf symptoms of deficiencies or geolocalised diseases and containing the information on the probability of detection, coming from the plurality of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases, as well as the information of lateral and temporal overlaps, in order to calculate the consolidated probabilities of detections of weeds or of leaf symptoms of deficiencies or geolocated diseases; Said consolidation operation taking into account the probabilities of each detection of weeds or leaf symptoms of deficiencies or geolocalised diseases.
- the localized detection information obtained for several times spaced apart are merged as described above.
- This embodiment is, where appropriate, applicable to a single system for detecting weeds or leaf symptoms of deficiency or disease.
- the collaborative work is done from two detections spaced in time of the same system for detecting weeds or leaf symptoms of deficiency or disease.
- the agricultural treatment control device comprises a single system for detecting weeds or leaf symptoms of deficiency or disease, it does not implement a communication system between detection systems for weeds or leaf symptoms of deficiency or disease. However, a communication system between the detection system for weeds or leaf symptoms of deficiency or disease and the treatment device remains necessary.
- Each of said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases continuously calculates the instantaneous speed of movement by means of said location information obtained by means of said location system.
- the speed information is necessary in order to estimate the time of control of said at least one agricultural processing device and to anticipate the processing time according to said agricultural processing device.
- the control device determines the or the processing devices to be actuated, and the temporal characteristics (instant, duration, etc.) of this actuation.
- each of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases is estimated at each instant and for each of said target plants 4 currently within range of said at least one treatment device 3, which of said at least one treatment device 3 is the most suitable for treating said target plant 4.
- the piloting commands are transmitted to said at least one agricultural treatment device by means of the means of communication between said at least two systems for detecting weeds or leaf symptoms of deficiencies or diseases and said at least one treatment device. agricultural.
- the command 8.7 to be sent to each of said at least one agricultural treatment device 3 is a pressure and flow control taking into account the presence of a target plant at the present moment in the spraying zone of said spreading nozzle.
- the command 8.7 to be sent to each of said at least one agricultural treatment device 3 is a command for transverse and longitudinal offsets, and for the power of lighting taking into account the presence of a target plant at the present moment in the range of said LASER.
- the command 8.7 to be sent to each of said at least one agricultural treatment device 3 is a pressure and flow control taking taking into account the presence of a target plant at the present moment in the range zone of the high pressure water injection nozzle.
- the command 8.7 to be sent to each of said at least one agricultural treatment device 3 is an activation command taking into account the presence of a target plant at the present moment in the zone of said mechanical weeding tool of hoeing.
- the command 8.7 to be sent to each of said at least one agricultural treatment device 3 is an activation command taking into account the presence of a target plant at the moment present in the zone of said electric weeding tool.
- the acquired image is first projected in a given frame of reference, then the detection of weed or leaf deficiency symptom or disease is implemented for the projected image.
- provision could be made to start by producing an image of the probability of presence of weed or of leaf deficiency symptom or of detection from the raw acquired image, then to orthoproject it into the given frame of reference.
- the geolocation of each detection system is carried out independently, and the geolocated detections are merged in order to decide on the possible treatment.
- the geolocation of each detection system can be done collaboratively.
- said attitude information can be calculated from the raw information of the inertial units of all of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases. Said raw information from the inertial units being exchanged by means of the communication system connecting said at least two weed detection systems 2 or leaf symptoms of deficiencies or diseases continuously, the attitude estimation algorithm executed on each of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases can use all the raw information .
- the estimates of roll, pitch and yaw are consolidated by a set of similar, coherent and covariate measurements between them.
- an extended Kalman filter can be used in each of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases, by taking the data from the inertial units of all of said at least two systems. detection of weeds or leaf symptoms of deficiencies or diseases.
- the document “Data Fusion Algorithms for Multiple Inertial Measurement Units”, Jared B. Bancroft and protagonist Lachapelle, Sensors (Basel), 06/29/2011, 6771-6798 presents an alternative raw data fusion algorithm of a set of power plants inertial to determine attitude information.
- said attitude information can be calculated from the raw information of the inertial units to which the geolocation data of all of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases are added. Said raw information from the inertial units as well as the geolocation data being exchanged by means of the communication system connecting said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases, the attitude estimation algorithm can use all raw information. For example, an extended Kalman filter can be used in each of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases, by taking the data from the inertial units as well as the geolocation data of the plant. together of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases.
- said communication system between said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases and said at least one agricultural treatment device 3 is a wired Ethernet 1 Gigabit per second network allowing thus each of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases to communicate with the others systems for detecting 2 weeds or leaf symptoms of deficiencies or diseases as well as with said at least one agricultural treatment device 3.
- each of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases locally builds a mapping of the peculiarities; or the presence of weeds or leaf symptoms of deficiencies or diseases; using a local geographic database.
- the geolocated detection information for the presence of weeds or leaf symptoms of deficiency or diseases, detected by all of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases and exchanged by means of the communication system. are thus stored in each of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases.
- each of said geographical databases stored locally in each of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases represents the real state, as measured by the all of said at least two detection systems 2 of weeds or leaf symptoms of deficiencies or diseases, and health of said agricultural plot 5 traversed.
- mapping information of the agricultural plot 5 traversed by said agricultural machine is transmitted by means of a communication system, and displayed on a control screen intended for the technician processing the agricultural plot. 5.
- the communication system used to transmit the mapping information of the agricultural plot 5 to said control screen intended for the technician processing the agricultural plot 5, comprises a wired Gigabit Ethernet network.
- the communication system used to transmit the mapping information of the agricultural plot 5 to said control screen for the technician processing the agricultural plot 5, is a wired CAN network ("Control Area Network" ).
- the mapping of agricultural plot 5 finds an advantageous use in order to produce statistics of spraying or treatments applied to said agricultural plot 5. Said statistics also make it possible to measure the prevalence, presence and quantity of certain species of weeds. , as well as their densities and stages. The prevalence, presence and density of leaf symptoms of deficiency or diseases can also be calculated from the information contained in the mapping of agricultural plot 5.
- each detection system communicates with neighboring detection systems, for collaborative treatment decision making.
- a central processor adapted to communicate, via the communication system, with the detection systems, to take a decision, and to communicate the processing instructions to the processing devices 3 via the communication system.
- a single detection system 2 of weeds or leaf symptoms of deficiency or disease makes a collaborative decision using information relating to other detection systems 2 of weeds or leaf deficiency symptoms or disease.
Abstract
Description
Claims
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1902497A FR3093614B1 (fr) | 2019-03-12 | 2019-03-12 | Dispositif collaboratif de contrôle de traitement agricole |
FR1908086A FR3093613A1 (fr) | 2019-03-12 | 2019-07-17 | Dispositif de contrôle de traitement agricole |
PCT/EP2020/056401 WO2020182840A1 (fr) | 2019-03-12 | 2020-03-10 | Dispositif de contrôle de traitement agricole |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3937628A1 true EP3937628A1 (fr) | 2022-01-19 |
Family
ID=69726596
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP20708129.0A Pending EP3937628A1 (fr) | 2019-03-12 | 2020-03-10 | Dispositif de contrôle de traitement agricole |
Country Status (3)
Country | Link |
---|---|
US (1) | US20220174934A1 (fr) |
EP (1) | EP3937628A1 (fr) |
WO (1) | WO2020182840A1 (fr) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220101554A1 (en) * | 2020-09-25 | 2022-03-31 | Blue River Technology Inc. | Extracting Feature Values from Point Clouds to Generate Plant Treatments |
US11793187B2 (en) * | 2021-01-15 | 2023-10-24 | Cnh Industrial America Llc | System and method for monitoring agricultural fluid deposition rate during a spraying operation |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2964577B1 (fr) | 2010-09-10 | 2013-12-27 | Exel Ind | Systeme de commande pour engin agricole de pulverisation |
AU2012228772A1 (en) | 2011-03-16 | 2013-10-17 | Syddansk Universitet | Spray boom for selectively spraying a herbicidal composition onto dicots |
US9609859B2 (en) * | 2013-09-13 | 2017-04-04 | Palo Alto Research Center Incorporated | Unwanted plant removal system having a stabilization system |
US10269107B2 (en) | 2017-02-23 | 2019-04-23 | Global Neighbor Inc | Selective plant detection and treatment using green luminance photometric machine vision scan with real time chromaticity operations and image parameter floors for low processing load |
FR3063206B1 (fr) | 2017-02-24 | 2021-08-13 | Bilberry Sas | Systeme de controle pour epandage agricole |
CN108990944B (zh) | 2018-06-27 | 2021-01-29 | 浙江大学 | 基于可见光热红外图像融合的无人机遥感喷药一体化方法及装置 |
-
2020
- 2020-03-10 WO PCT/EP2020/056401 patent/WO2020182840A1/fr unknown
- 2020-03-10 US US17/437,333 patent/US20220174934A1/en active Pending
- 2020-03-10 EP EP20708129.0A patent/EP3937628A1/fr active Pending
Also Published As
Publication number | Publication date |
---|---|
WO2020182840A1 (fr) | 2020-09-17 |
US20220174934A1 (en) | 2022-06-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10274420B2 (en) | Compact multifunctional system for imaging spectroscopy | |
Dekker et al. | Intercomparison of shallow water bathymetry, hydro‐optics, and benthos mapping techniques in Australian and Caribbean coastal environments | |
CA3083079A1 (fr) | Procede de caracterisation d'echantillons utilisant des reseaux de neurones | |
US20170154440A1 (en) | Method and system for photogrammetric processing of images | |
WO2020182840A1 (fr) | Dispositif de contrôle de traitement agricole | |
EP1828992A1 (fr) | Procede de traitement d'images mettant en oeuvre le georeferencement automatique d'images issues d'un couple d'images pris dans le meme plan focal | |
FR3069940B1 (fr) | Procede et systeme de cartographie de l’etat sanitaire de cultures | |
EP3714399A1 (fr) | Dispositif de détection hyperspectrale | |
WO2020127422A1 (fr) | Dispositif de détection hyperspectrale | |
FR3013878A1 (fr) | Analyse d'une image multispectrale | |
FR2982393A1 (fr) | Recherche d'une cible dans une image multispectrale | |
WO2019053364A1 (fr) | Dispositif de capture d'une image hyperspectrale | |
FR3093613A1 (fr) | Dispositif de contrôle de traitement agricole | |
WO2021234063A1 (fr) | Procede et systeme de controle de traitement agricole | |
Papadopoulos et al. | Weed mapping in cotton using ground-based sensors and GIS | |
EP3579186A1 (fr) | Procede et systeme pour la gestion d'une parcelle agricole | |
Okamoto et al. | Unified hyperspectral imaging methodology for agricultural sensing using software framework | |
FR3098962A1 (fr) | Système de détection d’une particularité hyperspectrale | |
WO2020165176A1 (fr) | Dispositif de microscopie holographique hyperspectrale par fusion de capteurs | |
WO2018109044A1 (fr) | Dispositif de détection d'un objet d'intérêt et procédé mettant en œuvre ce dispositif | |
Okamoto et al. | Weed detection using hyperspectral imaging | |
Pushparaj et al. | Reconstruction of hyperspectral images from RGB images | |
Mifdal | Application of optimal transport and non-local methods to hyperspectral and multispectral image fusion | |
Tardif | Proximal sensing and neural network processes to assist in diagnosis of multi-symptom grapevine diseases | |
Kåsen et al. | Band selection for hyperspectral target detection based on a multinormal mixture anomaly detection algorithm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210913 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20231215 |