WO2023079063A1 - Methode und system zur datenerhebung auf einem agrarwirtschaftlich genutzten feld - Google Patents
Methode und system zur datenerhebung auf einem agrarwirtschaftlich genutzten feld Download PDFInfo
- Publication number
- WO2023079063A1 WO2023079063A1 PCT/EP2022/080785 EP2022080785W WO2023079063A1 WO 2023079063 A1 WO2023079063 A1 WO 2023079063A1 EP 2022080785 W EP2022080785 W EP 2022080785W WO 2023079063 A1 WO2023079063 A1 WO 2023079063A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- weed
- data
- image analysis
- flight
- reference point
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 241000196324 Embryophyta Species 0.000 claims abstract description 164
- 238000010191 image analysis Methods 0.000 claims abstract description 67
- 238000001514 detection method Methods 0.000 claims description 36
- 238000013480 data collection Methods 0.000 claims description 25
- 238000004590 computer program Methods 0.000 claims description 6
- 239000004009 herbicide Substances 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 7
- 239000003795 chemical substances by application Substances 0.000 description 6
- 238000012937 correction Methods 0.000 description 6
- 230000002363 herbicidal effect Effects 0.000 description 4
- 230000011218 segmentation Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 240000008042 Zea mays Species 0.000 description 2
- 235000005824 Zea mays ssp. parviglumis Nutrition 0.000 description 2
- 235000002017 Zea mays subsp mays Nutrition 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 235000005822 corn Nutrition 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 241000219310 Beta vulgaris subsp. vulgaris Species 0.000 description 1
- 241000252254 Catostomidae Species 0.000 description 1
- 241000132536 Cirsium Species 0.000 description 1
- 241000863148 Geometra Species 0.000 description 1
- 235000010469 Glycine max Nutrition 0.000 description 1
- 244000068988 Glycine max Species 0.000 description 1
- 241000699670 Mus sp. Species 0.000 description 1
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 235000021536 Sugar beet Nutrition 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000013481 data capture Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 238000009313 farming Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 238000009428 plumbing Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000002689 soil Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000002269 spontaneous effect Effects 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
Definitions
- the present invention relates to a method and a system for data collection on an agricultural field by a combination of long-distance flight and near-ground detection, in particular for the detection of weeds.
- the present invention also includes a computer program product.
- the object is achieved by a method for data collection on an agricultural field through a combination of long-distance flight and near-ground sensing, wherein a) in a first step, the geographic position of the respective reference point is recorded by near-ground sensing at reference points on the agricultural field and for each reference point at least one photograph of at least one weed on the agricultural field is taken, b) in a second step, flight detection parameters are determined using the data of an image analysis of the photographs of the at least one weed for each reference point, c) in in a third step, at least the reference points on the agricultural field are recorded photographically by remote flight sensing, the remote flight sensing parameters being used at least partially for the remote flight sensing.
- the method for collecting data on an agricultural field first collects reference data from a weed or weeds by ground probing.
- the reference data is used, among other things, to derive flight remote sensing parameters and thus ensure that flight remote sensing can deliver image data that is of high quality and is suitable, for example, for the creation of a weed distribution map. This procedure enables optimal acquisition of the flight remote sensing data and minimizes the collection of useless data. This leads to increases in efficiency and cost savings.
- the image analysis of the photograph in step b) is improved because exactly one weed is selected. This avoids, for example, detecting a number of weeds lying close together, for which it is more likely that the image analysis in step b) will lead to incorrect results and possible consequential errors resulting from this, for example in the determination of the flight remote sensing parameters.
- near-ground reconnaissance is carried out at at least 20 (twenty) reference points.
- the image analysis of the photographs for each reference point in the second step b) includes the determination of at least one weed and its size.
- the image analysis in the second step b) includes determining the weed species for the at least one weed.
- the flight remote sensing parameters are defined in the second step b) by first determining the projected size of a single pixel of the smallest weed on the ground to be detected (Ground Sampling Distance, GSD).
- the smallest weed to be detected is determined based on a size comparison of all identified weeds from the image analysis to the photographs for each reference point.
- this procedure makes it possible to identify weeds in an early stage of growth using remote aerial sensing, because the smallest weeds to be detected on the agricultural field are used as a basis for determining the remote aerial sensing parameters.
- the flight remote sensing parameters include flight altitude and camera characteristics and are determined based on the projected size of a single pixel of the smallest ground weed to be detected (GSD).
- the image analysis of the photographic flight remote sensing data in the fourth step d) includes the determination of at least one weed.
- a fourth step d) at least one weed distribution map for the agricultural field by means of an image analysis Photographic flight remote sensing data created.
- the accuracy of the at least one weed distribution map is determined by comparing the image analysis of the photographic flight remote sensing data and the image analysis of the ground proximity sensing data at the reference points.
- the comparison of the image analysis of the photographic long-distance flight detection data and the image analysis of the ground proximity detection data at the reference points is carried out in the fourth step d) by checking whether at the same geographic position of a reference point both in the image analysis of the photographic long-distance flight detection data and in the Image analysis of the photographic near-ground exploration data at least one weed has been detected.
- a further embodiment relates to a system for data collection on an agricultural field by a combination of long-distance flight and near-ground detection, comprising: at least one measuring rod; a receiving unit; a computing unit; and an output unit; whereby, with the aid of the at least one measuring rod, the geographic position of individual reference points on an agricultural field is recorded by ground proximity exploration and at least one photograph of at least one weed on the agricultural field is taken for each reference point, wherein the data from the reference points via the receiving unit is made available to the computer unit, wherein, the computing unit is configured to perform an image analysis of the photographic data from the respective reference points and to determine at least one weed for each reference point, wherein the computing unit is configured to determine flight remote sensing parameters based on the image analysis, wherein the output unit is configured to at least the Display, output or store in a data memory information from the computer unit relating to the determination of remote flight detection parameters.
- Another embodiment relates to a computer program product for controlling the system described above, which when executed by a processor is configured to carry out the method described above.
- a further embodiment relates to a measuring stick for data collection by ground proximity exploration in an agricultural field, comprising: at least one stick; a sensor for determining the geographical position of individual reference points on the agricultural field; a camera for photographing at least one weed for each reference point; an output unit; wherein the sensor for determining the geographical position and the camera are positioned on the pole such that the geographical position and the photograph can be determined at a reference point at the same time.
- such a measuring rod makes it possible to collect the necessary data for a reference point on the agricultural field quickly and accurately. Measurement errors or inaccuracies can thus be minimized or ruled out.
- FIG. 1 schematically shows step a) of the method for data collection on an agricultural field.
- FIG. 2 schematically shows step b) of the method for data collection on an agricultural field.
- FIG. 3 schematically shows step c) of the method for data collection on an agricultural field.
- FIG. 4 schematically shows step d) of the method for data collection on an agricultural field and in particular the creation of at least one weed distribution map.
- FIG. 5 schematically shows step d) of the method for data collection on an agricultural field and in particular the determination of the accuracy of the at least one weed distribution map.
- FIG. 6 shows specific examples of determining the accuracy of the at least one weed distribution map.
- FIG. 7 schematically shows a system for data collection on an agricultural field.
- FIG. 8 shows a schematic of three possible embodiments of a measuring rod for data collection by ground probing on a field used for agriculture.
- Figures 1 to 3 show a schematic of a method 10 for data collection on an agricultural field by a combination of long-distance flight and near-ground detection, where a) in a first step, the geographic position of the respective reference point is recorded by near-ground detection at reference points on the agricultural field and for each reference point at least one photograph of at least one weed on the agricultural field is taken, b) in a second step, flight detection parameters are determined using the data of an image analysis of the photographs of the at least one weed for each reference point, c) in a third step, at least the reference points on the agricultural field are photographed by aerial remote sensing, wherein the remote flight reconnaissance parameters determined in step b) are used at least partially for the remote flight reconnaissance.
- the method of data collection on the agricultural field includes the detection of weeds by a combination of long-distance aerial and ground-level sensing.
- FIG. 1 shows a schematic of step a) of method 10.
- Data are collected at reference points 12 on the field 11 used for agriculture by means of near-ground reconnaissance.
- the agricultural field 11 is shown in Figure 1 from a bird's eye view.
- the geographic position is recorded for each reference point 12 .
- at least one photograph 14 of at least one weed 13 on the agriculturally used field 11 is made for each reference point 12 .
- a measuring rod 300 can be used for this data collection.
- the measuring rod comprises, for example, a sensor 320 for determining the geographic position of individual reference points 12 and a camera 330 for photographically capturing 14 at least one weed 13 for each reference point 12.
- 20 (twenty) reference points 12 on the agricultural field 11 Data collected so that the data collection includes twenty photographs and the respective geographic position of the photographs.
- At least one reference point is chosen on which a weed is growing.
- At least one reference point is chosen on which a single weed plant is growing.
- the geographic position is determined by a positioning system.
- a known positioning system is a satellite navigation system such as NAVSTAR GPS, GLONASS, Galileo or Beidou. Since the abbreviation GPS (Global Positioning System) has established itself in everyday language as a generic term for all satellite navigation systems, the term GPS is used below as a collective term for all positioning systems.
- GPS Global Positioning System
- RTK Real Time Kinematic GPS position determination system
- Accuracies of 1 to 2 cm are achieved.
- the coordinates of the points can be calculated in real time after initialization.
- a ground proximity survey is carried out at at least (twenty) 20, preferably (thirty) 30 and even more preferably (fifty) 50 reference points.
- At least one photographic recording of the agricultural field is made for each reference point with the same working distance and preferably with the same camera properties.
- camera characteristics relate to sensor size, sensor resolution, and/or (preferably "and") focal length.
- the geographical position is determined and the photograph is taken at a reference point at the same time.
- ground-level reconnaissance refers to surveying the agricultural field and collecting data in a low-level area, for example at a distance of no more than two meters from the ground, preferably no more than one meter.
- the term "reference point" refers to a narrowly defined area on an agricultural field.
- the reference points can be chosen at random. There just has to be at least one weed growing at a reference point.
- a geographical position can be determined for each reference point. For example, a reference point covers an area of 20 cm 2 , preferably 10 cm 2 and even more preferably 5 cm 2 .
- a “photograph” means data capture with a camera, for example, in 2D.
- the camera includes an image sensor capable of capturing individual weeds in the field with good resolution.
- a camera of a mobile phone can be used.
- the camera is configured to capture photographs in the visible wavelength range.
- the camera is configured to capture color (RGB) information.
- weeds refers to plants that occur as spontaneous “companion vegetation” in the agricultural field, which are not intentionally grown there and from the seed potential of the soil, via root suckers or come to development via the inflow of seeds.
- Weeds can be monocotyledonous or dicotyledonous plants.
- FIG. 2 schematically shows step b) of the method for data collection on an agricultural field.
- Flight remote sensing parameters 16 are determined on the basis of an image analysis 15 of the photographs 14 of all reference points 12 .
- the image analysis of the photographs for each reference point in the second step b) includes the determination of at least one weed and its size.
- determining the size of the at least one weed includes determining the area and diameter of the at least one weed.
- the image analysis includes determining the weed species for the at least one weed.
- the image analysis includes determining the BBCH growth stage for the at least one weed.
- BBCH growth stage is preferably visually determined by image analysis.
- the BBCH code (or the BBCH scale) provides information about the morphological stage of development of a plant.
- the at least one weed and its properties are determined by means of instance segmentation, preferably using artificial intelligence and more preferably using a convolutional neural network and even more preferably a "Region Based Convolutional Neural Network" (R-CNN).
- R-CNN Region Based Convolutional Neural Network
- Instance segmentation and the use of R-CNN to determine weeds is known to those skilled in the art see, for example, Julien Champ et al., Instance segmentation for the fine detection of crop and weed plants by precision agricultural robots, Applications in Plant Sciences 2020 8/7); el 1373.
- the flight remote sensing parameters are set in step b) by first determining the projected size of a single pixel of the smallest ground weed to be detected (Ground Sampling Distance, GSD).
- the size comparison of all identified weeds also takes into account the weed species and preferably also the BBCH growth stage of each weed.
- the value for the smallest weed to be detected is thresholded or adjusted to a threshold.
- a threshold value of 2 cm can be determined for certain weeds such as thistles, because the plants cannot be recognized by image analysis of flight remote sensing data if the plants are less than 2 cm in size.
- the flight remote sensing parameters include flight altitude and camera characteristics, and these are determined based on the projected size of a single pixel of the smallest detectable ground weed (GSD).
- GSD ground weed
- the camera characteristics include sensor size, sensor resolution, and/or (preferably "and") focal length.
- the flight remote sensing parameters include the geographic location of the reference points.
- flight remote sensing parameters are determined that ensure weed detection while maximizing the area performance of remote sensing.
- the following formula is used: "Plant size in mm / 2 * Correction factor for light * Correction factor for flight conditions".
- the correction factor for light takes into account, for example, the time of day/season or the weather conditions (sunny, slightly cloudy, etc.).
- the correction factor for flight conditions takes into account, for example, unsteady wind conditions that affect the camera shutter speed, the overlapping of the photographic recordings or the flight speed.
- the correction factors thus compensate for the image blur.
- the adjustment can also be made in advance immediately before the flight based on the weather forecast at the location.
- FIG. 3 schematically shows step c) of the method for data collection on an agricultural field. At least the reference points 12 are recorded photographically 14 on the agricultural field 11 by remote aerial survey.
- the aircraft 17 is shown by way of example as a drone with a camera, which can be used for remote aerial reconnaissance.
- the flight remote sensing parameters 16 determined in step b) are used at least partially for the flight remote sensing.
- the entire area encompassed by the reference points is captured photographically.
- the entire agricultural field is photographed by aerial remote sensing.
- the remote flight reconnaissance parameters determined in step b), such as the flight altitude and the camera properties, are used at least partially for the remote reconnaissance.
- flight remote sensing parameters such as the choice of aircraft, the flight route, etc. that must be taken into account.
- At least one unmanned aerial vehicle is used in step c) for flight remote sensing.
- UAV unmanned aerial vehicle
- Several aircraft can also be used.
- cameras integrated in aircraft or cameras that can be attached to aircraft are used for the photographic recordings of flight remote sensing.
- the use of a high-resolution camera sensor is particularly important for this.
- the image analysis of the photographic flight remote sensing data in the fourth step d) includes the identification of at least one weed.
- the image analysis of the photographic flight remote sensing data in the fourth step d) includes determining the size of the at least one weed.
- determining the size of the at least one weed includes determining the area and diameter of the at least one weed.
- the image analysis of the photographic flight remote sensing data in the fourth step d) includes determining the weed species for the at least one weed.
- the image analysis of the photographic flight remote sensing data in the fourth step d) includes the determination of the BBCH growth stage of the at least one weed.
- the determination of the at least one weed and its properties is performed by means of instance segmentation, preferably using artificial intelligence and more preferably using a convolutive neural network and in particular an R-CNN. As described above, such methods are known to those skilled in the art.
- Figure 4 schematically shows step d) of the method for data collection on an agricultural field 11 and in particular the creation of at least one weed distribution map 18.
- the at least one weed distribution map 18 for the agricultural field 11 is created by means of an image analysis 19 of the photographic flight remote sensing data 20.
- the aircraft 17 shown as a drone in Figure 4
- flies over the agricultural field 11 see dashed line 25, which flight route as an example).
- the photographic recording 20 of the entire field 11 takes place. Overlapping photographic recordings 20 are preferably made, which can be used for georeferencing and, if necessary, for orthorectification of the image data.
- the at least one weed distribution map 18 is created on the basis of georef ened and preferably orthorectified photographic aerial survey data 20 and the image analysis 19, in which at least the weeds 13 on the agricultural field 11 are determined.
- the image analysis 19 preferably also determines the size of the weeds 13 and in particular also the weed species of the individually identified weeds or the BBCH growth stage.
- the weed distribution map 18 shows at least those areas 21 on the agricultural field 11 in which weeds 13 grow (shown as hatched areas 21 in FIG. 4).
- the white areas in the weed distribution map 18 in FIG. 4 show examples of regions on the agricultural field where no weeds were growing (or were too small) at the time the data was collected.
- the weed distribution map 18 can also display more detailed data, such as the distribution and occurrence of different weed species, the reference points 12 or the size or the BBCH growth stage of the individual weeds. Combinations of these data can also be displayed.
- FIG. 5 schematically shows step d) of the method for data collection on an agricultural field 11 and in particular the determination of the accuracy of the at least one weed distribution map.
- the accuracy of the at least one weed distribution map is determined by comparing the image analysis of the photographic remote flight detection data 27 and the image analysis of the ground proximity detection data 26 at the reference points 12 .
- the comparison of the image analysis of the photographic remote flight detection data 27 and the image analysis of the ground proximity detection data 26 at the reference points 12 is carried out in the fourth step d) by checking whether at the same geographic position of a reference point 12 both in the image analysis of the photographic remote flight detection data 27 and in at least one weed 13 has been recognized by the image analysis of the ground proximity sensing data 26 .
- Figure 4 shows number 28 a scenario in which at a weed was detected in both image analyses.
- Paragraph 29 shows a scenario where weeds were only detected in the image analysis of the near-ground sensing data 27, but not in the image analysis of the flight remote sensing data 27. It is also possible that weeds are detected in both image analyzes (26 and 27), it however, the weeds are different weed species.
- the at least one weed distribution map for the agricultural field is used for the site-specific application of at least one weed control agent.
- All known biologically and/or chemically based herbicides can be used as weed control agents.
- the site-specific application of a weed control agent according to the weed distribution map is carried out by a tractor with a crop protection sprayer.
- the determination of the weed species in step d) can be used to determine which at least one weed control agent to use.
- different herbicides may be used for different weeds.
- the agricultural field is planted with an arable crop preferably selected from the group consisting of corn, sugar beet, corn and soybeans.
- the accuracy of the at least one weed distribution map in fourth step d) is sufficient if, based on all reference points recorded by ground proximities, at least in 95% (preferably at least in 96.5% and even more preferably in 98%) of the comparisons at the geographic position of a At least one weed has been detected at the reference point both in the image analysis of the photographic flight remote sensing data and in the image analysis of the ground proximity sensing data.
- at least one weed that is of the same type of weed is preferably recognized in both image analyses.
- FIG. 6 shows specific examples of determining the accuracy of the at least one weed distribution map.
- the left-hand side of FIG. 6 shows an example a) in which near-ground detection data and remote flight detection data were collected at twenty reference points and at least one weed distribution map was determined using the method described.
- the comparisons of the image analysis of the photographic flight remote sensing data 27 with the image analysis of the near ground sensing data 26 for each reference point showed that at least one weed was detected at 19 reference points in both data sets.
- a weed was only detected at a reference point in the image analysis of the near-ground detection data 26, but not in the image analysis of the long-distance flight detection data 27.
- a further example b) is shown on the right-hand side of FIG. 6, in which near-ground detection data and remote flight detection data were collected at twenty reference points and at least one weed distribution map was determined using the method described.
- the comparison of the image analysis of the photographic flight remote sensing data 27 with the image analysis of the near ground sensing data 26 for each reference point showed that at least one weed was detected at 18 reference points in both data sets.
- a weed was only detected at two reference points in the image analysis of the near-ground detection data 26, but not in the image analysis of the long-distance flight detection data 27.
- FIG. 7 schematically shows a system 100 for data collection on an agricultural field by a combination of long-distance flight and near-ground detection.
- the system comprises at least one measuring stick 110, a receiving unit 120, a computer unit 130 and an output unit 140.
- the geographic position at individual reference points on an agricultural field is recorded by ground proximity sensing and at least one photograph of each reference point made at least one weed on the agricultural field.
- the data from the reference points are made available to the computer unit via the receiving unit.
- the computer unit is configured to carry out an image analysis of the photographic data from the respective reference points and to determine at least one weed for each reference point.
- the computing unit is further configured to determine flight remote sensing parameters based on the image analysis.
- the output unit is configured, at least the information from display, output or store in a data memory the computer unit with regard to the determination of flight remote sensing parameters.
- the system includes the dipstick further described by FIG.
- the data is transmitted from the measuring rod 110 to the receiving unit 120 by various transmission techniques known per se to those skilled in the art, such as via cable or wirelessly, for example via networks such as PAN (e.g. Bluetooth), LAN (e.g. Ethernet); WAN (e.g. ISDN), GAN (e.g. the Internet), LPWAN or LPN (e.g. SigFox, LoRAWAN, etc ), cellular networks or others.
- PAN e.g. Bluetooth
- LAN e.g. Ethernet
- WAN e.g. ISDN
- GAN e.g. the Internet
- LPWAN or LPN e.g. SigFox, LoRAWAN, etc
- the system includes a receiving unit, a computer unit and an output unit. It is conceivable that the units mentioned are part of a single computer system; however, it is also conceivable for the units mentioned to be components of a number of separate computer systems which are connected to one another via a network in order to transmit data and/or control signals from one unit to another unit. It is possible, for example, for the computer unit to be in the “cloud” and for the analysis steps described in this application to be carried out by this computer unit in the “cloud”.
- a "computer system” is an electronic data processing system that processes data using programmable calculation rules. Such a system typically includes a "computer,” the unit that includes a processor for performing logical operations, and peripherals.
- peripherals are all devices that are connected to the computer and are used to control the computer and/or as input and output devices. Examples of this are monitors (screens), printers, scanners, mice, keyboards, drives, cameras, microphones, loudspeakers, etc. Internal connections and expansion cards are also considered peripherals in computer technology.
- Today's computer systems are often divided into desktop PCs, portable PCs, laptops, notebooks, netbooks and tablet PCs and so-called handhelds (e.g. smartphones); all of these systems can be used for execution.
- the computer unit 130 is configured to carry out step b) of the method, which is described in detail above, including all preferred embodiments thereof.
- the system includes at least one aircraft 150.
- the aircraft preferably includes a data receiving and transmitting unit.
- the aircraft is preferably at least one unmanned aerial vehicle (UAV).
- UAV unmanned aerial vehicle
- the output unit is configured to transmit at least the information from the computer unit relating to the specific flight remote sensing parameters to the at least one aircraft using the transmission technologies described above and known to those skilled in the art.
- the at least one aircraft 150 is configured to carry out step c) of the method - including all preferred embodiments thereof - described in detail above.
- the flight remote sensing data are made available by the aircraft 150 to the computer unit 130 via the receiving unit 120 .
- the computing unit 130 is configured to perform step d) of the method - including all preferred embodiments thereof - detailed above.
- the computer unit can carry out the image analysis of the flight remote sensing data, create at least one weed distribution map of the agricultural field and/or check the accuracy of the weed distribution map.
- the weed distribution map generated by the computer unit 130 is made available by the output unit 140 to the receiving unit of a tractor with a crop protection sprayer, preferably after checking the accuracy of the weed distribution map.
- the tractor with the crop protection sprayer is configured to carry out a site-specific application of a weed control agent according to the weed distribution map on the agricultural field.
- FIG. 1 Another embodiment relates to a storage medium storing the computer program product.
- FIG. 8 schematically shows three possible embodiments a) to c) of a measuring rod 300 for collecting data by probing the ground in a field used for agriculture.
- the dipstick 300 includes at least one stick 310; a sensor 320 for determining the geographical position of individual reference points on an agricultural field; a camera 330 for photographing at least one weed for each reference point; and an output unit 340.
- the sensor for determining the geographical position and the camera are positioned on the pole so that the geographical position and the photograph at a reference point are the same Time can be determined or made.
- pole 310 is a plumbing pole.
- the senor 320 is a positioning system and more particularly a satellite navigation system such as NAVSTAR GPS, GLONASS, Galileo or Beidou.
- a satellite navigation system such as NAVSTAR GPS, GLONASS, Galileo or Beidou.
- RTK Real Time Kinematic GPS position determination system is particularly preferred.
- camera 330 includes an image sensor capable of capturing individual weeds in the field with good resolution.
- a camera of a mobile phone can be used.
- the camera is configured to capture photographs in the visible wavelength range.
- the camera is configured to capture color (RGB) information.
- the camera captures photographs in 2D.
- the output unit 340 comprises a transmission unit.
- the transmission unit is configured to transmit the data from the sensor 310 and/or (preferably "and") the camera 330 to other devices via the transmission techniques described above, which are known per se, such as via cables or wirelessly.
- the camera is located at the bottom of the rod 310, preferably at right angles to the rod (as shown in Figure 8a)). In this position, the camera can photograph the at least one weed from above (in the downward extension of the vertical direction of the stick, nadir position).
- the measuring stick 300 includes a camera mount 350.
- the camera mount is configured to fix the camera to the stick firmly, but preferably reversibly.
- the gauge 300 includes a laser pointer 360 (see also Figure 8b) and Figure 8c)).
- the laser pointer is configured to illuminate the at least one weed on the agricultural field. This ensures that the geographical position and the photographic recording can be synchronized at exactly the exact location.
- Sensor 320, camera 330 and laser pointer 360 are therefore preferably synchronized.
- the camera 330 is located at the bottom on the side of the rod 310.
- the recording area of the camera is shown with dashed lines.
- the laser pointer 360 is also located at the bottom of the wand. Its laser light (dashed line) is aimed at the center of the camera's 330 field of view.
- the determination of the geographic position, ie where the laser illuminates the at least one weed, is slightly offset compared to the vertical direction of the measuring stick (see 321). This difference in the geographical position of the weed to be detected and the sensor 320 is compensated for via a correction factor in determining the exact geographical position of the weed to be detected.
- FIG. 8c in which the camera 300 is fastened laterally (and preferably at right angles to the rod) further up on the rod 310 with a camera holder 350.
- RTK GPS surveying poles are known in the prior art (e.g. ProMark 220 GNSS Ashtech from Spectra, GeoMax Zenitz 35 pro from Geometra) but are not suitable for photographically recording at least one weed at a reference point on an agricultural field and at the same time the geographical position to measure. In the case of the known measuring rods, such data collection takes place sequentially at best, which can lead to measurement errors.
- the dipstick described in the application addresses this problem and offers a solution that is significantly less error-prone.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Remote Sensing (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2022381713A AU2022381713A1 (en) | 2021-11-08 | 2022-11-04 | Method and system for collecting data on a field used for agriculture |
EP22812641.3A EP4430578A1 (de) | 2021-11-08 | 2022-11-04 | Methode und system zur datenerhebung auf einem agrarwirtschaftlich genutzten feld |
CA3237567A CA3237567A1 (en) | 2021-11-08 | 2022-11-04 | Method and system for collecting data on a field used for agriculture |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP21206921.5 | 2021-11-08 | ||
EP21206921 | 2021-11-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023079063A1 true WO2023079063A1 (de) | 2023-05-11 |
Family
ID=78592489
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2022/080785 WO2023079063A1 (de) | 2021-11-08 | 2022-11-04 | Methode und system zur datenerhebung auf einem agrarwirtschaftlich genutzten feld |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4430578A1 (de) |
AU (1) | AU2022381713A1 (de) |
CA (1) | CA3237567A1 (de) |
WO (1) | WO2023079063A1 (de) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200193589A1 (en) * | 2018-12-10 | 2020-06-18 | The Climate Corporation | Mapping field anomalies using digital images and machine learning models |
US20210068335A1 (en) * | 2018-05-06 | 2021-03-11 | Weedout Ltd. | Methods and systems for weed control |
WO2021062459A1 (en) * | 2019-10-04 | 2021-04-08 | Single Agriculture Pty Ltd | Weed mapping |
-
2022
- 2022-11-04 EP EP22812641.3A patent/EP4430578A1/de active Pending
- 2022-11-04 WO PCT/EP2022/080785 patent/WO2023079063A1/de active Application Filing
- 2022-11-04 AU AU2022381713A patent/AU2022381713A1/en active Pending
- 2022-11-04 CA CA3237567A patent/CA3237567A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210068335A1 (en) * | 2018-05-06 | 2021-03-11 | Weedout Ltd. | Methods and systems for weed control |
US20200193589A1 (en) * | 2018-12-10 | 2020-06-18 | The Climate Corporation | Mapping field anomalies using digital images and machine learning models |
WO2021062459A1 (en) * | 2019-10-04 | 2021-04-08 | Single Agriculture Pty Ltd | Weed mapping |
Non-Patent Citations (2)
Title |
---|
JULIEN CHAMP ET AL.: "Instance segmentation for the fine detection of crop and weed plants by precision agricultural robots", APPLICATIONS IN PLANT SCIENCES, vol. 8, no. 7, 2020, pages e1 1373 |
MARTIN WEIS ET AL: "Qualitative und quantitative Messung der Verunkrautung in Kulturpflanzenbeständen mittels Bildanalyse (Qualitative and quantitative measurement of weed distribution in crops using image processing)", BORNIMER AGRARTECHNISCHE BERICHTE, 1 May 2008 (2008-05-01), pages 67 - 74, XP055299413 * |
Also Published As
Publication number | Publication date |
---|---|
AU2022381713A1 (en) | 2024-05-09 |
CA3237567A1 (en) | 2023-05-11 |
EP4430578A1 (de) | 2024-09-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Torres-Sánchez et al. | Assessing UAV-collected image overlap influence on computation time and digital surface model accuracy in olive orchards | |
Stanton et al. | Unmanned aircraft system-derived crop height and normalized difference vegetation index metrics for sorghum yield and aphid stress assessment | |
US11631166B2 (en) | Crop yield prediction method and system based on low-altitude remote sensing information from unmanned aerial vehicle | |
Forsmoo et al. | Drone‐based structure‐from‐motion photogrammetry captures grassland sward height variability | |
DE112021000243B4 (de) | Verfahren, system und computerprogrammprodukt zur bestimmung von kulturpflanzentyp und/oder aussaattermin | |
EP2044573B1 (de) | Überwachungskamera, verfahren zur kalibrierung der überwachungskamera sowie verwendung der überwachungskamera | |
DE69627487T2 (de) | Direktes digitales panorama-luftbildsystem und -verfahren | |
EP1637838B1 (de) | Verarbeitung von Fernerkundungsdaten | |
Sinde-González et al. | Biomass estimation of pasture plots with multitemporal UAV-based photogrammetric surveys | |
DE102016123286B4 (de) | Verfahren und Vorrichtung zur Georeferenzierung von Luftbilddaten | |
DE102015221085A1 (de) | Verfahren und Informationssystem zum Erkennen zumindest einer auf einem Feld angepflanzten Pflanze | |
EP3528609A1 (de) | Ertragsvorhersage für ein kornfeld | |
EP0325957B1 (de) | Grossmassstäbliches Kartieren von Parametern mehrdimensionaler Strukturen in Naturräumen | |
DE102015221092A1 (de) | Verfahren und Informationssystem zum Erfassen zumindest eines Pflanzenparameterdatensatzes einer auf einem Feld wachsenden Pflanze | |
WO2021105019A1 (de) | Verfahren zum generieren einer vielzahl von annotierten bildern | |
DE102014205083B3 (de) | Erstellen von Karten mit Küstenlinien | |
DE112018007180T5 (de) | Ort-schätzung-vorrichtung, bewegendes objekt, ort-schätzung-verfahren und programm | |
DE102017220005A1 (de) | Verfahren und Anzeigegerät zum Führen einer Arbeitsmaschine | |
WO2023079063A1 (de) | Methode und system zur datenerhebung auf einem agrarwirtschaftlich genutzten feld | |
EP0634628B1 (de) | Verfahren zur Erdbeobachtung | |
Rajan et al. | Mapping crop ground cover using airborne multispectral digital imagery | |
Kutnjak et al. | Potential of aerial robotics in crop production: high resolution NIR/VIS imagery obtained by automated unmanned aerial vehicle (UAV) in estimation of botanical composition of alfalfa-grass | |
DE102022212642A1 (de) | Verfahren und System zur Ermittlung von Baumdurchmessern auf Brusthöhe | |
Pflanz et al. | Automatisierte Unkrauterkennung auf dem Acker–Möglichkeiten und Grenzen | |
CN110648281B (zh) | 田间全景图生成方法、装置、系统、服务器及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22812641 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: AU2022381713 Country of ref document: AU |
|
ENP | Entry into the national phase |
Ref document number: 3237567 Country of ref document: CA |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112024008578 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 2022381713 Country of ref document: AU Date of ref document: 20221104 Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022812641 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022812641 Country of ref document: EP Effective date: 20240610 |
|
ENP | Entry into the national phase |
Ref document number: 112024008578 Country of ref document: BR Kind code of ref document: A2 Effective date: 20240430 |