WO2019063255A1 - Method for recognizing a leaf edge, method for the targeted treatment of plants by means of a leaf treatment agent, and use of an event-based image sensor for the recognition of a leaf edge - Google Patents
Method for recognizing a leaf edge, method for the targeted treatment of plants by means of a leaf treatment agent, and use of an event-based image sensor for the recognition of a leaf edge Download PDFInfo
- Publication number
- WO2019063255A1 WO2019063255A1 PCT/EP2018/073932 EP2018073932W WO2019063255A1 WO 2019063255 A1 WO2019063255 A1 WO 2019063255A1 EP 2018073932 W EP2018073932 W EP 2018073932W WO 2019063255 A1 WO2019063255 A1 WO 2019063255A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pixel
- event
- plant
- polarity
- image
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/194—Terrestrial scenes using hyperspectral data, i.e. more or other wavelengths than RGB
Definitions
- Method for detecting a sheet edge Method for targeted treatment of plants with a foliar treatment agent and use of an event-based image sensor for detecting a sheet edge
- the present invention relates to a method for detecting a sheet edge, a method for the targeted treatment of plants with a
- Sheet handling means a use of an event-based image sensor for detecting a sheet edge, a computer program, a machine-readable storage medium and an electronic control unit.
- the dynamic range is in a single image with direct
- NDVI Normalized Difference Vegetation Index
- the NDVI is calculated as the difference in near infrared reflectance and red visible range divided by the sum of these reflectance values.
- the NDVI is the most commonly used vegetation index and has historically been defined for remote sensing tasks.
- DVS Dynamic Vision Sensor
- the Dynamic Vision sensor does not take like classic image sensors
- CMOS technology images at equidistant time intervals, but measures intensity differences, so-called events, at individual pixel locations and only instantaneously sends them with microsecond accuracy and millisecond latency. If the intensity of a pixel does not change or only slightly changes, no event will be triggered and no data will be sent for that pixel.
- Change sensor detects an event, for example, a change by a certain percentage or by a certain amount. Other definitions of change are possible.
- the change sensor is configured to determine intensity changes in a pixel from a time tnl to a time tn2 as intensity data.
- tn2 - tnl dt2, where dt2 ⁇ 10 "is 4 seconds
- the time interval dt2 results from the time points of an intensity change detected by the change sensor
- the times tnl and tn2 and the time interval dt2 are preferably variable and / or not necessarily constant for different intensity changes
- the time interval dt2 is variable and results from the detected intensity changes in a cell and / or from the transmission rate of the change sensor. Disclosure of the invention
- the sheet edge detection method comprises, as a first step, acquiring events of an image of a plant or vegetation
- Such an image of a vegetated soil usually has mainly two types of areas. On the one hand, this is an area in which the color green predominates, which comes from the chlorophyll of the plants.
- this is an area where a brownish color from the ground predominates.
- a plant is to be understood as meaning a plant with leaves whose leaves contain chlorophyll.
- Vegetation has a very low visible level
- the reflection coefficient of dry soil changes less strongly from the red to the NI R range.
- the corresponding value of the reflection coefficient is about 25% in the red region and about 30 to 35% in the NI R region.
- the reflection coefficient of the vegetation is significantly higher than the reflection coefficient of the dry soil.
- the reflection coefficient of the dry soil is usually larger than that
- Reflection coefficient of vegetation however, the reflection coefficient of the vegetation in the direction of the NI R range is greater than the reflection coefficient of the dry soil.
- the method uses a bottom resolution between 0.5 and 1 mm per pixel for the determination of a sheet edge.
- This ground resolution is that it allows the so-called leaf edge curve to be detected by conventional means.
- this type of detection is very fast and allows a high dynamic range. This ensures a reliable analysis of the sheet edge curve in real time.
- a bandpass filter of a first frequency range is arranged in front of a first pixel of the image sensor and a bandpass filter of a second frequency range is arranged in front of a second pixel adjacent to the first pixel.
- the first frequency range is equal to the second frequency range. This allows the use of a filter that uses machine learning for false positive removal.
- the first frequency range is preferably the near infrared and the second frequency range is a red spectral range.
- Spectral range a sheet edge can be reliably detected.
- the method it is determined that a leaf margin has been detected if the first pixel at a first time tl at least one event of a first Polarity and the second pixel detected at a time t2 at least one event of a second polarity, wherein the first and second polarity are opposite, and that the at least one event of a first polarity of the first pixel and the at least one event of a second polarity of the second pixel the same Place of plant-covered soil corresponds.
- the term "the same place is to be understood as meaning the plant-covered ground", in particular adjacent pixels of a single image, or alternatively pixels with a spacing of up to three pixels can be understood Plant-covered soil both in the first and in the second
- Frequency range has an equally intense lighting.
- the method it is determined that a leaf margin has been detected if the detected events were detected at only the first or the second pixel.
- This can be achieved in particular by suitable selection of threshold values of the object contrast.
- the thresholds for on and off events can be set separately.
- Thresholds may also be used for the different channels, i. the R channel and the NI R channel are set separately. This case occurs, among other things, if the ground is dark or light.
- the threshold values it can be achieved for any brightness conditions that either the above-mentioned first case, according to which adjacent pixels have an opposite polarity, or the above-mentioned second case, after which only one is present at adjacent pixels
- the same information can also be deduced from the brightness histogram of the frame.
- the method of scene reconstruction is suitable, which is e.g. among others in the scientific publication entitled “Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera” by Hanme Kim, Stefan
- the brightness histogram is in at least one of the two
- Bimodal color channels in the case of the average ground brightness even in both, thereby allowing automatic definition of the contrast threshold in the event generation, e.g. by Otsu method.
- intensity changes at the individual pixel positions represent events.
- a detected event is sent immediately to a computing unit with microsecond accuracy and millisecond latency. If only the two adjacent pixels are considered in the time period, the event-based image sensor provides a time-dependent function for each pixel, which indicates for each time point whether an event was detected or not. If an event has been detected, the function additionally gives the
- time window When events are considered to be simultaneous depends on a time window.
- the size of the time window depends on ground resolution and speed. If e.g. If two pixels represent one millimeter and the speed is 1 meter per second, then events with a time difference of approximately half a millisecond are to be regarded as simultaneous. As described elsewhere, one then detects a leaf margin with opposite polarity of the events.
- the method described can be used to extract leaf margin information and to determine the associated plant species in real time.
- the method generates a low data rate through the use of event-based image sensors, which has an advantageous effect on subsequent data processing and enables efficient local calculations on the camera and field spray modules. Particularly advantageous is the high robustness of the
- the duration between the two times t1 and t2 for a given direction of travel depends on a first direction, which extends from a center point of the first pixel to a center point of the second pixel, and a second direction, which corresponds to a tangential direction of the page edge.
- the events are detected with an event-based image sensor.
- the event-based image sensor is preferably a DVS sensor. Since the detected events usually require much less memory than the
- the amount of data delivered by a DVS camera is much less than that of a regular image sensor. Furthermore, the processing speed of a DVS sensor is much faster than with ordinary cameras. Since a DVS sensor looks at individual pixels independently, a much higher dynamic range up to 130 d B is possible. This high dynamic range prevents overexposed image areas, which in turn allows a more reliable image analysis and thus a more accurate detection of the sheet edge.
- the bottom resolution is between 0.1 and 1 mm per pixel.
- finer structures can advantageously be resolved in the recorded images. This helps to differentiate between different plant species more reliably.
- a color filter array (C FA) of band pass filters in the NI R and R spectral range is arranged in front of the image sensor.
- the R-spectral range stands for the red spectral range.
- the two different bandpass filters can advantageously be arranged in a checkerboard pattern. This advantageously achieves that next to each pixel with NI R bandpass filter is a pixel with bandpass filter in the red spectral range. This in turn has the advantage that a sheet edge detection can be performed for each pixel. According to another embodiment, the two are different
- the color filter array is NDVI optimized.
- An NDVI-optimized color filter array is understood to mean that the color filter array optimally distinguishes between the NI R and the red spectral region so that the NDVI index has high expressivity. This is especially the case when the NI R band pass filter of the color filter array is centered around 850 nm and the R band pass filter of the color filter array is centered around 660 nm.
- the respective bandpass filters preferably have a width of 20 nm.
- Such a color filter array is an optimal distinction between
- a bandpass filter of a third frequency range is arranged in front of a third pixel of the image sensor.
- the color filter array has a third bandpass filter, which is preferably in the green color range.
- the accuracy of the detection of the sheet edge can be further increased.
- other indexes such as e.g. Excessive Green, to be used.
- the color filter array is optimized for other filters, for example for the Excessive Green Index.
- the soil which is overgrown with plants, has equally intense illumination both in the first and in the second frequency range. If this is not the case, alternatively the intensity in the first frequency range and in the second frequency range is determined. This can also be done pixel by pixel. Preferably, a new image is calculated in which the intensity in the first and second frequency ranges is the same or identical. This can also be done pixel by pixel.
- a passing object is measured several times. A passing object is an object that passes through a detection area of the event-based image sensor. Since the vehicle speed and direction of travel and the orientation and position of the event-based image sensor relative to the vehicle are known, an object detected by the image sensor can be tracked.
- a plant height of a plant is determined from at least one image of the soil covered with plants.
- an image is understood as an image taken with an event-based image sensor.
- the stature height is determined from a variety of images of the vegetated soil with soil. The prerequisite for this is that an interpretable flow field can be determined from the at least one image.
- SfM Structure from motion
- the determination of the stature height has the advantage that subsequent spraying operations can be dosed more accurately and thus can be flexibly adapted to the particular circumstances.
- the method comprises a machine-based learning algorithm for recognizing a classification of the plant.
- the method comprises a machine-based learning algorithm for recognizing a classification of the plant.
- Foliar treatment agent recognizes a sheet edge in a first step of the method according to the method set out above.
- a leaf shape of the plant is detected based on the recognized leaf margin of the plant.
- a leaf margin is detected for each pixel pair in which only one pixel predominantly detects events. Accordingly, any location of the sheet edge can be recognized for the entire still image. If each location of the leaf margin is known, a leaf shape can be determined or recognized. For a moving image, a leaf shape can be determined for each time you see the leaf.
- a classification of the plant is determined on the basis of the specific leaf shape of the plant. This has the advantage that the thus classified plant can be specifically treated with a suitable agent for this plant.
- Classification of the plant appropriate means treated. This advantageously ensures that certain plants are selective or targeted can be treated. For example, it is possible for weeds to be specifically treated with a weed control or herbicide. Alternatively or additionally, other recognized plants may be treated by other means.
- Such agent is especially a pesticide, a pesticide, an insecticide, a fungicide, a herbicide, a biocide, a bactericide
- Virucide an acaricide, an avicide, a molluscicide, a nematicide, an ovicide or a rodenticide.
- the targeted treatment of the plant with a suitable agent for this plant has the advantage that no resources are wasted, in particular no plants are damaged because no plant is treated with an inappropriate agent.
- the method for the targeted treatment of plants enables a rapid differentiation between the different plant species and, based on this, a targeted treatment of the respective plant species with the appropriate agent.
- the invention relates to a use of an event-based image sensor for detecting a sheet edge using images recorded by an event-based image sensor. This use has already been disclosed in the sheet edge detection method described above.
- the computer program is set up to perform each step of the method, especially when running on an electronic controller or computing device. This refers to both the method for the detection of a leaf margin and the method for the targeted treatment of plants with a foliar treatment agent. This allows the implementation of the method in a conventional control unit, without constructive
- the computer program is stored on a machine-readable storage medium.
- the electronic control unit is obtained, which is set up to recognize a leaf margin or to treat a plant specifically.
- FIG. 1 shows a flowchart of a method for sheet edge detection according to an exemplary embodiment of the invention.
- FIG. 1 shows a flow chart of a method 100 for sheet edge detection.
- a first step 110 of the method 100 events of an image of a vegetated soil at a ground speed relative to the ground are detected using an event-based image sensor.
- the ground resolution used here is 0.5 mm per pixel and in front of the image sensor, an NDVI-optimized color filter array (CFA) of bandpass filters in the NI R and R spectral range is arranged.
- the NI R band pass filter is centered around 850 nm and the R band pass filter centered at 660 nm.
- a sheet edge was detected if the detected events of two adjacent pixels were detected predominantly only at one of the two adjacent pixels.
- Step 120 is performed for all adjacent pixels of the captured events of the image. This also means that the method 100 repeatedly measures a passing object. Thus, one obtains the leaf margin for all recorded images of the plant, from which one can determine a leaf shape of the plant. In step 130 of the method 100, a plant height of a plant is determined from all images showing the same plant of the overgrown soil.
- a machine-based learning algorithm is used to classify the plant based on the images of the leaf margin taken by the plant.
- the classified plant is specifically treated with a leaf treatment agent corresponding to the classification of the plant.
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
BR112020002571-2A BR112020002571A2 (en) | 2017-09-28 | 2018-09-06 | process for recognition of a leaf edge, process for objective treatment of plants with a leaf treatment agent and use of an event-based image sensor for recognition of a leaf edge |
EP18765629.3A EP3688660A1 (en) | 2017-09-28 | 2018-09-06 | Method for recognizing a leaf edge, method for the targeted treatment of plants by means of a leaf treatment agent, and use of an event-based image sensor for the recognition of a leaf edge |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102017217275.4 | 2017-09-28 | ||
DE102017217275.4A DE102017217275A1 (en) | 2017-09-28 | 2017-09-28 | Method for detecting a sheet edge, method for targeted treatment of plants with a foliar treatment agent and use of an event-based image sensor for detecting a sheet edge |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019063255A1 true WO2019063255A1 (en) | 2019-04-04 |
Family
ID=63517892
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2018/073932 WO2019063255A1 (en) | 2017-09-28 | 2018-09-06 | Method for recognizing a leaf edge, method for the targeted treatment of plants by means of a leaf treatment agent, and use of an event-based image sensor for the recognition of a leaf edge |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP3688660A1 (en) |
BR (1) | BR112020002571A2 (en) |
DE (1) | DE102017217275A1 (en) |
WO (1) | WO2019063255A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112256951A (en) * | 2020-09-09 | 2021-01-22 | 青岛大学 | Intelligent household seedling planting system |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102020215415A1 (en) | 2020-12-07 | 2022-06-09 | Robert Bosch Gesellschaft mit beschränkter Haftung | Method and device for detecting a sturgeon plant image in a camera raw image and image processing device |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012122988A1 (en) * | 2011-03-16 | 2012-09-20 | Syddansk Universitet | Spray boom for selectively spraying a herbicidal composition onto dicots |
US20150015697A1 (en) * | 2013-03-07 | 2015-01-15 | Blue River Technology, Inc. | Method for automatic phenotype measurement and selection |
US20150379702A1 (en) * | 2014-06-30 | 2015-12-31 | Trimble Navigation Limited | Active Imaging Systems for Plant Growth Monitoring |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9075008B2 (en) * | 2003-11-07 | 2015-07-07 | Kyle H. Holland | Plant treatment based on a water invariant chlorophyll index |
-
2017
- 2017-09-28 DE DE102017217275.4A patent/DE102017217275A1/en active Pending
-
2018
- 2018-09-06 BR BR112020002571-2A patent/BR112020002571A2/en not_active IP Right Cessation
- 2018-09-06 EP EP18765629.3A patent/EP3688660A1/en not_active Ceased
- 2018-09-06 WO PCT/EP2018/073932 patent/WO2019063255A1/en unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012122988A1 (en) * | 2011-03-16 | 2012-09-20 | Syddansk Universitet | Spray boom for selectively spraying a herbicidal composition onto dicots |
US20150015697A1 (en) * | 2013-03-07 | 2015-01-15 | Blue River Technology, Inc. | Method for automatic phenotype measurement and selection |
US20150379702A1 (en) * | 2014-06-30 | 2015-12-31 | Trimble Navigation Limited | Active Imaging Systems for Plant Growth Monitoring |
Non-Patent Citations (3)
Title |
---|
HANME KIM; STEFAN LEUTENEGGER; ANDREW J. DAVISON: "Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera", EUROPEAN CONFERENCE ON COMPUTER VISION, 2016, pages 349 - 364, XP047355269, DOI: doi:10.1007/978-3-319-46466-4_21 |
MIDOPT: "DB660/850 Dual Bandpass Red + 850nm NIR", 23 June 2017 (2017-06-23), pages 1 - 3, XP055522253, Retrieved from the Internet <URL:https://web.archive.org/web/20170623070705/http://midopt.com/filters/db660850/> [retrieved on 20181108] * |
TOBI DELBRUCK: "Dynamic Vision Sensor (DVS) -asynchronous temporal contrast silicon retina", 14 April 2014 (2014-04-14), XP055522343, Retrieved from the Internet <URL:http://siliconretina.ini.uzh.ch/wiki/index.php Go FEB APR APR 14 2016 2017 2018 34> [retrieved on 20181108] * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112256951A (en) * | 2020-09-09 | 2021-01-22 | 青岛大学 | Intelligent household seedling planting system |
Also Published As
Publication number | Publication date |
---|---|
DE102017217275A1 (en) | 2019-03-28 |
EP3688660A1 (en) | 2020-08-05 |
BR112020002571A2 (en) | 2020-08-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1267178B1 (en) | Method for processing a high definition picture | |
DE102016118227A1 (en) | Image analysis system for agricultural machines | |
DE112009000949T5 (en) | Detection of a free driving path for a vehicle | |
WO2018185083A2 (en) | Time-of-flight camera | |
DE102013019138A1 (en) | A method for detecting a hidden state of a camera, camera system and motor vehicle | |
EP1531342B1 (en) | Method of detecting pedestrians | |
DE102013212495A1 (en) | Method and device for inspecting a contoured surface, in particular the underbody of a motor vehicle | |
DE112010003752T5 (en) | DYNAMIC REAL-TIME REFERENCE GENERATION FOR A DISTANCE IMAGING SYSTEM | |
EP3615887A1 (en) | Material testing by means of angle-variable illumination | |
DE102016223185A1 (en) | Method and system for detecting a raised object located within a parking lot | |
WO2019063255A1 (en) | Method for recognizing a leaf edge, method for the targeted treatment of plants by means of a leaf treatment agent, and use of an event-based image sensor for the recognition of a leaf edge | |
AT508711B1 (en) | METHOD AND DEVICE FOR SEARCHING AND DETECTING ANIMALS HIDDEN IN AGRICULTURAL FIELDS AND MEADOWS | |
DE10148062A1 (en) | Localizing system for objects uses transmitter for pulsed emission of laser beams and receiver with sensor to pick up reflected beam pulses and to analyze them regarding their execution time | |
DE102007041333B4 (en) | Non-contact counting system | |
WO2016107722A1 (en) | Method for determining particles | |
EP3663881B1 (en) | Method for controlling an autonomous vehicle on the basis of estimated movement vectors | |
WO2016087202A1 (en) | Image processing by means of cross-correlation | |
WO2013068521A1 (en) | Device and method for mechanically thinning out blossom | |
DE102016210056A1 (en) | Camera arrangement for determining the optical flow, driver assistance system and surveillance camera with the camera arrangement | |
WO2019185184A1 (en) | Device and method for the optical position detection of transported objects | |
DE102011083745B4 (en) | A method for monocular motion stereo-based automatic automatic parking free parking, computer program product and device | |
DE102020113183B4 (en) | Camera and method for detecting moving objects | |
DE102016223180A1 (en) | Method and system for detecting a raised object located within a parking lot | |
EP3547664B1 (en) | Device and system for creating 3d image data | |
DE102019117849B4 (en) | Detection of an object in a surveillance area |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18765629 Country of ref document: EP Kind code of ref document: A1 |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112020002571 Country of ref document: BR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2018765629 Country of ref document: EP Effective date: 20200428 |
|
ENP | Entry into the national phase |
Ref document number: 112020002571 Country of ref document: BR Kind code of ref document: A2 Effective date: 20200206 |