EP3369037A1 - Procédé et système d'information pour détecter au moins une plante plantée dans un champ - Google Patents

Procédé et système d'information pour détecter au moins une plante plantée dans un champ

Info

Publication number
EP3369037A1
EP3369037A1 EP16785383.7A EP16785383A EP3369037A1 EP 3369037 A1 EP3369037 A1 EP 3369037A1 EP 16785383 A EP16785383 A EP 16785383A EP 3369037 A1 EP3369037 A1 EP 3369037A1
Authority
EP
European Patent Office
Prior art keywords
plant
information
detecting
field
image information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP16785383.7A
Other languages
German (de)
English (en)
Inventor
Roland-Kosta Tschakarow
Dejan PANGERCIC
Tobias DIPPER
Gabriel GAESSLER
Markus Hoeferlin
Tobias Mugele
Steffen Fuchs
Achim Winkler
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP3369037A1 publication Critical patent/EP3369037A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image

Definitions

  • the invention is based on a method for detecting at least one plant planted on a field or an information system as generically defined by the independent claims.
  • the subject of the present invention is also a computer program.
  • Procedures are used in extensive series of experiments involving multiple repetitions in different locations.
  • Boniturvon in which the characteristics of a plant to be assessed (germination, sprouting, stature height, leaf size, flowering, fruit ripeness, u.v.a.) accordingly
  • the data collected by persons in the field trial according to the rating scheme will be evaluated afterwards.
  • the evaluation is usually a manual procedure in which the partial results are summarized in tables and then analyzed by means of statistical methods.
  • the approach presented here provides a method for detecting at least one plant planted in a field, the method comprising the following steps:
  • a field can be understood as an acreage for plants or even a plot of such a field.
  • a crop can be understood as a crop whose fruit
  • an optical detection unit for example, a camera can be understood. For example, under a geographical position where the plant grows on the field
  • Plant image information can be understood to be an image or a plant parameter which reproduces optically detectable features or features of the plant which can be detected by infrared radiation.
  • the plant image information may also contain information obtained by processing or processing the image of the plant acquired by the optical camera and / or the infrared sensor.
  • plant information can be understood as a parameter or information, such as the aforementioned shape, size, species, number of leaves, leaf structure, number of buds, bud structure or the like, which makes it possible to distinguish the plant from another plant.
  • the parameter may also only provide information about the presence of the plant itself at the geographical position.
  • a plant dataset may be understood to mean a data record or an information unit stored or to be stored in a memory in which the plant information is or will be stored with an associated geographical position of the plant.
  • a detection unit to recognize the plant certain geographical position can be made very quickly and reproducibly by avoiding a human classification.
  • Evaluation can be easily recycled. By assigning a geographical position to the plant or the plant information is further ensured that in repeated cycles of detection of the plant always the same plant can be detected, so that the
  • the approach proposed here offers the advantage that the assessment of the plant's plant growth can be carried out very quickly automatically. This is particularly useful when monitoring the growth of a large number of plants, as is the case for example in the
  • Plant information and the geographical position then draw a conclusion on the influence of these growth or environmental conditions on the
  • the step of detecting the plant in the step of detecting the plant can be detected from a plurality of viewing directions and / or with an exposure time of less than half a second, in particular wherein the plant is detected with an exposure time of less than a tenth of a second.
  • Plant distance information can be, for example, information in which distance grid individual plants to be examined in particular
  • Embodiment of the approach proposed here has the advantage that already a pre-selection of actually closer to be examined or to be detected plants can be made, which can sometimes save considerable effort in the recording or identification of plants.
  • Plant image information in particular using a green
  • Image processing in order to be able to identify individual parts or whole plants in the plant image information.
  • step of identifying determination of growth information as partial information of the plant information, which includes a
  • Development status and / or or a health status of the plant represents, wherein in the step of storing the growth information as
  • Partial information of the plant information is stored in the plant dataset.
  • growth information can, for example, information about the species, height, shape or number of leaves, buds, branches or the like or information about a structure of the plant itself or of parts of the plant.
  • Such an embodiment of the approach proposed here offers the additional advantage that not only the presence of the plant, but already one or more other parameters which are responsible for the
  • an embodiment of the approach proposed here in which, in the step of identifying, a recognition of a species of the plant takes place in order to obtain plant information, in particular wherein recognition results in a distinction of the species of the plant from a species of another plant.
  • a species of a plant may in the present case be understood as a genus of the plant.
  • Such an embodiment of the approach proposed here offers the advantage of being able to make a distinction between a useful or a cultivated plant from a weed, which need not be taken into account, for example, for the investigation of the state of development of the plant. In this way it can also be established, for example, that at the geographical position where, for example, at an earlier time a crop plant was planted, but this has been received by drought or animal feeding, now one (not further)
  • Such information that a weed plant now grows at the current geographic position may also be stored as part of the plant information in the plant dataset.
  • An embodiment of the approach proposed here in which at least one further plant is detected by means of an optical and / or infrared detection unit in order to obtain at least one further plant image information and at least one is particularly advantageous for assessing the development state of a plurality of plants another geographical position is detected at which the further plant grows in the field.
  • the step of identifying the further plant is identified using the further plant image information to obtain another To obtain plant information that represents the presence of the further plant.
  • the step of storing the further plant information and the further geographical position are stored in the plant data record.
  • an embodiment of the approach presented here which has a step of counting plants grown in the field using the plant information and the further plant information from the plant dataset.
  • Such an embodiment of the approach presented here offers the advantage of a fast, safe and unambiguous counting of plants grown on the field.
  • Determination time of the plant information contained in the stored plant dataset represents.
  • Such an embodiment of the approach proposed herein offers the advantage of providing a very precise history of the state of development of the plant.
  • Identify the plant is identified using a previously recorded identification information.
  • Information of a previously recorded plant image information or identification information can be, for example, a recognition algorithm for the plant, which in a previous training cycle specifically for Detection of the plant or characteristics of the plant has been optimized. In this way, the detection or identification can be simplified, since the
  • Plant image information needs to be compared only with a reference information or a reference image and thereby usually much less numerical or circuit complexity to determine the
  • a captured plant image may be plant image information taken in a previous embodiment of an embodiment of the method of recognition proposed herein.
  • Such an approach offers the advantage of being able to detect a technical disturbance of the detection unit with simple means, so that as far as possible no erroneous data are stored in the plant data record and an early elimination of the disturbance at the detection unit is possible.
  • This method can be implemented, for example, in software or hardware or in a mixed form of software and hardware, for example in a control unit.
  • the approach presented here also creates an information system that is designed to implement, control or implement the steps of a variant of a method presented here in corresponding devices. Also by this embodiment of the invention in the form of a device, the object underlying the invention can be solved quickly and efficiently.
  • the information system at least one arithmetic unit for
  • At least one memory unit for storing signals or data, at least one interface to a sensor or an actuator for reading sensor signals from the sensor or for outputting data or control signals to the actuator and / or at least one communication interface to Reading or outputting data embedded in a communication protocol.
  • the arithmetic unit may be, for example, a signal processor, a microcontroller or the like, wherein the memory unit may be a flash memory, an EPROM or a magnetic memory unit.
  • the communication interface can be designed to read in or output data wirelessly and / or by line, wherein a communication interface, the
  • an information system can be understood to mean an electrical device which processes sensor signals and outputs control and / or data signals in dependence thereon.
  • the device may have an interface, which may be formed in hardware and / or software.
  • the interfaces can be part of a so-called system ASIC, for example, which contains a wide variety of functions of the device.
  • system ASIC system ASIC
  • the interfaces are their own integrated circuits or at least partially consist of discrete components.
  • the interfaces may be software modules that are present, for example, on a microcontroller in addition to other software modules.
  • the information system controls an electronic record of the plant data record.
  • the information system may, for example, read sensor signals such as the image data of the detection unit and an interface (for example from a satellite-based location system)
  • Access location information takes place via actuators, such as a write head for an electromagnetic recording of the plant information and the geographical position information in the
  • Plant data set or in a microelectronic memory Plant data set or in a microelectronic memory.
  • a computer program product or computer program with program code which can be stored on a machine-readable carrier or storage medium such as a semiconductor memory, a hard disk memory or an optical memory and for carrying out, implementing and / or controlling the steps of the method according to one of the above
  • Fig. 1 is a representation of a use of an embodiment of an information system for detecting at least one on a field
  • Fig. 2 is a schematic representation of an embodiment of the
  • FIG. 3 is a flowchart of a method according to a
  • FIG. 1 shows an illustration of a use of an exemplary embodiment of an information system 100 for recognizing at least one plant 110 cultivated on a field 105.
  • the plants 110 can thereby be arranged in columns, as shown in the left-hand area in FIG.
  • the geographical position of the individual plants 110 may already have been determined and stored in a corresponding file. Now on the field 105
  • the information system 100 has a detection unit 120 for detecting a (single) plant 110a.
  • the detection unit 120 may include, for example, an optical camera 125 and / or an infrared sensor 130 (also referred to as an infrared camera) that form an optical image and / or an infrared image of the plant 110a. It is also conceivable that the detection unit 120 comprises a filter unit 135 in order to extract a color component, such as the green color component of the image captured by the optical camera 125.
  • an infrared image of the infrared sensor 130 and / or one of the optical image of the camera 125 / or an infrared image of the infrared sensor 130 derived image can then as plant image information 140 from the
  • Detecting unit 120 are output.
  • the plant image information 140 is then sent from a unit 145 to
  • the plant information 150 represents the presence of the plant 110a.
  • the unit 145 may use an image processing algorithm for identifying this by using color components such as, for example, a green component in certain image regions, to draw a conclusion that a leaf or other component of the plant 110a would have to be imaged in these areas concerned. Additionally or alternatively, a
  • Edge detection algorithm used to outline the plant 110a and thereby recognize a spatial delimitation of the plant 110a of one or more adjacent plants.
  • the plant information 150 is transferred to a storage unit 155 into which the plant information 150 together with a
  • Position information 160 which represents a geographical position, is stored in a plant dataset 165.
  • the position information 160 is in this case provided by a position detection unit 170, which receives the position information 160 using, for example, a
  • This position information 160 refers to the current geographical position, of which the detection unit 120 or a subunit of the detection unit 120 has created an image. In a larger spatial extent of the information system 100 can in this case in the
  • Position detection unit 170 where appropriate, an offset correction take place, which takes into account a distance of an antenna 180 of the position detection unit 170 of the image pick-up area of the detection unit 120.
  • a file is stored, at which distance or at which geographical positions the plants 110 were planted, so that, for example, from the
  • a trigger signal 185 is sent to the detection unit 120 when a geographical position is reached at which a plant 110a is expected. In this way, a numerical effort for the determination or evaluation of the of the detection unit 120th
  • the position detection unit 170 also detects only a relative position of the information system 100 with respect to a corner coordinate 187 of the field 105 and thereby determines the relative position of the plant 110a. This relative position of the plant with respect to the corner coordinate 187 as a geographic position or as
  • Position information 160 is also sufficient here for the
  • the position information 160 is supplied as a signal to the unit 145 for identification.
  • unit 145 may already use the knowledge of the current geographical position from position information 160 to identify, for example, from previous recognition cycles a particular species of plant, plant size or the like by comparing the current plant image information with those contained in plant data set 165, from previously identified plant information. In this way, an effort for identifying the plant can be significantly reduced, since already some information about the expected species of RooellOa, size of the plant or the like can be suspected.
  • This growth information can then also be transmitted as part of the plant information 150 to the storage unit 155 and in the
  • Plant dataset 165 are stored. In this way, not only can a count of the individual plants 110 be made on the field 105, but it is also technically very easy to create a machine data record 165 which is very easy to machine and which provides a precise evaluation of the growth of the field 105 growing Plant 110 allows.
  • the information system 100 is used to determine a history of the growth of the plants 110 grown on the field 105.
  • the information system 100 can be used at intervals, in particular cyclically, at intervals of days, weeks and / or months in order to detect the plants 110a, 110b and / or 110c cultivated on the field 105.
  • a corresponding combination of the plant information 150 with the respective position information 160 can be stored in the plant data record 165, wherein additionally a time information for this above combination of plant information 150 and position information 160 is stored, which stores a time, for example indicates a date and an hour to be entered by the detection unit 120
  • Plant image information of the corresponding plant 110a, 110b and / or 110c has been detected.
  • the recognition of the plants 110 planted on the field 105 is particularly simple if the information system 100 is fastened, for example, on or to a vehicle or aircraft not shown in FIG. As a result, detection of the plants 110 planted on the field 105 can be carried out quickly and reliably without major personnel expenditure, and a large and machine-evaluable plant data set 165 can be set up very quickly by technical means.
  • the history of the physiological development of each individual plant measured can be taken into account in the assessment of breeding candidates, varieties, seed qualities or pesticide use.
  • the approach proposed here can be used in particular for the assessment of the field emergence stage as an important development phase of crop plants.
  • the task is the numerical and quota recording of rising plants.
  • the field crop rate is an essential one, especially with increasing seed quality (combined with high seed costs) and precise application with precision seed drills
  • seed is offered in different qualities and for different
  • a quality feature is the starting quota guaranteed by the manufacturer.
  • Another quality-forming feature is the field emergence time. The simultaneous emergence of sowing results in a homogeneous stock. This is for plant care, for certain
  • Breeding goals are z. As stress tolerances, resistance and fruit-specific characteristics. Yield is an integral breeding goal achieved through in conjunction with several specific subgoals.
  • Varieties developed by seed producers are tested and monitored in terms of their characteristics and characteristics by these institutions.
  • the improved measurement and analysis technology allows testing to be performed faster, with greater accuracy, and less labor.
  • the methods according to the invention can also be used in daily agricultural operations.
  • the data on field emergence and plant development gained during cultivation are inferior
  • the measurement is made by photogrammetric acquisition for the short duration of a single exposure. This excludes the influence of external factors such as B. Wind out. i. If several image recordings are taken one after the other (for example, while the measuring device is being moved over the inventory), then images of the corresponding plants in the image detail are obtained from different starting positions. The evaluation of the obtained images enables an improvement of the measurement results. j.) With the help of classification methods, the distinction
  • This classification is additionally supported by a-priori knowledge (such as alination, row and plant spacing).
  • the cyclical measurements and classifications result in a refinement of the measurement and classification results, for example by transformation (in the process of alination) of the positionally determined position grid of the plants. k.) Due to the history of each plant contained in the data management in its original position, a correction or improvement of the
  • the data processing can both online (during the current
  • the online analysis also allows Plausibility checks of the acquired measurement data in order to minimize errors and external influences or exclude them on site.
  • the use of suitable methods for data compression and reduction allows the (longer-term) storage of measurement and analysis data on the device or a (radio) transmission to a cloud-based storage system.
  • Data Management also allows the collection and analysis of data collected by third-party devices or processes. Corresponding interfaces for data exchange are required. o.) From the obtained data further insights (eg over
  • Pest infestation, diseases, influence of external factors such as weather, water, drought can be obtained.
  • Influencing factors are of crucial importance for the assessment of breeding, seed treatment or plant protection measures.
  • the (semi-) automatic functions according to the invention contribute to a holistic examination of the plants in the herd (that is, as a result, the user obtains a consistent overall picture of the plants in the herd during different stages of development).
  • the invention takes into account a plausibility check on the basis of a (partial) online evaluation of the measured data. This ensures that faults in the measuring device (such as a soiled camera lens) are already detected at the beginning of the measurement and recording of the data. This ensures that no unsuccessful attempts and resources are spared.
  • Position assignment realized. b. A data management part that collects, archives, evaluates and / or processes the comparatively large amounts of data, for example, as meaningful reports and also helps to plan the measurements.
  • the recorded measurement data can be processed directly on the measuring instrument or transmitted to a central server.
  • the storage on suitable data carriers is possible, which can be read in the wake of the measurement.
  • a suitable mobile solution is used by the measuring device (ie herein).
  • This mobile solution can be designed as a (semi-) autonomous vehicle, hand or motor vehicle, as an attachment for tractors, as an aircraft (for example as a drone) or in the simplest form in a portable form (for humans or animals).
  • a plurality of parallel measuring devices 120 may be integrated in a structure for optimum construction of the measuring device 100
  • Information system 100 has a position and orientation sensor 170 (eg rtkGPS), optionally supplemented by an orientation sensor (eg an inertial measurement unit IMU), optionally supplemented by odometrically obtained position information.
  • a position and orientation sensor 170 eg rtkGPS
  • an orientation sensor eg an inertial measurement unit IMU
  • odometrically obtained position information optionally supplemented by odometrically obtained position information.
  • It also has one or more cameras 125 with RGB, I R information channels and a suitable illumination unit for the required wavelength ranges.
  • An optional shading unit of the image surfaces ensures that the disturbing influence of extraneous light and
  • Shading is minimized.
  • the shading is designed so that it does not touch the plants when crossing.
  • the following exemplary processes and functions are integrated into the measuring and evaluation device 100:
  • the concrete experimental design (here, for example, division of the parcels with their assignment to test objects and series) is likewise read into the measuring device 100 before the measurements are started (experiment description).
  • the measuring device 100 is guided in rows (eg in the drilling direction) over the test surfaces (that is to say here the field 105). This process can take place several times.
  • Measuring device 100 built-in cameras 125 and / or 130 of the
  • Detection unit 120 driven according to the known boundaries of the parcels and take data as plant image information 140. It is particularly noteworthy that one or more plants 110a and 110b are captured in the image section during the crossing. Thus, multiple images of the same plant 110a and 110b are sequentially made by the one-shot imaging technique (Image Sequencing). This method allows the influences of changing wind and light conditions as well
  • image data recorded in lighting conditions includes a.) RGB images and b.) Infrared images.
  • the cameras 125 and 130 are calibrated (for example, to calculate the height assignment from the pictures). 6.
  • the captured by the cameras 125 and 130 images are registered with each other to z.
  • an NDVI index Normalized Differenced Vegetation Index, which is formed from reflection values in the near infrared and visible red wavelength range of the light spectrum
  • biomass masks reduced image data with clusterings of green fractions that are very likely to represent individual plants.
  • the biomass masks are assigned positions in the global coordinate system (eg rtkGPS positions).
  • fusion eg orientation of the measuring device, transformation between camera position and position sensors.
  • Classification method applied to the detected green clusters In the case of a follow-on crossing, the biomass masks are re-identified on the basis of their global position data. The classification can be done online (during the current measurements) or offline (in the aftermath). The used classifiers have been trained for this purpose before the measurements. For this purpose, data has been collected on trial plots and manually examined and "labeled" (annotated) . The trainer distinguishes between crop and weeds in this process
  • Parameters such as geometric features of the biomass masks (such as shape or size) and statistical features (such as the number of pixels of particular colors and their distribution based on intensity values). 10.
  • the classification is based on all the image data recorded during the measurements (several images during the crossing, over several crossings). All classification results are taken into account and contribute to an overall result based on a majority voting evaluation process. This result is updated at the current date and can be used to correct misclassifications in the past correct. Overall, the method used increases the likelihood of correct classification.
  • Biomass masks classed as crops are used to calculate appropriate characteristics that contribute to the results of the trial (eg leaf area estimation, field crop detection or crop loss).
  • the data is stored after recording and evaluation z. B. stored in a cloud-based environment and maintained for visualization and further analysis on terminals. Evaluations can have different tabular, graphical and written form and contain all measured and derived parameters (eg scaled representation of the plants in their plots as plan view, weed stock in the trial field, etc.).
  • Fig. 2 shows a schematic representation of an embodiment of the method with an operation of an automated high-throughput Deginaung and measurement in the field.
  • a carrier platform such as the information system 100, the position detection unit is first used
  • a position information 160 representing the geographical position.
  • This position information 160 can be detected, for example, via an interface such as an antenna 180 to a satellite-based location system 175, so that, for example, GPS data can be detected and thereby the geographic position 160 of the information system 100 can be determined. It is also conceivable that the position information 160 a memory in conjunction with odometrically detected movement data of the
  • the carrier platform or the information system 100 comprises at least one detection unit 120 for detecting a plant 110a or 110b or 110c by means of the optical camera
  • the detection unit 120 may, for example, comprise one or more sensor systems, such as the cameras 125 and 130, which are embodied, for example, as low-cost optical cameras which capture / capture images of the plants 110a or 110b and derive these as plant image information 140 or derived therefrom Information of unit 145 transmitted for identification.
  • Information of unit 145 transmitted for identification.
  • the unit 145 for identifying for example, a data extraction and / or a data fusion of the plant image information 140 and / or the information derived therefrom with the position information 160 can take place.
  • an assignment 215 of the geographical position (which can be taken from the position information 160) to the plant 110a can be made.
  • the geographical position (for example using GPS data) of the plant 110a known from the position detection unit 170 and / or from a memory 220 can be used and / or used for plausibility checking.
  • a specific capture or identification 220 of the plant 110a may be in the form of the plant image information 140 or derived therefrom Information is provided so that, for example, this information in the
  • Plant data set 165 can be stored. This can be done in the
  • Plant data set 165 may be stored as part of the plant image information 140 or the information derived therefrom information about extracted features of the plant 110a, such as a sheet number, a leaf structure or the like of the respective plant 110a. Finally, in the storage unit 155, the plant information 145 together with the respectively corresponding position information 160 is stored in the plant data record 165.
  • Exposure unit (for example, as a subunit of the unit 120 for detecting) is driven in order to change the brightness in the area detected by the camera (s) 125 and 130, in particular to increase.
  • Drive signal 230 may be output to unit 120 for detection, for example, by unit 145 for identifying and / or storage unit 155 or the control unit (not shown in FIG. 2).
  • the plant dataset 165 may then, for example, to a
  • Processing unit 235 are output, either part of the
  • Information system 100 or, for example, in a central computer 237 is arranged, as shown by way of example in Figure 2. In this
  • Processing unit 235 then carries out, for example, the processing in a 4D processing level, as illustrated by way of example in region 240 of FIG.
  • this 4D processing level an evaluation of the plant parameter data set 105, especially of the one preferably takes place
  • Plant data set 165 stored geographic positions 245, for example, as the GPS data of the plants 110a and 110b assigned Position information 160 is present, with data 250 relating to a history of the development of the individual plant (s) 110a or 110b or the re-identification of the individual plant (s) 110a or 110b, for example using the
  • Soil water content of the soil, weather and / or
  • an annotation 238 takes place in which a change in the data of the stored in the storage unit 155 plant data set 165 takes place.
  • a correction can take place such that upon detection of a dead plant 110a or 110b, for example due to animal feeding or due to drought, the plant information for the plant 110a or 110b receives additional information such that this plant 110a or 110a resp.
  • the plant data record 165 can be stored in a database 260, for example directly in the central computer 237, a PC or a cloud, not shown here. Also conceivable is a modification of the plant data record 165, which from the database 260 by means of a
  • Information from the plant dataset 165 can also be found in a
  • Display unit 265 are displayed visually. From this you can leave
  • Results 270 determine or determine the quality of a seed, a particularly favorable method of growing seed, a development of varieties of the plant, a development of the use of plant protection products etc. improve. Also conceivable here is a change of the
  • Plant data set 165 which is triggered by the display unit 265 by means of a change signal 275.
  • the method 300 comprises a step 310 of detecting a plant by means of an optical and / or infrared detection unit in order to obtain plant image information and detecting position information representing a geographical position at which the plant grows in the field. Further, the method 300 includes a step 320 of identifying the plant using the plant
  • the method 300 includes a step 330 of plant information and position information in a plant dataset to identify the plant planted in the field.
  • an exemplary embodiment comprises a "and / or" link between a first feature and a second feature, then this is to be read so that the embodiment according to one embodiment, both the first feature and the second feature and according to another embodiment either only first feature or only the second feature.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un procédé (300) destiné à détecter au moins une plante (110a, 110b) plantée dans un champ (105). Le procédé (300) comprend une étape (310) consistant à détecter une plante (110a, 110b) au moyen d'une unité de détection optique et/ou infrarouge (120, 125, 130) afin d'obtenir une information d'image de plante (140) et détecter une information de position (160) représentant une position géographique à laquelle la plante (110a, 110b) pousse dans le champ (105). En outre, le procédé (300) comprend une étape (320) consistant à reconnaître la plante (110a, 110b) à l'aide d'information d'image de plante (140) pour obtenir une information (145) relative à la plante qui représente la présence de la plante (110a, 10b). Enfin, le procédé (300) comprend une étape (330) de mémorisation (330) de l'information (145) relative à la plante et de la position géographique dans un ensemble de données de plante (165) pour détecter la plante (110a, 110b) plantée dans le champ (105).
EP16785383.7A 2015-10-28 2016-10-07 Procédé et système d'information pour détecter au moins une plante plantée dans un champ Pending EP3369037A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102015221085.5A DE102015221085A1 (de) 2015-10-28 2015-10-28 Verfahren und Informationssystem zum Erkennen zumindest einer auf einem Feld angepflanzten Pflanze
PCT/EP2016/073939 WO2017071928A1 (fr) 2015-10-28 2016-10-07 Procédé et système d'information pour détecter au moins une plante plantée dans un champ

Publications (1)

Publication Number Publication Date
EP3369037A1 true EP3369037A1 (fr) 2018-09-05

Family

ID=57199957

Family Applications (1)

Application Number Title Priority Date Filing Date
EP16785383.7A Pending EP3369037A1 (fr) 2015-10-28 2016-10-07 Procédé et système d'information pour détecter au moins une plante plantée dans un champ

Country Status (5)

Country Link
US (1) US20180308229A1 (fr)
EP (1) EP3369037A1 (fr)
CN (1) CN108140118A (fr)
DE (1) DE102015221085A1 (fr)
WO (1) WO2017071928A1 (fr)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017217258A1 (de) * 2017-09-28 2019-03-28 Robert Bosch Gmbh Verfahren zum Klassifizieren von Pflanzen
US10492374B2 (en) 2017-12-28 2019-12-03 X Development Llc Capture of ground truthed labels of plant traits method and system
CN109214831B (zh) * 2018-08-09 2021-08-03 云智前沿科技发展(深圳)有限公司 基于位置信息与DNA信息的Hash指纹及构建方法与应用
US11676244B2 (en) 2018-10-19 2023-06-13 Mineral Earth Sciences Llc Crop yield prediction at field-level and pixel-level
DE102018251708A1 (de) * 2018-12-27 2020-07-02 Robert Bosch Gmbh Verfahren zur Zerstörung von zumindest einer Pflanze
US11803959B2 (en) * 2019-06-24 2023-10-31 Mineral Earth Sciences Llc Individual plant recognition and localization
US11508092B2 (en) 2019-12-16 2022-11-22 X Development Llc Edge-based crop yield prediction
CN113407755A (zh) * 2020-03-17 2021-09-17 北京百度网讯科技有限公司 植物生长状况信息获取方法、装置及电子设备
US11532080B2 (en) 2020-11-17 2022-12-20 X Development Llc Normalizing counts of plant-parts-of-interest
DE102021114996A1 (de) 2021-06-10 2022-12-15 Eto Magnetic Gmbh Vorrichtung zu einem Erkennen eines Sprießens von Aussaaten, Agrarsensorvorrichtung und Agrarüberwachungs- und/oder Agrarsteuerungsverfahren und -system
US20230274541A1 (en) * 2022-02-25 2023-08-31 X Development Llc Aggregate trait estimation for agricultural plots
EP4393293A1 (fr) 2023-01-02 2024-07-03 KWS SAAT SE & Co. KGaA Procédé d'analyse de cultures hors sol dans un champ agricole
CN117530164B (zh) * 2024-01-09 2024-04-02 四川省农业机械科学研究院 一种基于机器视觉的智能决策立体循环育秧方法和系统

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3906215A1 (de) * 1989-02-28 1990-08-30 Robert Prof Dr Ing Massen Automatische klassifikation von pflaenzlingen
DE4413739C2 (de) * 1994-04-20 1996-07-18 Deutsche Forsch Luft Raumfahrt Einrichtung zum Erkennen und Unterscheiden von Pflanzen und Bodenbereichen sowie zum Unterscheiden von Kultur- und Wildpflanzen
US7141773B2 (en) * 2001-08-06 2006-11-28 Bioview Ltd. Image focusing in fluorescent imaging
DE10211706A1 (de) * 2002-03-16 2003-09-25 Deere & Co Austrageinrichtung einer landwirtschaftlichen Erntemaschine
US7412330B2 (en) * 2005-08-01 2008-08-12 Pioneer Hi-Bred International, Inc. Sensor system, method, and computer program product for plant phenotype measurement in agricultural environments
EP2685813A4 (fr) * 2011-03-16 2014-12-03 Univ Syddansk Rampe de pulvérisation pour pulvériser de manière sélective une composition désherbante sur des dicotylédones
CN102208099A (zh) * 2011-05-30 2011-10-05 华中科技大学 一种抗光照变化的作物彩色图像分割方法
WO2013030965A1 (fr) * 2011-08-30 2013-03-07 富士通株式会社 Dispositif d'imagerie, programme de support d'imagerie, procédé de fourniture d'informations et programme de fourniture d'informations
JP5673833B2 (ja) * 2011-08-30 2015-02-18 富士通株式会社 撮影装置、作業支援プログラム、および情報提供プログラム
BR112014009255A8 (pt) * 2011-10-20 2017-06-20 Monsanto Technology Llc contador de plantas
US9251420B2 (en) * 2013-01-22 2016-02-02 Vale S.A. System for mapping and identification of plants using digital image processing and route generation
JP5950166B2 (ja) * 2013-03-25 2016-07-13 ソニー株式会社 情報処理システム、および情報処理システムの情報処理方法、撮像装置および撮像方法、並びにプログラム
WO2015134886A1 (fr) * 2014-03-06 2015-09-11 Raven Industries, Inc. Système et procédé de détection d'un bord
US10402835B2 (en) * 2014-07-16 2019-09-03 Raytheon Company Agricultural situational awareness tool
US9652690B2 (en) * 2015-02-27 2017-05-16 Lexmark International, Inc. Automatically capturing and cropping image of check from video sequence for banking or other computing application

Also Published As

Publication number Publication date
US20180308229A1 (en) 2018-10-25
DE102015221085A1 (de) 2017-05-04
CN108140118A (zh) 2018-06-08
WO2017071928A1 (fr) 2017-05-04

Similar Documents

Publication Publication Date Title
WO2017071928A1 (fr) Procédé et système d'information pour détecter au moins une plante plantée dans un champ
EP3700320B1 (fr) Production de cartes numériques de traitement
CN111582055B (zh) 一种无人机的航空施药航线生成方法及系统
WO2018036909A1 (fr) Lutte contre des organismes nuisibles basée sur la prévision de risques de contamination
DE102015216080A1 (de) Anordnung zur Sammlung von Bilddaten
EP3782467A1 (fr) Procédé d'identification des mauvaises herbes dans un rang défini de plantes d'une surface agricole
EP3701449A1 (fr) Estimation du rendement de production de plantes cultivées
DE102015221092A1 (de) Verfahren und Informationssystem zum Erfassen zumindest eines Pflanzenparameterdatensatzes einer auf einem Feld wachsenden Pflanze
DE102011078290A1 (de) Verfahren und Vorrichtung zum Klassifizieren eines Umgebungsbereiches eines Fahrzeuges
EP3516580B1 (fr) Lutte contre des organismes nuisibles
DE102018216476A1 (de) Verfahren zum Sanieren entwicklungsverzögerter Pflanzen
Qiao et al. AI, sensors and robotics in plant phenotyping and precision agriculture
DE102019218189A1 (de) Verfahren zum Generieren einer Vielzahl von annotierten Bildern
DE102017217258A1 (de) Verfahren zum Klassifizieren von Pflanzen
WO2021198731A1 (fr) Procédé de diagnostic de santé et d'évaluation du développement de caractéristiques physiques de plantes agricoles et horticoles basé sur l'intelligence artificielle
Kim et al. Case study: Cost-effective weed patch detection by multi-spectral camera mounted on unmanned aerial vehicle in the buckwheat field
EP3668311A1 (fr) Utilisation de données provenant d'essais sur le terrain dans la protection de plantes pour l'étalonnage et l'optimisation de modèles de prévision
DE202022102591U1 (de) System zur Überwachung des Gesundheitszustands von Pflanzen in der Präzisionslandwirtschaft mittels Bildverarbeitung und Faltungsneuronalem Netz
EP3468339B1 (fr) Procede de determination de caracteristiques d'une plante utile
Butler Making the replant decision: utilization of an aerial platform to guide replant decisions in Tennessee cotton
DE202024100734U1 (de) Biomasse-Überwachungssystem für landwirtschaftliche Flächen
Maldaner Sugarcane plant detection and mapping for site-specific management
DE102016203850B4 (de) Verfahren zum Klassifizieren von Pflanzen
CN116758330A (zh) 一种植物病虫害图像识别系统
EP4064818A1 (fr) Procédé de traitement de plantes dans un champ

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20180528

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: ROBERT BOSCH GMBH

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20210426

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS