DE102011078290A1 - Method for classifying surrounding region of vehicle e.g. agricultural tractor, involves comparing detected pattern on recorded image of the surrounding area with patterns stored in database - Google Patents

Method for classifying surrounding region of vehicle e.g. agricultural tractor, involves comparing detected pattern on recorded image of the surrounding area with patterns stored in database

Info

Publication number
DE102011078290A1
DE102011078290A1 DE102011078290A DE102011078290A DE102011078290A1 DE 102011078290 A1 DE102011078290 A1 DE 102011078290A1 DE 102011078290 A DE102011078290 A DE 102011078290A DE 102011078290 A DE102011078290 A DE 102011078290A DE 102011078290 A1 DE102011078290 A1 DE 102011078290A1
Authority
DE
Germany
Prior art keywords
vehicle
pattern
surrounding area
ub
d1
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
DE102011078290A
Other languages
German (de)
Inventor
Michael Dorna
Peter Biber
Ulrich Weiss
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Priority to DE102011078290A priority Critical patent/DE102011078290A1/en
Publication of DE102011078290A1 publication Critical patent/DE102011078290A1/en
Application status is Pending legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B69/00Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
    • A01B69/001Steering by means of optical assistance, e.g. television cameras
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B79/00Methods for working soil
    • A01B79/005Precision agriculture

Abstract

The present invention relates to a method and a device for classifying a surrounding area (UB) of a vehicle (10), the method comprising the following method steps: taking an image of the surrounding area (UB) of the vehicle (10) by means of an imaging sensor device (23) Detecting a pattern on the captured image of the surrounding area (UB) and classifying the surrounding area (UB) of the vehicle (10) by comparing the recognized pattern with patterns stored in a pattern database (D1).

Description

  • The invention relates to a method and a device for classifying a surrounding area of a vehicle.
  • State of the art
  • An agricultural utility vehicle with an adjustable in its position and / or orientation relative to the agricultural vehicle arranged processing device is in the document DE 297 24 884 U1 described. The agricultural utility vehicle described therein has at least one agricultural processing device that can be adjusted relative to the agricultural utility vehicle in its position and orientation and is equipped with a GPS satellite navigation system receiving unit.
  • From the DE 44 31 824 C1 For example, a method is known in which location coordinates are determined by means of a satellite navigation system. In the method described there, operating data of a combine harvester are combined with the respective local coordinates of the combine harvester and set therefrom setpoint or limit operating data for processing a field, wherein the absolute and / or relative location coordinates of the combine are continuously recorded and the acquired data are assigned to the combine harvester and further the respective area specific yield measurement data of the field is collected and calculated, in particular as a yield data cadast, which is spatially resolved representing the harvest of the field harvested by the harvester and stored in the combine for further use as a historical yield data cadastre.
  • Disclosure of the invention
  • The invention provides a method for classifying a surrounding area of a vehicle with the features of patent claim 1 and an apparatus for classifying a surrounding area of a vehicle with the features of patent claim 11.
  • The present invention is based on the recognition that classification of an environment of a vehicle by means of a pattern database can be made by comparison with patterns derived from measured data from environmental sensing imaging sensors.
  • Advantages of the invention
  • The method according to the invention simultaneously enables a self-localization of the vehicle and an environmental identification of the vehicle. This allows navigation on previously unknown terrain. The method also serves as a fault detection mechanism during navigation and thus contributes to greater safety of the vehicle.
  • Advantageous embodiments and further developments will become apparent from the dependent claims and from the description with reference to the figures of the drawings.
  • According to an advantageous development of the method, by classifying the surrounding area, an identification of the surrounding area of the vehicle with regard to different environmental classes takes place, wherein the patterns stored in the pattern database describe the different environmental classes. An advantage that results from identifying the environment area is that the vehicle can make certain actions depending on the identified environmental area.
  • According to a further advantageous development of the method, by classifying the surrounding area, a determination is made of a relative position of the vehicle in an agriculturally used field, wherein the patterns stored in the pattern database describe possible relative positions of the vehicle. This allows an advantageous simple method for determining the relative position of the vehicle in harvesting with only low susceptibility.
  • According to a further advantageous embodiment of the method, the relative position of the vehicle is determined as a position on a Vorgewände, on a row end or on a center of the agricultural field.
  • According to a further advantageous development of the method, the comparison of the recognized pattern with the patterns stored in the pattern database is carried out by correlation of the recognized pattern with the stored patterns. This advantageously enables an autonomous environmental classification of the vehicle without the need to establish and maintain data links that are prone to interference with central servers or other satellite navigation systems.
  • According to a further advantageous development of the method, the imaging sensor device has a plurality of sensors and a combination of the sensors is used to record the image of the surrounding area. An advantage of using a combination of the sensors to capture the image results is that the system has increased reliability in the event of failure of a sensor or a sensor system.
  • According to a further advantageous development of the method, the patterns stored in the pattern database are generated by calculation.
  • According to a further advantageous development of the method, the patterns stored in the pattern database are generated by measuring the patterns.
  • According to a further advantageous development of the method, the recognized pattern is stored as a further pattern in the pattern database. This allows for improved pattern recognition.
  • According to a further advantageous embodiment of the method, the recognition of the pattern is performed with a variable image recognition algorithm. This advantageously leads to an increased rate of pattern recognition.
  • According to an advantageous development of the device, the imaging sensor device is designed as a camera, an infrared camera, a TOF camera, a 3D camera, an imaging vegetation sensor, a 3D laser scanner or as an imaging ultrasound sensor device.
  • According to a further advantageous development of the device, the imaging sensor device has a plurality of sensors.
  • According to a further advantageous development of the device, the patterns stored in the pattern database are stored in pattern classes.
  • According to a further advantageous development of the device, the control device is further configured to determine a relative position of the vehicle in an agricultural field by comparing the recognized pattern with patterns stored in the pattern database.
  • Brief description of the drawings
  • Further features and advantages of the present invention will be explained below with reference to the figures.
  • Show it:
  • 1 a schematic representation of an apparatus for classifying a surrounding area of a vehicle according to a possible embodiment of the device;
  • 2 a flowchart for illustrating an embodiment of the method for classifying a surrounding area of a vehicle according to a possible embodiment of the invention;
  • 3 a table for explaining a pattern database; and
  • 4 a schematic representation for explaining an image recognition transformation of a recorded environment pattern.
  • Embodiments of the invention
  • In the figures, like reference numerals designate the same or functionally identical elements.
  • The 1 shows a schematic representation of an apparatus for classifying a surrounding area of a vehicle according to a possible embodiment of the device.
  • A vehicle 10 has a device for classifying a surrounding area UB of the vehicle 10 on. The device includes one on the vehicle 10 or on the body 20 the vehicle arranged imaging sensor device 23 , which is designed to be an image of a surrounding area UB of the vehicle 10 wherein the image of the surrounding area UB is limited by visual field boundaries SFG.
  • The vehicle 10 For example, it is designed as a motor vehicle, a truck, a tractor, a tractor or a tractor. Furthermore, the vehicle 10 be designed as an agricultural towing vehicle, as a combine harvester, as a seed drill or sowing machine, as a planter or fertilizer spreader.
  • The device further comprises a memory device 21 which is adapted to store patterns in a pattern database D1. Furthermore, the device comprises a control device 22 , which is designed to recognize a pattern on the recorded image of the surrounding area UB and the surrounding area UB of the vehicle 10 by classifying the recognized pattern with patterns stored in the pattern database D1.
  • The pattern is given, for example, by a serial order on an agricultural field LF, wherein the vehicle 10 used to harvest the agricultural field LF.
  • The agricultural field LF is, for example, a field that has been worked with a plow, for example, and thus has a characteristic pattern.
  • The use of the pattern database D1 allows a simple and fast way to load other environment classes and to adapt the classification to the respective task. The pattern database D1 is embodied, for example, as a hierarchical, network-type or relational database system for the electronic data management of the patterns.
  • The method and apparatus are not specific to a sensor type of the imaging sensor device 23 bound, wherein the imaging sensor device 23 is designed as a camera, an infrared camera, an imaging vegetation sensor or as an imaging ultrasonic sensor device.
  • The imaging sensor device 23 generates an image from measured variables of the surrounding area UB, wherein the measured variable or information derived therefrom can be visualized spatially resolved and coded via brightness values or colors.
  • For example, the sensor device 23 run as a TOF camera or other 3D camera system, with the time of flight (TOF) distances are measured. For this purpose, the surrounding area UB by means of one of the sensor device 23 illuminated light pulse and the TOF camera measures the time for each pixel, the light to the object and back again needs.
  • By the control device 22 For example, from the sensor data of the sensor device 23 derived environment descriptive pattern, and classify the surrounding area UB of the vehicle 10 by comparing the recognized pattern with patterns stored in the pattern database D1.
  • For example, by classifying the surrounding area UB, an identification of the surrounding area UB of the vehicle is made 10 with regard to different environmental classes, the patterns stored in the pattern database D1 describing the different environmental classes.
  • The identification of the surrounding area UB comprises, for example, the recognition of the cultivated plants cultivated on the agricultural field LF or the recognition of other characteristic features of the surrounding area UB.
  • Furthermore, the identification of the surrounding area UB can be used to determine a degree of ripeness of the crops grown on the agricultural field LF and, depending on the recognized degree of ripeness, certain settings on the vehicle 10 make.
  • By classifying the surrounding area UB, it may also determine a relative position of the vehicle 10 carried out on the agricultural field LF, the pattern stored in the pattern database D1 possible relative positions of the vehicle 10 on the agricultural field LF.
  • The localization of the vehicle is performed, for example, not in metric values, but with respect to the environment or with respect to the environmental classes, thus allowing a localization of the vehicle with respect to the environment without the use of expensive, highly accurate GPS or other satellite navigation devices.
  • The relative position may be given in relation to a preferred direction of the lanes of the agricultural field LF, or with respect to the agricultural field LF itself, for example the edge of the field, the middle of the field or the headland or the end of the field ,
  • This allows a determination as to whether the user or the vehicle 10 in the middle of the field, on the front or at the end of an agricultural field. With this information, especially the position of the vehicle 10 , topological maps of the field can be generated.
  • The control device 22 is intended for example as a digital computer with image analysis software for controlling the device, for image storage and image evaluation.
  • As storage device 21 For example, a data memory, a semiconductor memory, a magnetic storage medium or an optical data carrier can be used.
  • The environmental classes are described by patterns derived from the sensor data of the imaging sensor device 23 be derived. For each expected environment class, patterns are stored in the pattern database D1, with which the measured patterns are compared. The Pattern database D1 consists of different environment classes for each of which one or more patterns are stored.
  • The correlation of the measured patterns with patterns stored in the pattern database D1 becomes, for example, in the control device 22 calculated, thereby the captured image of the environmental area UB can be assigned to a specific environment class. Different sample databases are used for different applications.
  • The 2 FIG. 12 is a flowchart illustrating an embodiment of the method for classifying a surrounding area of a vehicle according to a possible embodiment of the invention.
  • In a step E1, the imaging sensor device is interrogated 23 whereby the image of the surrounding area UB of a crop plantation of the agricultural field LF is generated.
  • In a step E2, a pattern is generated based on an evaluation of the generated image of the surrounding area UB.
  • In an alternative method, the pattern generation can also take place by inputting data by hand, for example with regard to a height distribution of the crop planting or the field soil of the utilized field LF within the surrounding area in a step E3.
  • By the control device 22 In a step E6, an image recognition is performed, wherein the image recognition can be performed with a variable image recognition algorithm. This makes it possible for the control device 22 can learn the recognition of patterns based on already recorded patterns.
  • In a step E4, an arbitrary summary of the stored patterns can be made into advantageous pattern classes, wherein the summary of the patterns, for example, depending on the field used agricultural LF, a season of the time of processing the field or the type of vehicle 10 can be done.
  • The method for classifying the surrounding area UB of the vehicle 10 further comprises the steps of: taking an image of the surrounding area UB of the vehicle 10 by means of an imaging sensor device 23 , Recognizing a pattern on the recorded image of the surrounding area UB and classifying the surrounding area UB of the vehicle 10 by comparing the recognized pattern with patterns stored in a pattern database D1.
  • The 3 shows a table for explaining a pattern database. Columns 1 to 5 and lines A and B of the table show patterns that are characterized by the tramlines shown in black / white contrast.
  • For example, the crops cultivated on agricultural fields LF have already been created during sowing or during interim processing, the patterns for example forming lanes of the agricultural field LF. The table shows possible patterns as thumbnails.
  • The driving lane is the part of the agricultural field LF which, during the growth of the crop, is repeated for car maintenance 10 or other vehicles is driven.
  • The field is divided into environment classes and stored for each class of patterns that characterize the environment. In the 3 Furthermore, the generation of the pattern database for row crops is shown schematically. By means of the imaging sensor device 23 and by the controller 22 generated patterns can be created more patterns for the database.
  • Alternatively, however, patterns can also be generated using other methods, for example by manually entering data or by using stored terrain profiles from topographical maps or from other digital terrain profiles. The patterns thus obtained are determined by means of a in the control device 22 implemented pattern learning process stored in the pattern database D1.
  • Line A shows patterns for single-row tramlines, row B patterns for double-row tramlines. Column 1 shows patterns for an open field, column 2 patterns for a single row, column 3 patterns for a row start, column 4 patterns for a row end, column 5 patterns for a row gap.
  • The 4 shows a schematic representation for explaining an image recognition transformation of a recorded environmental pattern.
  • By way of example, the use of a 3D laser scanner as the imaging sensor device will be described 23 for the surroundings survey described and used as an application agriculturally used fields LF, which have pattern generating row crops or tramlines.
  • This process either gets the pattern class as input or uses the existing database for mapping to an environment class.
  • For ambient classification, the image of the surrounding area UB is detected by a 3D laser scanner, which serves as the imaging sensor device 23 is used, as in a in a perspective 3-dimensional view D1 in the 4 with the agricultural field LF in the x / y plane of the coordinate system used and the z-axis of the coordinate system normal to the agricultural field LF, where the z-coordinate can indicate the height information of the crop plantation of the utilized field LF ,
  • For recognizing a pattern within the captured image of the surrounding area UB by the control device 22 For example, a digital terrain model or a digital elevation model of the agricultural field LF is used as digital numerical storage of the altitude information of the field surface.
  • For example, over the terrain or elevation model, a uniform grid or grid, such as a grid of lines intersecting at the same distance, is laid, as in FIG 4 represented in a 2-dimensional x / z diagram D2a. Each grid point is assigned a height value or multiple height values within a respective Raser square, such as in a 2-dimensional bar graph D2b for an optional height section of the field surface in the 4 shown.
  • By scanning the entire field surface with further height cuts is by the controller 22 a pattern D3 of the height information of the field surface is calculated.
  • The obtained pattern D3 is compared with the patterns from the pattern database D1 by means of a correlation by the controller 22 compared. For the obtained pattern, for example, a similarity measure is calculated with each pattern stored in the pattern database D1.
  • Depending on the determined similarity measures, each environment class is assigned a point or probability value. These values may be from the controller 22 also be used for a classification of the patterns.
  • Although the present invention has been described above with reference to preferred embodiments, it is not limited thereto, but modifiable in a variety of ways.
  • QUOTES INCLUDE IN THE DESCRIPTION
  • This list of the documents listed by the applicant has been generated automatically and is included solely for the better information of the reader. The list is not part of the German patent or utility model application. The DPMA assumes no liability for any errors or omissions.
  • Cited patent literature
    • DE 29724884 U1 [0002]
    • DE 4431824 C1 [0003]

Claims (15)

  1. Method for classifying a surrounding area (UB) of a vehicle ( 10 ), the method comprising the following steps: - taking a picture of the surrounding area (UB) of the vehicle ( 10 ) by means of an imaging sensor device ( 23 ); - Recognizing a pattern on the recorded image of the surrounding area (UB); and classifying the surrounding area (UB) of the vehicle ( 10 by comparing the recognized pattern with patterns stored in a pattern database (D1).
  2. Method according to claim 1, wherein by classifying the surrounding area (UB) an identification of the surrounding area (UB) of the vehicle ( 10 ) with respect to different environmental classes, the patterns stored in the pattern database (D1) describing the different environmental classes.
  3. The method of claim 1, wherein classifying the surrounding area (UB) comprises determining a relative position of the vehicle (UB). 10 ) on an agricultural field (LF), the patterns stored in the pattern database (D1) indicating possible relative positions of the vehicle (LF). 10 ).
  4. Method according to claim 3, wherein the relative position of the vehicle ( 10 ) is determined as a position on a headland, on a row end or on a center of the utilized agricultural field (LF).
  5. Method according to one of Claims 1 to 4, wherein the comparison of the recognized pattern with the patterns stored in the pattern database (D1) is effected by correlating the recognized pattern with the stored patterns.
  6. Method according to one of claims 1 to 5, wherein the imaging sensor device ( 23 ) has a plurality of sensors and a combination of the sensors for capturing the image of the surrounding area (UB) is used.
  7. Method according to one of the preceding claims 1 to 6, wherein the patterns stored in the pattern database (D1) are generated by calculation.
  8. Method according to one of the preceding claims 1 to 6, wherein the patterns stored in the pattern database (D1) are generated by measurement.
  9. Method according to one of the preceding claims 1 to 8, wherein the recognized pattern is stored as a further pattern in the pattern database (D1).
  10. Method according to one of the preceding claims 1 to 9, wherein the recognition of the pattern is performed with a variable image recognition algorithm.
  11. Device for classifying a surrounding area (UB) of a vehicle ( 10 ), preferably for carrying out the method according to one of the preceding claims 1 to 10, wherein the device comprises: - one on the vehicle ( 10 ) arranged imaging device ( 23 ), which is adapted to an image of a surrounding area (UB) of the vehicle ( 10 ); A memory device ( 21 ) which is adapted to store patterns in a pattern database (D1); and a control device ( 22 ) which is designed to recognize a pattern on the recorded image of the surrounding area (UB) and the surrounding area (UB) of the vehicle ( 10 ) by comparing the recognized pattern with patterns stored in the pattern database (D1).
  12. Apparatus according to claim 11, wherein the imaging sensor device ( 23 ) is embodied as a camera, an infrared camera, a TOF camera, a 3D camera, an imaging vegetation sensor, a 3D laser scanner or as an imaging ultrasonic sensor device.
  13. Device according to one of claims 11 or 12, wherein the imaging sensor device ( 23 ) has a plurality of sensors.
  14. Device according to one of claims 11 to 13, wherein the patterns stored in the pattern database (D1) are stored in pattern classes.
  15. Device according to one of claims 11 to 14, wherein the control device ( 22 ) is further adapted to a relative position of the vehicle ( 10 ) on an agricultural field (LF) by comparing the recognized pattern with patterns stored in the pattern database (D1).
DE102011078290A 2011-06-29 2011-06-29 Method for classifying surrounding region of vehicle e.g. agricultural tractor, involves comparing detected pattern on recorded image of the surrounding area with patterns stored in database Pending DE102011078290A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE102011078290A DE102011078290A1 (en) 2011-06-29 2011-06-29 Method for classifying surrounding region of vehicle e.g. agricultural tractor, involves comparing detected pattern on recorded image of the surrounding area with patterns stored in database

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102011078290A DE102011078290A1 (en) 2011-06-29 2011-06-29 Method for classifying surrounding region of vehicle e.g. agricultural tractor, involves comparing detected pattern on recorded image of the surrounding area with patterns stored in database

Publications (1)

Publication Number Publication Date
DE102011078290A1 true DE102011078290A1 (en) 2013-01-03

Family

ID=47354974

Family Applications (1)

Application Number Title Priority Date Filing Date
DE102011078290A Pending DE102011078290A1 (en) 2011-06-29 2011-06-29 Method for classifying surrounding region of vehicle e.g. agricultural tractor, involves comparing detected pattern on recorded image of the surrounding area with patterns stored in database

Country Status (1)

Country Link
DE (1) DE102011078290A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9282693B2 (en) 2013-02-20 2016-03-15 Deere & Company Data encoding with planting attributes

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0219171A2 (en) * 1985-10-15 1987-04-22 Philips Electronics N.V. Biplane phased array transducer for ultrasonic medical imaging
EP0634628A1 (en) * 1993-07-13 1995-01-18 Daimler-Benz Aerospace Aktiengesellschaft Method and arrangement for earth observation
DE4431824C1 (en) 1994-09-07 1996-05-02 Claas Ohg Mähdrescherbetrieb with operating data register
GB2342242A (en) * 1998-09-25 2000-04-05 Environment Agency Environmental data collection system
DE10328395A1 (en) * 2003-06-18 2005-03-10 Poettinger Gmbh Geb Control method for agricultural machine such as tedder, compares image from image pick-up with reference image and generates steering control commands accordingly
DE29724884U1 (en) 1996-11-16 2005-05-04 Claas Kgaa Mbh Agricultural vehicle with satellite navigation receiver - calculates absolute position of defined reference point of working tool for corresponding adjustment to match ground contour.

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0219171A2 (en) * 1985-10-15 1987-04-22 Philips Electronics N.V. Biplane phased array transducer for ultrasonic medical imaging
EP0634628A1 (en) * 1993-07-13 1995-01-18 Daimler-Benz Aerospace Aktiengesellschaft Method and arrangement for earth observation
DE4431824C1 (en) 1994-09-07 1996-05-02 Claas Ohg Mähdrescherbetrieb with operating data register
DE29724884U1 (en) 1996-11-16 2005-05-04 Claas Kgaa Mbh Agricultural vehicle with satellite navigation receiver - calculates absolute position of defined reference point of working tool for corresponding adjustment to match ground contour.
GB2342242A (en) * 1998-09-25 2000-04-05 Environment Agency Environmental data collection system
DE10328395A1 (en) * 2003-06-18 2005-03-10 Poettinger Gmbh Geb Control method for agricultural machine such as tedder, compares image from image pick-up with reference image and generates steering control commands accordingly

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9282693B2 (en) 2013-02-20 2016-03-15 Deere & Company Data encoding with planting attributes

Similar Documents

Publication Publication Date Title
Busemeyer et al. BreedVision—A multi-sensor platform for non-destructive field-based phenotyping in plant breeding
Barawid Jr et al. Development of an autonomous navigation system using a two-dimensional laser scanner in an orchard application
US7058197B1 (en) Multi-variable model for identifying crop response zones in a field
Pedersen et al. Agricultural robots—system analysis and economic feasibility
US7248968B2 (en) Obstacle detection using stereo vision
Åstrand et al. A vision based row-following system for agricultural field machinery
AU2010255803B2 (en) Device and method for recording a plant
JP2011129126A (en) Automatic tagging for landmark identification
US10175362B2 (en) Plant treatment based on morphological and physiological measurements
US8204654B2 (en) System and method for generation of an inner boundary of a work area
Åstrand et al. An agricultural mobile robot with vision-based perception for mechanical weed control
Bakker et al. A vision based row detection system for sugar beet
Ollis et al. First results in vision-based crop line tracking
Rovira-Más et al. Hough-transform-based vision algorithm for crop row detection of an automated agricultural vehicle
US20040264763A1 (en) System and method for detecting and analyzing features in an agricultural field for vehicle guidance
Montalvo et al. Automatic detection of crop rows in maize fields with high weeds pressure
EP1777486B1 (en) Sensor system, method, and computer program product for plant phenotype measurement in agricultural environments
Lottes et al. UAV-based crop and weed classification for smart farming
Pérez-Ortiz et al. Selecting patterns and features for between-and within-crop-row weed mapping using UAV-imagery
Marchant et al. Real-time tracking of plant rows using a Hough transform
Mousazadeh A technical review on navigation systems of agricultural autonomous off-road vehicles
Chen et al. Counting apples and oranges with deep learning: A data-driven approach
Hiremath et al. Laser range finder model for autonomous navigation of a robot in a maize field using a particle filter
Bai et al. A multi-sensor system for high throughput field phenotyping in soybean and wheat breeding
US9076105B2 (en) Automated plant problem resolution

Legal Events

Date Code Title Description
R163 Identified publications notified
R012 Request for examination validly filed