WO2019091725A1 - Procédé et appareil d'affichage pour guider un engin - Google Patents

Procédé et appareil d'affichage pour guider un engin Download PDF

Info

Publication number
WO2019091725A1
WO2019091725A1 PCT/EP2018/078323 EP2018078323W WO2019091725A1 WO 2019091725 A1 WO2019091725 A1 WO 2019091725A1 EP 2018078323 W EP2018078323 W EP 2018078323W WO 2019091725 A1 WO2019091725 A1 WO 2019091725A1
Authority
WO
WIPO (PCT)
Prior art keywords
image information
working
work machine
route
machine
Prior art date
Application number
PCT/EP2018/078323
Other languages
German (de)
English (en)
Inventor
Marcus Hiemer
Mark Mohr
Mauro Cesar ZANELLA
Original Assignee
Zf Friedrichshafen Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zf Friedrichshafen Ag filed Critical Zf Friedrichshafen Ag
Publication of WO2019091725A1 publication Critical patent/WO2019091725A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B69/00Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
    • A01B69/001Steering by means of optical assistance, e.g. television cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Definitions

  • Known methods for guiding a work machine along a route include controlling a movement of the work machine along the route.
  • Guiding a work machine may be a user guided machine guidance.
  • the operation of the work machine may also include or be understood as controlling or regulating the work machine by a user.
  • a work machine can be any vehicle that can be used for a work, for example an agricultural machine.
  • a route can also be defined as a path or a route along which the work machine is to be moved.
  • the route may include a two- or three-dimensional trajectory with or without altitude information.
  • a movement of the work machine can also be referred to as a drive with the work machine.
  • Known display devices for guiding a work machine along a route are configured for controlling a movement of the work machine along the route.
  • a view of such a known display device is shown schematically.
  • the display device can show in perspective a route 101 to be traveled on a model of a work site 102.
  • the route 101 to be traveled can also be referred to as an artificial route.
  • the working machine 1 and optionally a working device 103, which can be mounted on the working machine 1 are shown symbolically.
  • a terrain 104 traversed by the work machine 1 or work implement 103 according to the route 101 to be traveled can also be displayed.
  • a tracking system is known with which a vehicle is moved along a predetermined lane. To drive off the lane and perform work, a driver of the vehicle is presented with the predetermined lane and the position of the vehicle. Solutions are provided for improving known methods and display devices for guiding a work machine.
  • Such a solution for a method for guiding a work machine consists in acquiring image information with at least one sensor, visualizing the image information with a display device, wherein the image information has information about the environment around the work machine and wherein the work machine is controllable with the image information.
  • Image information can be image data.
  • Capturing image information may also be defined as detecting or capturing image information.
  • image information basically any visualizable information can be understood, which can be represented as a picture in a two-dimensional image coordinate system.
  • Image information can basically be captured by photogrammetry, remote sensing or scanning.
  • Image information may also come from an external source, such as a satellite image or a drone-captured image.
  • any sensor for capturing image information can be provided.
  • a camera may be provided.
  • the camera can produce a photographic image of the environment as image information.
  • a temperature camera which generates a temperature image as image information or a 3D camera which generates a depth image as image information may be provided as the sensor. With such cameras, areal information about the environment can be captured pixel-like and simultaneously.
  • Visualizing image information can also be understood as displaying, displaying or displaying image information.
  • At least one coordinate transformation in particular a transformation from a global coordinate system into a local image coordinate system or from a camera system into an image system, may be necessary for the visualization.
  • the visualization can also be defined as displaying, displaying or visualizing.
  • a display device may be, for example, a display, a monitor or a screen.
  • the display device may be portable, such as a tablet, or fixed to the be connected.
  • the display device can be installed in the work machine, in particular in a guide post of the work machine.
  • Information about the environment around the work machine may include information about visible light. It is possible to visualize a picture of the environment around the working machine in the visible spectrum. An image can be a photo. Alternatively or additionally, an infrared image or a depth image of the surroundings of the working machine can be detected. In an infrared image, information about the environment may include infrared radiation. In a depth image, distances from the work machine to objects in the environment can be represented as pixel-like or voxel-like.
  • the environment around the work machine can also be defined as a scene or scenery. Furthermore, the surroundings may also be the periphery or the environment of the working machine.
  • the solutions provided are based on the knowledge that an automated, in particular a positionally accurate, guiding a work machine along a route to be processed can not be sufficient to optimally execute corresponding work tasks along the route.
  • a basic idea may be seen to provide real environmental information to a user visually in real time while working with a work machine along a work route to incorporate actual conditions of the environment into the work to be performed.
  • Environment information can thus be used as a real-time representation of the environment around the work machine for controlling the same, wherein the rules can be user-managed or autonomous.
  • a further basic idea can be seen in that the movement of a work machine that can already be guided or regulated by means of satellite navigation can be adapted by adding real-time image acquisition of the work environment to the actual position of objects to be processed, as an additional functionality for regulating the movement behavior of the work machine. It can be deliberately deviated from a predetermined target route to improve the execution of the task object specific For example, information about the field, plants or crop during the work to be performed along a working route can be taken into account in real time with an agricultural machine in field work to be done along a route of the agricultural machine.
  • a tractor, a combine harvester or a harvester can thus also be controlled and moved taking into account actual conditions, for example an existing crop or cereal cutting edge or the position of a swath.
  • the work machine can (partially) automatically drive along the working route as a given movement trajectory.
  • the basis for the trajectory can be global coordinate data, for example GPS coordinates.
  • a route previously traveled by a human may be repeatedly traveled, and this may be referred to as a "playback function" for moving the work machine.
  • One embodiment consists in rules of movement of the work machine with the image information and / or with a working position of the work machine.
  • the work machine can be regulated based on the image information in their lateral dynamics and / or longitudinal dynamics.
  • the working machine can optionally be regulated in terms of its working position in its longitudinal dynamics and based on the image information in their lateral dynamics.
  • the image information may visualize a region of the environment detected or selected by the camera.
  • the camera can have a wide-angle lens or a fisheye lens. This has the advantage that the viewing angle and thus the size of the detected environment can be increased.
  • Another embodiment is to provide a plurality of cameras on the work machine and capture the image information with the plurality of cameras, wherein the image information acquired by the plurality of cameras is at least partially redundant.
  • the cameras can be arranged on the working machine.
  • a camera can have a field of vision oriented in the direction of movement of the working machine.
  • Another camera can be a backward in the direction of movement of the working machine Have field of view.
  • Another camera may have a laterally to the left or right aligned in the direction of movement of the working machine field of view.
  • a field of view of a camera may have an angle of view.
  • the image angle can be changed in particular with an objective.
  • the field of view or the angle of view of the camera can also be called a field of view.
  • the information about the environment of the work machine is a panoramic image. Images taken by at least one camera may overlap, whereby the fields of view of the cameras may at least partially overlap. In corresponding overlapping areas, redundant image information may be present.
  • the panorama image may also be referred to as a surround view.
  • brightness differences or contrast differences can be compensated.
  • distortions can also be reduced. It is also possible to remove stubborn objects from the pictures. For example, a mirror, a chimney or an exhaust of a work machine can be removed.
  • the working machine can be controlled taking into account a complete 360 ° field of view.
  • a further embodiment consists in that a detection and / or visualization of the image information is provided in real time.
  • Real time can be defined in particular as near real time. Capturing the image information in real time can provide a real-time image of the environment around the work machine. A real-time image can be continuously captured and visualized. The real-time image can be displayed by a display device in real time. In order to regulate the work machine, image information can thus be available in real time in order to continuously adapt the movement of the work machine to ambient conditions and thus to correct the movement in real time.
  • the image information can be visualized in different perspectives. Perspectives can also be referred to as views or real-time views.
  • the image information can represent a region of the environment in the field of view of a camera in perspective.
  • the image information can be viewed as an all-round view or in bird's-eye view ("Bird It is also possible to visualize a trajectory view around the work machine.
  • the perspectives can optionally be visualized separately or side by side.An illustrated image information can also be visualized in changeable, changeable or changing perspectives.
  • Various perspectives have the advantage of that a user can select the representation of the environment on a display device and thus can focus on specific areas of the environment to influence the movement behavior of the working machine.
  • a further embodiment is that a detection of a working position of the working machine, a visualization of a working route and a visualization of the working position is provided, wherein the working machine is adjustable with a comparison of the working position with the working route. Rules of the work machine based on comparing the work position with the work route may be optional or permanently supportive. The work machine can be temporarily or permanently self-adjustable with the detected work position.
  • the regulation of a movement of the working machine along the route can first be carried out in a first step as a function of a visual comparison of the working position with the working route.
  • the visual deviation of a current working position from a given working route can be taken into account in order to move the working machine further on or back to the working route.
  • the regulation of the movement of the working machine along the route can then take place in a second step as a function of the visualized image information. It may be desirable to deviate from the work route followed in the first step in order to carry out the work tasks efficiently and precisely. A to be done for doing the task actually following track can be departed so deviating from the working route with the working machine or a work attachment attached thereto, the predetermined working route can be left.
  • the first control step may also be referred to as coarse control, in which the work machine is moved along a predetermined working route
  • the second control step may also be referred to as fine control, in which the along the working route to be executed work task depending on the actual environmental conditions can be performed accurately.
  • the detection of the working position can be carried out by means of satellite navigation or satellite positioning.
  • a satellite navigation system which can be used for this purpose can in particular use GPS, GLONASS and / or GALILEO satellites.
  • a receiver for corresponding satellite signals may be provided on the agricultural machine and detect the working positions of the working machine.
  • the accuracy of position data or working positions can be increased by using a reference station, for example, differential GPS can be used. Fast and high-frequency exposure of position information can thus be provided in real time.
  • a multiplicity of display devices can be provided for visualizing various information on the work machine.
  • a visualization of the image information, the working route, and / or the working position is provided with a single display device.
  • the current working position of the machine can be visualized with respect to a given working route and display additional image information of the environment.
  • the image information can be projected onto a terrain model around the working route and visualized on the display device. It is also possible to visualize the position of a work tool. It can be done a visual readjustment of the movement of the machine based on the image information. For example, such a cutter bar of a tractor can be precisely moved in a field.
  • controlling the movement of the work machine along the route has a departure from the working route.
  • the rules of the movement of the working machine can not be understood as precisely as possible a precise departure of a given working route, but rather as a adapted to the actual environmental conditions targeted deviation from a given working route can be understood.
  • the rules may in particular have a lateral deviation along the working route. This has the advantage that it can react flexibly to changing environmental conditions or to obstacles.
  • a solution for a display device for guiding a work machine is that the display device is designed to visualize image information acquired by a sensor, the image information having information about the environment around the work machine and wherein the display information displayed on the display device for controlling the work machine is provided. Another solution is a work machine having such a display device.
  • the display device can also be designed to visualize a working route and a working position of the working machine detected by a sensor.
  • the display device can visualize combined data having image information, the working position and / or the working route. Such combined data may also be referred to as fused data.
  • information about the environment is also displayed as a real-time scene.
  • a real-time view of the environment may be displayed on the display device as a full-frame with combined data or as a trajectory view as shown in FIG.
  • Fig. 1 shows an indication of a route and a position of a work machine for guiding the same from the prior art.
  • FIG. 2 shows a schematic flowchart of method steps for a method for guiding a work machine.
  • Fig. 3 shows a plan view of a working machine with four cameras for
  • FIG. 4 shows a representation of components for visualizing image information and a working position of a work machine.
  • FIG. 2 shows method steps for guiding a work machine 1.
  • a control S3 of the work machine 1 is performed based on a visualization S22 of image information 11.
  • the image information 11 is generated in a preceding step of detecting S12 of the image information 11 and based on the rule S3.
  • the control S3 of the work machine 1 optionally takes place on the basis of a visualization S23 of the work position 12.
  • the work position 12 is generated in a preceding step of detecting S11 of the work position 12 and based on the rules S3.
  • the control S3 of the work machine 1 is optionally based on a visualization S21 of the work route 14.
  • the work route 14 is generated in a previous step of detecting the work route 14 (not shown).
  • FIG. 3 shows a tractor as an agricultural machine or work machine 1 in plan view with four cameras 2, 3, 4, 5, an electronic control unit 6 and a satellite navigation system 7.
  • the cameras 2, 3, 4, 5 are connected with connections 10 to the control unit 6, for example CAN connections, Ethernet connections, LVDS connections.
  • the connections 10 transmit raw data acquired by the cameras 2, 3, 4, 5 to the control unit 6.
  • the control unit 6 communicates via the connections 10 or electrical connections to the cameras 2, 3, 4, 5, wherein the image acquisition is controlled and parameters of the Image recording can be set.
  • the cameras 2, 3, 4 and 5 on the work machine 1 have four fields of view 20, 30, 40 and 50.
  • a first field of view 20 of the first camera 2 is aligned in the direction of travel of the working machine 1 to the rear.
  • the first camera 2 records a first image as a function of the first field of view 20.
  • a second field of view 30 of the second camera 3 is aligned laterally to the left in the direction of travel of the work machine 1.
  • the second camera 3 records a second image as a function of the second field of view 30.
  • a third field of view 40 of the third camera 4 is aligned in the direction of travel of the work machine 1 to the front.
  • the third camera 4 takes on a third image as a function of the third field of view 40.
  • a fourth field of view 50 of the fourth camera 5 is aligned laterally to the right in the direction of travel of the work machine 1.
  • the fourth camera 5 records a fourth image as a function of the fourth field of view 50.
  • the cameras 2, 3, 4 and 5 are arranged and aligned such that the fields of view 20, 30, 40, 50 generate pairs overlap areas 60, 70, 80 and 90. Regions of the environment in the overlap areas 60, 70, 80 and 90 are thus recorded in two images each by two cameras.
  • An overlap area 70 results from an overlap of the third field of view 40 and the fourth field of view 50, which define the image capture with the third camera 4 and the fourth camera 5.
  • An overlap area 60 results from an overlap of the fourth field of view 50 and the first field of view 20, which define the image capture with the fourth camera 5 and the first camera 2.
  • An overlap area 90 results from an overlap of the first field of view 20 and the second field of view 30, which define the image capture with the first camera 2 and the second camera 3.
  • An overlap area 80 results from an overlap of the second field of view 30 and the third field of view 40, which define the image capture with the second camera 3 and the third camera 4. Based on the individual images recorded in the fields of view 20, 30, 40, 50, a panoramic view is calculated.
  • FIG. 4 shows the components for visualizing image information and a working position of the working machine with a display device 8.
  • Image information 11 is transmitted from the controller 6, which is connected to the cameras 2, 3, 4 and 5, to the display device 8.
  • the control unit 6 has optional N inputs for N cameras.
  • the work machine 1 has a satellite navigation system 7.
  • the satellite navigation system 7 has an antenna (not shown) which receives satellite signals, for example, a GPS receiver is disposed on the work machine 1.
  • the satellite navigation system 7 additionally uses a terrestrial positioning system 9, for example a RTK (Real Time Kinematic) positioning system, to generate position data.
  • RTK Real Time Kinematic
  • the working position 12 is generated by a satellite navigation system 7 and a related optional positioning system 9.
  • the working position 12 is transmitted to the display device 8 via a line, for example CAN or Ethernet.
  • Combined data 13 will be stored on a storage medium 15, for example on a USB stick or an SD card, to be later evaluated offline by the user or user of the work machine 1.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Soil Sciences (AREA)
  • Environmental Sciences (AREA)
  • Guiding Agricultural Machines (AREA)

Abstract

Procédé et appareil d'affichage pour guider un engin le long d'un itinéraire, comprenant le réglage du mouvement de l'engin le long de l'itinéraire, y compris la détection d'informations d'image avec au moins un capteur, la visualisation des informations d'image avec l'appareil d'affichage, les informations d'image comprenant des informations relatives à l'environnement autour de l'engin, et l'engin pouvant être réglé avec les informations d'image.
PCT/EP2018/078323 2017-11-10 2018-10-17 Procédé et appareil d'affichage pour guider un engin WO2019091725A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102017220005.7 2017-11-10
DE102017220005.7A DE102017220005A1 (de) 2017-11-10 2017-11-10 Verfahren und Anzeigegerät zum Führen einer Arbeitsmaschine

Publications (1)

Publication Number Publication Date
WO2019091725A1 true WO2019091725A1 (fr) 2019-05-16

Family

ID=63896179

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/078323 WO2019091725A1 (fr) 2017-11-10 2018-10-17 Procédé et appareil d'affichage pour guider un engin

Country Status (2)

Country Link
DE (1) DE102017220005A1 (fr)
WO (1) WO2019091725A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11846947B2 (en) 2021-05-11 2023-12-19 Cnh Industrial Canada, Ltd. Systems and methods for an implement imaging system

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102019204251A1 (de) * 2019-03-27 2020-10-01 Zf Friedrichshafen Ag Verfahren und System zur Linienerfassung auf einer landwirtschaftlichen Nutzfläche
DE102019209526A1 (de) * 2019-06-28 2020-12-31 Zf Friedrichshafen Ag Überwachen einer Anbaufläche

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010083949A1 (fr) 2009-01-21 2010-07-29 Amazonen-Werke H. Dreyer Gmbh & Co. Kg Système d'acheminement électronique
US20140324272A1 (en) * 2013-04-29 2014-10-30 Claas Agrosystems Kgaa Mbh & Co Kg Operating system for and method of operating an automatic guidance system of an agricultural vehicle
WO2016009688A1 (fr) * 2014-07-16 2016-01-21 株式会社リコー Système, machine, procédé de commande et programme
KR20160063324A (ko) * 2013-10-01 2016-06-03 얀마 가부시키가이샤 콤바인
EP3335944A1 (fr) * 2016-12-19 2018-06-20 Kubota Corporation Véhicule de travail

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8374790B2 (en) * 2009-10-30 2013-02-12 Teejet Technologies Illinois, Llc Method and apparatus for guiding a vehicle
DE102014201203A1 (de) * 2014-01-23 2015-07-23 Deere & Company Landwirtschaftliches Arbeitsfahrzeug mit einem Fluggerät und zugehöriger Stromversorgung

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010083949A1 (fr) 2009-01-21 2010-07-29 Amazonen-Werke H. Dreyer Gmbh & Co. Kg Système d'acheminement électronique
US20140324272A1 (en) * 2013-04-29 2014-10-30 Claas Agrosystems Kgaa Mbh & Co Kg Operating system for and method of operating an automatic guidance system of an agricultural vehicle
KR20160063324A (ko) * 2013-10-01 2016-06-03 얀마 가부시키가이샤 콤바인
WO2016009688A1 (fr) * 2014-07-16 2016-01-21 株式会社リコー Système, machine, procédé de commande et programme
EP3335944A1 (fr) * 2016-12-19 2018-06-20 Kubota Corporation Véhicule de travail

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11846947B2 (en) 2021-05-11 2023-12-19 Cnh Industrial Canada, Ltd. Systems and methods for an implement imaging system

Also Published As

Publication number Publication date
DE102017220005A1 (de) 2019-05-16

Similar Documents

Publication Publication Date Title
DE102016114535B4 (de) Wegbestimmung für automatisierte Fahrzeuge
DE102013209415A1 (de) Dynamische Hinweisüberlagerung mit Bildbeschneidung
DE102018215344A1 (de) System und verfahren für fahrzeugkonvois
EP2922384B1 (fr) Dispositif de locomotion autonome
EP3413155B1 (fr) Procédé de détection d'au moins une section d'un bord de limitation d'une surface à travailler, procédé de fonctionnement d'un robot de traitement d'espace vert autonome mobile, système de détection et système de traitement d'espace vert
DE10308525A1 (de) Vermessungssystem
DE102006059549A1 (de) Verfahren zur Interpolation von Positionsdaten, positionserkennender Sensor, und Vorrichtung zur Positionsmessung
DD228096A5 (de) Verfahren und vorrichtung zur automatischen fuehrung von fahzeugen, insbesondere von fahrerlosen elektrokarren
EP2060873B1 (fr) Procédé destiné au soutien de la navigation inertielle d'un aéronef
WO2019091725A1 (fr) Procédé et appareil d'affichage pour guider un engin
DE102006055652A1 (de) Verfahren zur Aufarbeitung dreidimensionaler Daten und Vorrichtung zur Aufarbeitung dreidimensionaler Daten
DE102012111345B4 (de) Mobiles Handgerät zur Ausrichtung eines Sensors
DE102014012831A1 (de) Selbstfahrende Baumaschine und Verfahren zum Steuern einer selbstfahrenden Baumaschine
DE102014012825A1 (de) Selbstfahrende Baumaschine und Verfahren zur Steuerung einer selbstfahrenden Baumaschine
DE102020109279A1 (de) System und verfahren zur anhängerausrichtung
DE102007043534A1 (de) Anordnung zum Erfassen einer Umgebung
DE112016000689T5 (de) Kameraparametereinstellvorrichtung
DE112012004055T5 (de) Bildaufnahmevorrichtung
EP2930466A1 (fr) Appareil d'observation mobile doté d'un compas magnétique numérique
DE102016209437A1 (de) Selbsttätiges Lenksystem zur Führung eines landwirtschaftlichen Fahrzeugs über ein Feld und entsprechendes Verfahren
DE102018200060B4 (de) Verfahren zum Betreiben einer mobilen Arbeitsmaschine und mobile Arbeitsmaschine
EP2350977B1 (fr) Procédé pour fusionner au moins deux images pour former une image panoramique
DE102017100885B4 (de) Verfahren und vorrichtung zum erzeugen und projizieren eines 3d-thermogramms samt daten zu den aufnahmebedingungen
DE102021110287A1 (de) Verfahren und System zum automatisierten Kalibrieren von Sensoren
EP2996327A2 (fr) Systeme de vue environnante pour vehicules dote d'appareils de montage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18789114

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 18789114

Country of ref document: EP

Kind code of ref document: A1