WO2010115580A1 - Method and apparatus for recognizing objects - Google Patents

Method and apparatus for recognizing objects Download PDF

Info

Publication number
WO2010115580A1
WO2010115580A1 PCT/EP2010/002097 EP2010002097W WO2010115580A1 WO 2010115580 A1 WO2010115580 A1 WO 2010115580A1 EP 2010002097 W EP2010002097 W EP 2010002097W WO 2010115580 A1 WO2010115580 A1 WO 2010115580A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
environment
determined
objects
camera
Prior art date
Application number
PCT/EP2010/002097
Other languages
German (de)
French (fr)
Inventor
Jörg GRÜNER
Mathias Hartl
Martin Lallinger
Joachim Missel
Matthias Reichmann
Fridtjof Stein
Original Assignee
Daimler Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler Ag filed Critical Daimler Ag
Publication of WO2010115580A1 publication Critical patent/WO2010115580A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/86Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93272Sensor installation details in the back of the vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Definitions

  • the invention relates to a method for detecting objects in an environment of a vehicle according to the features of the preamble of claim 1 and an apparatus for performing the method.
  • An arrangement for determining information about objects in an environment of a vehicle comprises one or more vehicle-arranged light-emitting components which emit infrared light into the surroundings of the vehicle and receivers arranged on the vehicle for reception of infrared light from the surroundings of the vehicle.
  • the information about the objects is determined by an evaluation of the received infrared light by means of a processor. For example, a distance between the vehicle and the object and a speed of the object are determined and the object is identified.
  • pattern recognition techniques are used to obtain the desired information.
  • On the basis of detected objects and their position and speed functions of the vehicle can be influenced, for example, an audible or visual warning device or a steering wheel control device.
  • US 2006/0055776 A1 describes a method and a device for motion detection of a mobile unit and a navigation system.
  • a partial area determines corresponding points between images taken by a camera.
  • a first portion for motion determination determines a first movement of a mobile body using the corresponding points assuming a given level in the pictures.
  • a second subarea for the Motion determination determines a second motion using the first motion and the corresponding points.
  • a bird's-eye view of a first image is created with a road surface imaged in the image. This first image was captured by a camera, which is arranged on a vehicle. Thereafter, a bird's eye view of a second image is generated. The second image is acquired at a different time at the time of the first image.
  • the two bird's-eye views are linked by a characteristic feature on the road surface. In each overlap area of the two bird's-eye views, areas that have differences in the two representations are identified as obstacles.
  • US 2007/0285217 A1 describes an environment detection device, a method for environment detection and a program sequence for environment detection.
  • An environment detecting device includes a first camera for detecting an apron area and a second camera for detecting a road surface. The optical axis of the second camera is inclined downwardly to detect a characteristic point in consecutively acquired images and to determine therefrom an optical flux and to detect structural information of the road. Thus, three-dimensional information of obstacles in an environment of the vehicle can be determined.
  • the invention has for its object to provide an improved method for detecting objects in an environment of a vehicle and an apparatus for performing the method.
  • the object is achieved by a method for detecting objects in an environment of a vehicle having the features of claim 1.
  • the object is achieved by the features specified in claim 6.
  • Preferred embodiments and further developments of the invention are specified in the dependent claims.
  • a first image of the surroundings of the vehicle is detected by means of at least one camera arranged on the vehicle at a first vehicle position, and a second image of the surroundings of the vehicle is detected at a second vehicle position.
  • the two acquired images are evaluated in an image processing and evaluation unit.
  • a change in position of the vehicle between the first vehicle position and the second vehicle position is determined by a determination of an optical flow based on at least one characteristic point shown in both images and / or by sensors for determining a travel path of the vehicle.
  • positions and dimensions of objects shown in the images in the surroundings of the vehicle are furthermore determined according to the invention, from which a three-dimensional environment map of the detected environment of the vehicle having a height profile is generated therefrom.
  • the method makes it possible to optimize automatic or semi-automatic parking systems, since an environment of the vehicle is detected much more accurately.
  • a distance from vehicle tires to a curb can be significantly reduced in automatic or semi-automatic parking systems by using the method, without increasing the risk of damage to the vehicle.
  • a combination with further distance sensors arranged on the vehicle, such as ultrasound sensors, for example, further increases reliability, since physically induced detection inaccuracies can be compensated.
  • Fig. 1a is equipped with a camera vehicle in a first
  • Fig. 1 b a vehicle equipped with a camera in a second
  • Fig. 3 is an evaluation of the two images in a Schmakus- and
  • FIGS. 1a and 1b show a vehicle F equipped with a camera K in a first vehicle position FP1 and a second vehicle position FP2 in front of an object O.
  • the camera K with which an environment of the vehicle F can be monitored, is one in the exemplary embodiment illustrated here Backup camera.
  • Such cameras K as well as other cameras K at other positions of the vehicle F, for example in a front area of the vehicle F or in outside mirrors of the vehicle F, are already in widespread use.
  • additional cameras K are not necessarily to be installed; cameras K already installed on the vehicle F can be used, as a result of which the method can be implemented cost-effectively in vehicles F.
  • the illustrated vehicle F has distance sensors A, for example, ultrasonic sensors in a rear bumper of the vehicle F.
  • a currently acquired image of the reversing camera is displayed on an optical output unit.
  • dimensions of objects O shown are very difficult to estimate.
  • distance sensors A for example, on ultrasound basis, however, have a limited detection range E, so that, for example, when approaching the Object O shown here, which represents a curb, falls below a certain distance detection of the object O by the distance sensors A fails. If the vehicle F is in the first vehicle position FP1, as shown in FIG. 1a, it is initially warned that the object O is still within the detection range E of the distance sensors.
  • the object O disappears from the detection area E of the distance sensors A, for example, as shown in Figure 1 b, when the vehicle F is in the second vehicle position FP2, so that the warning is set.
  • a vehicle driver can not judge how high the curbside is, ie whether he can drive over them without damaging the vehicle F.
  • positions and dimensions of objects O in the environment of the vehicle F are determined which are represented in images B1, B2 captured by the camera K.
  • a three-dimensional environment map with a height profile of the detected environment of the vehicle F can be generated and evaluated internally and / or output on the optical output unit, so that at an approach of the vehicle F to objects O, which of predetermined shapes and / or dimensions deviate, a visual, audible and / or haptic warning is issued.
  • the object O is the curb.
  • a first image B1 of the environment of the vehicle F is detected at the first vehicle position FP1 shown in FIG. 1a, as shown in FIG. 2a.
  • the object O (FP1) is shown from the first vehicle position FP1.
  • vehicle position FP2 is detected by means of the camera K, a second image B2 of the environment of the vehicle F, as shown in Figure 2 b.
  • the object O (FP2) is shown from the second vehicle position FP2.
  • the two captured images B1, B2 are evaluated in an image processing and evaluation unit, as shown in FIG.
  • a change in the position of the vehicle F between the first vehicle position FP1 and the second vehicle position FP2 must first be determined.
  • Position change ie, both a direction and a distance traveled, can be determined, for example, by means of sensors for determining a travel path of the vehicle F, which are already installed in the vehicle F as part of driver assistance systems, such as an electronic stability program, according to the prior art.
  • wheel speeds, a speed, a steering wheel angle, a yaw angle, a yaw rate, a longitudinal and / or lateral acceleration of the vehicle F are determined, for example, so that a change in position of the vehicle F and thus of the camera K can be determined therefrom.
  • a characteristic point P is, for example, an upper corner point of the curb.
  • a motion vector BV can be displayed between the characteristic point P (FPI) shown in the first image B1 and the corresponding characteristic point P (FP2) shown in the second image B2.
  • a displacement of the characteristic point P (FPI), P (FP2) is caused by the displacement of an imaging area of the camera K, which is fixedly arranged on the vehicle F. That is, the displacement represented by the motion vector BV between the illustrated corresponding characteristic points P (FPI), P (FP2) corresponds to the positional change of the vehicle F.
  • this position change in the image processing and evaluation of the image analysis of the two images B1, B2 and the creation of the three-dimensional map of these two Figures B1, B2 are used. Since the two images B1, B2 have been acquired in different positions FP1, FP2 of the camera K and thus from different perspectives, dimensions and proportions in the images B1, B2 of objects O (FP1), O shown by comparison of the two images B1, B2 (FP2) comparable and on the basis of the determined change in position from this real dimensions, shapes and positions of the detected objects O can be determined and thus the three-dimensional map and the height profile of the detected environment of the vehicle F can be created.
  • the curb edge shown has a height of, for example, 8 cm and is formed vertically sloping in the direction of a driving plane FE of the vehicle F.
  • the detected object O is an obstacle, which can not be run over without a risk of damaging the vehicle F. It is therefore when approaching the object O, d. H. to the curb, an optical, audible and / or haptic warning generated.
  • This may be, for example, a warning sound, a vibration warning in a driver's seat, a successive illumination of a plurality of bulbs as the approach approaches and / or, for example, a color marking of the object O on the camera image displayed on the optical output unit.
  • the detected object O for example, a lowered curb
  • a lower, safely traversable height of the object O is determined, so that no warning is issued.
  • the object O were a bevelled curb
  • this determination of positions and dimensions of objects O in the vicinity of the vehicle F is combined with sensor data of the distance sensors A, so that a redundancy can be achieved, physical inaccuracies, such as a non-detection of objects O outside the detection range E of the distance sensors A are compensable and / or by means of such a combination a complete environment detection around the vehicle F around is made possible.
  • the determined three-dimensional environment map with the determined dimensions and positions of the objects O in the environment of the vehicle F, preferably combined with the sensor data of the distance sensors A, for a partially or fully automatic parking operation of the vehicle F can be used by using determined Environment data by means of a control unit, a drive train, a steering and / or a braking system of the vehicle F are controlled.
  • a partially or fully automatic parking process can be performed much more accurately by using environment data determined by means of the method.

Abstract

The invention relates to a method for recognizing objects (O) in an environment of a vehicle (F), wherein by means of at least one camera (K) disposed on the vehicle (F) a first image (B1) is recorded of the environment of the vehicle (F) at a first vehicle position (FP1), a second image (B2) is recorded of the environment of the vehicle (F) at a second vehicle position (FP2), and the two recorded images (B1, B2) are evaluated in an image processing and evaluation unit. According to the invention, a change of position of the vehicle (F) between the first vehicle position (FP1) and the second vehicle position (FP2) is determined by determining an optical flow by means of at least one characteristic point (P(FPI), P(FP2)) depicted in both images (B1, B2) and/or by means of sensors for determining a driving path of the vehicle (F). Furthermore, by using the determined change of position, according to the invention, positions and dimensions of objects (O(FP1), (O(FP2)) depicted in the images (B1, B2) are determined in the environment of the vehicle (F), wherein a three-dimensional environmental map of the recorded environment of the vehicle (F) having a height profile is generated therefrom. The invention further relates to an apparatus for performing said method.

Description

Verfahren und Vorrichtung zur Objekterkennung Method and device for object recognition
Die Erfindung betrifft ein Verfahren zur Erkennung von Objekten in einem Umfeld eines Fahrzeugs nach den Merkmalen des Oberbegriffs des Anspruchs 1 und eine Vorrichtung zur Durchführung des Verfahrens.The invention relates to a method for detecting objects in an environment of a vehicle according to the features of the preamble of claim 1 and an apparatus for performing the method.
Aus dem Stand der Technik ist, wie in der US 2002/0005778 A1 beschrieben, ein System für ein Fahrzeug zur Identifikation und Überwachung eines toten Winkels bekannt. Eine Anordnung zur Ermittlung von Informationen über Objekte in einer Umgebung eines Fahrzeugs, zum Beispiel in toten Winkeln aus einer Fahrersicht, umfasst eine oder mehrere am Fahrzeug angeordnete Licht emittierende Komponenten, welche Infrarotlicht in die Umgebung des Fahrzeugs emittieren, und am Fahrzeug angeordnete Empfänger zum Empfang von Infrarotlicht aus der Umgebung des Fahrzeugs. Die Information über die Objekte wird ermittelt durch eine Auswertung des empfangenen Infrarotlichts mittels eines Prozessors. Es werden beispielsweise eine Entfernung zwischen dem Fahrzeug und dem Objekt und eine Geschwindigkeit des Objektes ermittelt und das Objekt identifiziert. Bevorzugt werden Mustererkennungstechniken genutzt, um die gewünschten Informationen zu erhalten. Auf Basis erfasster Objekte sowie deren Position und Geschwindigkeit können Funktionen des Fahrzeugs beeinflusst werden, beispielsweise eine akustische oder optische Warneinrichtung oder eine Lenkradsteuerungseinrichtung.From the prior art, as described in US 2002/0005778 A1, a system for a vehicle for the identification and monitoring of a blind spot is known. An arrangement for determining information about objects in an environment of a vehicle, for example at blind spots from a driver's view, comprises one or more vehicle-arranged light-emitting components which emit infrared light into the surroundings of the vehicle and receivers arranged on the vehicle for reception of infrared light from the surroundings of the vehicle. The information about the objects is determined by an evaluation of the received infrared light by means of a processor. For example, a distance between the vehicle and the object and a speed of the object are determined and the object is identified. Preferably, pattern recognition techniques are used to obtain the desired information. On the basis of detected objects and their position and speed functions of the vehicle can be influenced, for example, an audible or visual warning device or a steering wheel control device.
In der US 2006/0055776 A1 werden ein Verfahren und eine Vorrichtung zur Bewegungsermittlung einer mobilen Einheit und ein Navigationssystem beschrieben. In einer Vorrichtung zur Bewegungsermittlung eines mobilen Körpers bestimmt ein Teilbereich korrespondierende Punkte zwischen Bildern, die von einer Kamera aufgenommen wurden. Ein erster Teilbereich zur Bewegungsbestimmung bestimmt eine erste Bewegung eines mobilen Körpers unter Benutzung der korrespondierenden Punkte unter Annahme einer vorgegebenen Ebene in den Bildern. Ein zweiter Teilbereich zur Bewegungsbestimmung bestimmt eine zweite Bewegung unter Benutzung der ersten Bewegung und der korrespondierenden Punkte.US 2006/0055776 A1 describes a method and a device for motion detection of a mobile unit and a navigation system. In a mobile body motion detection apparatus, a partial area determines corresponding points between images taken by a camera. A first portion for motion determination determines a first movement of a mobile body using the corresponding points assuming a given level in the pictures. A second subarea for the Motion determination determines a second motion using the first motion and the corresponding points.
Aus der US 2007/0206833 A1 ist ein Hinderniserkennungssystem bekannt. Damit wird eine Erfassung eines dreidimensionalen Objektes mittels einer Monookularkamera durch Eliminierung von Erfassungsfehlern, welche beispielsweise durch einen Geschwindigkeitssensor oder einen Steuerwinkelsensor verursacht werden, verbessert. Es wird eine Vogelperspektivedarstellung eines ersten Bildes erzeugt, wobei in dem Bild eine Straßenoberfläche abgebildet ist. Dieses erste Bild wurde erfasst von einer Kamera, welche an einem Fahrzeug angeordnet ist. Danach wird eine Vogelperpektivedarstellung eines zweiten Bildes erzeugt. Das zweite Bild wird zu einem zum Erfassungszeitpunkt des ersten Bildes abweichenden Zeitpunkt erfasst. Die beiden Vogelperspektivedarstellungen werden anhand eines charakteristischen Merkmals auf der Straßenoberfläche miteinander verbunden. In jedem Überlappungsbereich der beiden Vogelperspektivedarstellungen werden Bereiche, welche Unterschiede in den beiden Darstellungen aufweisen, als Hindernisse identifiziert.From US 2007/0206833 A1 an obstacle detection system is known. Thus, detection of a three-dimensional object by means of a monaural camera is improved by eliminating detection errors caused by, for example, a speed sensor or a steering angle sensor. A bird's-eye view of a first image is created with a road surface imaged in the image. This first image was captured by a camera, which is arranged on a vehicle. Thereafter, a bird's eye view of a second image is generated. The second image is acquired at a different time at the time of the first image. The two bird's-eye views are linked by a characteristic feature on the road surface. In each overlap area of the two bird's-eye views, areas that have differences in the two representations are identified as obstacles.
In der US 2007/0285217 A1 sind eine Umfelderfassungsvorrichtung, ein Verfahren zur Umfelderfassung und ein Programmablauf zur Umfelderfassung beschrieben. Eine Umfelderfassungsvorrichtung umfasst eine erste Kamera zur Erfassung eines Vorfeldbereiches und eine zweite Kamera zur Erfassung einer Straßenoberfläche. Die optische Achse der zweiten Kamera ist nach unten geneigt, um einen charakteristischen Punkt in aufeinander folgend erfassten Bildern zu erfassen und daraus einen optischen Fluss zu ermitteln und um Strukturinformationen der Straße zu erfassen. Damit sind dreidimensionale Informationen von Hindernissen in einem Umfeld des Fahrzeugs ermittelbar.US 2007/0285217 A1 describes an environment detection device, a method for environment detection and a program sequence for environment detection. An environment detecting device includes a first camera for detecting an apron area and a second camera for detecting a road surface. The optical axis of the second camera is inclined downwardly to detect a characteristic point in consecutively acquired images and to determine therefrom an optical flux and to detect structural information of the road. Thus, three-dimensional information of obstacles in an environment of the vehicle can be determined.
Der Erfindung liegt die Aufgabe zugrunde, ein verbessertes Verfahren zur Erkennung von Objekten in einem Umfeld eines Fahrzeugs und eine Vorrichtung zur Durchführung des Verfahrens anzugeben.The invention has for its object to provide an improved method for detecting objects in an environment of a vehicle and an apparatus for performing the method.
Die Aufgabe wird erfindungsgemäß durch ein Verfahren zur Erkennung von Objekten in einem Umfeld eines Fahrzeugs mit den Merkmalen des Anspruchs 1 gelöst. Hinsichtlich der Vorrichtung zur Durchführung des Verfahrens wird die Aufgabe durch die im Anspruch 6 angegebenen Merkmale gelöst. Bevorzugte Ausgestaltungen und Weiterbildungen der Erfindung sind in den abhängigen Ansprüchen angegeben.The object is achieved by a method for detecting objects in an environment of a vehicle having the features of claim 1. With regard to the device for carrying out the method, the object is achieved by the features specified in claim 6. Preferred embodiments and further developments of the invention are specified in the dependent claims.
In einem Verfahren zur Erkennung von Objekten in einem Umfeld eines Fahrzeugs wird mittels zumindest einer am Fahrzeug angeordneten Kamera an einer ersten Fahrzeugposition ein erstes Bild des Umfeldes des Fahrzeugs erfasst und an einer zweiten Fahrzeugposition wird ein zweites Bild des Umfeldes des Fahrzeugs erfasst. Die beiden erfassten Bilder werden in einer Bildverarbeitungs- und Auswerteeinheit ausgewertet.In a method for detecting objects in an environment of a vehicle, a first image of the surroundings of the vehicle is detected by means of at least one camera arranged on the vehicle at a first vehicle position, and a second image of the surroundings of the vehicle is detected at a second vehicle position. The two acquired images are evaluated in an image processing and evaluation unit.
Erfindungsgemäß wird eine Positionsänderung des Fahrzeugs zwischen der ersten Fahrzeugposition und der zweiten Fahrzeugposition durch eine Ermittlung eines optischen Flusses anhand zumindest eines in beiden Bildern dargestellten charakteristischen Punktes und/oder durch Sensoren zur Ermittlung eines Fahrwegs des Fahrzeugs ermittelt. Unter Verwendung der ermittelten Positionsänderung werden erfindungsgemäß weiterhin Positionen und Abmessungen von in den Bildern dargestellter Objekte im Umfeld des Fahrzeugs ermittelt, wobei daraus eine dreidimensionale Umfeldkarte des erfassten Umfeldes des Fahrzeugs mit einem Höhenprofil erzeugt wird.According to the invention, a change in position of the vehicle between the first vehicle position and the second vehicle position is determined by a determination of an optical flow based on at least one characteristic point shown in both images and / or by sensors for determining a travel path of the vehicle. Using the determined change in position, positions and dimensions of objects shown in the images in the surroundings of the vehicle are furthermore determined according to the invention, from which a three-dimensional environment map of the detected environment of the vehicle having a height profile is generated therefrom.
Durch eine Auswertung derart erfasster Bildinformationen kann ein Fahrzeugführer vor Hindernissen gewarnt werden, welche das Fahrzeug beschädigen könnten. Dabei sind bereits im Fahrzeug vorhandene Kameras nutzbar. Durch das Verfahren wird auch eine Optimierung automatischer oder teilautomatischer Einparksysteme ermöglicht, da ein Umfeld des Fahrzeugs wesentlich genauer erfasst wird. Insbesondere ein Abstand von Fahrzeugreifen zu einem Bordstein ist bei automatischen oder teilautomatischen Einparksystemen durch Verwendung des Verfahrens deutlich verringerbar, ohne ein Risiko von Beschädigungen des Fahrzeugs zu erhöhen. Durch eine Kombination mit weiteren am Fahrzeug angeordneten Abstandssensoren, wie beispielsweise Ultraschallsensoren, wird eine Zuverlässigkeit weiter erhöht, da physikalisch bedingte Erfassungsungenauigkeiten ausgleichbar sind.By evaluating such captured image information, a driver can be warned of obstacles that could damage the vehicle. In this case, already existing in the vehicle cameras are available. The method also makes it possible to optimize automatic or semi-automatic parking systems, since an environment of the vehicle is detected much more accurately. In particular, a distance from vehicle tires to a curb can be significantly reduced in automatic or semi-automatic parking systems by using the method, without increasing the risk of damage to the vehicle. A combination with further distance sensors arranged on the vehicle, such as ultrasound sensors, for example, further increases reliability, since physically induced detection inaccuracies can be compensated.
Ausführungsbeispiele der Erfindung werden anhand von Zeichnungen näher erläutert.Embodiments of the invention will be explained in more detail with reference to drawings.
Dabei zeigen: Fig. 1a ein mit einer Kamera ausgerüstetes Fahrzeug in einer erstenShowing: Fig. 1a is equipped with a camera vehicle in a first
Fahrzeug position,Vehicle position,
Fig. 1 b ein mit einer Kamera ausgerüstetes Fahrzeug in einer zweitenFig. 1 b a vehicle equipped with a camera in a second
Fahrzeugpositionvehicle position
Fig. 2a ein erstes von der Kamera erfasstes Bild eines Objektes,2a shows a first image of an object captured by the camera,
Fig. 2b ein zweites von der Kamera erfasstes Bild eines Objektes, und2b shows a second captured by the camera image of an object, and
Fig. 3 eine Auswertung der beiden Bilder in einer Bildverarbeitungs- undFig. 3 is an evaluation of the two images in a Bildverarbeitungs- and
Auswerteeinheit.Evaluation.
Einander entsprechende Teile sind in allen Figuren mit den gleichen Bezugszeichen versehen.Corresponding parts are provided in all figures with the same reference numerals.
Die Figuren 1a und 1 b zeigen ein mit einer Kamera K ausgerüstetes Fahrzeug F in einer ersten Fahrzeugposition FP1 und einer zweiten Fahrzeugposition FP2 vor einem Objekt O. Die Kamera K, mit welcher ein Umfeld des Fahrzeugs F überwachbar ist, ist im hier dargestellten Ausführungsbeispiel eine Rückfahrkamera. Derartige Kameras K sowie weitere Kameras K an anderen Positionen des Fahrzeugs F, beispielsweise in einem Frontbereich des Fahrzeugs F oder in Außenspiegeln des Fahrzeugs F, sind bereits vielfach im Einsatz. Zur Durchführung des Verfahrens sind daher nicht notwendigerweise zusätzliche Kameras K zu installieren, es können bereits am Fahrzeug F installierte Kameras K genutzt werden, wodurch das Verfahren kostengünstig in Fahrzeuge F implementiert werden kann. Zusätzlich verfügt das dargestellte Fahrzeug F über Abstandssensoren A, beispielsweise Ultraschallsensoren in einer hinteren Stoßstange des Fahrzeugs F.FIGS. 1a and 1b show a vehicle F equipped with a camera K in a first vehicle position FP1 and a second vehicle position FP2 in front of an object O. The camera K, with which an environment of the vehicle F can be monitored, is one in the exemplary embodiment illustrated here Backup camera. Such cameras K as well as other cameras K at other positions of the vehicle F, for example in a front area of the vehicle F or in outside mirrors of the vehicle F, are already in widespread use. For carrying out the method, therefore, additional cameras K are not necessarily to be installed; cameras K already installed on the vehicle F can be used, as a result of which the method can be implemented cost-effectively in vehicles F. In addition, the illustrated vehicle F has distance sensors A, for example, ultrasonic sensors in a rear bumper of the vehicle F.
Nach dem Stand der Technik wird ein aktuell erfasstes Bild der Rückfahrkamera auf einer optischen Ausgabeeinheit dargestellt. Dabei sind insbesondere Abmessungen dargestellter Objekte O nur sehr schwer einzuschätzen. Zusätzlich kann vor Objekten O, an welche sich das Fahrzeug F annähert und welche mittels der Abstandssensoren A erfasst werden, beispielsweise optisch, akustisch und/oder haptisch gewarnt werden. Abstandssensoren A, beispielsweise auf Ultraschallbasis, haben allerdings einen begrenzten Erfassungsbereich E, so dass beispielsweise bei einer Annäherung an das hier dargestellte Objekt O, welches eine Bordsteinkante darstellt, bei Unterschreitung eines bestimmten Abstandes eine Erfassung des Objektes O durch die Abstandssensoren A ausfällt. Wenn sich das Fahrzeug F in der ersten Fahrzeugposition FP1 befindet, wie in Figur 1 a dargestellt, wird zunächst noch gewarnt, da das Objekt O noch im Erfassungsbereich E der Abstandssensoren liegt. Bei einer weiteren Annäherung verschwindet das Objekt O aus dem Erfassungsbereich E der Abstandssensoren A, beispielsweise, wie in Figur 1 b dargestellt, wenn sich das Fahrzeug F in der zweiten Fahrzeugposition FP2 befindet, so dass die Warnung eingestellt wird. Dadurch kann ein Fahrzeugführer nicht beurteilen, wie hoch die Bordsteinkante ist, d. h. ob er sie überfahren kann, ohne das Fahrzeug F zu beschädigen.In the prior art, a currently acquired image of the reversing camera is displayed on an optical output unit. In particular, dimensions of objects O shown are very difficult to estimate. In addition, in front of objects O, to which the vehicle F approaches and which are detected by the distance sensors A, for example, be warned visually, acoustically and / or haptically. Distance sensors A, for example, on ultrasound basis, however, have a limited detection range E, so that, for example, when approaching the Object O shown here, which represents a curb, falls below a certain distance detection of the object O by the distance sensors A fails. If the vehicle F is in the first vehicle position FP1, as shown in FIG. 1a, it is initially warned that the object O is still within the detection range E of the distance sensors. In a further approximation, the object O disappears from the detection area E of the distance sensors A, for example, as shown in Figure 1 b, when the vehicle F is in the second vehicle position FP2, so that the warning is set. As a result, a vehicle driver can not judge how high the curbside is, ie whether he can drive over them without damaging the vehicle F.
Mittels des Verfahrens zur Erkennung von Objekten O im Umfeld des Fahrzeugs F werden Positionen und Abmessungen von Objekten O im Umfeld des Fahrzeugs F ermittelt, welche in mittels der Kamera K erfassten Bildern B1 , B2 dargestellt sind. Auf diese Weise kann eine dreidimensionale Umfeldkarte mit einem Höhenprofil des erfassten Umfeldes des Fahrzeugs F erzeugt und intern ausgewertet und/oder auf der optischen Ausgabeeinheit ausgegeben werden, so dass bei einer Annäherung des Fahrzeugs F an Objekte O, welche von vorgegebenen Ausformungen und/oder Abmessungen abweichen, eine optische, akustische und/oder haptische Warnung ausgegeben wird. Im hier dargestellten Beispiel ist das Objekt O die Bordsteinkante.By means of the method for the recognition of objects O in the environment of the vehicle F, positions and dimensions of objects O in the environment of the vehicle F are determined which are represented in images B1, B2 captured by the camera K. In this way, a three-dimensional environment map with a height profile of the detected environment of the vehicle F can be generated and evaluated internally and / or output on the optical output unit, so that at an approach of the vehicle F to objects O, which of predetermined shapes and / or dimensions deviate, a visual, audible and / or haptic warning is issued. In the example shown here, the object O is the curb.
Mittels der Kamera K wird an der ersten, in Figur 1a dargestellten Fahrzeugposition FP1 ein erstes Bild B1 des Umfeldes des Fahrzeugs F erfasst, wie in Figur 2a dargestellt. Darin ist das Objekt O(FP1) von der ersten Fahrzeugposition FP1 aus dargestellt. Zu einem späteren Zeitpunkt, nachdem sich das Fahrzeug F weiterbewegt hat und sich dadurch dessen Fahrzeugposition FP1 und daraus resultierend auch die Position der fest am Fahrzeug F angeordneten Kamera K verändert hat und sich das Fahrzeug F in einer zweiten, in Figur 1 b dargestellten Fahrzeugposition FP2 befindet, wird mittels der Kamera K ein zweites Bild B2 des Umfeldes des Fahrzeugs F erfasst, wie in Figur 2 b dargestellt. Darin ist das Objekt O(FP2) von der zweiten Fahrzeugposition FP2 aus dargestellt. Die beiden erfassten Bilder B1 , B2 werden in einer Bildverarbeitungs- und Auswerteeinheit ausgewertet, wie in Figur 3 dargestellt.By means of the camera K, a first image B1 of the environment of the vehicle F is detected at the first vehicle position FP1 shown in FIG. 1a, as shown in FIG. 2a. Therein, the object O (FP1) is shown from the first vehicle position FP1. At a later time, after the vehicle F has moved on and thereby its vehicle position FP1 and consequently also the position of the camera fixed to the vehicle F has changed K and the vehicle F in a second, shown in Figure 1 b vehicle position FP2 is detected by means of the camera K, a second image B2 of the environment of the vehicle F, as shown in Figure 2 b. Therein, the object O (FP2) is shown from the second vehicle position FP2. The two captured images B1, B2 are evaluated in an image processing and evaluation unit, as shown in FIG.
Um aus den beiden erfassten Bildern B1 , B2 eine dreidimensionale Umfeldkarte erstellen zu können, welche ein Höhenprofil des Umfeldes des Fahrzeugs F umfasst, muss zunächst eine Positionsänderung des Fahrzeugs F zwischen der ersten Fahrzeugposition FP1 und der zweiten Fahrzeugposition FP2 ermittelt werden. Diese Positionsänderung, d. h. sowohl eine Richtung als auch eine zurückgelegte Entfernung, sind beispielsweise mittels Sensoren zur Ermittlung eines Fahrwegs des Fahrzeugs F ermittelbar, welche nach dem Stand der Technik bereits als Bestandteil von Fahrerassistenzsystemen, wie beispielsweise ein elektronisches Stabilitätsprogramm, im Fahrzeug F installiert sind. Mittels derartiger Sensoren werden beispielsweise Raddrehzahlen, eine Geschwindigkeit, ein Lenkradwinkel, ein Gierwinkel, eine Gierrate, eine Längs- und/oder Querbeschleunigung des Fahrzeugs F ermittelt, so dass daraus eine Positionsänderung des Fahrzeugs F und somit der Kamera K ermittelt werden kann.In order to be able to create a three-dimensional environment map from the two captured images B1, B2, which includes a height profile of the environment of the vehicle F, a change in the position of the vehicle F between the first vehicle position FP1 and the second vehicle position FP2 must first be determined. These Position change, ie, both a direction and a distance traveled, can be determined, for example, by means of sensors for determining a travel path of the vehicle F, which are already installed in the vehicle F as part of driver assistance systems, such as an electronic stability program, according to the prior art. By means of such sensors, wheel speeds, a speed, a steering wheel angle, a yaw angle, a yaw rate, a longitudinal and / or lateral acceleration of the vehicle F are determined, for example, so that a change in position of the vehicle F and thus of the camera K can be determined therefrom.
Eine weitere Möglichkeit, welche alternativ oder beispielsweise aus Redundanzgründen zusätzlich durchgeführt werden kann, ist die Bestimmung der Positionsänderung durch Bildauswertung der beiden mittels der Kamera K erfassten Bilder B1 , B2 in der Bildverarbeitungs- und Auswerteeinheit und, wie ebenfalls in Figur 3 dargestellt, einer daraus resultierenden Ermittlung eines optischen Flusses zumindest eines charakteristischen Punktes P welcher im ersten Bild B1 als charakteristischer Punkt P(FPI) von der ersten Fahrzeugposition FP1 aus dargestellt wird, und im zweiten Bild B2 als charakteristischer Punkt P(FP2) von der zweiten Fahrzeugposition FP2 aus dargestellt wird. Im hier dargestellten Beispiel ist ein solcher charakteristischer Punkt P beispielsweise ein oberer Eckpunkt der Bordsteinkante.Another possibility, which can alternatively be carried out alternatively or, for example, for reasons of redundancy, is the determination of the position change by image evaluation of the two images B1, B2 captured by the camera K in the image processing and evaluation unit and, as likewise shown in FIG resulting determination of an optical flow of at least one characteristic point P which is represented in the first image B1 as a characteristic point P (FPI) from the first vehicle position FP1, and shown in the second image B2 as a characteristic point P (FP2) from the second vehicle position FP2 becomes. In the example shown here, such a characteristic point P is, for example, an upper corner point of the curb.
Durch Vergleich der beiden Bilder B1 , B2 ist zwischen dem im ersten Bild B1 dargestellten charakteristischen Punkt P(FPI) und dem im zweiten Bild B2 dargestellten, korrespondierenden charakteristischen Punkt P(FP2) ein Bewegungsvektor BV darstellbar. Eine Verschiebung des charakteristischen Punktes P(FPI), P(FP2) wird durch die Verschiebung eines Abbildungsbereiches der Kamera K hervorgerufen, welche fest am Fahrzeug F angeordnet ist. Das heißt, die Verschiebung, welche durch den Bewegungsvektor BV zwischen den dargestellten korrespondierenden charakteristischen Punkten P(FPI), P(FP2) dargestellt ist, korrespondiert mit der Positionsänderung des Fahrzeugs F. Um eine ausreichende Genauigkeit der auf diese Weise ausgeführten Positionsbestimmung des Fahrzeugs F zu erzielen, sollte der optische Fluss vorteilhafterweise anhand einer Mehrzahl solcher charakteristischer Punkte P bestimmt werden.By comparing the two images B1, B2, a motion vector BV can be displayed between the characteristic point P (FPI) shown in the first image B1 and the corresponding characteristic point P (FP2) shown in the second image B2. A displacement of the characteristic point P (FPI), P (FP2) is caused by the displacement of an imaging area of the camera K, which is fixedly arranged on the vehicle F. That is, the displacement represented by the motion vector BV between the illustrated corresponding characteristic points P (FPI), P (FP2) corresponds to the positional change of the vehicle F. To ensure a sufficient accuracy of the position determination of the vehicle F thus executed to achieve the optical flow should be determined advantageously on the basis of a plurality of such characteristic points P.
Nachdem die Positionsänderung des Fahrzeugs F und somit die Positionsänderung der Kamera K zwischen den beiden erfassten Bilder B1 , B2 ermittelt wurde, wird diese Positionsänderung in der Bildverarbeitungs- und Auswerteeinheit der Bildauswertung der beiden Bilder B1 , B2 und der Erstellung der dreidimensionalen Karte aus diesen beiden Bildern B1 , B2 zugrunde gelegt. Da die beiden Bilder B1 , B2 in unterschiedlichen Positionen FP1 , FP2 der Kamera K und somit aus unterschiedlichen Perspektiven erfasst wurden, sind durch Vergleich der beiden Bilder B1 , B2 Abmessungen und Proportionen in den Bildern B1 , B2 dargestellter Objekte O(FP1), O(FP2) vergleichbar und durch Zugrundelegung der ermittelten Positionsänderung sind daraus reale Abmessungen, Ausformungen und Positionen der erfassten Objekte O ermittelbar und damit die dreidimensionale Karte und das Höhenprofil des erfassten Umfeldes des Fahrzeuges F erstellbar.After the change in position of the vehicle F and thus the change in position of the camera K between the two captured images B1, B2 has been determined, this position change in the image processing and evaluation of the image analysis of the two images B1, B2 and the creation of the three-dimensional map of these two Figures B1, B2 are used. Since the two images B1, B2 have been acquired in different positions FP1, FP2 of the camera K and thus from different perspectives, dimensions and proportions in the images B1, B2 of objects O (FP1), O shown by comparison of the two images B1, B2 (FP2) comparable and on the basis of the determined change in position from this real dimensions, shapes and positions of the detected objects O can be determined and thus the three-dimensional map and the height profile of the detected environment of the vehicle F can be created.
Auf diese Weise werden im hier dargestellten Beispiel eine Höhe und eine Ausformung des Objektes O im Umfeld des Fahrzeugs F, d. h. der Bord stein kante, ermittelt. Die dargestellte Bordsteinkante weist eine Höhe von beispielsweise 8 cm auf und ist in Richtung einer Fahrebene FE des Fahrzeugs F senkrecht abfallend ausgeformt. Damit stellt das erfasste Objekt O ein Hindernis dar, welches nicht ohne eine Gefahr einer Beschädigung des Fahrzeugs F überfahren werden kann. Es wird daher bei einer Annäherung an das Objekt O, d. h. an die Bordsteinkante, eine optische, akustische und/oder haptische Warnung generiert. Dies kann beispielsweise ein Warnton, eine Vibrationswarnung in einem Fahrersitz, ein sukzessives Aufleuchten einer Mehrzahl von Leuchtmitteln bei zunehmender Annäherung und/oder beispielsweise eine farbliche Markierung des Objektes O auf dem auf der optischen Ausgabeeinheit angezeigten Kamerabild sein.In this way, in the example shown here, a height and a shape of the object O in the environment of the vehicle F, d. H. the board edge, determined. The curb edge shown has a height of, for example, 8 cm and is formed vertically sloping in the direction of a driving plane FE of the vehicle F. Thus, the detected object O is an obstacle, which can not be run over without a risk of damaging the vehicle F. It is therefore when approaching the object O, d. H. to the curb, an optical, audible and / or haptic warning generated. This may be, for example, a warning sound, a vibration warning in a driver's seat, a successive illumination of a plurality of bulbs as the approach approaches and / or, for example, a color marking of the object O on the camera image displayed on the optical output unit.
Wäre das erfasste Objekt O beispielsweise ein abgesenkter Bordstein, so wird mittels des Verfahrens eine geringere, gefahrlos überfahrbare Höhe des Objektes O ermittelt, so dass keine Warnung ausgegeben wird. Wäre das Objekt O beispielsweise ein angeschrägter Bordstein, so wird mittels des Verfahrens ermittelt, dass die Bordsteinkante in Richtung der Fahrebene FE des Fahrzeugs F schräg abfallend ausgeformt ist, d. h. eine Steigung von beispielsweise 45° aufweist, welche von dem Fahrzeug F trotz der ermittelten Höhe der Bordsteinkante gefahrlos überfahren werden kann, ohne beispielsweise Reifen oder Felgen des Fahrzeugs F zu beschädigen. Daher würde auch in diesem Fall keine Warnung generiert werden.If the detected object O, for example, a lowered curb, so by means of the method, a lower, safely traversable height of the object O is determined, so that no warning is issued. If, for example, the object O were a bevelled curb, then it is determined by means of the method that the curb edge is formed sloping obliquely in the direction of the driving plane FE of the vehicle F, d. H. a slope of, for example, 45 °, which can be safely run over by the vehicle F despite the determined height of the curb without damaging tires or rims of the vehicle F for example. Therefore, no warning would be generated in this case either.
In einer weiteren Ausführungsform wird diese Ermittlung von Positionen und Abmessungen von Objekten O im Umfeld des Fahrzeugs F mit Sensordaten der Abstandssensoren A kombiniert, so dass eine Redundanz erzielbar ist, physikalisch bedingte Ungenauigkeiten, wie beispielsweise eine Nichterfassung von Objekten O außerhalb des Erfassungsbereichs E der Abstandssensoren A ausgleichbar sind und/oder mittels einer solchen Kombination eine vollständige Umfelderfassung um das Fahrzeug F herum ermöglicht ist.In a further embodiment, this determination of positions and dimensions of objects O in the vicinity of the vehicle F is combined with sensor data of the distance sensors A, so that a redundancy can be achieved, physical inaccuracies, such as a non-detection of objects O outside the detection range E of the distance sensors A are compensable and / or by means of such a combination a complete environment detection around the vehicle F around is made possible.
In einer weiteren Ausführungsform kann die ermittelte dreidimensionale Umfeldkarte mit den ermittelten Abmessungen und Positionen der Objekte O im Umfeld des Fahrzeugs F, vorzugsweise kombiniert mit den Sensordaten der Abstandssensoren A, auch für einen teil- oder vollautomatischen Einparkvorgang des Fahrzeugs F genutzt werden, indem anhand ermittelter Umfelddaten mittels einer Steuereinheit ein Antriebsstrang, eine Lenkung und/oder ein Bremssystems des Fahrzeugs F angesteuert werden. Ein solcher teil- oder vollautomatischer Einparkvorgang ist durch Verwendung von mittels des Verfahrens ermittelter Umfelddaten wesentlich exakter durchführbar. In a further embodiment, the determined three-dimensional environment map with the determined dimensions and positions of the objects O in the environment of the vehicle F, preferably combined with the sensor data of the distance sensors A, for a partially or fully automatic parking operation of the vehicle F can be used by using determined Environment data by means of a control unit, a drive train, a steering and / or a braking system of the vehicle F are controlled. Such a partially or fully automatic parking process can be performed much more accurately by using environment data determined by means of the method.
BezugszeichenlisteLIST OF REFERENCE NUMBERS
A AbstandssensorenA distance sensors
B1. B2 BilderB1. B2 pictures
BV BewegungsvektorBV motion vector
E ErfassungsbereichE detection range
F FahrzeugF vehicle
FE FahrebeneFE driving level
FP1 , FP2 FahrzeugpositionenFP1, FP2 vehicle positions
K KameraK camera
O, 0(FP1), 0(FP2) ObjektO, 0 (FP1), 0 (FP2) object
P, P(FPI), P(FP2) charakteristischer Punkt P, P (FPI), P (FP2) characteristic point

Claims

Patentansprüche claims
1. Verfahren zur Erkennung von Objekten (O) in einem Umfeld eines Fahrzeugs (F), wobei mittels zumindest einer am Fahrzeug (F) angeordneten Kamera (K) an einer ersten Fahrzeug position (FP1) ein erstes Bild (B1) des Umfeldes des Fahrzeugs (F) erfasst wird, an einer zweiten Fahrzeugposition (FP2) ein zweites Bild (B2) des Umfeldes des Fahrzeugs (F) erfasst wird und die beiden erfassten Bilder (B1 , B2) in einer Bildverarbeitungs- und Auswerteeinheit ausgewertet werden, dadurch gekennzeichnet, dass eine Positionsänderung des Fahrzeugs (F) zwischen der ersten Fahrzeugposition (FP1) und der zweiten Fahrzeugposition (FP2) durch eine Ermittlung eines optischen Flusses anhand zumindest eines in beiden Bildern (B1 , B2) dargestellten charakteristischen Punktes (P(FPI), P(FP2)) und/oder durch Sensoren zur Ermittlung eines Fahrwegs des Fahrzeugs (F) ermittelt wird und unter Verwendung der ermittelten Positionsänderung Positionen und Abmessungen von in den Bildern (B1 , B2) dargestellter Objekte (O(FP1), (O(FP2)) im Umfeld des Fahrzeugs (F) ermittelt werden und daraus eine dreidimensionale Umfeldkarte des erfassten Umfeldes des Fahrzeugs (F) mit einem Höhenprofil erzeugt wird.1. A method for detecting objects (O) in an environment of a vehicle (F), wherein at least one on the vehicle (F) arranged camera (K) at a first vehicle position (FP1) a first image (B1) of the environment of Vehicle (F) is detected, at a second vehicle position (FP2) a second image (B2) of the environment of the vehicle (F) is detected and the two captured images (B1, B2) are evaluated in an image processing and evaluation, characterized in that a change in position of the vehicle (F) between the first vehicle position (FP1) and the second vehicle position (FP2) is determined by determining an optical flow using at least one characteristic point (P (FPI), P (FP2)) and / or by sensors for determining a travel path of the vehicle (F) is determined and using the determined position change positions and dimensions of in the images (B1, B2) shown object e (O (FP1), (O (FP2)) in the environment of the vehicle (F) are determined and from a three-dimensional environment map of the detected environment of the vehicle (F) is generated with a height profile.
2. Verfahren nach Anspruch 1 , dadurch gekennzeichnet, dass bei einer Annäherung des Fahrzeugs (F) an Objekte (O)1 welche von vorgegebenen Ausformungen und/oder Abmessungen abweichen, eine optische, akustische und/oder haptische Warnung ausgegeben wird. 2. The method according to claim 1, characterized in that at an approach of the vehicle (F) to objects (O) 1 which deviate from predetermined shapes and / or dimensions, an optical, audible and / or haptic warning is issued.
3. Verfahren nach Anspruch 1 oder 2, dadurch gekennzeichnet, dass die Umfeldkarte auf einer optischen Ausgabeeinheit im Fahrzeug (F) dargestellt wird.3. The method according to claim 1 or 2, characterized in that the environment map is displayed on an optical output unit in the vehicle (F).
4. Verfahren nach einem der Ansprüche 1 bis 3, dadurch gekennzeichnet, dass die ermittelten Positionen und Abmessungen der Objekte (O) im Umfeld des Fahrzeugs (F) für einen teil- oder vollautomatischen Einparkvorgang verwendet werden.4. The method according to any one of claims 1 to 3, characterized in that the determined positions and dimensions of the objects (O) in the environment of the vehicle (F) are used for a partially or fully automatic parking.
5. Verfahren nach einem der Ansprüche 1 bis 4, dadurch gekennzeichnet, dass die mittels der zumindest einen Kamera (K) ermittelten Positionen und Abmessungen der Objekte (O) mit Sensordaten von Abstandssensoren (A) kombiniert werden.5. The method according to any one of claims 1 to 4, characterized in that the means of the at least one camera (K) determined positions and dimensions of the objects (O) with sensor data from distance sensors (A) are combined.
6. Vorrichtung zur Durchführung des Verfahrens nach einem der Ansprüche 1 bis 5, umfassend zumindest eine Kamera (K) und eine Bildverarbeitungs- und Auswerteeinheit.6. Apparatus for carrying out the method according to one of claims 1 to 5, comprising at least one camera (K) and an image processing and evaluation unit.
7. Vorrichtung nach Anspruch 6, umfassend Abstandssensoren (A).7. Apparatus according to claim 6, comprising distance sensors (A).
8. Vorrichtung nach Anspruch 6 oder 7, umfassend Sensoren zur Ermittlung eines Fahrwegs des Fahrzeugs (F).8. Apparatus according to claim 6 or 7, comprising sensors for determining a travel path of the vehicle (F).
9. Vorrichtung nach einem der Ansprüche 6 bis 8, umfassend optische, akustische und/oder haptische Ausgabeeinheiten.9. Device according to one of claims 6 to 8, comprising optical, acoustic and / or haptic output units.
10. Vorrichtung nach einem der Ansprüche 6 bis 9, umfassend eine Steuereinheit zur Ansteuerung eines Antriebsstrangs, einer Lenkung und/oder eines Bremssystems des Fahrzeugs (F). 10. Device according to one of claims 6 to 9, comprising a control unit for controlling a drive train, a steering and / or a braking system of the vehicle (F).
PCT/EP2010/002097 2009-04-06 2010-04-01 Method and apparatus for recognizing objects WO2010115580A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102009016562A DE102009016562A1 (en) 2009-04-06 2009-04-06 Method for identifying objects in periphery of vehicle, involves recording image of periphery of vehicle at vehicle position, and another image of periphery of vehicle at another vehicle position by camera arranged on vehicle
DE102009016562.2 2009-04-06

Publications (1)

Publication Number Publication Date
WO2010115580A1 true WO2010115580A1 (en) 2010-10-14

Family

ID=41180610

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2010/002097 WO2010115580A1 (en) 2009-04-06 2010-04-01 Method and apparatus for recognizing objects

Country Status (2)

Country Link
DE (1) DE102009016562A1 (en)
WO (1) WO2010115580A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10565714B2 (en) 2018-05-25 2020-02-18 Denso Corporation Feature tracking for visual odometry

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009046158A1 (en) * 2009-10-29 2011-05-05 Robert Bosch Gmbh Method for detecting objects with low height
DE102010013093A1 (en) * 2010-03-29 2011-09-29 Volkswagen Ag Method for creating model of surrounding area of motor vehicle i.e. car, involves determining whether card cells are loaded with object represented by three- dimensional structures
DE102010013815A1 (en) * 2010-04-03 2011-10-06 Volkswagen Aktiengesellschaft Method for determining and tracking position of objects in environment of motor car, involves detecting spacing between feature points related to object, and determining object distance from image based on measurement of size of object
DE102010031040A1 (en) * 2010-07-07 2012-01-12 Robert Bosch Gmbh Method for assisting driver of motor car during driving maneuver, involves presenting non-detected objects or non-detected parts of objects different to detected objects or detected parts of objects
DE102010063742A1 (en) * 2010-12-21 2012-06-21 Deniz Yilmaz motor vehicle
DE102011014699B4 (en) * 2011-03-22 2015-10-29 Audi Ag Method for operating a driver assistance system for protecting a motor vehicle against damage and motor vehicle
DE102011077555A1 (en) * 2011-06-15 2012-12-20 Robert Bosch Gmbh Retrofit kit for park guidance
DE102011108468A1 (en) 2011-07-23 2013-01-24 Volkswagen Aktiengesellschaft Method of generating three-dimensional environment information of motor vehicle, involves increasing occupancy probability of voxels of voxel based environment map, such that voxels are contacted or cut lines of sight
DE102011109569A1 (en) 2011-08-05 2013-02-07 Conti Temic Microelectronic Gmbh Lane detection method using a camera
DE102011087894A1 (en) * 2011-12-07 2013-06-13 Robert Bosch Gmbh Method and vehicle assistance system for active warning and / or navigation aid for avoiding a collision of a vehicle body part and / or a vehicle wheel with an object
DE102013104256A1 (en) 2013-04-26 2014-10-30 Conti Temic Microelectronic Gmbh Method and device for estimating the number of lanes
DE102014006547A1 (en) * 2014-05-06 2015-11-12 Audi Ag Driver assistance system for a motor vehicle and method for issuing a warning
DE102014019078A1 (en) 2014-12-18 2015-06-18 Daimler Ag Calibration method and method for adjusting a camera mounted on a vehicle
DE102014226439A1 (en) * 2014-12-18 2016-06-23 Conti Temic Microelectronic Gmbh Driver assistance system
DE102015000425A1 (en) * 2015-01-13 2016-07-14 Audi Ag Method for operating a driver assistance system of a motor vehicle and motor vehicle
DE102016111079A1 (en) * 2016-06-17 2017-12-21 Valeo Schalter Und Sensoren Gmbh Method for object height detection of an object in the environment of a motor vehicle and driver assistance system
DE102021209575B3 (en) 2021-08-31 2023-01-12 Volkswagen Aktiengesellschaft Method and assistance device for supporting vehicle functions in a parking space and motor vehicle
DE102022126322A1 (en) 2022-10-11 2024-04-11 Valeo Schalter Und Sensoren Gmbh Method for operating a driver assistance system for a motor vehicle, driver assistance system for a motor vehicle and motor vehicle with a driver assistance system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1030188A1 (en) * 1999-02-16 2000-08-23 Mitsubishi Denki Kabushiki Kaisha Situation awareness system
US20020005778A1 (en) 2000-05-08 2002-01-17 Breed David S. Vehicular blind spot identification and monitoring system
US20060055776A1 (en) 2003-10-17 2006-03-16 Matsushita Electric Industrial Co., Ltd. Mobile unit motion calculating method, apparatus and navigation system
DE102004046101A1 (en) * 2004-09-23 2006-09-21 Daimlerchrysler Ag Method for early detection of motor vehicle collision involves estimation of collision danger on the basis of determined time to collision and introducing collision reducing measures with exceeding of predetermined collision danger
US20070206833A1 (en) 2006-03-02 2007-09-06 Hitachi, Ltd. Obstacle detection system
US20070285217A1 (en) 2006-04-27 2007-12-13 Denso Corporation Field recognition apparatus, method for field recognition and program for the same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1030188A1 (en) * 1999-02-16 2000-08-23 Mitsubishi Denki Kabushiki Kaisha Situation awareness system
US20020005778A1 (en) 2000-05-08 2002-01-17 Breed David S. Vehicular blind spot identification and monitoring system
US20060055776A1 (en) 2003-10-17 2006-03-16 Matsushita Electric Industrial Co., Ltd. Mobile unit motion calculating method, apparatus and navigation system
DE102004046101A1 (en) * 2004-09-23 2006-09-21 Daimlerchrysler Ag Method for early detection of motor vehicle collision involves estimation of collision danger on the basis of determined time to collision and introducing collision reducing measures with exceeding of predetermined collision danger
US20070206833A1 (en) 2006-03-02 2007-09-06 Hitachi, Ltd. Obstacle detection system
US20070285217A1 (en) 2006-04-27 2007-12-13 Denso Corporation Field recognition apparatus, method for field recognition and program for the same

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10565714B2 (en) 2018-05-25 2020-02-18 Denso Corporation Feature tracking for visual odometry

Also Published As

Publication number Publication date
DE102009016562A1 (en) 2009-11-19

Similar Documents

Publication Publication Date Title
WO2010115580A1 (en) Method and apparatus for recognizing objects
DE102009006335B4 (en) Method for assisting the driver of a motor vehicle
EP1928687B1 (en) Method and driver assistance system for sensor-based driving off control of a motor vehicle
EP2888604B1 (en) Method for determining the course of a lane for a vehicle
EP3356203B1 (en) Method for determining a parking surface for parking a motor vehicle, driver assistance system, and motor vehicle
DE102013012324A1 (en) Method and device for finding a route
WO2006087002A1 (en) Device for bringing a motor vehicle to a target position
DE102005015463A1 (en) Vehicle with distance control system
WO2016020347A1 (en) Method for detecting an object in a surrounding region of a motor vehicle using an ultrasonic sensor, driver assistance system, and motor vehicle
EP3687881B1 (en) Method for parking a motor vehicle kerbside, apparatus, and motor vehicle
EP2788968A1 (en) Method and vehicle assistance system for active warning and/or for navigation aid for preventing a collision of a vehicle body part and/or a vehicle wheel with an object
DE102016117712A1 (en) Method for at least semi-autonomous maneuvering of a motor vehicle taking into account a detection range of a sensor, driver assistance system and motor vehicle
DE102021002377A1 (en) Process for predictive, camera-based parking lot detection and vehicle
DE102010049216A1 (en) Method for operating camera i.e. stereo camera, arranged at car, involves detecting distance of object via image evaluation, and calibrating camera when distance determined by sensing unit deviates from distance determined by evaluation
DE102010013093A1 (en) Method for creating model of surrounding area of motor vehicle i.e. car, involves determining whether card cells are loaded with object represented by three- dimensional structures
EP2579228A1 (en) Method and system for digital imaging of the vicinity of a vehicle
DE102011080720A1 (en) Method for predictive monitoring of track in driving assistance system of vehicle, involves detecting environmental data concerning to track section by environment sensor system of vehicle before detecting gradient change
EP3520020B1 (en) Road sign classification in a surrounding area of a motor vehicle
DE102015117904A1 (en) Method for determining an area occupied by a trailer connected to a motor vehicle, driver assistance system and motor vehicle
WO2019137864A1 (en) Method for preventing a critical situation for a motor vehicle, wherein a distance between a motor vehicle contour and an object contour is determined, driver assistance system and motor vehicle
DE102016109850B4 (en) Method for detecting an inclination in a roadway of a motor vehicle, driver assistance system and motor vehicle
EP3178727B1 (en) Method for detecting a longitudinal parking space for parking a motor vehicle on the basis of a road marking, driver assistance system and motor vehicle
DE102019006243A1 (en) Method for operating a turning assistance system, turning assistance system and motor vehicle with such a turning assistance system
DE102015005999B3 (en) Method for changing a position of an exterior mirror of a motor vehicle and motor vehicle
WO2013178224A1 (en) Method and device for detecting objects

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10716473

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10716473

Country of ref document: EP

Kind code of ref document: A1