WO2010115580A1 - Procédé et dispositif de reconnaissance d'objets - Google Patents
Procédé et dispositif de reconnaissance d'objets Download PDFInfo
- Publication number
- WO2010115580A1 WO2010115580A1 PCT/EP2010/002097 EP2010002097W WO2010115580A1 WO 2010115580 A1 WO2010115580 A1 WO 2010115580A1 EP 2010002097 W EP2010002097 W EP 2010002097W WO 2010115580 A1 WO2010115580 A1 WO 2010115580A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- environment
- determined
- objects
- camera
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/93—Sonar systems specially adapted for specific applications for anti-collision purposes
- G01S15/931—Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/86—Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93272—Sensor installation details in the back of the vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Definitions
- the invention relates to a method for detecting objects in an environment of a vehicle according to the features of the preamble of claim 1 and an apparatus for performing the method.
- An arrangement for determining information about objects in an environment of a vehicle comprises one or more vehicle-arranged light-emitting components which emit infrared light into the surroundings of the vehicle and receivers arranged on the vehicle for reception of infrared light from the surroundings of the vehicle.
- the information about the objects is determined by an evaluation of the received infrared light by means of a processor. For example, a distance between the vehicle and the object and a speed of the object are determined and the object is identified.
- pattern recognition techniques are used to obtain the desired information.
- On the basis of detected objects and their position and speed functions of the vehicle can be influenced, for example, an audible or visual warning device or a steering wheel control device.
- US 2006/0055776 A1 describes a method and a device for motion detection of a mobile unit and a navigation system.
- a partial area determines corresponding points between images taken by a camera.
- a first portion for motion determination determines a first movement of a mobile body using the corresponding points assuming a given level in the pictures.
- a second subarea for the Motion determination determines a second motion using the first motion and the corresponding points.
- a bird's-eye view of a first image is created with a road surface imaged in the image. This first image was captured by a camera, which is arranged on a vehicle. Thereafter, a bird's eye view of a second image is generated. The second image is acquired at a different time at the time of the first image.
- the two bird's-eye views are linked by a characteristic feature on the road surface. In each overlap area of the two bird's-eye views, areas that have differences in the two representations are identified as obstacles.
- US 2007/0285217 A1 describes an environment detection device, a method for environment detection and a program sequence for environment detection.
- An environment detecting device includes a first camera for detecting an apron area and a second camera for detecting a road surface. The optical axis of the second camera is inclined downwardly to detect a characteristic point in consecutively acquired images and to determine therefrom an optical flux and to detect structural information of the road. Thus, three-dimensional information of obstacles in an environment of the vehicle can be determined.
- the invention has for its object to provide an improved method for detecting objects in an environment of a vehicle and an apparatus for performing the method.
- the object is achieved by a method for detecting objects in an environment of a vehicle having the features of claim 1.
- the object is achieved by the features specified in claim 6.
- Preferred embodiments and further developments of the invention are specified in the dependent claims.
- a first image of the surroundings of the vehicle is detected by means of at least one camera arranged on the vehicle at a first vehicle position, and a second image of the surroundings of the vehicle is detected at a second vehicle position.
- the two acquired images are evaluated in an image processing and evaluation unit.
- a change in position of the vehicle between the first vehicle position and the second vehicle position is determined by a determination of an optical flow based on at least one characteristic point shown in both images and / or by sensors for determining a travel path of the vehicle.
- positions and dimensions of objects shown in the images in the surroundings of the vehicle are furthermore determined according to the invention, from which a three-dimensional environment map of the detected environment of the vehicle having a height profile is generated therefrom.
- the method makes it possible to optimize automatic or semi-automatic parking systems, since an environment of the vehicle is detected much more accurately.
- a distance from vehicle tires to a curb can be significantly reduced in automatic or semi-automatic parking systems by using the method, without increasing the risk of damage to the vehicle.
- a combination with further distance sensors arranged on the vehicle, such as ultrasound sensors, for example, further increases reliability, since physically induced detection inaccuracies can be compensated.
- Fig. 1a is equipped with a camera vehicle in a first
- Fig. 1 b a vehicle equipped with a camera in a second
- Fig. 3 is an evaluation of the two images in a Schmakus- and
- FIGS. 1a and 1b show a vehicle F equipped with a camera K in a first vehicle position FP1 and a second vehicle position FP2 in front of an object O.
- the camera K with which an environment of the vehicle F can be monitored, is one in the exemplary embodiment illustrated here Backup camera.
- Such cameras K as well as other cameras K at other positions of the vehicle F, for example in a front area of the vehicle F or in outside mirrors of the vehicle F, are already in widespread use.
- additional cameras K are not necessarily to be installed; cameras K already installed on the vehicle F can be used, as a result of which the method can be implemented cost-effectively in vehicles F.
- the illustrated vehicle F has distance sensors A, for example, ultrasonic sensors in a rear bumper of the vehicle F.
- a currently acquired image of the reversing camera is displayed on an optical output unit.
- dimensions of objects O shown are very difficult to estimate.
- distance sensors A for example, on ultrasound basis, however, have a limited detection range E, so that, for example, when approaching the Object O shown here, which represents a curb, falls below a certain distance detection of the object O by the distance sensors A fails. If the vehicle F is in the first vehicle position FP1, as shown in FIG. 1a, it is initially warned that the object O is still within the detection range E of the distance sensors.
- the object O disappears from the detection area E of the distance sensors A, for example, as shown in Figure 1 b, when the vehicle F is in the second vehicle position FP2, so that the warning is set.
- a vehicle driver can not judge how high the curbside is, ie whether he can drive over them without damaging the vehicle F.
- positions and dimensions of objects O in the environment of the vehicle F are determined which are represented in images B1, B2 captured by the camera K.
- a three-dimensional environment map with a height profile of the detected environment of the vehicle F can be generated and evaluated internally and / or output on the optical output unit, so that at an approach of the vehicle F to objects O, which of predetermined shapes and / or dimensions deviate, a visual, audible and / or haptic warning is issued.
- the object O is the curb.
- a first image B1 of the environment of the vehicle F is detected at the first vehicle position FP1 shown in FIG. 1a, as shown in FIG. 2a.
- the object O (FP1) is shown from the first vehicle position FP1.
- vehicle position FP2 is detected by means of the camera K, a second image B2 of the environment of the vehicle F, as shown in Figure 2 b.
- the object O (FP2) is shown from the second vehicle position FP2.
- the two captured images B1, B2 are evaluated in an image processing and evaluation unit, as shown in FIG.
- a change in the position of the vehicle F between the first vehicle position FP1 and the second vehicle position FP2 must first be determined.
- Position change ie, both a direction and a distance traveled, can be determined, for example, by means of sensors for determining a travel path of the vehicle F, which are already installed in the vehicle F as part of driver assistance systems, such as an electronic stability program, according to the prior art.
- wheel speeds, a speed, a steering wheel angle, a yaw angle, a yaw rate, a longitudinal and / or lateral acceleration of the vehicle F are determined, for example, so that a change in position of the vehicle F and thus of the camera K can be determined therefrom.
- a characteristic point P is, for example, an upper corner point of the curb.
- a motion vector BV can be displayed between the characteristic point P (FPI) shown in the first image B1 and the corresponding characteristic point P (FP2) shown in the second image B2.
- a displacement of the characteristic point P (FPI), P (FP2) is caused by the displacement of an imaging area of the camera K, which is fixedly arranged on the vehicle F. That is, the displacement represented by the motion vector BV between the illustrated corresponding characteristic points P (FPI), P (FP2) corresponds to the positional change of the vehicle F.
- this position change in the image processing and evaluation of the image analysis of the two images B1, B2 and the creation of the three-dimensional map of these two Figures B1, B2 are used. Since the two images B1, B2 have been acquired in different positions FP1, FP2 of the camera K and thus from different perspectives, dimensions and proportions in the images B1, B2 of objects O (FP1), O shown by comparison of the two images B1, B2 (FP2) comparable and on the basis of the determined change in position from this real dimensions, shapes and positions of the detected objects O can be determined and thus the three-dimensional map and the height profile of the detected environment of the vehicle F can be created.
- the curb edge shown has a height of, for example, 8 cm and is formed vertically sloping in the direction of a driving plane FE of the vehicle F.
- the detected object O is an obstacle, which can not be run over without a risk of damaging the vehicle F. It is therefore when approaching the object O, d. H. to the curb, an optical, audible and / or haptic warning generated.
- This may be, for example, a warning sound, a vibration warning in a driver's seat, a successive illumination of a plurality of bulbs as the approach approaches and / or, for example, a color marking of the object O on the camera image displayed on the optical output unit.
- the detected object O for example, a lowered curb
- a lower, safely traversable height of the object O is determined, so that no warning is issued.
- the object O were a bevelled curb
- this determination of positions and dimensions of objects O in the vicinity of the vehicle F is combined with sensor data of the distance sensors A, so that a redundancy can be achieved, physical inaccuracies, such as a non-detection of objects O outside the detection range E of the distance sensors A are compensable and / or by means of such a combination a complete environment detection around the vehicle F around is made possible.
- the determined three-dimensional environment map with the determined dimensions and positions of the objects O in the environment of the vehicle F, preferably combined with the sensor data of the distance sensors A, for a partially or fully automatic parking operation of the vehicle F can be used by using determined Environment data by means of a control unit, a drive train, a steering and / or a braking system of the vehicle F are controlled.
- a partially or fully automatic parking process can be performed much more accurately by using environment data determined by means of the method.
Abstract
La présente invention concerne un procédé de reconnaissance d'objets (O) dans un champ périphérique d'un véhicule (F), ce procédé consistant à utiliser une caméra (K) pour saisir, d'une part une première image (B1) du champ périphérique du véhicule (F) pour une première position du véhicule (FP1), et d'autre part une seconde image (B2) du champ périphérique du véhicule (F) pour une seconde position du véhicule (FP2), puis, au moyen d'une unité de traitement et d'interprétation d'images, à interpréter les deux images saisies (B1, B2). En l'occurrence, l'invention consiste à déterminer une modification de la position du véhicule (F) entre la première position du véhicule (FP1) et la seconde position du véhicule, et ce, en déterminant un flux optique sur la base d'au moins un point caractéristique (P(FPI), P(FP2)) représenté dans les deux images (B1, B2) et/ou en faisant intervenir des capteurs capables de déterminer une trajectoire du véhicule (F). L'invention consiste ensuite à utiliser les modifications de positions déterminées pour déterminer, dans le champ périphérique du véhicule, les positions et les dimensions des objets (O(FP1), (O(FP2)) représentés dans les images (B1, B2), ce qui permet de produire une carte de champ périphérique tridimensionnel, c'est-à-dire avec un profil de hauteur, du champ périphérique capturé du véhicule (F). L'invention concerne également un dispositif de mise en œuvre du procédé.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102009016562A DE102009016562A1 (de) | 2009-04-06 | 2009-04-06 | Verfahren und Vorrichtung zur Objekterkennung |
DE102009016562.2 | 2009-04-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010115580A1 true WO2010115580A1 (fr) | 2010-10-14 |
Family
ID=41180610
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2010/002097 WO2010115580A1 (fr) | 2009-04-06 | 2010-04-01 | Procédé et dispositif de reconnaissance d'objets |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE102009016562A1 (fr) |
WO (1) | WO2010115580A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10565714B2 (en) | 2018-05-25 | 2020-02-18 | Denso Corporation | Feature tracking for visual odometry |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102009046158A1 (de) | 2009-10-29 | 2011-05-05 | Robert Bosch Gmbh | Verfahren zur Erkennung von Objekten mit geringer Höhe |
DE102010013093A1 (de) * | 2010-03-29 | 2011-09-29 | Volkswagen Ag | Verfahren und System zur Erstellung eines Modells eines Umfelds eines Fahrzeugs |
DE102010013815A1 (de) * | 2010-04-03 | 2011-10-06 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zur Erkennung und Verfolgung von Objekten im Umfeld eines Kraftfahrzeugs |
DE102010031040A1 (de) * | 2010-07-07 | 2012-01-12 | Robert Bosch Gmbh | Verfahren zur Unterstützung eines Fahrers eines Kraftfahrzeugs |
DE102010063742A1 (de) * | 2010-12-21 | 2012-06-21 | Deniz Yilmaz | Kraftfahrzeug |
DE102011014699B4 (de) * | 2011-03-22 | 2015-10-29 | Audi Ag | Verfahren zum Betrieb eines Fahrerassistenzsystems zum Schutz eines Kraftfahrzeuges vor Beschädigungen und Kraftfahrzeug |
DE102011077555A1 (de) * | 2011-06-15 | 2012-12-20 | Robert Bosch Gmbh | Nachrüstsatz für Parkführung |
DE102011108468A1 (de) | 2011-07-23 | 2013-01-24 | Volkswagen Aktiengesellschaft | Verfahren und Vorrichtung zur Generierung von dreidimensionalen Umfeldinformationen |
DE102011109569A1 (de) | 2011-08-05 | 2013-02-07 | Conti Temic Microelectronic Gmbh | Verfahren zur Fahrspurerkennung mittels einer Kamera |
DE102011087894A1 (de) * | 2011-12-07 | 2013-06-13 | Robert Bosch Gmbh | Verfahren und Fahrzeugassistenzsystem zur aktiven Warnung und/oder zur Navigationshilfe zur Vermeidung einer Kollision eines Fahrzeugkarosserieteils und/oder eines Fahrzeugrades mit einem Objekt |
DE102013104256A1 (de) | 2013-04-26 | 2014-10-30 | Conti Temic Microelectronic Gmbh | Verfahren und Vorrichtung zur Schätzung der Anzahl von Fahrspuren |
DE102014006547A1 (de) * | 2014-05-06 | 2015-11-12 | Audi Ag | Fahrerassistenzsystem für ein Kraftfahrzeug und Verfahren zum Ausgeben einer Warnung |
DE102014019078A1 (de) | 2014-12-18 | 2015-06-18 | Daimler Ag | Verfahren zur Kalibrierung und Verfahren zur Justierung einer an einem Fahrzeug angeordneten Kamera |
DE102014226439A1 (de) * | 2014-12-18 | 2016-06-23 | Conti Temic Microelectronic Gmbh | Fahrerassistenzsystem |
DE102015000425A1 (de) * | 2015-01-13 | 2016-07-14 | Audi Ag | Verfahren zum Betrieb eines Fahrerassistenzsystems eines Kraftfahrzeugs und Kraftfahrzeug |
DE102016111079A1 (de) * | 2016-06-17 | 2017-12-21 | Valeo Schalter Und Sensoren Gmbh | Verfahren zur Objekthöhenerkennung eines Objektes in der Umgebung eines Kraftfahrzeugs sowie Fahrerassistenzsystem |
DE102021209575B3 (de) | 2021-08-31 | 2023-01-12 | Volkswagen Aktiengesellschaft | Verfahren und Assistenzeinrichtung zum Unterstützen von Fahrzeugfunktionen in einem Parkraum und Kraftfahrzeug |
DE102022126322A1 (de) | 2022-10-11 | 2024-04-11 | Valeo Schalter Und Sensoren Gmbh | Verfahren zum Betreiben eines Fahrerassistenzsystems für ein Kraftfahrzeug, Fahrerassistenzsystem für ein Kraftfahrzeug sowie Kraftfahrzeug mit einem Fahrerassistenzsystem |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1030188A1 (fr) * | 1999-02-16 | 2000-08-23 | Mitsubishi Denki Kabushiki Kaisha | Système de présentation de situations |
US20020005778A1 (en) | 2000-05-08 | 2002-01-17 | Breed David S. | Vehicular blind spot identification and monitoring system |
US20060055776A1 (en) | 2003-10-17 | 2006-03-16 | Matsushita Electric Industrial Co., Ltd. | Mobile unit motion calculating method, apparatus and navigation system |
DE102004046101A1 (de) * | 2004-09-23 | 2006-09-21 | Daimlerchrysler Ag | Verfahren, Sicherheitsvorrichtung und Verwendung der Sicherheitsvorrichtung zur Früherkennung von Kraftfahrzeugkollisionen |
US20070206833A1 (en) | 2006-03-02 | 2007-09-06 | Hitachi, Ltd. | Obstacle detection system |
US20070285217A1 (en) | 2006-04-27 | 2007-12-13 | Denso Corporation | Field recognition apparatus, method for field recognition and program for the same |
-
2009
- 2009-04-06 DE DE102009016562A patent/DE102009016562A1/de not_active Withdrawn
-
2010
- 2010-04-01 WO PCT/EP2010/002097 patent/WO2010115580A1/fr active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1030188A1 (fr) * | 1999-02-16 | 2000-08-23 | Mitsubishi Denki Kabushiki Kaisha | Système de présentation de situations |
US20020005778A1 (en) | 2000-05-08 | 2002-01-17 | Breed David S. | Vehicular blind spot identification and monitoring system |
US20060055776A1 (en) | 2003-10-17 | 2006-03-16 | Matsushita Electric Industrial Co., Ltd. | Mobile unit motion calculating method, apparatus and navigation system |
DE102004046101A1 (de) * | 2004-09-23 | 2006-09-21 | Daimlerchrysler Ag | Verfahren, Sicherheitsvorrichtung und Verwendung der Sicherheitsvorrichtung zur Früherkennung von Kraftfahrzeugkollisionen |
US20070206833A1 (en) | 2006-03-02 | 2007-09-06 | Hitachi, Ltd. | Obstacle detection system |
US20070285217A1 (en) | 2006-04-27 | 2007-12-13 | Denso Corporation | Field recognition apparatus, method for field recognition and program for the same |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10565714B2 (en) | 2018-05-25 | 2020-02-18 | Denso Corporation | Feature tracking for visual odometry |
Also Published As
Publication number | Publication date |
---|---|
DE102009016562A1 (de) | 2009-11-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2010115580A1 (fr) | Procédé et dispositif de reconnaissance d'objets | |
DE102009006335B4 (de) | Verfahren zur Unterstützung des Fahrers eines Kraftfahrzeugs | |
EP1928687B1 (fr) | Procede et systeme d'aide a la conduite pour la commande de demarrage d'un vehicule automobile basee sur un capteur | |
EP2888604B1 (fr) | Procédé de détermination du tracé de la voie d'un véhicule | |
EP3356203B1 (fr) | Procédé de détermination d'une zone de stationnement pour stationner un véhicule automobile, système d'assistance à la conduite et véhicule automobile | |
DE102013012324A1 (de) | Verfahren und Vorrichtung zur Fahrwegfindung | |
EP1848626A1 (fr) | Dispositif permettant d'amener un vehicule en un emplacement cible | |
DE102005015463A1 (de) | Fahrzeug mit Distanzkontrollsystem | |
EP3687881B1 (fr) | Procédé pour effectuer un stationnement en bord de trottoir d'un véhicule automobile, dispositif et véhicule automobile | |
EP2788968A1 (fr) | Procédé et système d'aide à la conduite destiné à émettre une alerte active et/ou à assister le guidage du véhicule pour éviter la collision d'une partie de carrosserie et/ou d'une roue de véhicule avec un objet | |
DE102016117712A1 (de) | Verfahren zum zumindest semi-autonomen Manövrieren eines Kraftfahrzeugs unter Berücksichtigung eines Erfassungsbereichs eines Sensors, Fahrerassistenzsystem sowie Kraftfahrzeug | |
DE102021002377A1 (de) | Verfahren zur vorausschauenden, kamerabasierten Parkplatzdetektion und Fahrzeug | |
DE102010049216A1 (de) | Verfahren zum Betrieb einer an einem Fahrzeug angeordneten Kamera | |
DE102010013093A1 (de) | Verfahren und System zur Erstellung eines Modells eines Umfelds eines Fahrzeugs | |
DE102011080720A1 (de) | Visualisierung einer Rampenabfahrt | |
EP3520020B1 (fr) | Classification de signaux routiers dans une zone environnante d'un véhicule automobile | |
EP3165420A1 (fr) | Procédé de détermination d'une zone occupée par une remorque attelée à un véhicule automobile, système d'assistance au conducteur et véhicule automobile | |
WO2019137864A1 (fr) | Procédé pour la prévention d'une situation critique pour un véhicule à moteur, dans lequel une distance entre un contour de véhicule à moteur et un contour d'objet est déterminée, système d'assistance au conducteur ainsi que véhicule | |
DE102016109850B4 (de) | Verfahren zum Erkennen einer Neigung in einer Fahrbahn eines Kraftfahrzeugs, Fahrerassistenzsystem sowie Kraftfahrzeug | |
EP3178727B1 (fr) | Procédé de reconnaissance de créneau pour stationnement d'un véhicule automobile à l'aide de marquage au sol, système d'assistance au conducteur et véhicule automobile | |
DE102019006243A1 (de) | Verfahren zum Betrieb eines Abbiegeassistenzsystems, Abbiegeassistenzsystem und Kraftfahrzeug mit einem derartigen Abbiegeassistenzsystem | |
DE102015005999B3 (de) | Verfahren zum Verändern einer Position eines Außenspiegels eines Kraftfahrzeugs und Kraftfahrzeug | |
EP2856205A1 (fr) | Procédé et dispositif de détection d'objets | |
DE102015016738A1 (de) | Vorrichtung und Verfahren zur Fahrspurerkennung | |
DE102020005136A1 (de) | Verfahren zum automatischen Steuern eines Fahrzeug-Anhänger-Gespanns mittels zwei Rückfahrkameras, sowie elektronisches Fahrzeuganhängerassistenzsystem |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10716473 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10716473 Country of ref document: EP Kind code of ref document: A1 |