WO2008132130A1 - Détection de voie de circulation avec des caméras ayant des distances focales différentes - Google Patents

Détection de voie de circulation avec des caméras ayant des distances focales différentes Download PDF

Info

Publication number
WO2008132130A1
WO2008132130A1 PCT/EP2008/055001 EP2008055001W WO2008132130A1 WO 2008132130 A1 WO2008132130 A1 WO 2008132130A1 EP 2008055001 W EP2008055001 W EP 2008055001W WO 2008132130 A1 WO2008132130 A1 WO 2008132130A1
Authority
WO
WIPO (PCT)
Prior art keywords
image information
camera
lane
vehicle
focal length
Prior art date
Application number
PCT/EP2008/055001
Other languages
German (de)
English (en)
Inventor
Gregory Baratoff
Ludwig Ertl
Original Assignee
Continental Automotive Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive Gmbh filed Critical Continental Automotive Gmbh
Publication of WO2008132130A1 publication Critical patent/WO2008132130A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60TVEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
    • B60T2201/00Particular use of vehicle brake systems; Special systems using also the brakes; Special software modules within the brake system controller
    • B60T2201/08Lane monitoring; Lane Keeping Systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60TVEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
    • B60T2201/00Particular use of vehicle brake systems; Special systems using also the brakes; Special software modules within the brake system controller
    • B60T2201/08Lane monitoring; Lane Keeping Systems
    • B60T2201/089Lane monitoring; Lane Keeping Systems using optical detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping

Definitions

  • the invention relates to an evaluation device for a driver assistance system for a vehicle.
  • driver assistance systems summarizes functions that serve the support of the driver of a motor vehicle.
  • the aim of driver assistance systems is often to increase safety by avoiding dangerous situations before their emergence and by assisting the driver to avoid accidents in critical situations. Other goals are to increase comfort through
  • driver assistance functions are traction control or traction control, such as ABS (anti-lock braking system), ASR (Electronic Traction Control), ESP (Electronic Stability Program), EDS (Electronic Differential Locking), and Adaptive Headlights, Fade-in and Fade-out Assist Driving lights, night vision systems, cruise control, parking assistance, brake assist, Adaptive Cruise Control, distance warning, turn assistant, traffic jam assistant, lane detection system, lane departure warning, lane keeping assistance, lane change assistant, ISA (Intelligent Speed Adaptation), ANB (Automatic emergency braking), curve assistant, tire pressure monitoring system, driver condition detection, traffic sign recognition, platooning.
  • ABS anti-lock braking system
  • ASR Electrical Traction Control
  • ESP Electronic Stability Program
  • EDS Electronic Differential Locking
  • Adaptive Headlights Fade-in and Fade-out Assist Driving lights
  • night vision systems cruise control, parking assistance, brake assist, Adaptive Cruise Control, distance warning, turn assistant, traffic jam
  • the invention is based on the object, an efficient A usonnevoriques for a driver assistance system, and a corresponding driver assistance system, a method for Operating a driver assistance system and to show a computer program product.
  • the evaluation device comprises an input for receiving first image information, wherein the first image information was acquired by a first camera aligned in the forward direction of the vehicle with a first focal length, and an input for receiving second image information, wherein the second image information from a second in the forward direction of the vehicle-aligned camera were detected with a second compared to the first focal length larger focal length. Furthermore, a determination component is provided for determining a course of a lane lane using the first image information and the second image information.
  • the evaluation device receives first and second image information.
  • This image information can be in each case the data originally captured by the respective camera, or an already processed version of the originally acquired data.
  • the two cameras are facing forward. This means that they capture, at least in sections, the surroundings lying ahead of the vehicle when traveling in the forward direction or behind the vehicle when driving in the rearward direction.
  • both the first and the second image information depict parts of the environment ahead of the vehicle.
  • Each of the two cameras has a specific focal length, wherein the focal length of the second camera is greater than that of the first camera. Is the focal length one or both cameras variable, it is at least for the first and the second image information that the focal length used for the second image information is greater than that of the first image information.
  • the focal length of the cameras determines the dimensions of the section of the environment, which the respective camera captures.
  • the focal length determines the opening angle and the radius of the circular sector. According to the ratio of the focal lengths, the first image information corresponds to a closer vehicle environment and the second image information corresponds to a more distant environment of the vehicle.
  • the evaluation device uses both data of the first camera and data of the second camera to determine the course of a lane track.
  • the lane track whose course is determined may be the lane track the vehicle is currently traveling on, and / or e.g. around an adjacent lane.
  • the course determined by the evaluation device can be output by the evaluation device via a suitable output. From the evaluation device, the course is preferably indicated relative to the vehicle; Alternatively, an indication of the absolute position of the course of the lane track is possible.
  • the determined course may be e.g. the course of the center of the respective lane track, or the right or left boundary of the respective lane track.
  • radiation is detected by the first camera in the wavelength range of visible light. It is possible that the first camera is sensitive only in the wavelength range of visible light, or mainly in this wavelength range, or, inter alia, in this wavelength range. Additionally or alternatively, the first camera can detect radiation in the wavelength range of the near infrared. In this case, it is possible, in particular, for the first camera to be located mainly in the wavelength range of visible light and in smaller surroundings. Fang in the wavelength range of the near infrared is sensitive. It is understood that the first image information depicts the radiation emitted by the vehicle environment in the respective wavelength range which is detectable by the first camera.
  • the second camera detects radiation in the wavelength range of the near infrared. It is possible that the second camera is sensitive only in the near infrared wavelength range, or mainly in this wavelength range, or the like. in this wavelength range. It is understood that the second image information images the radiation emitted by the vehicle environment in the respective wavelength range detectable by the second camera.
  • the evaluation device is designed to determine the course of the lane lane on the basis of at least one lane marking and / or at least one lane edge.
  • Road markings are also referred to as road marking or ground marking; they are signs on the road surface. Examples are continuous or broken center lines, or two or more parallel lines.
  • At the edge of the lane it can be e.g. to act a strip of grass and / or earth, a guardrail or a wall.
  • a suitable image evaluation algorithm it is possible to search in the image information for features which are characteristic for a roadway marking and / or a roadway edge.
  • the first image information relative to the second image information depicts different parts of the at least one roadway marking and / or of the at least one roadway edge.
  • first image information depict first parts of the at least one lane marking and / or the at least one lane edge not shown by the second image information, and that the second image information farther away from the first parts of the second image information Vehicle remote parts of the at least one lane marking and / or the at least one lane edge map.
  • the respective image information containing the roadway marking and / or the roadway edge can then be used to determine the course of the roadway lane.
  • the evaluation device can be designed to determine the course of a lane track within the first image information and to determine the course of a lane track within the second image information, and to combine the courses determined using the first image information and the second image information.
  • the courses may be e.g. they are related to different parts of the process. If courses are determined at matching positions, e.g. averaging done.
  • the driver assistance system comprises an evaluation device of the kind explained, the first and the second camera, and an application unit for using the course of the track lane determined by the evaluation device in a driver assistance function, in particular within the context of a lane departure warning or a lane departure warning system.
  • the vehicle according to the invention comprises a driver assistance system of the type explained, as well as headlights, which are adapted with respect to their radiation to a wavelength range detectable by the first camera and / or the second camera. For example, infrared lights may be present if the second camera is an infrared camera.
  • first image information is acquired by a first camera oriented in the forward direction of the vehicle, a second camera aligned in the forward direction of the vehicle with a second focal length greater than the first focal length, second image information is acquired, and Using the first and the second image information, a course of a lane track is determined.
  • the computer program product for a driver assistance system has the functionality of an input for receiving first image information acquired by a first forward camera of the vehicle at a first focal length, an input for receiving second image information received from a second in Forward direction of the vehicle-aligned second camera having a second compared to the first focal length larger focal length were detected, and a determination component for determining a course of a lane track using the first image information and the second image information.
  • a computer program product in addition to the actual computer program (with its technical effect beyond the normal physical interaction between program and arithmetic unit), in particular a record carrier for the computer program, a file collection, a configured arithmetic unit, but also For example, a storage device or a server on which the computer program associated files are stored understood.
  • the computer program product according to the invention and the method according to the invention are particularly suitable for the evaluation device according to the invention, and this can also apply to the designs and developments. For this purpose, they may comprise further suitable means or steps.
  • FIG. 1 a motor vehicle with a driver assistance system
  • Figure 2 a section of a driver assistance system.
  • the motor vehicle 1 shown in FIG. 1 currently travels on the right-hand lane track SP1, in FIG. 1 to the right.
  • the road boundary MR On the right edge of the right lane SPl is the road boundary MR. This may be e.g. is a lane marking in the form of a line or the edge of a strip of grass and / or earth.
  • the lane marking MM Between the right lane track SPl and the left lane track SP2 is the lane marking MM.
  • the vehicle 1 has a video camera K1, eg a CMOS camera.
  • the camera Kl records images mainly in the wavelength range of the visible spectrum. This can be achieved by placing a suitable filter in front of the sensor of the camera Kl, or by the sensor being sensitive only in said wavelength range.
  • the camera Kl can be a black and white camera, which is less expensive than a color camera. Moreover, the resolution of a black-and-white camera is larger for a given number of pixels than for a color camera. For the driver of the vehicle 1, the use of a black-and-white camera is not disadvantageous in that the image captured by the camera 1 is not displayed to him.
  • the camera Kl is directed in the direction of travel, thus taking pictures of the lying in front of the vehicle 1 environment.
  • the detection area Bl may be mounted, for example, in the vicinity of the roof and the windshield of the vehicle 1.
  • Bl the schematically represented by the camera Kl detected area is designated, which is determined by the focal length of the camera Kl. This is so large that the detection area Bl extends to about 50 meters in front of the vehicle 1.
  • the detected course of the lane SPl may be e.g. for the driver assistance function Lane Departure Warning (Lane Keep Assist) or Lane Keep Assist (English: Lane Keep Assist) are used.
  • Lane Departure Warning Lane Keep Assist
  • Lane Keep Assist Edinburgh: Lane Keep Assist
  • a stable lane detection using the camera Kl is possible in good visibility conditions during the day to about 50 meters in front of the vehicle 1, at night to about 35 meters.
  • Range limitation at night is in particular due to the limited illumination of the vehicle surroundings by the low beam of the vehicle 1.
  • Lane Departure Warning Lane Keeping Assist or Adaptive Cruise Control (ACC)
  • ACC Adaptive Cruise Control
  • the extrapolated lane course may differ significantly from the actual one.
  • the extrapolated lane curvature may be opposite to the actual one if the inflection point of the lane lane is outside the range Bl detected by the camera Kl.
  • the vehicle 1 has another video camera K2.
  • the K2 camera captures images mainly in the Near Infrared (NIR) wavelength range, i. from 0.7-1.4 ⁇ m. This can be effected by placing a suitable filter in front of the sensor of the camera K2, or by the sensor being sensitive only in said wavelength range.
  • the camera K2 is directed in the direction of travel, thus capturing images of lying in front of the vehicle 1 environment. For this purpose it may e.g. be mounted near the roof and the windshield of the vehicle 1.
  • the camera K2 may e.g. As part of the driver assistance function Night Vision serve to increase driving safety during night or fog rides.
  • the infrared images taken by the camera K2 can be displayed to the driver of the vehicle 1 on a display. This exploits the fact that the range of the near-infrared portion of radiation emitted by an incandescent lamp of the vehicle headlight is higher than in the visible range.
  • infrared lights S can be provided, which illuminate the environment with infrared radiation.
  • B2 denotes the schematically represented area detected by the camera K2, which is determined by the focal length of the camera K2. This is such that the detection area B2 up to about 200 meters in front of the vehicle. 1 extends.
  • the focal length of the camera K2 is larger than that of the camera K1. Accordingly, the field of view of the camera K2 is narrower than that of the camera K1 but extends further to the front.
  • the larger focal length of the camera K2 is particularly suitable for the aforementioned driver assistance function Night Vision, since in this case the area not illuminated by the low beam of the vehicle 1 is to be made visible to the driver.
  • Figure 2 shows a section of the driver assistance system of the vehicle 1.
  • the evaluation component S are provided by the camera Kl the information I-Kl and the camera K2 the information I-K2.
  • the information I-K1 and I-K2 can be the image data captured by the respective camera K1 or K2 or data resulting from a processing of this image data.
  • the evaluation component S evaluates the information I-K1 and I-K2 and, based on this evaluation, determines the course of the lane SP1 relative to the own vehicle 1.
  • the evaluation unit S outputs to the application unit A the information IS indicating the course of the lane SP1 can be seen.
  • the application unit A uses the information I-S in the sense of the respective driver assistance function, in a lane departure warning driver assistance function e.g. for detecting an unintentional lane change of the vehicle 1.
  • the application unit A is connected to an interface I.
  • This can be a human-machine interface and / or an interface to the autonomous
  • Control the vehicle 1 act. Via a human-machine interface, information or warning can be output to the driver in an acoustic, optical or haptic manner, for example due to a leaving of the lane SPl.
  • An interface for the autonomous control of the vehicle can be used to intervene in the driving behavior of the vehicle 1, without requiring the driver's assistance, for example by braking or steering the vehicle 1.
  • the evaluation component S determines, using the information I-Kl, the course of the track track SP1 within the detection range Bl of the camera Kl. Likewise, the evaluation component S determines, using the information I-K2, the course of the lane SPl within the coverage area B2 of the camera K2.
  • the determination of the course takes place in each case on the basis of a detection of road boundary MR and / or road mark MM.
  • the detection of road boundary MR and / or road mark MM is both within the images of the visible wavelength range of the camera K1, as well as within the images of the near infrared range of Camera K2 possible, because these pictures are very similar borrowed.
  • light or dark objects also appear bright or dark in the near infrared.
  • the evaluation component S merges the lane lane courses obtained from the two item information I-K1 and I-K2, so that the entire course of the lane track SP1 within the coverage areas Bl and B2 results therefrom. This can be done by combining the courses in non-overlapping areas. In overlapping areas, on the other hand, either the course derived from the image information I-KL or the course derived from the image information I-K2, or a combination of the two, can be used.
  • the evaluation component S can determine the course of the traffic lane SP 1 from the own vehicle 1 to the end of the coverage area B2 of the camera K 2. This is not possible using only the image information I-Kl, since the detection range Bl of the camera Kl is much smaller due to the smaller focal length. Even using only the image information I-K2, this is not possible because the detection range B2 of the camera K2 is narrower due to the larger focal length, so that in the near future Heren environment of the vehicle 1, the road boundary MR and / or road marking MM as shown in Figure 1 by the camera K2 not be detected.
  • the infrared headlights S can be used so that the camera K2 detects the radiation of the infrared headlights S reflected from the surroundings. If the camera K1 not only detects visible light but also at least partially radiation from the near infrared range, the infrared headlights S can also be used to determine the course of the lane SP1 with respect to the camera K1. As already mentioned, the detection range Bl of the camera Kl is severely limited at night due to the low range of the low beam. This restriction can be eliminated or at least reduced if the infrared lamps S are turned on.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Vascular Medicine (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un dispositif d'interprétation pour un système d'assistance à la conduite d'un véhicule (1), comprenant une entrée destinée à recevoir des premières informations d'image (I-K1) qui ont été acquises par une première caméra (K1) ayant une première distance focale et dirigée dans le sens de la marche avant du véhicule, et une entrée destinée à recevoir des deuxièmes informations d'image (I-K2) qui ont été acquises par une deuxième caméra (K2) ayant une deuxième distance focale et dirigée dans le sens de la marche avant du véhicule. L'invention concerne également un composant de détermination destiné à déterminer un tracé de voie de circulation (SP1) en utilisant les premières informations d'image (I-K1) et les deuxièmes informations d'image (I-K2).
PCT/EP2008/055001 2007-04-25 2008-04-24 Détection de voie de circulation avec des caméras ayant des distances focales différentes WO2008132130A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102007019531.3 2007-04-25
DE102007019531A DE102007019531A1 (de) 2007-04-25 2007-04-25 Fahrspurdetektion mit Kameras unterschiedlicher Brennweite

Publications (1)

Publication Number Publication Date
WO2008132130A1 true WO2008132130A1 (fr) 2008-11-06

Family

ID=39639586

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2008/055001 WO2008132130A1 (fr) 2007-04-25 2008-04-24 Détection de voie de circulation avec des caméras ayant des distances focales différentes

Country Status (2)

Country Link
DE (1) DE102007019531A1 (fr)
WO (1) WO2008132130A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013235580A (ja) * 2012-05-08 2013-11-21 Axis Ab ビデオ解析
EP3008708A4 (fr) * 2013-06-13 2017-02-22 Mobileye Vision Technologies Ltd. Navigation améliorée par vision
CN109178040A (zh) * 2018-11-01 2019-01-11 同方威视技术股份有限公司 列车识别系统及其方法、列车安全检查系统及其方法
US10562439B2 (en) 2016-01-19 2020-02-18 Harman International Industries, Incorporated Techniques for optimizing vehicle headlights based on situational awareness

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013002212A1 (de) 2013-02-06 2014-08-07 GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) Spurhalteassistenzsystem für ein Kraftfahrzeug
DE102020105711A1 (de) 2020-03-03 2021-09-09 Bayerische Motoren Werke Aktiengesellschaft Fahrerassistenzsystem

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5892855A (en) * 1995-09-29 1999-04-06 Aisin Seiki Kabushiki Kaisha Apparatus for detecting an object located ahead of a vehicle using plural cameras with different fields of view
DE19846664A1 (de) * 1997-10-10 1999-04-15 Hyundai Motor Co Ltd Lenksteuersystem und Lenksteuerverfahren für autonome intelligente Fahrzeuge
WO2005013025A1 (fr) * 2003-07-31 2005-02-10 Trw Limited Appareil de detection pour vehicules
WO2007002964A2 (fr) 2005-07-04 2007-01-11 Advanced Computer Vision Gmbh-Acv Procede d'identification de voies de circulation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5892855A (en) * 1995-09-29 1999-04-06 Aisin Seiki Kabushiki Kaisha Apparatus for detecting an object located ahead of a vehicle using plural cameras with different fields of view
DE19846664A1 (de) * 1997-10-10 1999-04-15 Hyundai Motor Co Ltd Lenksteuersystem und Lenksteuerverfahren für autonome intelligente Fahrzeuge
WO2005013025A1 (fr) * 2003-07-31 2005-02-10 Trw Limited Appareil de detection pour vehicules
WO2007002964A2 (fr) 2005-07-04 2007-01-11 Advanced Computer Vision Gmbh-Acv Procede d'identification de voies de circulation

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013235580A (ja) * 2012-05-08 2013-11-21 Axis Ab ビデオ解析
US8983131B2 (en) 2012-05-08 2015-03-17 Axis Ab Video analysis
EP2662827B1 (fr) * 2012-05-08 2016-01-13 Axis AB Analyse vidéo
EP3008708A4 (fr) * 2013-06-13 2017-02-22 Mobileye Vision Technologies Ltd. Navigation améliorée par vision
US9671243B2 (en) 2013-06-13 2017-06-06 Mobileye Vision Technologies Ltd. Vision augmented navigation
US10533869B2 (en) 2013-06-13 2020-01-14 Mobileye Vision Technologies Ltd. Vision augmented navigation
US11604076B2 (en) 2013-06-13 2023-03-14 Mobileye Vision Technologies Ltd. Vision augmented navigation
US10562439B2 (en) 2016-01-19 2020-02-18 Harman International Industries, Incorporated Techniques for optimizing vehicle headlights based on situational awareness
CN109178040A (zh) * 2018-11-01 2019-01-11 同方威视技术股份有限公司 列车识别系统及其方法、列车安全检查系统及其方法
US11952027B2 (en) 2018-11-01 2024-04-09 Nuctech Company Limited Train identification system and method, and train safety inspection system and method

Also Published As

Publication number Publication date
DE102007019531A1 (de) 2008-11-13

Similar Documents

Publication Publication Date Title
DE102011088130B4 (de) Verfahren und Vorrichtung zur Erkennung einer Bremssituation
EP1214224B1 (fr) Dispositif de capteurs automobiles pour detection de l'environnement
DE102007039617B4 (de) Automatische Spiegeleinstellung in einem Fahrzeug
WO2014094767A1 (fr) Estimation d'un coefficient de frottement sur la base de données de caméra et de données de nombre de tours d'une roue
DE102005054972A1 (de) Verfahren zur Totwinkelüberwachung bei Fahrzeugen
EP2291836A1 (fr) Système d'aide à la conduite destiné à éviter des collisions entre un véhicule motorisé et des piétons
DE102007033887A1 (de) Fahrerassistenzsystem mit Empfehlung für einen Fahrspurwechsel
DE102013101639A1 (de) Verfahren und Vorrichtung zur Bestimmung eines Fahrbahnzustands
EP2097770A1 (fr) Alignement vertical d'un capteur de radar optique
WO2008132130A1 (fr) Détection de voie de circulation avec des caméras ayant des distances focales différentes
DE102008020007A1 (de) Verfahren zum Unterstützen eines Fahrers beim Fahren mit einem Fahrzeug mit einer Fahrspurerkennung
EP1591316B1 (fr) Système de prise de vue
DE102011105074A1 (de) Verfahren und Vorrichtung zur Bestimmung einer Sichtweite für ein Fahrzeug
DE102010045511A1 (de) Verfahren und Vorrichtung zur Detektion von Fahrbahnmarkierungen in einem Bild
DE102014209791A1 (de) Verkehrswarnvorrichtung und Verkehrswarnverfahren zum Warnen von Verkehrsteilnehmern vor einem Gefahrenbereich
WO2019121243A1 (fr) Dispositif d'alerte de situations de danger pour un véhicule automobile
DE102010049215A1 (de) Verfahren zur Bestimmung einer Fahrzeugumgebung
DE102008054041B3 (de) Verfahren zur kombinierten Ausgabe eines Bildes und einer ein Verkehrszeichen betreffenden Information, sowie Kraftfahrzeug hierfür
WO2009097883A1 (fr) Procédé pour faire fonctionner un dispositif d'enregistrement d'images, et dispositif d'enregistrement d'images
DE102007017293A1 (de) Fahrspurdetektion mit Schwarz-Weiß-Kamera und Farbkamera
DE102012221652B4 (de) Verfahren und Vorrichtung zum Bestimmen, ob in einer Verkehrssituation Linksverkehr oder Rechtsverkehr besteht
DE102017211887A1 (de) Verfahren und Vorrichtung zum Lokalisieren und automatisierten Betreiben eines Fahrzeugs
DE102010049214A1 (de) Verfahren zur Bestimmung eines Fahrspurverlaufs für ein Fahrzeug
EP3688412A1 (fr) Procédé et dispositif pour déterminer une position très précise d'un véhicule automatisé et pour le faire fonctionner
DE102016119592A1 (de) Verfahren zum Erkennen von Objekten in einem Umgebungsbereich eines Kraftfahrzeugs unter Berücksichtigung von Sensordaten im infraroten Wellenlängenbereich, Objekterkennungsvorrichtung, Fahrerassistenzsystem sowie Kraftfahrzeug

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08749707

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08749707

Country of ref document: EP

Kind code of ref document: A1