WO2014032664A1 - Verfahren zur bestimmung eines fahrspurverlaufs für ein fahrzeug - Google Patents

Verfahren zur bestimmung eines fahrspurverlaufs für ein fahrzeug Download PDF

Info

Publication number
WO2014032664A1
WO2014032664A1 PCT/DE2013/200115 DE2013200115W WO2014032664A1 WO 2014032664 A1 WO2014032664 A1 WO 2014032664A1 DE 2013200115 W DE2013200115 W DE 2013200115W WO 2014032664 A1 WO2014032664 A1 WO 2014032664A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
lane
dark
detected
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/DE2013/200115
Other languages
German (de)
English (en)
French (fr)
Inventor
Matthias Strauss
Matthias Komar
Dirk Waldbauer
Wolfgang Günther
Stefan LÜKE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Teves AG and Co OHG
Original Assignee
Continental Teves AG and Co OHG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Teves AG and Co OHG filed Critical Continental Teves AG and Co OHG
Priority to US14/409,051 priority Critical patent/US9360332B2/en
Priority to DE112013004196.0T priority patent/DE112013004196A5/de
Priority to EP13765623.7A priority patent/EP2888604B1/de
Priority to JP2015528874A priority patent/JP6353448B2/ja
Publication of WO2014032664A1 publication Critical patent/WO2014032664A1/de
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • G06V10/507Summing image-intensity values; Histogram projection analysis
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/102Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using 360 degree surveillance camera system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/804Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for lane monitoring
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Definitions

  • a method for determining a lane course for a vehicle The invention relates to a method for determining a lane course for a vehicle according to the preamble of Pa ⁇ tent labors.
  • Such a method is, for example, from DE 10207013023 Al, in which the detected vehicle environment by means of a bung Conversely ⁇ sensors and for detecting objects in the vehicle environment divided into lattice cells.
  • This grid cells are each assigned a value that indicates the Occupation ⁇ probability of the presence of an object in this lattice cell.
  • the value 0 or a lower probability value near 0 is assigned, while for a busy grid cell a high value, for example between 0.5 and 1 is used becomes.
  • each grid cell is assigned a value that depends on the distance of a free grid cell from the vehicle, ie, the farther the free grid cell is located, the higher the occupation probability is selected.
  • the coordinate system of the grid-based environment map generated by this known method according to DE 10 207 013 023 Al is non-rotatably connected to the global coordinate system, so that during a movement of the vehicle, the vehicle is moved on the two-dimensional grid structure of the surroundings map.
  • This lattice-based environment map generated in accordance with DE 10 207 013 023 A1 is used to detect a roadway, a driving tube and / or roadway boundaries.
  • an area on the gitterba ⁇ overbased Map is first determined, in which the cast International ⁇ probabilities below a predetermined value, for example. 0.1 are.
  • a running in the longitudinal ⁇ direction of the vehicle center line is determined, which is divided into several sub-lines. These partial lines ⁇ then shifted to perpendicular to the direction of the center line on both sides of the vehicle until they moved to mesh cells whose occupation probabilities exceed a certain value, for example. 0.5.
  • Road markings or boundaries of a roadway such as crash barriers and the like.
  • the image processing algorithms used detect markings mainly due to the dark-light / light-dark transitions between the road surface ⁇ and the lane markings. Furthermore, one moment in the images of structures with the highest contrast, as they are generated mostly by the aforementioned transitions he ⁇ .
  • the object of the invention is to provide a comparison with the prior art improved method for determining a lane course for a vehicle, which is possible in particular a tracking of the lane course even at slow speeds or low speeds and hidden areas for the environment sensor.
  • Such a method for determining a lane course for a vehicle in which a drivable space limiting structures are detected by at least one image capture unit and these structures are registered in an environment ⁇ map, which divides the vehicle environment into a two-dimensional grid structure of cells, draws
  • ⁇ map which divides the vehicle environment into a two-dimensional grid structure of cells
  • the position in the grid structure of the surroundings map is determined by means of odometric data of the vehicle and updated continuously,
  • the distance and the direction of the vehicle to the cell of the grid structure of the surroundings map is determined, which have the roadway and / or the lane delimiting structures, - detected light-dark and dark-light transitions in the generated by the Bil ⁇ der DCsaku image data and entered into the cells of the grid structure of the Area, and
  • the lane course is determined from the cells with the detected light-dark and dark-light transitions.
  • Structures delimiting a trafficable space are structures defining a lane as well as, for example, structures bounding a parking space, such as lane boundaries, for example curbs, green strips, lane markings or line markings on the lane center or lane side etc. and traffic signs , including guide posts, etc. understood.
  • the lane information is obtained at very low speeds and also at short distances to the vehicle in front, through which, for example, line markings are obscured.
  • the lane markers can be tracked ("tracked"), wherein advantageously all forms of lane markings can be detected, ie also bright -dark-and dark-light transitions that are from other road markings ent ⁇ such.
  • Tracked traffic signs with Ge ⁇ schwindtechniks very or crosswalks, etc.
  • the error rate in the detection of lane marker is extremely low since the deviation of the vehicle relative to the track center, the angular position of the vehicle and the curvature of a Spurmar ⁇ k ist is updated continuously.
  • Vehicle lying environment of certain lane course is extrapolated in the front of the vehicle environment.
  • the thus estimated further course of the lane or the driving ⁇ web is particularly advantageous when the vehicle is moving in urban transport, in particular in dense traffic or in densely built.
  • a stationary coordinate system is used for the area map. This will discretization errors in the
  • the method according to the invention is rendered particularly efficient by detecting light-dark and dark-light transitions which are present in a line structure. As a result, in particular road edges and road edges or lane boundaries in the form of line markings are detected quickly and easily.
  • the method according to the invention is particularly robust if the grid cells of the surroundings map are classified as passable or not passable.
  • the structures and objects of the environment that can be detected from the image data of the image acquisition unit are recorded and entered into the grid cells.
  • the odometrical data are determined according to training by means of vehicle sensors, which are usually already present in vehicles, especially motor vehicles.
  • the optical flow from the image data of the image acquisition unit is used to determine the position of the vehicle in the grid structure of the environment map in addition to the odometric data.
  • vehicle conditions such as wheel spin or Rut ⁇ rule of the vehicle are taken into account.
  • Direction of travel of the vehicle to the parallel moving vehicles determined and the determined distances to the parallel moving vehicles are used to verify the specific lane course.
  • the track recognition is supported in an advantageous manner.
  • FIG. 1 shows a schematic representation of a vehicle with an image acquisition unit for explaining the method according to the invention
  • FIG. 2 shows a schematic illustration of a grid-based environment map of a vehicle produced by the method according to the invention
  • FIG. 3 shows a flow chart for generating a grid-based environment map according to FIG. 2 by means of the vehicle-mounted image recording system according to FIG. 1 as an exemplary embodiment according to the invention.
  • the vehicle diagrammatically illustrated in Figure 1 10, and in particular ⁇ sondere a motor vehicle comprising an imaging system 1 with a camera 2 as an image sensing unit, a reer ⁇ detection unit 3 for detecting objects from the images taken by the camera 2 the image data, wherein the object recognition ⁇ unit 3 a Memory 4 is assigned.
  • the object recognition unit 3 has a classifier 3a with which by means of a pattern recognition algorithm a classification of recognized structures and objects, in particular of objects detected in the image data generated by the camera 2, is made at the roadside.
  • the roadway or lane delimiting structures such as lane markings, lane boundaries, such as crash barriers and curbs and traffic signs, but, for example, driving out or parallel vehicles, taillights, headlights, etc. can be detected.
  • the lane course or the lane course of the vehicle 10 is also determined by the image acquisition system 1 and used for driver assistance functions.
  • a driver assistance system 6 is, for example, is provided, which is formed as Spurhalteas- sistenzsystem, the required for the tracking information is supplied with respect to the detected course of the road and possibly interference with the braking and / or steering system of the vehicle 10 takes over actuators 6a before ⁇
  • the vehicle 10 includes a display 5, which represents, for example. A portion of a central display and control unit of the driving ⁇ zeugs 10 or a combination instrument of the vehicle 10, which projects the ekterkennungshim of the Whether 3 detected Obwalden, indicating for example. Signs and is therefore connected to dersel ⁇ ben.
  • optical and / or acoustic warnings to the driver are displayed or he witnesses ⁇ when an unintentional exit of the detected lane and the lane is detected.
  • odometric data of the vehicle 10 are also supplied to the image acquisition system 1 for its motion estimation, for which purpose vehicle sensors 7 detect, for example, the yaw rate, the steering angle and the wheel speeds of the vehicle.
  • a first step Sl the captured by the camera 2 vehicle environment is scanned 20 in the form of the image data by means of a stationary or global grid 20a with each moving ⁇ cher mesh size in grid cells 21st
  • a grid-based environment map 20 shown in FIG. 2 discretizes the surroundings detected by the camera 1, with only every tenth grid line being entered for reasons of clarity.
  • objects from the image data of the camera 2 are detected, classified by the classifier 3a and entered into the grid cells 21 with a ⁇ occupancy probability.
  • the grid cells 21 receive the status not passable (grid cells 21a) and "passable" (grid cells 21b). According to FIG. 2, the passable grid cells 21a are shown darker than the non-passable grid cells 21b.
  • the odometric data generated by the sensors 7 is detected according to a next process step S3, and the position of the vehicle 10 in the corresponding grid cell 21aLetra ⁇ gene. With the continuously generated odometric data is estimated motion of the vehicle 10 and its Position moved on the environment map 20 according to the motion estimation.
  • the distance and the direction of the vehicle 10 to the non-reachable lattice cells 21b which have structures defining the lane and / or lane, can be determined by a further method step S4.
  • a subsequent method step S5 all light-dark and dark-light transitions present in a line structure are detected and likewise entered into the environment map 20 (see FIG. 2) and in a last method step S7 as lane markings 22a and 22b of a lane 22 and identified as a right lane marker 23 and tracked by using a particulate filter; besides, also the already passed by the vehicle 10 Spurmarkie ⁇ stanchions 22a, 22b and the roadway limiting structures 23 in the determination of the lane-course of the vehicle 10 are to be included. Even at low speeds, the described method enables tracking of the lane markers 22a, 22b and the lane delimiting structures 23.
  • the hitherto determined lane course 22 or lane course 23 is extrapolated into the environment ahead of the vehicle 10.
  • the illustrated method can also be improved with respect to the determination of the lane course of the vehicle 10, that parallel-moving vehicles are detected, so that it can be prevented that such parallel running ⁇ object is touched at a lane narrowing by the vehicle 10.
  • the image acquisition unit used is a video image-producing camera 2, which can also be a stereo camera. Instead of such a camera 2 can be used to carry out the invention
  • a laser scanner-data-generating laser can also be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Image Processing (AREA)
  • Mechanical Engineering (AREA)
PCT/DE2013/200115 2012-08-27 2013-08-09 Verfahren zur bestimmung eines fahrspurverlaufs für ein fahrzeug Ceased WO2014032664A1 (de)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US14/409,051 US9360332B2 (en) 2012-08-27 2013-08-09 Method for determining a course of a traffic lane for a vehicle
DE112013004196.0T DE112013004196A5 (de) 2012-08-27 2013-08-09 Verfahren zur Bestimmung eines Fahrspurverlaufs für ein Fahrzeug
EP13765623.7A EP2888604B1 (de) 2012-08-27 2013-08-09 Verfahren zur bestimmung eines fahrspurverlaufs für ein fahrzeug
JP2015528874A JP6353448B2 (ja) 2012-08-27 2013-08-09 自動車用の走行レーン推移を測定するための方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102012107885.8A DE102012107885A1 (de) 2012-08-27 2012-08-27 Verfahren zur Bestimmung eines Fahrspurverlaufs für ein Fahrzeug
DE102012107885.8 2012-08-27

Publications (1)

Publication Number Publication Date
WO2014032664A1 true WO2014032664A1 (de) 2014-03-06

Family

ID=49225985

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/DE2013/200115 Ceased WO2014032664A1 (de) 2012-08-27 2013-08-09 Verfahren zur bestimmung eines fahrspurverlaufs für ein fahrzeug

Country Status (5)

Country Link
US (1) US9360332B2 (enExample)
EP (1) EP2888604B1 (enExample)
JP (1) JP6353448B2 (enExample)
DE (2) DE102012107885A1 (enExample)
WO (1) WO2014032664A1 (enExample)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015209467A1 (de) 2015-05-22 2016-11-24 Continental Teves Ag & Co. Ohg Verfahren zur Schätzung von Fahrstreifen

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011109569A1 (de) 2011-08-05 2013-02-07 Conti Temic Microelectronic Gmbh Verfahren zur Fahrspurerkennung mittels einer Kamera
DE102012103669A1 (de) 2012-04-26 2013-10-31 Continental Teves Ag & Co. Ohg Verfahren zur Darstellung einer Fahrzeugumgebung
DE102012106932A1 (de) 2012-07-30 2014-05-15 Continental Teves Ag & Co. Ohg Verfahren zur Darstellung einer Fahrzeugumgebung mit Positionspunkten
DE102015111925B4 (de) * 2015-07-22 2021-09-23 Deutsches Zentrum für Luft- und Raumfahrt e.V. Spurhalteassistenzsystem für ein Fahrzeug
DE102016217637B4 (de) * 2016-09-15 2025-05-15 Volkswagen Aktiengesellschaft Odometrie-Verfahren zum Ermitteln einer Position eines Kraftfahrzeugs, Steuervorrichtung und Kraftfahrzeug
US11480971B2 (en) * 2018-05-01 2022-10-25 Honda Motor Co., Ltd. Systems and methods for generating instructions for navigating intersections with autonomous vehicles
KR102564021B1 (ko) * 2018-09-14 2023-08-07 현대자동차주식회사 후방 영상 표시 제어 장치 및 방법, 그리고 차량 시스템
JP7136663B2 (ja) * 2018-11-07 2022-09-13 日立Astemo株式会社 車載制御装置
DE102019102922A1 (de) * 2019-02-06 2020-08-06 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Vorrichtung zur Multi-Sensor-Datenfusion für automatisierte und autonome Fahrzeuge
US11392128B1 (en) * 2019-04-19 2022-07-19 Zoox, Inc. Vehicle control using directed graphs
DE102019112413B4 (de) * 2019-05-13 2025-10-09 Bayerische Motoren Werke Aktiengesellschaft Verfahren und vorrichtung zur multi-sensor-datenfusion für automatisierte und autonome fahrzeuge
CN111739283B (zh) * 2019-10-30 2022-05-20 腾讯科技(深圳)有限公司 一种基于聚类的路况计算方法、装置、设备及介质
DE102020106884A1 (de) 2020-03-13 2021-09-16 Valeo Schalter Und Sensoren Gmbh Verfahren zur adaptiven Fahrerassistenz und Fahrerassistenzsystem für ein Kraftfahrzeug
US11657625B2 (en) * 2020-12-18 2023-05-23 Toyota Research Institue, Inc. System and method for determining implicit lane boundaries
CN113449692A (zh) * 2021-07-22 2021-09-28 成都纵横自动化技术股份有限公司 一种基于无人机的地图车道信息更新方法及其系统
CN113920724B (zh) * 2021-09-29 2022-06-03 南通大学 一种基于混合道路切换控制器的改良交通流分析方法
DE102022002334B3 (de) 2022-06-28 2023-08-24 Mercedes-Benz Group AG Verfahren zur Ermittlung und Bereitstellung von Fahrspurverläufen

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005002719A1 (de) * 2005-01-20 2006-08-03 Robert Bosch Gmbh Verfahren zur Kursprädiktion in Fahrerassistenzsystemen für Kraftfahrzeuge
DE102007013023A1 (de) 2007-03-19 2008-09-25 Ibeo Automobile Sensor Gmbh Probabilistische Rasterkarte
WO2010099789A1 (de) * 2009-03-04 2010-09-10 Continental Teves Ag & Co. Ohg Verfahren zur automatischen erkennung eines fahrmanövers eines kraftfahrzeugs und ein dieses verfahren umfassendes fahrerassistenzsystem
DE102009003697A1 (de) 2009-03-30 2010-10-07 Conti Temic Microelectronic Gmbh Verfahren und Vorrichtung zur Fahrspurerkennung

Family Cites Families (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9317983D0 (en) 1993-08-28 1993-10-13 Lucas Ind Plc A driver assistance system for a vehicle
US5991427A (en) * 1996-07-31 1999-11-23 Aisin Seiki Kabushiki Kaisha Method and apparatus for detecting a lane on a road
EP1383100A3 (en) * 1996-08-28 2005-02-02 Matsushita Electric Industrial Co., Ltd. Local positioning apparatus and method therefor
US6014601A (en) 1997-01-07 2000-01-11 J. Martin Gustafson Driver alert system
DE19738764A1 (de) 1997-09-04 1999-03-11 Bayerische Motoren Werke Ag Vorrichtung zur graphischen Darstellung einer vorausliegenden Straße
JPH11250396A (ja) * 1998-02-27 1999-09-17 Hitachi Ltd 車両位置情報表示装置および方法
US6269308B1 (en) * 1998-08-20 2001-07-31 Honda Giken Kogyo Kabushiki Kaisha Safety running system for vehicle
DE19843564A1 (de) 1998-09-23 2000-03-30 Bosch Gmbh Robert Warneinrichtung für ein Kraftfahrzeug
JP2001331787A (ja) * 2000-05-19 2001-11-30 Toyota Central Res & Dev Lab Inc 道路形状推定装置
DE10102771A1 (de) 2001-01-23 2002-07-25 Bosch Gmbh Robert Einrichtung zum Bereitstellen von Signalen in einem Kraftfahrzeug
US6498972B1 (en) 2002-02-13 2002-12-24 Ford Global Technologies, Inc. Method for operating a pre-crash sensing system in a vehicle having a countermeasure system
DE10212787A1 (de) 2002-03-22 2003-10-16 Audi Ag Kraftfahrzeug
WO2003105108A1 (de) 2002-06-11 2003-12-18 Robert Bosch Gmbh Verfahren und vorrichtung zur fahrerinformation bzw. zur reaktion bei verlassen der fahrspur
DE10251357A1 (de) 2002-11-05 2004-05-13 Daimlerchrysler Ag Setzen oder Abschalten eines Fahrtrichtungsanzeigers
DE10355807A1 (de) 2002-12-20 2004-07-22 Robert Bosch Gmbh Anordnung zur Steuerung eines Kraftfahrzeugblinkers
GB0308912D0 (en) 2003-04-17 2003-05-21 Trw Ltd Signal apparatus for a vehicle
US20100013917A1 (en) 2003-08-12 2010-01-21 Keith Hanna Method and system for performing surveillance
WO2005060640A2 (en) 2003-12-15 2005-07-07 Sarnoff Corporation Method and apparatus for object tracking prior to imminent collision detection
DE102004001113A1 (de) 2004-01-07 2005-07-28 Robert Bosch Gmbh Informationssystem für Fortbewegungsmittel
DE102004018681A1 (de) 2004-04-17 2005-11-03 Daimlerchrysler Ag Verfahren zum Vermeiden von Kollisionen eines Fahrzeugs mit entgegenkommenden Fahrzeugen
DE102004019337A1 (de) 2004-04-21 2005-11-17 Siemens Ag Assistenzsystem für Kraftfahrzeuge
AU2005242076B2 (en) 2004-05-01 2009-07-23 Eliezer Jacob Digital camera with non-uniform image resolution
DE102004048009A1 (de) 2004-10-01 2006-04-06 Robert Bosch Gmbh Verfahren und Vorrichtung zur Fahrerunterstützung
JP4327062B2 (ja) * 2004-10-25 2009-09-09 三菱電機株式会社 ナビゲーション装置
DE102004062496A1 (de) 2004-12-24 2006-07-06 Daimlerchrysler Ag Verfahren zum Betreiben eines Kollisionsvermeidungs- oder Kollisionsfolgenminderungssystems eines Fahrzeugs sowie Kollisionsvermeidungs- oder Kollisionsfolgenminderungssystem
US7304300B2 (en) 2005-03-15 2007-12-04 Battelle Energy Alliance, Llc Infrared tag and track technique
DE102005036924A1 (de) 2005-08-05 2007-02-08 Bayerische Motoren Werke Ag Fahrerassistenzsystem für ein Kraftfahrzeug
DE102005046672A1 (de) 2005-09-29 2007-04-05 Robert Bosch Gmbh Nachtsichteinrichtung
US7495550B2 (en) 2005-12-28 2009-02-24 Palo Alto Research Center Incorporated Method and apparatus for rear-end collision warning and accident mitigation
US20070276600A1 (en) 2006-03-06 2007-11-29 King Timothy I Intersection collision warning system
DE102006020631A1 (de) 2006-05-04 2007-11-08 Valeo Schalter Und Sensoren Gmbh Steuereinrichtung zur Steuerung eines Fahrtrichtungsanzeigers eines Kraftfahrzeugs
US7633383B2 (en) 2006-08-16 2009-12-15 International Business Machines Corporation Systems and arrangements for providing situational awareness to an operator of a vehicle
DE102006040333A1 (de) 2006-08-29 2008-03-06 Robert Bosch Gmbh Verfahren für die Spurerfassung mit einem Fahrerassistenzsystem eines Fahrzeugs
US8072370B2 (en) 2006-10-31 2011-12-06 Valeo Radar Systems, Inc. System and method for generating an alert signal in a detection system
US7680749B1 (en) 2006-11-02 2010-03-16 Google Inc. Generating attribute models for use in adaptive navigation systems
DE102006062061B4 (de) * 2006-12-29 2010-06-10 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Vorrichtung, Verfahren und Computerprogramm zum Bestimmen einer Position basierend auf einem Kamerabild von einer Kamera
DE102007016868A1 (de) 2007-04-10 2008-10-16 Robert Bosch Gmbh Verfahren zur Anzeige eines Fahrbahnverlaufs und Steuereinrichtung
US7792641B2 (en) 2007-06-12 2010-09-07 Palo Alto Research Center Incorporated Using long-range dynamics and mental-state models to assess collision risk for early warning
JP2009009331A (ja) * 2007-06-27 2009-01-15 Nissan Motor Co Ltd 白線検出装置および白線検出方法
JP2009023399A (ja) 2007-07-17 2009-02-05 Toyota Motor Corp 衝突防止装置
JP4759547B2 (ja) 2007-09-27 2011-08-31 日立オートモティブシステムズ株式会社 走行支援装置
JP5012396B2 (ja) * 2007-10-16 2012-08-29 日産自動車株式会社 白線検出装置、駐停車支援装置および白線検出方法
EP2340187B1 (de) 2008-10-22 2019-03-20 Continental Teves AG & Co. OHG Verfahren und vorrichtung zur automatischen fahrtrichtungsanzeige
US8352111B2 (en) 2009-04-06 2013-01-08 GM Global Technology Operations LLC Platoon vehicle management
DE112010000146A5 (de) * 2009-05-06 2012-06-06 Conti Temic Microelectronic Gmbh Verfahren zur Auswertung von Sensordaten für ein Kraftfahrzeug
US8164543B2 (en) 2009-05-18 2012-04-24 GM Global Technology Operations LLC Night vision on full windshield head-up display
DE102009045286A1 (de) 2009-10-02 2011-04-21 Robert Bosch Gmbh Verfahren zur Abbildung des Umfelds eines Fahrzeugs
US8730059B2 (en) 2009-11-24 2014-05-20 International Business Machines Corporation Optimizing traffic speeds to minimize traffic pulses in an intelligent traffic system
JP2011118753A (ja) 2009-12-04 2011-06-16 Denso Corp 接近報知装置、および接近報知プログラム
US20110190972A1 (en) 2010-02-02 2011-08-04 Gm Global Technology Operations, Inc. Grid unlock
US8812193B2 (en) 2010-05-11 2014-08-19 Conti-Temic Microelectronic Gmbh Method for determining a virtual vehicle corridor
DE102010042440B4 (de) 2010-10-14 2021-09-02 Robert Bosch Gmbh Verfahren und Vorrichtung zum Einstellen eines Eingriffsmoments eines Lenkassistenzsystems
US9377535B2 (en) 2010-12-02 2016-06-28 Honda Motor Co., Ltd. Method for testing GNSS-based positioning systems in obstructed environments
DE102012103669A1 (de) 2012-04-26 2013-10-31 Continental Teves Ag & Co. Ohg Verfahren zur Darstellung einer Fahrzeugumgebung
US10110805B2 (en) 2012-12-06 2018-10-23 Sandisk Technologies Llc Head mountable camera system
DE102013102087A1 (de) 2013-03-04 2014-09-04 Conti Temic Microelectronic Gmbh Verfahren zum Betrieb eines Fahrerassistenzsystems eines Fahrzeugs
US9530057B2 (en) 2013-11-26 2016-12-27 Honeywell International Inc. Maintenance assistant system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005002719A1 (de) * 2005-01-20 2006-08-03 Robert Bosch Gmbh Verfahren zur Kursprädiktion in Fahrerassistenzsystemen für Kraftfahrzeuge
DE102007013023A1 (de) 2007-03-19 2008-09-25 Ibeo Automobile Sensor Gmbh Probabilistische Rasterkarte
WO2010099789A1 (de) * 2009-03-04 2010-09-10 Continental Teves Ag & Co. Ohg Verfahren zur automatischen erkennung eines fahrmanövers eines kraftfahrzeugs und ein dieses verfahren umfassendes fahrerassistenzsystem
DE102009003697A1 (de) 2009-03-30 2010-10-07 Conti Temic Microelectronic Gmbh Verfahren und Vorrichtung zur Fahrspurerkennung

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015209467A1 (de) 2015-05-22 2016-11-24 Continental Teves Ag & Co. Ohg Verfahren zur Schätzung von Fahrstreifen
WO2016188523A1 (de) 2015-05-22 2016-12-01 Continental Teves Ag & Co. Ohg Verfahren zur schätzung von fahrstreifen
JP2018517979A (ja) * 2015-05-22 2018-07-05 コンティネンタル・テーベス・アクチエンゲゼルシヤフト・ウント・コンパニー・オッフェネ・ハンデルスゲゼルシヤフト 走行レーンを推定するための方法
US10650253B2 (en) 2015-05-22 2020-05-12 Continental Teves Ag & Co. Ohg Method for estimating traffic lanes

Also Published As

Publication number Publication date
DE112013004196A5 (de) 2015-06-03
US20150149076A1 (en) 2015-05-28
JP6353448B2 (ja) 2018-07-04
EP2888604A1 (de) 2015-07-01
US9360332B2 (en) 2016-06-07
EP2888604B1 (de) 2019-07-24
DE102012107885A1 (de) 2014-02-27
JP2015534152A (ja) 2015-11-26

Similar Documents

Publication Publication Date Title
EP2888604B1 (de) Verfahren zur bestimmung eines fahrspurverlaufs für ein fahrzeug
DE102019206569B4 (de) Fahrzeug und verfahren zum steuern desselben
DE102009005505B4 (de) Verfahren und Vorrichtung zur Erzeugung eines Abbildes der Umgebung eines Kraftfahrzeugs
EP3455785B1 (de) Verfahren zur erfassung von verkehrszeichen
DE102013012324A1 (de) Verfahren und Vorrichtung zur Fahrwegfindung
EP2116958B1 (de) Verfahren und Vorrichtung zum Ermitteln des Fahrbahnverlaufs im Bereich vor einem Fahrzeug
WO2013091620A1 (de) Bestimmung eines höhenprofils einer fahrzeugumgebung mittels einer 3d-kamera
DE102009016562A1 (de) Verfahren und Vorrichtung zur Objekterkennung
DE102019112413A1 (de) Verfahren und vorrichtung zur multi-sensor-datenfusion für automatisierte und autonome fahrzeuge
DE102017118651A1 (de) Verfahren und System zur Kollisionsvermeidung eines Fahrzeugs
DE102016212326A1 (de) Verfahren zur Verarbeitung von Sensordaten für eine Position und/oder Orientierung eines Fahrzeugs
DE102016118497A1 (de) Ermittlung einer virtuellen Fahrspur für eine von einem Kraftfahrzeug befahrene Straße
DE102021127078B4 (de) Verfahren zum Plausibilisieren einer auf Basis von Schwarmdaten erzeugten Trajektorie für ein zumindest teilweise assistiert betriebenes Kraftfahrzeug, Computerprogrammprodukt sowie Assistenzsystem
DE102015116542A1 (de) Verfahren zum Bestimmen einer Parkfläche zum Parken eines Kraftfahrzeugs, Fahrerassistenzsystem sowie Kraftfahrzeug
EP3391281A1 (de) Verfahren und vorrichtung zur kamerabasierten verkehrszeichenerkennung in einem kraftfahrzeug
DE102020211971A1 (de) Fahrzeugtrajektorienvorhersage unter verwendung von strassentopologie und verkehrsteilnehmer-objektzuständen
EP3911555A1 (de) Verfahren zum trainieren einer trajektorie für ein fahrzeug, sowie elektronisches fahrzeugführungssystem
EP3475876A1 (de) Steuergerät, system mit solch einem steuergerät und verfahren zum betrieb solch eines systems
DE102008025773A1 (de) Verfahren zur Schätzung eines Orts- und Bewegungszustands eines beobachteten Objekts
DE102020007772A1 (de) Verfahren zur In-Betrieb-Kalibrierung eines Lidars und Fahrzeug
EP2715666A1 (de) Verfahren zum bestimmen einer nickbewegung einer in einem fahrzeug verbauten kamera und verfahren zur steuerung einer lichtaussendung zumindest eines frontscheinwerfers eines fahrzeugs
DE102012007127A1 (de) Verfahren zum Bestimmen einer Bewegungsbahn für ein Fahrzeug
EP2579228A1 (de) Verfahren und System zur Erstellung einer digitalen Abbildung eines Fahrzeugumfeldes
DE102010013093A1 (de) Verfahren und System zur Erstellung eines Modells eines Umfelds eines Fahrzeugs
DE102019102922A1 (de) Verfahren und Vorrichtung zur Multi-Sensor-Datenfusion für automatisierte und autonome Fahrzeuge

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13765623

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2013765623

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2015528874

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 14409051

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 1120130041960

Country of ref document: DE

Ref document number: 112013004196

Country of ref document: DE

REG Reference to national code

Ref country code: DE

Ref legal event code: R225

Ref document number: 112013004196

Country of ref document: DE

Effective date: 20150603