DE102009022277A1 - Device for detecting objects for vehicle, and for operating driver assistance system of vehicle, has two image recording units with optical axes running in different angles in horizontal manner - Google Patents
Device for detecting objects for vehicle, and for operating driver assistance system of vehicle, has two image recording units with optical axes running in different angles in horizontal manner Download PDFInfo
- Publication number
- DE102009022277A1 DE102009022277A1 DE102009022277A DE102009022277A DE102009022277A1 DE 102009022277 A1 DE102009022277 A1 DE 102009022277A1 DE 102009022277 A DE102009022277 A DE 102009022277A DE 102009022277 A DE102009022277 A DE 102009022277A DE 102009022277 A1 DE102009022277 A1 DE 102009022277A1
- Authority
- DE
- Germany
- Prior art keywords
- vehicle
- image
- optical axes
- image acquisition
- image recording
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000003287 optical effect Effects 0.000 title claims abstract description 17
- 238000001514 detection method Methods 0.000 claims abstract description 20
- 238000011156 evaluation Methods 0.000 description 3
- 238000000034 method Methods 0.000 description 3
- BUHVIAUBTBOHAG-FOYDDCNASA-N (2r,3r,4s,5r)-2-[6-[[2-(3,5-dimethoxyphenyl)-2-(2-methylphenyl)ethyl]amino]purin-9-yl]-5-(hydroxymethyl)oxolane-3,4-diol Chemical compound COC1=CC(OC)=CC(C(CNC=2C=3N=CN(C=3N=CN=2)[C@H]2[C@@H]([C@H](O)[C@@H](CO)O2)O)C=2C(=CC=CC=2)C)=C1 BUHVIAUBTBOHAG-FOYDDCNASA-N 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/31—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing stereoscopic vision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/24—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/107—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using stereoscopic cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/303—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
Abstract
Description
Die Erfindung betrifft eine Vorrichtung zur Erfassung von Objekten für ein Fahrzeug mit zumindest zwei horizontal nebeneinander angeordneten Bilderfassungseinheiten und einer Bildverarbeitungseinheit, anhand welcher mittels der Bilderfassungseinheit aufgenommene Bilder verarbeitbar sind. Die Erfindung betrifft weiterhin eine Verwendung einer Vorrichtung zur Erfassung von Objekten.The The invention relates to a device for detecting objects for a vehicle with at least two horizontally arranged side by side Image capture units and an image processing unit, based which processed by the image capture unit images processed are. The invention further relates to a use of a device for capturing objects.
Aus
der
Der Erfindung liegt die Aufgabe zugrunde, eine gegenüber dem Stand der Technik verbesserte Vorrichtung zur Erfassung von Objekten für ein Fahrzeug anzugeben, anhand welcher in einfacher Art und Weise ein Erfassungsbereich vergrößert wird und somit eine verbesserte Erfassung der Objekte möglich ist. Der Erfindung liegt weiterhin die Aufgabe zugrunde, eine Verwendung der Vorrichtung anzugeben.Of the Invention is based on the object, a relation to the Prior art improved device for detecting objects for a vehicle based on which in simpler Way enlarges a detection area and thus an improved detection of the objects possible is. The invention is further based on the object, a use indicate the device.
Die Aufgabe wird erfindungsgemäß durch eine Vorrichtung mit den in Anspruch 1 angegebenen Merkmalen gelöst. Hinsichtlich der Verwendung wird die Aufgabe erfindungsgemäß durch die in Anspruch 5 angegebenen Merkmale gelöst.The The object is achieved by a device solved with the features specified in claim 1. Regarding the use of the object according to the invention the features specified in claim 5 solved.
Vorteilhafte Ausgestaltungen der Erfindung sind Gegenstand der Unteransprüche.advantageous Embodiments of the invention are the subject of the dependent claims.
Die Vorrichtung zur Erfassung von Objekten für ein Fahrzeug umfasst zumindest zwei horizontal nebeneinander angeordnete Bilderfassungseinheiten, welche jeweils eine optische Achse aufweisen, und eine Bildverarbeitungseinheit, anhand welcher mittels der Bilderfassungseinheiten aufgenommene Bilder verarbeitbar sind.The Device for detecting objects for a vehicle comprises at least two horizontally juxtaposed image acquisition units, each having an optical axis, and an image processing unit, based on which captured by the image capture units images are processable.
Erfindungsgemäß sind die Bilderfassungseinheiten derart ausgerichtet, dass die optischen Achsen in unterschiedlichem Winkel zur Horizontalen verlaufen, so dass sich ein Erfassungsbereich der ersten Bilderfassungseinheit und ein Erfassungsbereich der zweiten Erfassungseinheit derart in vertikaler Richtung überschneiden, dass sich ein unterer erster Teilbereich als Mono-Bereich der ersten Bilderfassungseinheit, ein mittlerer zweiter Überlappungs-Teilbereich als Stereo-Bereich beider Bilderfassungseinheiten und ein oberer dritter Teilbereich als Mono-Bereich der zweiten Bilderfassungseinheit vertikal übereinander ausbilden.According to the invention the image capture units are aligned such that the optical axes at different angles to the horizontal, so that a detection range of the first image acquisition unit and a detection range of the second detection unit in such a vertical Direction overlap that is a lower first section as a mono area of the first image acquisition unit, a middle one second overlap portion as stereo range of both Image capture units and an upper third sub-area as a mono area the second image acquisition unit vertically above one another form.
Dadurch ist es in besonders vorteilhafter Weise möglich, dass ein gesamter Erfassungsbereich der Vorrichtung in vertikaler Richtung vergrößert ausgebildet ist. Dabei ist der Erfassungsbereich insbesondere derart ausgebildet, dass gleichzeitig ein Mono-Fernbereich, ein Mono-Nahbereich und der Stereo-Bereich erzeugbar sind. Somit sind Objekte in unterschiedlichsten Entfernungen erfassbar, so dass die erfassten und verarbeiteten Bilddaten zum Betrieb unterschiedlichster Fahrerassistenzsysteme verwendbar sind.Thereby it is possible in a particularly advantageous manner that a entire detection range of the device in the vertical direction is formed enlarged. In particular, the coverage area is formed such that at the same time a mono far range, a Mono close range and the stereo range can be generated. Thus are Objects in different distances detectable, so that the captured and processed image data to operate a variety of Driver assistance systems are used.
Ausführungsbeispiele der Erfindung werden im Folgenden anhand von Zeichnungen näher erläutert.embodiments The invention will be described in more detail below with reference to drawings explained.
Dabei zeigen:there demonstrate:
Einander entsprechende Teile sind in allen Figuren mit den gleichen Bezugszeichen versehen.each other corresponding parts are in all figures with the same reference numerals Mistake.
In
den
Mittels der Bilderfassungseinheiten R, L, welche horizontal nebeneinander an dem Fahrzeug F angeordnet sind, werden Bilddaten einer in Fahrtrichtung vor dem Fahrzeug F befindlichen Umgebung zweidimensional erfasst, welche der Bildverarbeitungseinheit V zugeführt werden.through the image acquisition units R, L, which are horizontally next to each other are arranged on the vehicle F, image data one in the direction of travel detected in front of the vehicle F environment two-dimensional, which are supplied to the image processing unit V.
Anhand der Bildverarbeitungseinheit V werden die zweidimensionalen Bilddaten verarbeitet, wobei die Bildverarbeitungseinheit V nicht näher dargestellte Mittel aufweist, anhand welcher die zweidimensionalen Bilddaten sowohl zweidimensional als auch dreidimensional, d. h. stereoskopisch, weiterverarbeitet werden. Zu dieser stereoskopischen Verarbeitung der Bilddaten werden ein oder mehrere von aus dem Stand der Technik zahlreich bekannten Verfahren und/oder Vorrichtungen verwendet.Based The image processing unit V becomes the two-dimensional image data processed, the image processing unit V not closer illustrated means, based on which the two-dimensional Image data both two-dimensional and three-dimensional, d. H. stereoscopic, further processed. To this stereoscopic Processing the image data will be one or more of the state the technique of numerous known methods and / or devices used.
Die Bilderfassungseinheiten R, L sind dabei derart ausgerichtet, dass deren optische Achsen AL, AR in horizontaler Richtung parallel zueinander verlaufen. Erfindungsgemäß verlaufen die optische Achsen AL, AR in unterschiedlichem Winkel zur Horizontalen, d. h. in vertikaler Richtung, so dass sich Erfassungsbereiche EL, ER, welche Öffnungswinkel αL bzw. αR aufweisen, zumindest in vertikaler Richtung überschneiden.The image acquisition units R, L are aligned such that their optical axes A L , A R in the horizontal direction parallel to each other. According to the invention, the optical axes A L , A R at different angles to the horizontal, ie in the vertical direction, so that detection ranges E L , E R , which opening angle α L or α R , at least intersect in the vertical direction.
Dabei verlaufen die optischen Achsen AL, AR derart in unterschiedlichem Winkel zur Horizontalen, dass ein Erfassungsbereich Eges, welcher sich aus den Erfassungsbereichen EL, ER beider Bilderfassungseinheiten R, L zusammensetzt, aus drei vertikal übereinander ausgebildeten Teilbereichen T1 bis T3 gebildet ist.In this case, the optical axes A L , A R extend at different angles to the horizontal such that a detection area E ges , which is composed of the detection areas E L , E R of both image acquisition units R, L, formed from three vertically stacked portions T1 to T3 is.
Aufgrund einer Verschiebung der optischen Achsen AL, AR um einen Winkel αopt zueinander ergibt sich ein vergrößerter Gesamtöffnungswinkel αges für eine aus den beiden Bilderfassungseinheiten R, L gebildete Stereokamera S. Entsprechend wird eine minimale Sichtweite der Stereokamera S verkleinert und eine maximale Sichtweite vergrößert.Due to a shift of the optical axes A L , A R by an angle α opt each other results in an increased total opening angle α ges for one of the two image acquisition units R, L formed stereo camera S. Accordingly, a minimum field of view of the stereo camera S is reduced and a maximum visibility increased.
Dabei
ergibt sich der Gesamtöffnungswinkels αges zu
Der Winkel αopt kennzeichnet hierbei einen Öffnungswinkel des durch den ersten Teilbereich T1 und den dritten Teilbereich T3 eingeschlossenen Überlappungs-Teilbereichs T2.The angle α opt here denotes an opening angle of the overlapping partial area T2 enclosed by the first partial area T1 and the third partial area T3.
Der erste Teilbereich T1 ist ein Mono-Nahbereich, welcher allein mittels der rechten Bilderfassungseinheit R erfasst wird und durch eine geringe Reichweite gekennzeichnet ist. Durch die Ausrichtung dieses Teilbereichs T1 im Nahbereich vor dem Fahrzeug F und der Ausrichtung in Richtung einer Fahrbahnoberfläche eignen sich Bilddaten des ersten Teilbereichs T1 insbesondere zu einer Verwendung als Eingangsdaten eines Spurassistenzsystems, da die Fahrbahnoberfläche mittels des ersten Teilbereichs T1 sehr genau abbildbar ist.Of the first sub-area T1 is a mono-near area, which alone means the right image acquisition unit R is detected and by a low range is indicated. By aligning this Subarea T1 in the vicinity of the vehicle F and the orientation Image data is suitable in the direction of a road surface of the first subarea T1 in particular for use as Input data of a lane assistance system, as the road surface by means of the first portion T1 is very accurately mapped.
Der dritte Teilbereich T3 ist ein Mono-Fernbereich, welcher allein mittels der linken Bilderfassungseinheit L erfasst wird und sich durch eine große Reichweite auszeichnet. Aufgrund der Ausrichtung und Höhe des dritten Teilbereichs T3 in Richtung des Horizonts eignen sich dessen Bilddaten insbesondere als Eingangsdaten einer automatischen Verkehrszeichenerkennung.Of the third sub-area T3 is a mono far area, which alone means the left image acquisition unit L is detected and by a long range. Due to the orientation and Height of the third subsection T3 in the direction of the horizon Its image data are particularly suitable as input data of a automatic traffic sign recognition.
Von den beiden Teilbereichen T1 und T3 wird der Überlappungs-Teilbereich T2 eingeschlossen, welcher durch Überlappung der beiden Erfassungsbereiche EL, ER entsteht. Somit bildet der Überlappungs-Teilbereich T2 einen Stereo-Bereich, wobei aus den Bilddaten dieses Überlappungs-Teilbereichs T2 insbesondere so genannte Disparitätsbilder erzeugbar sind, anhand deren Auswertung nicht dargestellte Objekte, welche sich vor dem Fahrzeug F befinden, dreidimensional erfassbar sind. In Abhängigkeit der durch Auswertung der Bilddaten derart erfassten Objekte ist insbesondere eine Steuerung eines Betriebs des Fahrzeugs F durch Fahrerassistenzsysteme möglich. Bei diesen Fahrerassistenzsystemen handelt es sich beispielsweise um ein automatisches Abstandsregelsystem oder ein Notbremssystem, welche aufgrund der erfassten Objekte automatisch eine Längssteuerung des Fahrzeugs F ausführen und/oder Warnungen für einen Fahrer des Fahrzeugs F erzeugen.Of the two partial areas T1 and T3, the overlapping partial area T2 is included, which is created by overlapping the two detection areas E L , E R. Thus, the overlapping partial area T2 forms a stereo area, wherein in particular so-called disparity images can be generated from the image data of this overlap partial area T2, on the basis of whose evaluation objects not shown, which are located in front of the vehicle F, can be detected three-dimensionally. Depending on the objects detected in such a way by evaluation of the image data, in particular a control of an operation of the vehicle F by driver assistance systems is possible. These driver assistance systems are, for example, an automatic distance control system or an emergency braking system, which automatically execute a longitudinal control of the vehicle F on the basis of the detected objects and / or generate warnings for a driver of the vehicle F.
In nicht näher dargestellter Weise sind die optischen Achsen AL, AR der Bilderfassungseinheiten R, L zusätzlich relativ zueinander und/oder relativ zum Fahrzeug veränderbar ausrichtbar. Insbesondere sind die optischen Achsen AL, AR der Bilderfassungseinheiten L, R zu einer Fahrzeuglängsachse, Fahrzeugquerachse und/oder einer Fahrzeughochachse veränderbar ausrichtbar, so dass in besonders vorteilhafter Weise eine erforderliche Sichtweite und/oder ein erforderlicher Blickwinkel in einfacher Art und Weise an eine aktuelle Situation des Fahrzeugs F anpassbar sind.In a manner not shown in detail, the optical axes A L , A R of the image acquisition units R, L are additionally relative to each other and / or relative to the vehicle changeable alignable. In particular, the optical axes A L , A R of the image acquisition units L, R to a vehicle longitudinal axis, vehicle transverse axis and / or a vehicle vertical axis variably aligned, so that in a particularly advantageous manner, a required visibility and / or a required angle in a simple manner to a current situation of the vehicle F are customizable.
- AL A L
- optische Achseoptical axis
- AR A R
- optische Achseoptical axis
- EL E L
- Erfassungsbereichdetection range
- EL E L
- Erfassungsbereichdetection range
- Eges E ges
- Erfassungsbereichdetection range
- FF
- Fahrzeugvehicle
- LL
- BilderfassungseinheitImage capture unit
- RR
- BilderfassungseinheitImage capture unit
- SS
- Stereokamerastereo camera
- T1T1
- Teilbereichsubregion
- T2T2
- Teilbereichsubregion
- T3T3
- Teilbereichsubregion
- VV
- BildverarbeitungseinheitImage processing unit
ZITATE ENTHALTEN IN DER BESCHREIBUNGQUOTES INCLUDE IN THE DESCRIPTION
Diese Liste der vom Anmelder aufgeführten Dokumente wurde automatisiert erzeugt und ist ausschließlich zur besseren Information des Lesers aufgenommen. Die Liste ist nicht Bestandteil der deutschen Patent- bzw. Gebrauchsmusteranmeldung. Das DPMA übernimmt keinerlei Haftung für etwaige Fehler oder Auslassungen.This list The documents listed by the applicant have been automated generated and is solely for better information recorded by the reader. The list is not part of the German Patent or utility model application. The DPMA takes over no liability for any errors or omissions.
Zitierte PatentliteraturCited patent literature
- - DE 10131196 A1 [0002] - DE 10131196 A1 [0002]
Claims (5)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102009022277A DE102009022277A1 (en) | 2009-05-22 | 2009-05-22 | Device for detecting objects for vehicle, and for operating driver assistance system of vehicle, has two image recording units with optical axes running in different angles in horizontal manner |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102009022277A DE102009022277A1 (en) | 2009-05-22 | 2009-05-22 | Device for detecting objects for vehicle, and for operating driver assistance system of vehicle, has two image recording units with optical axes running in different angles in horizontal manner |
Publications (1)
Publication Number | Publication Date |
---|---|
DE102009022277A1 true DE102009022277A1 (en) | 2010-01-21 |
Family
ID=41427418
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
DE102009022277A Withdrawn DE102009022277A1 (en) | 2009-05-22 | 2009-05-22 | Device for detecting objects for vehicle, and for operating driver assistance system of vehicle, has two image recording units with optical axes running in different angles in horizontal manner |
Country Status (1)
Country | Link |
---|---|
DE (1) | DE102009022277A1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102015007180A1 (en) | 2015-06-03 | 2015-12-10 | Daimler Ag | Environmental detection device for a motor vehicle and method for operating such |
DE102015010441A1 (en) | 2015-08-11 | 2016-01-28 | Daimler Ag | Method and camera-assisted lane-tracking system for a motor vehicle |
DE102015010535A1 (en) | 2015-08-12 | 2016-02-18 | Daimler Ag | Camera-based environmental detection for commercial vehicles |
DE102014220558A1 (en) * | 2014-10-10 | 2016-04-14 | Conti Temic Microelectronic Gmbh | IMAGING APPARATUS FOR A VEHICLE AND METHOD |
DE102014220572A1 (en) * | 2014-10-10 | 2016-04-14 | Conti Temic Microelectronic Gmbh | IMAGING APPARATUS AND METHOD |
DE102017006176A1 (en) | 2017-06-29 | 2017-12-21 | Daimler Ag | Stereo camera for environmental detection for a vehicle |
US11140337B2 (en) | 2017-11-29 | 2021-10-05 | Denso Corporation | Camera module |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10131196A1 (en) | 2001-06-28 | 2003-01-16 | Bosch Gmbh Robert | Device for the detection of objects, people or the like |
-
2009
- 2009-05-22 DE DE102009022277A patent/DE102009022277A1/en not_active Withdrawn
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10131196A1 (en) | 2001-06-28 | 2003-01-16 | Bosch Gmbh Robert | Device for the detection of objects, people or the like |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102014220558A1 (en) * | 2014-10-10 | 2016-04-14 | Conti Temic Microelectronic Gmbh | IMAGING APPARATUS FOR A VEHICLE AND METHOD |
DE102014220572A1 (en) * | 2014-10-10 | 2016-04-14 | Conti Temic Microelectronic Gmbh | IMAGING APPARATUS AND METHOD |
DE102015007180A1 (en) | 2015-06-03 | 2015-12-10 | Daimler Ag | Environmental detection device for a motor vehicle and method for operating such |
DE102015010441A1 (en) | 2015-08-11 | 2016-01-28 | Daimler Ag | Method and camera-assisted lane-tracking system for a motor vehicle |
DE102015010535A1 (en) | 2015-08-12 | 2016-02-18 | Daimler Ag | Camera-based environmental detection for commercial vehicles |
DE102017006176A1 (en) | 2017-06-29 | 2017-12-21 | Daimler Ag | Stereo camera for environmental detection for a vehicle |
US11140337B2 (en) | 2017-11-29 | 2021-10-05 | Denso Corporation | Camera module |
DE102018220427B4 (en) | 2017-11-29 | 2022-04-28 | Denso Corporation | camera module |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102009022277A1 (en) | Device for detecting objects for vehicle, and for operating driver assistance system of vehicle, has two image recording units with optical axes running in different angles in horizontal manner | |
EP2467291B1 (en) | Method and controller for a robust detection of a vehicle lane change | |
WO2013034141A1 (en) | Determination of the position of structural elements of a vehicle | |
DE102013019804A1 (en) | Method for determining a movement of an object | |
DE102015121353A1 (en) | Method for detecting a possible collision between a motor vehicle and an object taking into account a spatial uncertainty, control device, driver assistance system and motor vehicle | |
DE102015010535A1 (en) | Camera-based environmental detection for commercial vehicles | |
DE102014224762A1 (en) | Method and device for obtaining information about an object in a non-accessible, adjacent surrounding area of a motor vehicle | |
WO2020020654A1 (en) | Method for operating a driver assistance system having two detection devices | |
DE102013012778A1 (en) | Method for detecting a moving pedestrian on the basis of characteristic features and optical flow vectors of an image, camera system and motor vehicle | |
DE102009033853A1 (en) | Driver assistance system operating method for car, involves determining complexity of traffic situation, determining reliability of sensor data, and merging sensor data in dependence of determined reliability | |
DE102009007412A1 (en) | Method for tracing e.g. pedestrian in images captured by camera in car, involves determining subset based on parameters with respect to transformed image points and correspondence image points, and utilizing mapping rule to trace object | |
DE102012018471A1 (en) | Method for detecting e.g. lane markings of lane edge for motor car, involves performing classification in image region of individual images, which are detected by cameras, and in another image region of disparity images | |
DE102017005056A1 (en) | Method for operating a stereo camera system in a vehicle | |
DE102014010272A1 (en) | Method for determining a field of view of a detection unit of a vehicle | |
DE102015007180A1 (en) | Environmental detection device for a motor vehicle and method for operating such | |
DE102019211459B4 (en) | Method and device for checking a calibration of environmental sensors | |
DE102007013501B4 (en) | Driver assistance system with differently oriented cameras | |
DE102016224573A1 (en) | Radar system with dynamic object detection in a vehicle. | |
EP2565580B1 (en) | Method for determining the dimensions of an object in the vicinity of a vehicle, corresponding device and vehicle with such a device | |
DE102015010441A1 (en) | Method and camera-assisted lane-tracking system for a motor vehicle | |
DE102013210591A1 (en) | MOTION RECOGNITION OF A VEHICLE BY MULTIPLE CAMERAS | |
DE102017010307A1 (en) | Method for detecting an environment of a vehicle | |
EP3036684A2 (en) | Method for detecting errors for at least one image processing system | |
DE102018122092A1 (en) | Method for determining at least one position parameter of an object in an environment of a motor vehicle, computer program product, driver assistance system and motor vehicle | |
DE102015003963A1 (en) | Device and method for detecting traffic signs |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
OAV | Publication of unexamined application with consent of applicant | ||
R119 | Application deemed withdrawn, or ip right lapsed, due to non-payment of renewal fee | ||
R119 | Application deemed withdrawn, or ip right lapsed, due to non-payment of renewal fee |
Effective date: 20141202 |