EP2462538A1 - Verfahren zur überwachung einer umgebung eines fahrzeugs - Google Patents
Verfahren zur überwachung einer umgebung eines fahrzeugsInfo
- Publication number
- EP2462538A1 EP2462538A1 EP10737508A EP10737508A EP2462538A1 EP 2462538 A1 EP2462538 A1 EP 2462538A1 EP 10737508 A EP10737508 A EP 10737508A EP 10737508 A EP10737508 A EP 10737508A EP 2462538 A1 EP2462538 A1 EP 2462538A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- objects
- vehicle
- environment
- boundary line
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
- 238000000034 method Methods 0.000 title claims abstract description 14
- 238000012544 monitoring process Methods 0.000 title claims abstract description 8
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 claims abstract description 11
- 240000004050 Pentaglottis sempervirens Species 0.000 claims description 4
- 238000001514 detection method Methods 0.000 description 11
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/41—Extracting pixel data from a plurality of image sensors simultaneously picking up an image, e.g. for increasing the field of view by combining the outputs of a plurality of sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/303—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/607—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/16—Image acquisition using multiple overlapping images; Image stitching
Definitions
- the invention relates to a method for monitoring an environment of a vehicle, wherein the surroundings and objects present therein are detected by means of at least a first image capture unit and a second image capture unit whose capture regions at least partially overlap and form an overlap region, wherein from Individual images on the basis of an image processing unit, an overall image is generated, which shows the vehicle and its surroundings from a bird's eye view.
- Vehicle dimensions and / or a small clarity marked As a result, in particular, maneuvering with these vehicles is very difficult for a driver. Therefore, it is advantageous to vehicles and their environment, especially an environment behind or next to this, from a bird's eye view on a
- Laserscanner randomly recorded laser scan data wherein the laser scanner is mounted on a vehicle and each of the sampled position data and
- Positioning data are assigned. Furthermore, at least one image sequence is detected by means of a camera also arranged on the vehicle, with each image of the image sequence in turn being assigned position data and position determination data. From the laser scan data, a surface is determined, wherein a position of the
- the panoramic view or an all-round view becomes for the said surface of at least one image sequence depending on the position of this and the position data assigned to each of the images and
- US 2006/0192660 A1 discloses a device for displaying a
- the device comprises a first detection unit, which is arranged on one side of the vehicle and is provided for detecting a first image.
- a second detection unit is provided for detecting a second image, wherein the second detection unit is arranged relative to the first detection unit in front of it.
- a display unit is provided for displaying the captured images and for displaying the surroundings of the vehicle.
- a boundary line between the first and the second image is defined in an overall image formed from these images such that the boundary line is identical to a straight line connecting a position of the first camera and a position of the second camera.
- the invention has for its object to provide a comparison with the prior art improved method for monitoring an environment of a vehicle.
- the object is achieved by a method, which in the
- Image capturing unit and a second image capture unit whose coverage areas overlap at least partially and form an overlap area, wherein an overall image is generated from individual images captured by the image capture units based on an image processing unit, which shows the vehicle and its surroundings from a bird's eye view.
- raised objects such as persons, obstacles or other objects, in particular also on vehicles with a large vehicle length
- Single image area a course of a boundary line that extends from an origin to the image edge of the overall image, so variable given that the boundary line extends away from the objects.
- Fig. 1 shows schematically a section of an overall image according to the prior
- Technique showing a vehicle, its surroundings and an object from a bird's-eye view 2 shows schematically the section of an overall image with a variably definable borderline between a first and a second frame area
- Fig. 3 shows schematically a subdivision of the overall picture into several
- Fig. 4 shows schematically a rectangular course of the boundary line between the
- FIG. 5 schematically shows a curved course of the boundary line between the first individual image area and the second individual image area of the first image area.
- FIG. 1 shows a detail of an overall image G according to the prior art, which shows a vehicle F whose surroundings U and an object 01 are from a bird's-eye view.
- a first image detection unit 1 At the front end of the vehicle F is a first image detection unit 1 and on the right side of the vehicle F, a second image detection unit 2 is arranged, by means of which the vehicle F and its surroundings U can be detected. In this case, coverage areas of the image acquisition units 1 and 2 at least partially overlap.
- the image acquisition units 1 and 2 are preferably cameras, each having a large detection area. In addition to conventional cameras, these may in particular be omnidirectional cameras. these can
- the image acquisition units 1 and 2 capture individual images of the vehicle F, its surroundings U and the object O1, these individual images being converted into the overall image G by means of an image processing unit (not shown).
- the image processing unit not shown.
- Overall image G shows the vehicle F, whose surroundings and the object O1, which may be obstacles or other objects in addition to a person, from a bird's-eye viewpoint from a viewing point.
- the individual images are converted by the image processing unit based on a projection of the individual images on a virtual reference plane.
- This virtual reference plane is, in particular, a plane which, at the height of a roadway of the vehicle F, i. H. on the ground, which is a base is arranged.
- the object 01 which is raised from the base, d. H. protrudes from this, is detected by the image acquisition units 1 and 2 from different perspectives, it can lead to an optically refracted and / or at least partially incomplete representation of the object O1 in the overall image G, if the object O1 in the region of the boundary line L or directly on the latter is located, wherein the boundary line L separates a first individual image area EB1 projected by means of the first image detection unit 1 and onto the reference plane from a second individual image area EB2 projected by means of the second image detection unit 2 and onto the reference plane.
- a course of the boundary line L1 is predetermined such that the boundary line extends away from the object 01 ,
- FIG. 2 shows such curves of the boundary line L1 away from the object 01, the boundary line L1 always running in such a way that the object 01 is not cut or touched by it.
- the position of the object O1 in the surroundings of the vehicle F is first of all determined for such specification of the course of the boundary line L1, this taking place by means of the image acquisition units 1 and 2 and the image processing unit. From this position of the object 01 in the vicinity of the vehicle F and the known Orientation of the image acquisition units 1, 2 is then determined in which position the object 01 is displayed in the overall image G.
- the individual image areas EB1 and EB2 are generated when they are generated on the basis of FIG.
- Image processing unit calculated such that the boundary line extends away from the object 01. This variable course of the boundary line L1 is possible because the
- Image detection units 1, 2 are aligned such that the overlap region between the detection areas is formed, so that in the overlap region, the environment U and the object 01 located therein of both
- Image acquisition units 1, 2 is detected.
- the boundary line L1 is formed as a straight line, which is pivoted about its origin in a front corner of the vehicle F that it does not touch the object 01.
- a tolerance range T1 is formed between the boundary line L1 and the object 01.
- FIG. 3 shows the overall image G with the vehicle F, its surroundings U and two objects 01, 02, the overall image G being subdivided several times.
- Frame area EB1 is surrounded.
- Frame area EB2 is surrounded. In other words, it is changed over the area of the overall image G several times between the input image areas EB1 and EB2.
- the individual image areas EB1, EB2 are separated from each other by boundary lines L1 to L3, wherein the boundary lines L1 to L3 in turn run such that they do not touch the objects O1, O2 and that between the
- Boundary lines L1 to L3 and the objects 01, 02 form tolerance ranges T1 to T4.
- the course of the boundary lines L1 to L3 results in an optimal representation of the objects 01, O2, since these are represented in the perspective of the respective image acquisition unit 1, 2, in which they are unambiguously, error-free and completely recorded.
- Figure 4 shows a further course of the boundary line L1 between the origin at the front corner of the vehicle F and the image edge of the overall image G.
- the boundary line L1 is such that it represents a portion of a rectangle, wherein the object O1 in the overall image G above the boundary line is L1 and the
- Tolerance range T1 is formed between the object 01 and the boundary line L1.
- the boundary line may alternatively also polygonal run around one or more objects, so that always an optimal and complete representation of the objects based on the overall image G is possible.
- boundary line L1 is curved such that the first object 01 in the first frame EB1 above the boundary line L1 and the second object 02 below the boundary line L1 in the overall image G extends. Furthermore, the origin of the boundary line L1 is also from the front corner of the vehicle F at the front thereof toward the first one
- Imager unit 1 offset, so that an altered virtual observation point on the overall image G is created. This achieves a further improvement of the representation of the objects O1, O2. LIST OF REFERENCE NUMBERS
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102009036200A DE102009036200A1 (de) | 2009-08-05 | 2009-08-05 | Verfahren zur Überwachung einer Umgebung eines Fahrzeugs |
PCT/EP2010/004415 WO2011015283A1 (de) | 2009-08-05 | 2010-07-20 | Verfahren zur überwachung einer umgebung eines fahrzeugs |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2462538A1 true EP2462538A1 (de) | 2012-06-13 |
Family
ID=42063172
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP10737508A Ceased EP2462538A1 (de) | 2009-08-05 | 2010-07-20 | Verfahren zur überwachung einer umgebung eines fahrzeugs |
Country Status (6)
Country | Link |
---|---|
US (1) | US8750572B2 (de) |
EP (1) | EP2462538A1 (de) |
JP (1) | JP5667629B2 (de) |
CN (1) | CN102473239B (de) |
DE (1) | DE102009036200A1 (de) |
WO (1) | WO2011015283A1 (de) |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9147260B2 (en) | 2010-12-20 | 2015-09-29 | International Business Machines Corporation | Detection and tracking of moving objects |
CN103782591B (zh) * | 2011-08-26 | 2017-02-15 | 松下知识产权经营株式会社 | 驾驶辅助装置 |
DE102011082881A1 (de) | 2011-09-16 | 2013-03-21 | Bayerische Motoren Werke Aktiengesellschaft | Darstellung der Umgebung eines Kraftfahrzeugs in einer bestimmten Ansicht unter Verwendung räumlicher Information |
DE102011088332B4 (de) | 2011-12-13 | 2021-09-02 | Robert Bosch Gmbh | Verfahren zur Verbesserung der Objektdetektion bei Multikamerasystemen |
DE102013015625B3 (de) * | 2013-09-19 | 2014-12-18 | Mekra Lang Gmbh & Co. Kg | Sichtsystem für ein Kraftfahrzeug |
JPWO2015129280A1 (ja) * | 2014-02-26 | 2017-03-30 | 京セラ株式会社 | 画像処理装置および画像処理方法 |
KR102118066B1 (ko) * | 2014-08-20 | 2020-06-03 | 현대모비스 주식회사 | 주행 안전을 위한 차량 제어방법 |
EP3029601B1 (de) * | 2014-12-04 | 2018-09-19 | Conti Temic microelectronic GmbH | Verfahren und Vorrichtung zur Erhöhung der Sichtbarkeit von Hindernissen |
DE102015204213B4 (de) | 2015-03-10 | 2023-07-06 | Robert Bosch Gmbh | Verfahren zum Zusammensetzen von zwei Bildern einer Fahrzeugumgebung eines Fahrzeuges und entsprechende Vorrichtung |
DE102015007673A1 (de) | 2015-06-16 | 2016-12-22 | Mekra Lang Gmbh & Co. Kg | Sichtsystem für ein Nutzfahrzeug zur Darstellung von gesetzlich vorgeschriebenen Sichtfeldern eines Hauptspiegels und eines Weitwinkelspiegels |
EP3142066B1 (de) * | 2015-09-10 | 2024-06-12 | KNORR-BREMSE Systeme für Nutzfahrzeuge GmbH | Bildsynthesizer für ein umgebendes überwachungssystem |
EP3144162B1 (de) | 2015-09-17 | 2018-07-25 | KNORR-BREMSE Systeme für Nutzfahrzeuge GmbH | Vorrichtung und verfahren zur steuerung eines druckes in mindestens einem luftreifen eines fahrzeuges |
JP6819681B2 (ja) * | 2016-06-08 | 2021-01-27 | ソニー株式会社 | 撮像制御装置および方法、並びに車両 |
DE102016117518A1 (de) | 2016-09-16 | 2018-03-22 | Connaught Electronics Ltd. | Angepasstes Zusammenfügen von Einzelbildern zu einem Gesamtbild in einem Kamerasystem für ein Kraftfahrzeug |
DE102017201000A1 (de) * | 2017-01-23 | 2018-07-26 | Robert Bosch Gmbh | Verfahren zum Kombinieren einer Vielzahl von Kamerabildern |
JP6835004B2 (ja) * | 2018-02-09 | 2021-02-24 | 株式会社デンソー | 画像生成装置 |
JP2019151304A (ja) * | 2018-03-06 | 2019-09-12 | アイシン精機株式会社 | 周辺監視装置 |
CN113678433B (zh) | 2019-04-18 | 2024-05-03 | 三菱电机株式会社 | 车辆周边图像生成装置、车辆周边显示系统及车辆周边显示方法 |
KR102281609B1 (ko) | 2020-01-16 | 2021-07-29 | 현대모비스 주식회사 | 어라운드뷰 합성 시스템 및 방법 |
WO2022074848A1 (ja) * | 2020-10-09 | 2022-04-14 | 株式会社ソシオネクスト | 画像処理装置、画像処理方法、及び画像処理プログラム |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5670935A (en) * | 1993-02-26 | 1997-09-23 | Donnelly Corporation | Rearview vision system for vehicle including panoramic view |
JP3714116B2 (ja) * | 1999-08-09 | 2005-11-09 | トヨタ自動車株式会社 | 操縦安定性制御装置 |
KR20020033816A (ko) * | 2000-07-19 | 2002-05-07 | 마츠시타 덴끼 산교 가부시키가이샤 | 감시시스템 |
KR100866450B1 (ko) * | 2001-10-15 | 2008-10-31 | 파나소닉 주식회사 | 차량 주위 감시 장치 및 그 조정 방법 |
JP4639753B2 (ja) * | 2004-10-25 | 2011-02-23 | 日産自動車株式会社 | 運転支援装置 |
US7423521B2 (en) * | 2004-12-07 | 2008-09-09 | Kabushiki Kaisha Honda Lock | Vehicular visual assistance system |
JP4548129B2 (ja) * | 2005-01-26 | 2010-09-22 | トヨタ自動車株式会社 | 電動車両の駆動装置 |
US7403101B2 (en) * | 2005-02-09 | 2008-07-22 | General Motors Corporation | Collision avoidance of unattended vehicles |
EP1696669B1 (de) | 2005-02-24 | 2013-07-03 | Aisin Seiki Kabushiki Kaisha | Gerät zur Fahrzeugumgebungsbeobachtung |
JP4432801B2 (ja) * | 2005-03-02 | 2010-03-17 | 株式会社デンソー | 運転支援装置 |
JP4956915B2 (ja) * | 2005-05-20 | 2012-06-20 | 日産自動車株式会社 | 映像表示装置及び映像表示方法 |
JP4662832B2 (ja) | 2005-09-26 | 2011-03-30 | アルパイン株式会社 | 車両用画像表示装置 |
JP4883977B2 (ja) * | 2005-10-05 | 2012-02-22 | アルパイン株式会社 | 車両用画像表示装置 |
JP4606322B2 (ja) * | 2005-12-27 | 2011-01-05 | アルパイン株式会社 | 車両運転支援装置 |
DE102006003538B3 (de) * | 2006-01-24 | 2007-07-19 | Daimlerchrysler Ag | Verfahren zum Zusammenfügen mehrerer Bildaufnahmen zu einem Gesamtbild in der Vogelperspektive |
JP4254887B2 (ja) | 2006-07-06 | 2009-04-15 | 日産自動車株式会社 | 車両用画像表示システム |
JP4248570B2 (ja) * | 2006-08-21 | 2009-04-02 | 三洋電機株式会社 | 画像処理装置並びに視界支援装置及び方法 |
JP4315968B2 (ja) * | 2006-08-21 | 2009-08-19 | 三洋電機株式会社 | 画像処理装置並びに視界支援装置及び方法 |
JP2008187566A (ja) * | 2007-01-31 | 2008-08-14 | Sanyo Electric Co Ltd | カメラ校正装置及び方法並びに車両 |
JP4969269B2 (ja) * | 2007-02-21 | 2012-07-04 | アルパイン株式会社 | 画像処理装置 |
EP2158576A1 (de) | 2007-06-08 | 2010-03-03 | Tele Atlas B.V. | Verfahren und vorrichtung zum produzieren eines panoramas mit mehreren ansichtspunkten |
JP5182042B2 (ja) * | 2008-11-28 | 2013-04-10 | 富士通株式会社 | 画像処理装置、画像処理方法及びコンピュータプログラム |
TWI392366B (zh) * | 2009-12-31 | 2013-04-01 | Ind Tech Res Inst | 全周鳥瞰影像距離介面產生方法與系統 |
-
2009
- 2009-08-05 DE DE102009036200A patent/DE102009036200A1/de not_active Withdrawn
-
2010
- 2010-07-20 US US13/384,814 patent/US8750572B2/en active Active
- 2010-07-20 JP JP2012523219A patent/JP5667629B2/ja active Active
- 2010-07-20 CN CN201080034588.2A patent/CN102473239B/zh active Active
- 2010-07-20 WO PCT/EP2010/004415 patent/WO2011015283A1/de active Application Filing
- 2010-07-20 EP EP10737508A patent/EP2462538A1/de not_active Ceased
Non-Patent Citations (1)
Title |
---|
See references of WO2011015283A1 * |
Also Published As
Publication number | Publication date |
---|---|
WO2011015283A1 (de) | 2011-02-10 |
JP5667629B2 (ja) | 2015-02-12 |
US8750572B2 (en) | 2014-06-10 |
CN102473239B (zh) | 2014-09-24 |
US20120121136A1 (en) | 2012-05-17 |
DE102009036200A1 (de) | 2010-05-06 |
JP2013501280A (ja) | 2013-01-10 |
CN102473239A (zh) | 2012-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011015283A1 (de) | Verfahren zur überwachung einer umgebung eines fahrzeugs | |
DE102012001835B4 (de) | Sichtsystem für ein Nutzfahrzeug zur Darstellung von gesetzlich vorgeschriebenen Sichtfeldern eines Hauptspiegels und eines Weitwinkelspiegels | |
DE102012001950A1 (de) | Verfahren zum Betreiben einer Kameraanordnung für ein Fahrzeug und Kameraanordnung | |
EP3281178A1 (de) | Verfahren zur darstellung einer fahrzeugumgebung eines fahrzeuges | |
EP2791895A1 (de) | Verfahren zur verbesserung der objektdetektion bei multikamerasystemen | |
WO2014094941A1 (de) | Kraftfahrzeug mit kamera-monitor-system | |
WO2010025792A1 (de) | Verfahren und vorrichtung zur überwachung einer umgebung eines fahrzeuges | |
WO2018108213A1 (de) | Rundumsichtsystem für ein fahrzeug | |
DE102008035428B4 (de) | Verfahren und Vorrichtung zur Überwachung einer Umgebung eines Fahrzeuges | |
DE102017205630A1 (de) | Kameravorrichtung und Verfahren zur Erfassung eines Umgebungsbereichs eines Fahrzeugs | |
DE102014012250B4 (de) | Verfahren zur Bilderverarbeitung und -darstellung | |
DE102016124747A1 (de) | Erkennen eines erhabenen Objekts anhand perspektivischer Bilder | |
DE102007025147A1 (de) | System zur Spurverlassenswarnung und/oder Spurhaltefunktion | |
DE102008030104A1 (de) | Verfahren und Vorrichtung zur Überwachung einer Umgebung eines Fahrzeuges | |
DE102017123228A1 (de) | Verfahren zum Klassifizieren eines Objektpunkts als statisch oder dynamisch, Fahrerassistenzsystem und Kraftfahrzeug | |
DE102010042248A1 (de) | Verfahren und Vorrichtung zur optischen Darstellung einer Umgebung eines Fahrzeugs | |
DE102006037600A1 (de) | Verfahren zur auflösungsabhängigen Darstellung der Umgebung eines Kraftfahrzeugs | |
DE102015204213B4 (de) | Verfahren zum Zusammensetzen von zwei Bildern einer Fahrzeugumgebung eines Fahrzeuges und entsprechende Vorrichtung | |
EP3833576B1 (de) | Kameraüberwachungssystem | |
EP3234907A1 (de) | Kamerasystem und verfahren zum visualisieren mindestens eines fahrzeugumfeldbereiches eines fahrzeugumfeldes eines fahrzeuges | |
DE102017201000A1 (de) | Verfahren zum Kombinieren einer Vielzahl von Kamerabildern | |
DE102011116771A1 (de) | Verfahren zum Anzeigen von Bildinformationen auf einer Anzeigeeinheit eines Fahrzeugs sowie Fahrerassistenzeinrichtung zum Durchführen eines derartigen Verfahrens | |
DE102015006637A1 (de) | Verfahren zur Entzerrung von Bildern | |
DE102012025463A1 (de) | Verfahren zum Bestimmen eines Bewegungsparameters eines Kraftfahrzeugs durch Auffinden von invarianten Bildregionen in Bildern einer Kamera des Kraftfahrzeugs, Kamerasystem und Kraftfahrzeug | |
DE102012005468B4 (de) | Verfahren und Vorrichtung zur Darstellung einer Signallichtinformation für ein Fahrzeug |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20111215 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR |
|
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20130114 |
|
APBK | Appeal reference recorded |
Free format text: ORIGINAL CODE: EPIDOSNREFNE |
|
APBN | Date of receipt of notice of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA2E |
|
APBR | Date of receipt of statement of grounds of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA3E |
|
APAV | Appeal reference deleted |
Free format text: ORIGINAL CODE: EPIDOSDREFNE |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: DAIMLER AG |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
APBT | Appeal procedure closed |
Free format text: ORIGINAL CODE: EPIDOSNNOA9E |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20200224 |