EP2198390A2 - Procédé d'étalonnage d'un ensemble constitué d'au moins une caméra omnidirectionnelle et d'une unité d'affichage optique - Google Patents

Procédé d'étalonnage d'un ensemble constitué d'au moins une caméra omnidirectionnelle et d'une unité d'affichage optique

Info

Publication number
EP2198390A2
EP2198390A2 EP08802513A EP08802513A EP2198390A2 EP 2198390 A2 EP2198390 A2 EP 2198390A2 EP 08802513 A EP08802513 A EP 08802513A EP 08802513 A EP08802513 A EP 08802513A EP 2198390 A2 EP2198390 A2 EP 2198390A2
Authority
EP
European Patent Office
Prior art keywords
vehicle
image
camera
omnidirectional
display unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP08802513A
Other languages
German (de)
English (en)
Inventor
Tobias Ehlgen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Daimler AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler AG filed Critical Daimler AG
Publication of EP2198390A2 publication Critical patent/EP2198390A2/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • G06T3/047Fisheye or wide-angle transformations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

Definitions

  • the invention relates to a method for calibrating an arrangement comprising at least one omnidirectional camera arranged on an object, in particular a vehicle, and an optical display unit, in which an image displayed by the display unit reproduces a perspective of a virtual camera imagined above the object, wherein an image of the virtual camera in one
  • Object coordinate system is projected and wherein an image of the omnidirectional camera is projected into the object coordinate system.
  • the invention relates to a method for image processing and image display by means of an arrangement of at least one arranged on an object, in particular a vehicle omnidirectional camera and an optical display unit.
  • Omnidirectional cameras are basically known. They can be used to detect omnidirectional images (also called bird's eye views) whose image information contains data, such as image positions with regard to the position of detected objects or objects, including the distance of the object from the omnidirectional camera.
  • omnidirectional cameras are used to monitor vehicle environments in front of, beside, and / or behind a vehicle. In this case, objects or objects are identified, for example, by means of the omnidirectional camera in the vehicle environment. In order to be able to determine the position and / or the distance of the identified objects with sufficient accuracy, in particular, the exact position and the exact orientation of the omnidirectional camera relative to the vehicle are known.
  • omnidirectional cameras can be used to navigate confusing vehicles, such as trucks, for example when driving backwards, by presenting images taken by the camera to a driver on an optical display unit.
  • the heavily distorted images of the omnidirectional camera are transformed in such a way that the driver gets a bird's-eye view.
  • the problem here is that the proximity to the vehicle so that is very well detectable, but more distant objects are not displayed.
  • US Pat. No. 7,161,616 B1 discloses an image processing technique in which a plurality of cameras record images of an environment of a vehicle. These images are transformed so that the environment of the vehicle is interpreted as being projected onto an inner surface of a hemisphere or bowl, so that more distant pixels are also displayed. The distortion increases from a central region of the hemisphere to the outside. The method requires a high computational effort.
  • the invention is therefore based on the object, an improved method for calibrating an arrangement of at least one omnidirectional camera and an optical display unit and a method for image processing and Represent image representation by means of an arrangement of at least one arranged on an object, in particular a vehicle omnidirectional camera and an optical display unit.
  • the object is achieved by the features specified in claim 1.
  • the object is achieved by the features specified in claim 5.
  • an arrangement of at least one omnidirectional camera arranged on an object, in particular a vehicle, and an optical display unit are calibrated with the method described below.
  • a virtual camera is defined whose image is displayed by the display unit.
  • An image of the virtual camera is projected into an object coordinate system. Resulting points in the object coordinate system are projected into the omnidirectional camera or in its coordinate system.
  • pixels of the image of the virtual camera within a circle or ellipse that is about the object are projected onto an imaginary plane. Pixels of the image outside the circle or the ellipse are transformed by the virtual camera into the object coordinate system so that they point to a imaginary area that increases from the edge of the circle or ellipse, where a height of the pixels on the area in the object coordinate system is proportional to a distance of the pixel to a center of the circle or ellipse.
  • the surface has the shape of a lateral surface of a truncated cone.
  • the lateral surface of the truncated cone in this case has a linear slope, therefore, a projection on this surface can be calculated in a particularly simple manner.
  • a lateral surface of a truncated cone as a projection surface has the advantage that, compared to surfaces with a non-linear gradient, for example in the case of an ellipsoid, significantly less distortion occurs in the image.
  • a lateral surface of a truncated cone results for the driver a particularly intuitive and easily catchy image representation. Distortions occur here at best in the edge region of the lateral surface of the truncated cone, whereas representations in the center of the image takes place in a particularly advantageous manner distortion-free.
  • the omnidirectional camera is preferably calibrated for intrinsic parameters with the aid of a calibration body comprising the entire field of view of the camera
  • the calibration body has in particular the shape of a barrel, the inside of which is provided with circular markings.
  • the markings are in particular by means of a in [T. Luhmann. Nah Kunststoffsphotogrammie, Herbert Wichmann Verlag, 2000. 4] measured with subpixel accuracy.
  • Extrinsic camera parameters such as translation and rotation of the camera with respect to the object coordinate system, are detected on object-mounted omnidirectional cameras. For this rectangular markings can be used.
  • SPIE Photonics Europe Optical Metrology in Production Engineering
  • Intrinsic parameters of the camera are, for example, a focal length, a pixel size (width, height), an image center with respect to the respective coordinate system and distortions.
  • the thus calibrated arrangement is now used in a method of image processing and image display, in which an environment of the object is recorded with the omnidirectional camera and reproduced in the display unit according to the calibration.
  • omnidirectional cameras preferably two omnidirectional cameras are used which, for example, can be arranged on a rear side of a vehicle in the region of a roof edge.
  • a larger number of omnidirectional cameras or only one omnidirectional camera may be provided.
  • an area detected by the omnidirectional cameras with respect to the vehicle is asymmetrically divided among the cameras. This division is described for example in the patent DE 102006003538 B3 of the applicant. This avoids that further objects in the environment of the vehicle disappear from the field of vision due to the assumption of a planar environment which results from the back projection onto the imaginary plane.
  • Each point of the object coordinate system is projected behind the vehicle and to the right into the right-hand omnidirectional camera.
  • Each point to the left of the vehicle is projected into the left omnidirectional camera.
  • a driving movement is predicted as a function of a steering angle and superimposed on the image in the display unit.
  • the driver can estimate whether he can reach with the current steering angle with the vehicle when reversing a desired position.
  • the calculation can be based on a single-track model in which two wheels of an axle are modeled by a virtual wheel in the middle of the axle.
  • Showing: 1 shows a rear side of a vehicle with two omnidirectional cameras
  • FIG. 2 is a schematic representation of a vehicle and its surroundings in standard bird's-eye view, as known from the prior art
  • FIG. 3 shows a schematic illustration of a vehicle and its surroundings in an expanded bird's-eye view, by means of the method according to the invention for image processing and image display,
  • FIG. 4 shows a view of the vehicle and its surroundings in an extended bird's-eye view superimposed on an optical display unit in the vehicle with superimposed predicted driving movement when reversing
  • Fig. 5 is a single track model for predicting the driving movement.
  • FIG. 1 shows a rear side of an object 1 designed as a vehicle with two omnidirectional cameras 2.
  • the reference numeral 1 is also used below for the vehicle.
  • the omnidirectional cameras 2 accommodate an environment of the vehicle 1 beside and behind the vehicle 1 at a very wide angle. An edited image of the cameras is displayed in the vehicle 1 on a display unit, not shown.
  • the cameras 2 are calibrated for this purpose with respect to the vehicle 1. Because of the fixed arrangement of Cameras 2, the calibration must be performed only once.
  • the aim of the calibration is first the generation of a representation of the vehicle 1 and its surroundings in bird's eye view. This will be a virtual camera over the
  • Vehicle 1 defined.
  • X w ⁇ X, + C p
  • ⁇ _ c P -yx r - y X r R p
  • X p R p ⁇ ⁇ p . ⁇ , ⁇ p .y, f ⁇
  • the focal length of the virtual camera, R p and C p are the rotation and translation of the virtual camera with respect to the object coordinate system, whose origin may lie, for example, in the middle of a bumper at the rear of the vehicle 1, wherein the y-axis is perpendicular to above.
  • the backprojected points X w are projected into at least one of the omnidirectional cameras 2, that is to say in the images recorded therewith. This projection is detailed in [C. Toepfer, T. Ehlgen. A unifying omnidirectional camera model and is applications. In Omnivs, 2007. 2] described.
  • each point X w of the coordinate system object is projected behind the vehicle 1 and to the right thereof in the right-hand arranged omnidirectional camera 2.
  • Each point X w to the left of the vehicle 1 is projected into the left-side omnidirectional camera 2. This is described in detail in the patent DE 102006003538 B3 of the applicant.
  • FIG. 2 shows a schematic representation of the vehicle 1 and its surroundings in the standard bird's-eye view achieved with a configuration calibrated in such a way from the two omnidirectional cameras 2 and the optical display unit.
  • the rectified representation shown is achieved, as indicated by the checkerboard pattern.
  • the asymmetrical division of the field of view on the two omnidirectional cameras 2 is indicated by the different representation of the checkerboard pattern.
  • a driver of the vehicle 1 can thus determine whether there are obstacles in the vicinity of the vehicle 1.
  • the method is known from the prior art.
  • This inner region 3 lies within an imaginary circle with the radius r around the vehicle 1 or the coordinate origin of the object coordinate system.
  • Each pixel x p in the image of the virtual camera is backprojected onto the imaginary plane and thus overlapped, so that the point X w results in the object coordinate system.
  • An interpolation of the projected point X u gives an intensity value for the pixel x p .
  • Points X w outside the inner region 3 are projected not onto the imaginary plane but onto a surface rising from the edge of the circle. This means that a high X vY depends on a distance of the point X w from a center of the circle:
  • FIG. 4 shows a view of the vehicle 1 and its surroundings, shown on the optical display unit in the vehicle 1, in an expanded bird's eye view with a superimposed predicted driving movement during reversing.
  • the traveling motion is represented as a first corridor 4 for the movement of the rear bumper and a second corridor 5 for the movement of the front bumper, within which the vehicle 1 will move according to a current steering angle ⁇ R unless the steering angle ⁇ R is changed. If the steering angle ⁇ R is changed, the prognosis must be adjusted accordingly.
  • the line 4.1 of the corridor denotes the rear bumper.
  • the line 4.2 indicates a distance of, for example, 1 m from the bumper to the rear, which may for example correspond to the pivoting range of rear corners of the vehicle 1.
  • the line 4.3 denotes a distance of about the length of the vehicle 1.
  • the radius r of the circle, which surrounds the inner region 3, should be at least equal to the distance of the line 4.3 from the rear bumper.
  • a so-called single track model 6 is used, in which each two wheels of an axle of the vehicle 1 are modeled by a virtual wheel 7 in the middle of the axle.
  • the front and rear virtual wheels 7 are connected by a rigid line.
  • the model is sufficient under the constraint that the vehicle 1 does not move about its longitudinal axis and the load on the wheels remains the same. For low speeds, such as when reversing and maneuvering occur these assumptions are justified.
  • one or more corridors can also be represented in the image based on this model assumption, which allows the driver to have a particularly intuitive perception of the situation and enables a particularly simple maneuvering of the vehicle.
  • the other parameters depend on the steering angle ⁇ ⁇ : r h radius of movement of the front virtual wheel 7 r R radius of movement of the rear virtual wheel 7 r o Movement radius of a center of the rear
  • the model is formed by the following equations:
  • the radii of movement of the left and right outside points of the rear bumper can be calculated analogously to r o :
  • the omnidirectional cameras 2 are designed in particular as mirror-lens cameras.
  • the described methods can also be carried out with only one omnidirectional camera 2 or with more than two omnidirectional cameras 2.
  • the cameras 2 may be arranged in the region of a roof edge of the vehicle 1.
  • the inner region 3 can also be described by means of an ellipse instead of a circle.
  • the imaginary plane may correspond to a ground on which the vehicle 1 stands.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

L'invention concerne un procédé d'étalonnage d'un ensemble constitué d'au moins une caméra omnidirectionnelle (2), placée sur un véhicule (1), et d'une unité d'affichage optique, procédé selon lequel une image, représentée par l'unité d'affichage, reproduit une perspective d'une caméra virtuelle imaginée au-dessus de l'objet (1), une image de la caméra virtuelle étant projetée dans un système de coordonnées de l'objet et des points obtenus (Xw, Xw.Y) étant projetés dans la caméra omnidirectionnelle. Selon l'invention, lorsque l'image de la caméra virtuelle est projetée dans le système de coordonnées de l'objet, des pixels (xp) de l'image de la caméra virtuelle sont projetés sur un plan imaginaire à l'intérieur d'un cercle ou d'une ellipse imaginé(e) autour de l'objet (1), tandis que des pixels (xp) de l'image situés en dehors du cercle ou de l'ellipse sont transformés par la caméra virtuelle dans le système de coordonnées de l'objet, de sorte qu'ils sont projetés sur une surface imaginaire s'élevant du bord du cercle ou de l'ellipse.
EP08802513A 2007-10-16 2008-09-23 Procédé d'étalonnage d'un ensemble constitué d'au moins une caméra omnidirectionnelle et d'une unité d'affichage optique Withdrawn EP2198390A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102007049821A DE102007049821A1 (de) 2007-10-16 2007-10-16 Verfahren zum Kalibrieren einer Anordnung mit mindestens einer omnidirektionalen Kamera und einer optischen Anzeigeeinheit
PCT/EP2008/008023 WO2009049750A2 (fr) 2007-10-16 2008-09-23 Procédé d'étalonnage d'un ensemble constitué d'au moins une caméra omnidirectionnelle et d'une unité d'affichage optique

Publications (1)

Publication Number Publication Date
EP2198390A2 true EP2198390A2 (fr) 2010-06-23

Family

ID=40458801

Family Applications (1)

Application Number Title Priority Date Filing Date
EP08802513A Withdrawn EP2198390A2 (fr) 2007-10-16 2008-09-23 Procédé d'étalonnage d'un ensemble constitué d'au moins une caméra omnidirectionnelle et d'une unité d'affichage optique

Country Status (5)

Country Link
US (1) US8599258B2 (fr)
EP (1) EP2198390A2 (fr)
JP (1) JP5077606B2 (fr)
DE (1) DE102007049821A1 (fr)
WO (1) WO2009049750A2 (fr)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9858798B2 (en) 2013-05-28 2018-01-02 Aai Corporation Cloud based command and control system integrating services across multiple platforms
JP6271917B2 (ja) * 2013-09-06 2018-01-31 キヤノン株式会社 画像記録装置及び撮像装置
US9674433B1 (en) * 2014-07-24 2017-06-06 Hoyos Vsn Corp. Image center calibration for a quadric panoramic optical device
CN104089579A (zh) * 2014-08-05 2014-10-08 吉林大学 基于球面坐标的汽车视觉检测系统的摄像机标定靶标
DE102014011915A1 (de) 2014-08-12 2016-02-18 Man Truck & Bus Ag Verfahren zur Warnung von Verkehrsteilnehmern vor möglichen Gefahrenbereichen, die durch ein Fahrzeug entstehen, das ein Fahrmanöver ausführt oder ausführen will
CN106296642A (zh) * 2015-06-09 2017-01-04 汪浩 车载环视系统摄像头内外参数一次性标定的方法
EP3337166B1 (fr) * 2015-08-10 2020-01-01 JVC Kenwood Corporation Appareil d'affichage véhiculaire, et procédé d'affichage véhiculaire
KR102597435B1 (ko) * 2016-04-20 2023-11-03 엘지이노텍 주식회사 영상 취득 장치 및 그 방법
JP2017220051A (ja) * 2016-06-08 2017-12-14 ソニー株式会社 画像処理装置、画像処理方法、および車両
JP7081265B2 (ja) * 2018-03-29 2022-06-07 株式会社富士通ゼネラル 画像処理装置
EP3561773B1 (fr) * 2018-04-26 2022-03-02 Continental Automotive GmbH Évaluation en ligne de paramètres intrinsèques de caméra

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1302365A2 (fr) * 2001-10-15 2003-04-16 Matsushita Electric Industrial Co., Ltd. Appareil de surveillance de l'environnement d'un véhicule et procédé d'ajustage
EP2254334A1 (fr) * 2008-03-19 2010-11-24 SANYO Electric Co., Ltd. Dispositif et procédé de traitement d'image, système d'aide à la conduite et véhicule

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2259220A3 (fr) 1998-07-31 2012-09-26 Panasonic Corporation Procédé et appareil d'affichage d'images
EP1043191B1 (fr) * 1999-03-19 2003-07-30 Yazaki Corporation Dispositif de rétrovision pour véhicule
EP1179958B1 (fr) 1999-04-16 2012-08-08 Panasonic Corporation Dispositif de traitement d'images et systeme de surveillance
JP3624769B2 (ja) 1999-09-30 2005-03-02 株式会社豊田自動織機 車両後方監視装置用画像変換装置
KR20020033816A (ko) * 2000-07-19 2002-05-07 마츠시타 덴끼 산교 가부시키가이샤 감시시스템
SE522121C2 (sv) * 2000-10-04 2004-01-13 Axis Ab Metod och anordning för digital behandling av frekvent uppdaterade bilder från en kamera
JP3871614B2 (ja) * 2002-06-12 2007-01-24 松下電器産業株式会社 運転支援装置
JP3620532B2 (ja) * 2002-11-12 2005-02-16 日産自動車株式会社 車両用報知装置
JP4105146B2 (ja) * 2004-11-30 2008-06-25 本田技研工業株式会社 エイミング装置
JP4679293B2 (ja) 2005-08-08 2011-04-27 三洋電機株式会社 車載パノラマカメラシステム
JP5110343B2 (ja) * 2005-09-30 2012-12-26 アイシン精機株式会社 運転支援装置
DE102006003538B3 (de) * 2006-01-24 2007-07-19 Daimlerchrysler Ag Verfahren zum Zusammenfügen mehrerer Bildaufnahmen zu einem Gesamtbild in der Vogelperspektive
JP4742953B2 (ja) 2006-03-31 2011-08-10 株式会社デンソー 画像処理装置,画像表示システムおよびプログラム
US9240029B2 (en) * 2007-04-30 2016-01-19 Here Global B.V. Street level video simulation display system and method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1302365A2 (fr) * 2001-10-15 2003-04-16 Matsushita Electric Industrial Co., Ltd. Appareil de surveillance de l'environnement d'un véhicule et procédé d'ajustage
EP2254334A1 (fr) * 2008-03-19 2010-11-24 SANYO Electric Co., Ltd. Dispositif et procédé de traitement d'image, système d'aide à la conduite et véhicule

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JURGEN ACKERMANNI: "Brief Paper Robust Decoupling, Ideal Steering Dynamics and Yaw Stabilization of 4WS Cars*", AUTOMATICA, vol. 30, no. 11, 1 January 1994 (1994-01-01), pages 1761 - 1768, XP055176526 *
MATUSZYK L ET AL: "Stereo panoramic vision for monitoring vehicle blind-spots", INTELLIGENT VEHICLES SYMPOSIUM, 2004 IEEE PARMA, ITALY JUNE 14-17, 2004, PISCATAWAY, NJ, USA,IEEE, 14 June 2004 (2004-06-14), pages 31 - 36, XP010727438, ISBN: 978-0-7803-8310-4, DOI: 10.1109/IVS.2004.1336351 *

Also Published As

Publication number Publication date
WO2009049750A2 (fr) 2009-04-23
US8599258B2 (en) 2013-12-03
JP5077606B2 (ja) 2012-11-21
US20100214412A1 (en) 2010-08-26
DE102007049821A1 (de) 2009-04-23
JP2011500407A (ja) 2011-01-06
WO2009049750A3 (fr) 2009-09-03

Similar Documents

Publication Publication Date Title
EP2198390A2 (fr) Procédé d'étalonnage d'un ensemble constitué d'au moins une caméra omnidirectionnelle et d'une unité d'affichage optique
EP3328686B1 (fr) Méthode et dispositif pour l'affichage de la région entourant un ensemble véhicule et remorque
DE102006003538B3 (de) Verfahren zum Zusammenfügen mehrerer Bildaufnahmen zu einem Gesamtbild in der Vogelperspektive
DE102012025322B4 (de) Kraftfahrzeug mit Kamera-Monitor-System
EP1719093A1 (fr) Procede et dispositif d'alerte pour traiter de maniere graphique l'image d'une camera
WO2016005232A1 (fr) Assemblage de sous-images pour former une image d'un environnement d'un moyen de transport
EP3308361B1 (fr) Procédé de production d'une image virtuelle d'un environnement de véhicule
DE102010005638A1 (de) Verfahren zum Anzeigen zumindest eines Bildes einer Fahrzeugumgebung auf einer Anzeigeeinrichtung in einem Fahrzeug. Fahrerassistenzeinrichtung für ein Fahrzeugund Fahrzeug mit einer Fahrerassistenzeinrichtung
DE102009057996A1 (de) Verfahren zur Bestimmung einer Position einer Kamera mit einem zugehörigen Kamera-Koordinatensystem relativ zu einer Position eines Fahrzeuges oder Fahrzeuggespannes mit einem zugehörigen Fahrzeug-Koordinatensystem
DE102019110871A1 (de) Sichtsystem für ein Fahrzeug
DE102008035428B4 (de) Verfahren und Vorrichtung zur Überwachung einer Umgebung eines Fahrzeuges
DE102007016055A1 (de) Fahrzeugumgebungsüberwachungsvorrichtung und Fahrzeugumgebungsüberwachungs-Videoanzeigeverfahren
DE102011010859B4 (de) Verfahren zum Betreiben eines Kamerasystems in einem Kraftfahrzeug, Kamerasystem und Kraftfahrzeug
DE102014201409B4 (de) Parkplatz - trackinggerät und verfahren desselben
WO2016142079A1 (fr) Procédé pour assembler deux images de l'environnement d'un véhicule et dispositif correspondant
WO2018108211A1 (fr) Système de vue panoramique en trois dimensions
DE102006037600A1 (de) Verfahren zur auflösungsabhängigen Darstellung der Umgebung eines Kraftfahrzeugs
EP2603403B1 (fr) Methode d'affichage d'images sur un panneau d'affichage dans un véhicule automobile, système d'aide à la conduite et véhicule
EP1724726A1 (fr) Méthode et dispositif pour la mesure d'un espace de stationnement à l'aide d'une mono-caméra
DE102014018364A1 (de) Vorrichtung und Verfahren zur Unterstützung eines Fahrers eines Fahrzeugs, insbesondere eines Nutzfahrzeugs
EP3833576B1 (fr) Systeme de camera de surveillance
DE102007024752B4 (de) Verfahren zur Fahrerinformation in einem Kraftfahrzeug
DE102019108054A1 (de) Verfahren zum Anzeigen eines ausgewählten Anzeigebereichs eines Umgebungsbilds in Abhängigkeit eines charakterisierenden Fahrbahnwinkels, Computerprogrammprodukt, elektronische Recheneinrichtung sowie Fahrerassistenzsystem
DE102018209366B4 (de) Verfahren zur Bestimmung einer Position und/oder Orientierung einer Einrichtung
DE102015208345A1 (de) Fortbewegungsmittel, Fahrerassistenzsystem und Verfahren zur Anzeige eines aus einem ersten und einem zweiten Sensorsignal eines Fahrerassistenzsystems gefügten Bildes

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20100319

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA MK RS

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20150320

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20160401