EP2791895A1 - Procédé permettant d'améliorer la détection d'un objet par des systèmes multi-caméras - Google Patents

Procédé permettant d'améliorer la détection d'un objet par des systèmes multi-caméras

Info

Publication number
EP2791895A1
EP2791895A1 EP12794902.2A EP12794902A EP2791895A1 EP 2791895 A1 EP2791895 A1 EP 2791895A1 EP 12794902 A EP12794902 A EP 12794902A EP 2791895 A1 EP2791895 A1 EP 2791895A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
raised
area
interface
viewing areas
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP12794902.2A
Other languages
German (de)
English (en)
Inventor
Tobias Ehlgen
Leo VEPA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP2791895A1 publication Critical patent/EP2791895A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the invention relates to a method for improving the object detection in multi-camera systems, in particular to an automatic displacement of seams in object detection in overlapping areas of images of a multi-camera system, a device for implementing the method and a computer program for performing the method on a computer device.
  • DE 10 2009 036 200 A1 discloses a method for monitoring the surroundings of a vehicle.
  • the environment and objects located therein, in particular raised objects, are captured by means of image capture units in individual images and the individual images are used to create a bird's-eye view of the overall image.
  • boundary lines are defined in the overlapping areas between the individual images.
  • a boundary line is provided as a function of the determined position of the objects in the said image. This boundary line is set so variable that it runs away from the objects.
  • Dividing lines between the individual images are defined, wherein the dividing line is designed such that raised objects can be displayed in the overlapping region.
  • Photographs of an overall bird's eye view In particular, moving objects are displayed reliably.
  • the images are captured from different positions and have an overlap area.
  • the image recordings are transformed into a bird's eye view, which takes place as part of an image-data processing, and image sections of the transformed image recordings are combined to form a bird's-eye view of the overall picture.
  • the transition between two adjacent ones Image sections take place along a straight or curved boundary line which runs asymmetrically between the image sections in the overall image.
  • KR 2010 0113 959 A discloses a method for monitoring the surroundings of a vehicle, wherein the image areas of several cameras partially overlap.
  • the individual images are combined to form an overall image in which the vehicle and its surroundings are shown from a bird's eye view.
  • the cameras are arranged and aligned in such a way that their respective imaged areas are partially overlaid. Accordingly, the overall picture is expressed throughout the entire overlay area shown
  • DE 10 2008 029 181 A1 discloses a device to visualize the environment to the driver and to represent obstacles in the vehicle environment. When an obstacle is detected by a detection element other than the camera, the
  • Image processing unit the direction and slope of the virtual view depending on the obstacle.
  • Front camera comprising at least one side camera in the exterior mirror and a rear camera, calculate from images a view from above of the vehicle from a bird's eye view and reflect the vehicle environment (Bird's Eye View, Top View or Area View). This allows the driver, by a single glance at a
  • Display unit such as a display in a center console to see the entire closer vehicle environment.
  • the driver can overlook blind spots due to the bird's eye view.
  • the cameras are mounted in different positions.
  • the front camera is usually located in the front or in the windshield, the rear camera either above the rear window or at the level of the trunk lid, while side cameras are usually with a vertical orientation on the road orientation in the exterior mirrors on the driver and passenger side of the vehicle.
  • Some of these cameras have one Common field of view, which is referred to below as the overlap area.
  • each camera image is assigned an image area within the bird's-eye view today, i. a representation of
  • the stitching point within the overlap area is shifted such that the at least one detected raised object always remains visible.
  • the visualization of the at least one detected raised object in this area is prioritized prior to training Overlapping areas enjoys.
  • the detection of the at least one raised object can be achieved by sensors such as e.g. Ultrasound, laser scanner, radar or lidar applications and the like are reliably detected today.
  • the stitching point is shifted so that the most relevant object is not within a non-displayable area but it is ensured that the detection point (Stitching point) on detection of at least one raised object includes this at least one raised object, ie this is located in the visualization area.
  • the term "most relevant object” means that at least one raised object, which is within the overlap region or which is in the region of a
  • the method proposed according to the invention ensures that detected raised objects which are located in said areas, i. either in the overlap area of two frames or in the vicinity of a stitching point, do not disappear in a blind area.
  • the driver of such a vehicle is offered a significantly improved all-round view, in which all objects are and remain constantly visible. This not least leads to a considerable gain in security.
  • the solution proposed according to the invention makes it possible to color the detected object when displayed on a display within the vehicle in order to attract the driver's attention. Furthermore, it is ensured that the driver of the vehicle sees the object at any time, since this is not within a Interface area disappears, in particular not in a transition region of two adjacent viewing areas of the cameras. Furthermore, the solution proposed by the invention makes it possible to present raised objects in a much more natural manner, which results in a clear advantage with regard to the perception by the driver.
  • FIG. 2 shows an image of the vehicle, calculated from individual images, from the image taken by cameras mounted in the vehicle
  • FIG. 4 shows firmly defined stitching locations for an image on the image
  • Figure 5 Vehicle from a bird's eye view, Figure 5 is located in front of a vehicle (shown here as a semi-trailer)
  • FIG. 6 shows one in the region of an assembly point (stitching point)
  • Figure 7 shows a resulting blind wedge along that indicated in Figure 6
  • FIG. 8 shows a detected raised object approaching a stitching point
  • FIG. 9 shows a displacement of the stitching point within the overlapping area in order to visualize the detected, at least one raised object for the driver of the vehicle.
  • Embodiments The illustration according to FIG. 1 shows a vehicle which has four individual cameras. It should be noted that the vehicle may also have more or fewer cameras, as long as they have at least one common
  • a vehicle 10 is equipped with a 4-camera system 12.
  • the 4-camera system 12 includes a camera that captures an image of a
  • Front image area 14 receives, further includes a camera that receives a rear image area 16.
  • the front image area 14 and the rear image area 16 of the vehicle 10 are identified with respect to a direction of travel 18.
  • the 4-camera system 12 of the vehicle 10 comprises a side camera which receives a left image area 20 and a further side camera, which is mounted, for example, in an exterior mirror of the vehicle 10, which receives a right image area 22 next to the vehicle 10.
  • FIG. 2 shows that the front image areas, rear image areas and left and right image areas 14, 16, 20 and 22 recorded according to FIG. 1 are transformed into a bird's eye view 24.
  • the perspective resulting from the bird's-eye view 24 from above the vehicle 10 is shown in FIG.
  • the representation according to FIG. 3 shows that the vehicle 10 comprises a front camera 28.
  • the front camera 28 of the vehicle 10 receives a viewing area 30, which is located substantially in front of the vehicle 10.
  • the vehicle 10 has a rear camera 32, the viewing area 34 of which extends substantially behind the vehicle 10.
  • the vehicle 10 includes a right side camera 36 whose viewing area 38 is located on the right side of the vehicle 10.
  • the vehicle 10 includes the vehicle 10, a left side camera 40, whose viewing area is identified by reference numeral 42 and which is located substantially on the left side of the vehicle 10.
  • the two side cameras 36 and 40 may be arranged, for example, in the exterior mirrors of the vehicle 10.
  • first overlap region 44 results from an overlap of the field of view 30 of the front camera 28 with the front part of the field of view 38 of the right side camera 36.
  • second overlap region 46 arises, resulting from the overlap of the field of view 30 of the front camera 28 with the viewing area 42 of the left side camera 40 of the vehicle 10 results.
  • third overlapping region 48 results, which results from the overlap of the viewing region 42 of the left side camera 40 of the vehicle with the front visual region 30 of the front camera 28.
  • a fourth overlapping area 50 is formed by overlapping the rear part of the viewing area 42 of the left side camera 40 with the viewing area 34 of the rear camera 32.
  • FIG. 4 shows the individual stitching locations along which the individual images are assembled when the vehicle 10 is viewed from above. From the position of the seams 52, 54, 56, 58, which define a first, a second, a third and a fourth seam, it follows that the seams 52, 54, 56, 58 in a comparison with Figure 3, the individual Overlap areas 44, 46, 48, 50
  • Viewing area 30 of the front camera 28 is transformed to the first modified viewing area 60, which is a trapezoidal compared to the viewing area 30 of the front camera 28
  • Overlap area 46 and the third overlap area 48 (see illustration according to Figure 3), which are to a second modified field of view 62 on the left side of the vehicle 10.
  • the second modified viewing area 62 is bounded by the second seam 54 and by the third seam 56.
  • a third modified vision area 64 located behind the vehicle 10 is defined by the third seam 56 and the fourth
  • a fourth modified field of vision 66 is formed on the right side of the vehicle 10, as shown in FIG. 4, bounded by the fourth seam 58 and the first seam 52.
  • the first seam 52 corresponds to the first overlapping region 44 on the right front side of the vehicle 10
  • the fourth seam 58 corresponds to the fourth overlapping region 50 on the right rear side of the vehicle 10-seen in the direction of travel.
  • FIGS. 5, 6 and 7 are the presentation conditions in front of a vehicle - here a semi-trailer - in forming a blind wedge on the
  • Figure 5 shows that in front of a vehicle 10, here the tractor of a
  • a raised object 68 is in the form of a person.
  • the distortion in the vertical direction of the person representing a raised object 68 results from the vertical distortion that is traversed during the image transformation in order to arrive at a bird's eye view of the vehicle 10.
  • the raised object 68 illustrated here as a point is located at a central interface 72 between the viewing areas on the front side of the
  • the central interface 72 corresponds to the vehicle center 70.
  • the central interface 72 (stitching point) is located directly in front of the vehicle 10, so that there located raised objects 68 are no longer clearly visible to the driver of the vehicle 10.
  • the blind wedge resulting for the driver on the basis of the image processing with respect to the central interface 72 is designated by reference numeral 74.
  • This blind wedge 74 is located in front of the vehicle front 76 of the vehicle 10, here designed as a semitrailer.
  • the at least one raised object 68 is shown distorted in an oblique section as shown in FIG.
  • This distorted representation within the blind wedge 74 which can only be shown imperfectly in FIG. 7 - leads to an unfavorable reproduction of the raised object 68, be it as a distorted point 68 in FIG. 6, or as a distorted person 68 as in FIG see.
  • the blind wedge 74 results from the circumstance that in the vehicle 10 according to FIGS. 5, 6 and 7 cameras are arranged in the exterior mirrors of the vehicle, which faces the central interface 72 (stitching position) extending in the direction of the vehicle front 76 of the vehicle
  • the illustration according to FIG. 8 can be seen on the basis of FIG. 4 that the modified viewing areas 60, 62, 64 and 66 are located around the vehicle 10. 4, the first modified viewing area 60 is bounded by the first and second stitching sites 52, 54, while the second modified viewing area 62 on the left side of the vehicle is defined by the second interface 54 and the third seam 56 is limited. Furthermore, it can be seen from the illustration according to FIG. 8 that the third modified viewing area 64 is delimited by the second interface 54 and the third interface 56. The fourth modified viewing area 66 located on the right side of the vehicle 10 is bounded by the fourth interface 58 and the first interface 52. From the plan view according to FIG. 8 it follows that within the first modified
  • Interface 52 moves. From the illustration according to FIG. 8, it is also clear that here-shown in broken lines in FIG. 8 -a critical one around the first seam 52 Area 78 is defined.
  • the raised object 68 which moves in the direction of movement 80 from left to right in front of the vehicle 10, moves toward this critical area 78 of the first interface 52.
  • Reference numeral 82 designates an opening angle of this critical region 78, which in the illustration according to FIG. 8 is in the order of magnitude of 10 ° -20 °, for example 15 °.
  • FIG. 9 shows that the at least one raised object 68 has moved in the direction of movement 80 as far as the first seam 52, and that a displacement 84 of the first seam 52 has taken place in the direction of the fourth modified visible region 66.
  • the first modified viewing area 60 and the fourth modified viewing area 66 are affected by the displacement 84 of the first interface 52.
  • the first modified viewing area 60 in front of the vehicle 10 is enlarged, whereas the fourth modified viewing area 66 lying to the right of the vehicle 10 is reduced. This is due to the visualization of the at least one raised object 68 at the first proximity point 52, whose original position is shown in dashed lines in FIG. 9.
  • Approaching the at least one raised object 68 to the first seam 52 can analogously also with a corresponding movement direction 80 of the at least one raised object 68 to a shift of the second interface 54 and the third interface 56 and finally the fourth, behind the vehicle 10th lying
  • the raised object 68 shown in FIGS. 8 and 9 may be one which moves very fast or which is in the greatest proximity to the vehicle 10. This is to be understood in the above context under a relevant object. From the bird's-eye view of the vehicle 10 shown in FIGS. 8 and 9, it can be seen that, in the context of image processing, the modification of the individual viewing areas 60, 62, 64 and 66 depends on the occurrence of the at least one in the direction of movement 80
  • the invention further relates to a computer program, according to which one of the methods described above can be performed when the computer program is executed on a programmable computer device.
  • the computer program can be, for example, a module for implementing a driver assistance system or a subsystem thereof in a vehicle, or an application for driver assistance functions that can be executed on a smartphone.
  • Computer program can be stored on a machine-readable storage medium, such as on a permanent or rewritable storage medium or in association with a computer device or on a removable CD-ROM, DVD or USB stick. Additionally or alternatively, the computer program on a
  • Computer equipment such as provided on a server for downloading, for example via a data network, such as the Internet or a
  • the solution proposed by the invention advantageously avoids the blind wedge 74, in which raised objects 68 are no longer recognizable or hardly recognizable. Due to the modification of the blind wedge 74 in front of the vehicle 10, raised objects 68 located there will not disappear in the blind wedge 74 in the short term - as shown in FIG. 7 - but the seams 52, 54, 56, 58 will be displaced depending on the object.
  • the inventively proposed solution can be achieved in a vehicle equipped with multiple cameras that in the range of

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un procédé de lecture d'objets saillants (68) qui se trouvent dans une zone critique (78) ou qui se déplacent vers cette zone (78). La zone critique (78) se trouve au niveau de jonctions (52, 54, 56, 58) composées de plusieurs champs de vision (30, 34, 38, 42) détectés par plusieurs caméras individuelles (28, 32, 36, 40). Le procédé comprend les étapes suivantes : la détection des champs de vision (30, 34, 38, 42) s'effectue respectivement, au moyen d'une caméra individuelle (28, 32, 36, 40) pour un véhicule (10). Par la suite, se produit la réunion des champs de vision (30, 34, 38, 42), le long des jonctions (52, 54, 56, 58), en une vue à vol d'oiseau, transformée (24). En dernier lieu s'effectue le déplacement d'au moins une jonction (52, 54, 56, 58) entre des champs de vision adjacents (30, 34, 38, 42), lors de la détection d'au moins un objet saillant (68) se trouvant à l'intérieur de la zone critique (78) au niveau d'au moins une jonction (52, 54, 56, 58), ou se déplaçant vers cette zone.
EP12794902.2A 2011-12-13 2012-11-19 Procédé permettant d'améliorer la détection d'un objet par des systèmes multi-caméras Withdrawn EP2791895A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102011088332.0A DE102011088332B4 (de) 2011-12-13 2011-12-13 Verfahren zur Verbesserung der Objektdetektion bei Multikamerasystemen
PCT/EP2012/072963 WO2013087362A1 (fr) 2011-12-13 2012-11-19 Procédé permettant d'améliorer la détection d'un objet par des systèmes multi-caméras

Publications (1)

Publication Number Publication Date
EP2791895A1 true EP2791895A1 (fr) 2014-10-22

Family

ID=47278778

Family Applications (1)

Application Number Title Priority Date Filing Date
EP12794902.2A Withdrawn EP2791895A1 (fr) 2011-12-13 2012-11-19 Procédé permettant d'améliorer la détection d'un objet par des systèmes multi-caméras

Country Status (5)

Country Link
EP (1) EP2791895A1 (fr)
CN (1) CN103999122A (fr)
DE (1) DE102011088332B4 (fr)
IN (1) IN2014DN03173A (fr)
WO (1) WO2013087362A1 (fr)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013217081B4 (de) 2013-08-27 2019-11-21 Volkswagen Aktiengesellschaft Vorrichtung und Verfahren zur Positionserkennung
EP2851870B1 (fr) * 2013-09-20 2019-01-23 Application Solutions (Electronics and Vision) Limited Procédé pour l'estimation de l'ego-mouvement d'un objet
DE102014108684B4 (de) 2014-06-20 2024-02-22 Knorr-Bremse Systeme für Nutzfahrzeuge GmbH Fahrzeug mit Umfeldüberwachungseinrichtung sowie Verfahren zum Betreiben einer solchen Überwachungseinrichtung
EP3009983A1 (fr) * 2014-10-13 2016-04-20 Conti Temic microelectronic GmbH Appareil et méthode de détection d'un obstacle
DE102015204213B4 (de) * 2015-03-10 2023-07-06 Robert Bosch Gmbh Verfahren zum Zusammensetzen von zwei Bildern einer Fahrzeugumgebung eines Fahrzeuges und entsprechende Vorrichtung
DE102015010011B4 (de) 2015-08-05 2020-03-19 Wirtgen Gmbh Selbstfahrende Baumaschine und Verfahren zur Anzeige der Umgebung einer selbstfahrenden Baumaschine
DE102015010009A1 (de) 2015-08-05 2017-02-09 Wirtgen Gmbh Selbstfahrende Baumaschine und Verfahren zur Anzeige der Umgebung einer selbstfahrenden Baumaschine
JP6493143B2 (ja) * 2015-10-15 2019-04-03 株式会社デンソー 表示制御装置及び表示制御プログラム
US20170297488A1 (en) * 2016-04-19 2017-10-19 GM Global Technology Operations LLC Surround view camera system for object detection and tracking
JP6551336B2 (ja) * 2016-08-12 2019-07-31 株式会社デンソー 周辺監査装置
DE102016117518A1 (de) 2016-09-16 2018-03-22 Connaught Electronics Ltd. Angepasstes Zusammenfügen von Einzelbildern zu einem Gesamtbild in einem Kamerasystem für ein Kraftfahrzeug
CN107009968B (zh) * 2017-03-28 2020-08-28 驭势科技(北京)有限公司 车载激光雷达控制方法、装置及车载设备
CN113678433B (zh) 2019-04-18 2024-05-03 三菱电机株式会社 车辆周边图像生成装置、车辆周边显示系统及车辆周边显示方法
KR102281609B1 (ko) 2020-01-16 2021-07-29 현대모비스 주식회사 어라운드뷰 합성 시스템 및 방법

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4883977B2 (ja) 2005-10-05 2012-02-22 アルパイン株式会社 車両用画像表示装置
DE102006003538B3 (de) 2006-01-24 2007-07-19 Daimlerchrysler Ag Verfahren zum Zusammenfügen mehrerer Bildaufnahmen zu einem Gesamtbild in der Vogelperspektive
CN101035269A (zh) * 2007-02-14 2007-09-12 周竟 汽车多用途监控及定位器
CN100559420C (zh) * 2007-03-29 2009-11-11 汤一平 基于计算机视觉的停车诱导系统
CN101304515A (zh) * 2007-05-11 2008-11-12 徐世刚 全景式倒车引导系统
DE102008029181A1 (de) 2008-06-19 2009-02-19 Daimler Ag Verfahren zur Überwachung der Umgebung eines Fahrzeugs
CN101420599A (zh) * 2008-11-20 2009-04-29 符巨章 车辆全景监控系统
JP5165631B2 (ja) 2009-04-14 2013-03-21 現代自動車株式会社 車両周囲画像表示システム
DE102009036200A1 (de) 2009-08-05 2010-05-06 Daimler Ag Verfahren zur Überwachung einer Umgebung eines Fahrzeugs
EP2482557B1 (fr) * 2009-09-24 2020-03-18 Panasonic Intellectual Property Management Co., Ltd. Dispositif d'affichage d'aide à la conduite

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2013087362A1 *

Also Published As

Publication number Publication date
IN2014DN03173A (fr) 2015-05-22
DE102011088332B4 (de) 2021-09-02
WO2013087362A1 (fr) 2013-06-20
DE102011088332A1 (de) 2013-06-13
CN103999122A (zh) 2014-08-20

Similar Documents

Publication Publication Date Title
WO2013087362A1 (fr) Procédé permettant d'améliorer la détection d'un objet par des systèmes multi-caméras
EP3501897B1 (fr) Système de visualisation permettant d'apprehender l'environnement d'un véhicule
EP3028898B1 (fr) Systeme de vision
EP3157786B1 (fr) Véhicule comprenant un dispositif de surveillance de l'environnement et procédé permettant de faire fonctionner un dispositif de surveillance de l'environnement de ce type
EP2623374B1 (fr) Système de vision pour véhicules utilitaires destiné à la représentation de champs de vision règlementaires d'un rétroviseur principal et d'un rétroviseur grand angle
DE102012025322B4 (de) Kraftfahrzeug mit Kamera-Monitor-System
EP2603413B1 (fr) Procédé d'assistance à une manoeuvre de stationnement d'un véhicule à moteur, système d'aide à la conduite, et véhicule à moteur
DE102017100004A1 (de) Verfahren zum Bereitstellen von zumindest einer Information aus einem Umgebungsbereich eines Kraftfahrzeugs, Anzeigesystem für ein Kraftfahrzeug, Fahrerassistenzsystem für ein Kraftfahrzeug sowie Kraftfahrzeug
DE102011079913A1 (de) Sichtunterstützungssystem für einen Fahrer und Fahrzeug mit einem solchen System
EP3833575B1 (fr) Systeme de camera de surveillance
EP2559236A1 (fr) Procédé d'affichage d'une image sur un équipement d'affichage dans un véhicule, système d'assistance au conducteur et véhicule
DE102008046214A1 (de) Verfahren und Vorrichtung zur Überwachung einer Umgebung eines Fahrzeuges
DE102008035428B4 (de) Verfahren und Vorrichtung zur Überwachung einer Umgebung eines Fahrzeuges
DE102016114689A1 (de) Verfahren zum Unterstützen des Fahrers eines Kraftfahrzeugs beim Rangieren des Kraftfahrzeugs mit einem Anhänger, Fahrerassistenzsystem sowie Gespann
EP3106349B1 (fr) Système de vision pour véhicule utilitaire destiné à la représentation de champs de vision règlementaires d'un rétroviseur principal et d'un rétroviseur grand angle
DE102012005277B3 (de) Heckbereichssichtsystem
EP3421674A1 (fr) Procédé et dispositif de représentation des zones
EP3833576B1 (fr) Systeme de camera de surveillance
DE102019133948A1 (de) Verfahren zur Fahrerassistenz für eine Kombination aus einem Kraftfahrzeug und einem Anhänger
EP3610643A1 (fr) Système de vue panoramique pour un véhicule
DE102015208345A1 (de) Fortbewegungsmittel, Fahrerassistenzsystem und Verfahren zur Anzeige eines aus einem ersten und einem zweiten Sensorsignal eines Fahrerassistenzsystems gefügten Bildes
DE112021004648T5 (de) Anzeigesteuervorrichtung, Fahrzeug und Anzeigesteuerverfahren
DE102014107235A1 (de) Verfahren zur Darstellung einer Fahrzeugumgebung auf einer Anzeigevorrichtung; eine Anzeigevorrichtung; ein System aus einer Mehrzahl an Bilderfassungseinheiten und einer Anzeigevorrichtung; ein Computerprogramm

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140714

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20150204