EP2377006A1 - Verfahren zur steuerung und assoziiertes fahrhilfsystem - Google Patents

Verfahren zur steuerung und assoziiertes fahrhilfsystem

Info

Publication number
EP2377006A1
EP2377006A1 EP09817299A EP09817299A EP2377006A1 EP 2377006 A1 EP2377006 A1 EP 2377006A1 EP 09817299 A EP09817299 A EP 09817299A EP 09817299 A EP09817299 A EP 09817299A EP 2377006 A1 EP2377006 A1 EP 2377006A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
control
scene
dimensional scene
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP09817299A
Other languages
English (en)
French (fr)
Inventor
Patrick Bonhoure
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dav SA
Original Assignee
Dav SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dav SA filed Critical Dav SA
Publication of EP2377006A1 publication Critical patent/EP2377006A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/211Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays producing three-dimensional [3D] effects, e.g. stereoscopic images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/60Instruments characterised by their location or relative disposition in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/10Interpretation of driver requests or demands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1438Touch screens

Definitions

  • the present invention relates to a control method in an on-board driving assistance system in a motor vehicle, and a driving assistance system for implementing such a method.
  • Motor vehicles are increasingly equipped with driver assistance systems, for example, the generation of audible or visual alerts in case of imminent dangerous situation (for example: exceeding of limit, obstacle %) or the detection of vehicles, pedestrians or obstacles in so-called blind spot areas.
  • driver assistance systems for example, the generation of audible or visual alerts in case of imminent dangerous situation (for example: exceeding of limit, obstacle 8) or the detection of vehicles, pedestrians or obstacles in so-called blind spot areas.
  • a driver assistance display method for a motor vehicle makes it possible to display, on a first part of a screen of the dashboard of the vehicle, all the images produced by four simultaneously. cameras projected on the ground to allow a bird's eye view of the immediate environment of the vehicle, itself being materialized by its image in plan view; and displaying on a large scale, on a second portion of the dashboard screen, simultaneously with the previous display, a complete view of one or more images as directly produced by the one or more cameras.
  • this representation is complex and can disrupt the understanding of the vehicle environment and thus interfere with a vehicle user to park for example.
  • the invention therefore aims to overcome the disadvantages of the prior art by providing more intuitive driver assistance systems.
  • the subject of the invention is a control method for an on-vehicle driving assistance system, said system comprising:
  • At least one camera for capturing video streams of the vehicle environment
  • control unit for generating a three-dimensional scene according to a predefined viewpoint comprising:
  • a screen for displaying the three-dimensional scene characterized in that said method comprises the following steps:
  • control method according to the invention may further comprise one or more of the following characteristics, taken separately or in combination:
  • said method comprises a step in which the control trajectory is compared with a set of predetermined control trajectories so as to determine the associated command to make vary the point of view,
  • control trajectory is associated with at least one command chosen from the group of commands comprising: a command for translational movement of the scene, a command for moving the scene in rotation, a tilt displacement command for the possible scene because the generated scene is three-dimensional, which makes it possible to define different angles of view,
  • control path is associated with a command for zooming
  • said trajectory is carried out by a user's finger for commands for moving in rotation or translation of the three-dimensional scene, said path is made by several fingers of the user for tilting or zooming commands of the three-dimensional scene.
  • the invention also relates to an on-board driving assistance system in a motor vehicle comprising: at least one camera for capturing video streams from the vehicle environment, a control unit for generating a three-dimensional scene according to a predefined viewpoint including:
  • At least one image obtained from the captured video streams, and a display screen of the generated three-dimensional scene characterized in that said system comprises at least one means adapted to capture a control path of a user on a tactile surface integrated in the display screen so as to define a command associated with said entered trajectory to vary the point of view of said scene.
  • the driving assistance system according to the invention may further comprise one or more of the following features, taken separately or in combination:
  • said touch surface comprises a transparent or translucent film
  • said system comprises:
  • a rear camera located at the center of the rear window of the vehicle and facing down from the vehicle and over the horizon.
  • the present invention therefore allows the user to act on the images displayed intuitively without being limited by predefined points of view.
  • FIG. 1 illustrates a vehicle equipped with a driving assistance system according to the invention
  • FIG. 2 schematically represents a driving assistance system
  • FIGS. 3a to 3c represent an example of virtual screens making it possible to generate a three-dimensional scene
  • FIG. 4 represents an example of a three-dimensional scene generated according to the invention
  • FIGS. 5a to 5f show examples of control paths in the driving assistance system of FIG. 1;
  • FIG. 6 illustrates the steps of a control method according to the invention.
  • this driver assistance system comprises:
  • a plurality of cameras 3a-3h for capturing video streams of the vehicle environment, a control unit for generating a three-dimensional scene according to a predefined point of view, and
  • the driver assistance system comprises a set of cameras 3, at the front, rear and on the sides. At the front, we can provide:
  • a camera 3f located at the rear view mirror of the vehicle directed substantially towards the front, or a front camera 3g located for example at the logo of the vehicle 1, directed substantially forwards.
  • the front cameras 3a and 3b, 3f, or 3g may have an opening angle of 60 ° or wide angle (for example 110 ° horizontal angle of view) or very wide angle (for example at an angle of view horizontal 170 °), and allow to see down vehicle 1 and above the horizon.
  • an opening angle of 60 ° or wide angle for example 110 ° horizontal angle of view
  • very wide angle for example at an angle of view horizontal 170 °
  • a rear camera 3c arranged for example in the handle of the trunk of the vehicle 1, or at the level of the license plate, making it possible to see towards the bottom of the vehicle 1 and above the horizon, and / or - a rear camera 3 hours above the rear window or at the third brake light of the vehicle 1, allowing to see down the vehicle 1 and more above the horizon that the rear camera 3c.
  • the rear cameras 3c, 3h may have a large opening angle (for example with a horizontal angle of view between 110 ° and 170 °).
  • the vehicle 1 comprises one or more cameras as described above.
  • the set of cameras 3 equipping the vehicle 1 comprises the front cameras 3a, 3b, the rear camera 3c, and the side cameras 3d, 3e.
  • control unit 5 comprises at least one processing means configured to generate a three-dimensional scene according to a predefined point of view, the three-dimensional scene comprising:
  • a three-dimensional representation 9 of the vehicle (FIGS. 3a to 3c), for example in the form of a closed line representing the outline of the vehicle 1 or alternatively in the form of a parallelepiped, this three-dimensional representation being arranged according to an orthogonal reference represented in FIG. 3b, comprising:
  • a longitudinal axis Y directed from the front to the rear of the three-dimensional representation 9, and a vertical axis Z directed from the bottom to the top of the three-dimensional representation 9.
  • control unit 5 comprises for example processing means configured to generate virtual screens 11a to Ile representative of a solid angle respectively observed by the cameras 3a to 3e of the vehicle.
  • These virtual screens 1 to I are represented in hatched lines in the figures
  • the shape of the virtual screens 11a, 11b, 11c, associated respectively with the front cameras 3a, 3b and rear 3c, is representative of the perspective in which they will be seen by a user of the vehicle, for example the driver.
  • the control unit 5 then makes it possible to project an image obtained from the video streams captured by the cameras 3a to 3e in a virtual screen 11a to associated island so as to generate the three-dimensional scene.
  • the virtual screens 1 Ia-I are in the same frame as the three-dimensional representation 9, that is to say that if the three-dimensional representation 9 moves during the display of the three-dimensional scene for example following a command to modify the scene, the virtual screens 1 Ia-I also move simultaneously with the three-dimensional representation 9.
  • the images provided by the cameras 3a-3e are thus dynamically integrated with the virtual screens 1a-1a with a deformation representative of the perspective according to which they will be seen by the user, according to the predetermined point of view.
  • a point of view of the scene is defined by the orientation of the three-dimensional scene and by the virtual screens.
  • a predefined viewpoint is an initial point of view that is not selected by the user and that is automatically displayed when the driver assistance system is started, or a point of view determined automatically by the system. driving assistance according to a particular event such as an obstacle detection in the environment of the vehicle or a maneuver of the vehicle like parking. For example, when the user is reversing, the environmental point of view at the rear of the vehicle is selected.
  • the driving assistance system may then include means for displaying the generated three-dimensional scene enabling the user to visualize this three-dimensional scene so as to have a real notion of the environment of his vehicle and thus better apprehend the vehicle in its environment when it performs for example a tricky maneuver such as park his vehicle.
  • An example of a display of such a three-dimensional scene is shown in FIG. 4.
  • the driver assistance system further comprises means adapted to enter a user's control trajectory on a touch surface 7 integrated in FIG. the display screen 13 and processing means configured to interpret the control trajectory entered on the touch surface 7 of the screen 13 so as to generate an associated control for modifying the three-dimensional scene.
  • the user then realizes a control trajectory directly on the part of the three-dimensional scene on which he wishes to act, which allows an intuitive use of the driver assistance system.
  • the associated control for modifying the three-dimensional scene makes it possible, for example, to vary the point of view of the three-dimensional scene.
  • Another example of an associated command for modifying the three-dimensional scene is a command for moving in translation the three-dimensional scene displayed on the display screen 13.
  • the term "tactile surface” means a film sensitive to a pressure of one or more supports.
  • the touch surface 7 may also comprise a transparent or translucent film.
  • control path can be achieved by means of a stylus or by one or more fingers of the user.
  • the tactile surface 7 comprises sensors configured to detect a support of a user and, depending on the force exerted, the position of the detected support and the releasing the support on the sensitive film forming a control path, triggering an associated control to vary the viewpoint of the three-dimensional scene.
  • sensors using pressure-sensitive resistors, also known as the FSR sensor for "Force Sensing Resistor”.
  • each finger can move clean or similar to the movement of other fingers. It is also possible to provide movements of the fingers in substantially parallel or opposite directions, or even displacements of the fingers in rotation in the same direction of rotation.
  • the touch surface 7 comprises multiple support sensors configured to simultaneously detect pressure and / or movements of the fingers in several places.
  • the control unit 5 is then adapted to simultaneously interpret these separate actions so as to determine the control path and thus the associated control to vary the point of view of the scene.
  • control trajectory can for example be associated with:
  • a control trajectory defined by a circular displacement is associated with a control of the rotation of the scene around an axis of rotation substantially parallel to the vertical axis Z.
  • the angle made by the user on the touch surface 7 is associated with a rotation angle of the three-dimensional scene
  • the direction of the control path is associated with a direction of rotational movement of the three-dimensional scene.
  • a control trajectory made by three fingers of the user is associated with a rectilinear movement (FIG. 5b) to an inclination control of the scene with respect to an axis substantially parallel to the transverse axis X, making it possible to move to a two-dimensional view of the scene.
  • This two-dimensional view represents for example a so-called bird view.
  • the length of the displacement made by the user on the touch surface 7 is associated with a displacement length of the three-dimensional scene or at a rotation angle of the three-dimensional scene
  • the direction of the control trajectory is associated with to the moving direction of the three-dimensional scene
  • the direction of the control path is associated with a sense of moving the three-dimensional scene or to a direction of rotation of the three-dimensional scene.
  • FIG. 5f to a control of rotation displacement of the scene about an axis of rotation substantially parallel to the vertical axis Z, according to a predefined angle, for example 180 °, which makes it possible to modify the point of view of the three-dimensional scene.
  • Figure 6 depicts the control method for varying the point of view of the three-dimensional scene.
  • a control associated with the control path is generated to vary the viewpoint of the three-dimensional scene.
  • the control trajectory thus allows the user to vary the viewpoint of the three-dimensional scene as he desires rather than by selecting a point of view from among a set of predefined viewpoints, so as to clearly visualize a viewpoint. precise area of the vehicle environment before making a maneuver, such as backing up or parking.
  • a step E3 can be provided in which the entered control path 15 is compared to a set of control paths. predetermined.
  • step E5 the command associated with the control trajectory seized on the touch-sensitive surface 7 is generated for a part of the three-dimensional scene, and in step E5, the three-dimensional scene is generated and displayed in real time according to the modified point of view.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Automation & Control Theory (AREA)
  • Closed-Circuit Television Systems (AREA)
EP09817299A 2008-09-30 2009-09-30 Verfahren zur steuerung und assoziiertes fahrhilfsystem Withdrawn EP2377006A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR0805362A FR2936479B1 (fr) 2008-09-30 2008-09-30 Procede de commande et systeme d'aide a la conduite associe
PCT/EP2009/062715 WO2010037795A1 (fr) 2008-09-30 2009-09-30 Procédé de commande et système d'aide à la conduite associé

Publications (1)

Publication Number Publication Date
EP2377006A1 true EP2377006A1 (de) 2011-10-19

Family

ID=40474695

Family Applications (1)

Application Number Title Priority Date Filing Date
EP09817299A Withdrawn EP2377006A1 (de) 2008-09-30 2009-09-30 Verfahren zur steuerung und assoziiertes fahrhilfsystem

Country Status (3)

Country Link
EP (1) EP2377006A1 (de)
FR (1) FR2936479B1 (de)
WO (1) WO2010037795A1 (de)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2970354B1 (fr) * 2010-10-21 2013-04-26 Dav Procede de commande et dispositif de commande associe.
JP2015193280A (ja) * 2014-03-31 2015-11-05 富士通テン株式会社 車両制御装置及び車両制御方法
CN108789453A (zh) * 2018-08-17 2018-11-13 成都跟驰科技有限公司 带有折叠机械臂的汽车的触屏控制系统
CN109278744B (zh) * 2018-10-24 2020-07-07 广州小鹏汽车科技有限公司 一种自动泊车方法及车辆控制系统
CN112550305A (zh) * 2020-12-18 2021-03-26 雄狮汽车科技(南京)有限公司 基于三维模型的汽车控制方法

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1408693A1 (de) * 1998-04-07 2004-04-14 Matsushita Electric Industrial Co., Ltd. Bildanzeigegerät an Bord eines Fahrzeugs, Bildsendersystem, -sendegerät und -aufnahmegerät
EP2259220A3 (de) * 1998-07-31 2012-09-26 Panasonic Corporation Vorrichtung und Verfahren zur Bildanzeige
US7366595B1 (en) * 1999-06-25 2008-04-29 Seiko Epson Corporation Vehicle drive assist system
JP2003300444A (ja) * 2002-04-11 2003-10-21 Hitachi Ltd 移動体の運転支援装置
GB0302837D0 (en) * 2003-02-07 2003-03-12 Ford Global Tech Inc Vehicle steering aids
JP5210497B2 (ja) * 2006-04-12 2013-06-12 クラリオン株式会社 ナビゲーション装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2010037795A1 *

Also Published As

Publication number Publication date
FR2936479A1 (fr) 2010-04-02
WO2010037795A1 (fr) 2010-04-08
FR2936479B1 (fr) 2010-10-15

Similar Documents

Publication Publication Date Title
JP6806156B2 (ja) 周辺監視装置
EP2133237B1 (de) Verfahren zur Anzeige einer Einparkhilfe
EP1724153B1 (de) Vorrichtung zur Beleuchtung oder Signalisierung für Kraftfahrzeug
FR2711593A1 (fr) Dispositif de surveillance de l'espace arrière, respectivement avant d'un véhicule automobile en manÓoeuvre de stationnement.
EP2377006A1 (de) Verfahren zur steuerung und assoziiertes fahrhilfsystem
EP2094531B1 (de) Elektronische rückansichtsvorrichtung
FR3016573B1 (fr) Installation de retroviseur de vehicule
WO2019091920A1 (fr) Procede d'affichage d'une image de l'environnement d'un vehicle sur un ecran tactile equipant le vehicule
EP2729328B1 (de) Fahrhilfevorrichtung zur bereitstellung für einen fahrer unter nutzung synthetischer bilder zur darstellung einer ausgewählten fahrzeugumgebung
FR3128916A1 (fr) Dispositif de commande d’affichage et procédé de commande d’affichage
FR3047942A1 (fr) Dispositif d'eclairage interieur d'un habitacle de vehicule automobile
FR3040029A1 (fr) Systeme d’aide a la conduite pour un vehicule pourvu d’un systeme de retrovision numerique adaptable pour manoeuvre de recul
EP3700780B1 (de) Verbessertes rückblick-videosystem für ein kraftfahrzeug
FR3067669A1 (fr) Systeme de retrovision numerique pour vehicule automobile a reglage facilite
FR3051163B1 (fr) Systeme de visualisation d'au moins un flux d'images sur un ecran dans un vehicule
WO2020007624A1 (fr) Dispositif de retrovision panoramique par cameras avec affichage tete-haute
EP3682320A1 (de) Verfahren zur anzeige der umgebung eines kraftfahrzeugs auf einem bildschirm und kraftfahrzeug, in dem solch ein verfahren implementiert ist
EP2193047B1 (de) Unterstützungsvorrichtung zur ausfahrt aus einer versteckten strasse
FR2956364A1 (fr) Dispositif d'aide aux manoeuvres d'un vehicule par affichage de points de vue fonction de la position de la tete du conducteur
FR3107483A1 (fr) Vehicule avec retrovision numerique a affichage reglable
EP3110661B1 (de) Parkhilfesystemvorrichtung zur erkennung der einfahrt in eine oder der ausfahrt aus einer parkzone für einen fahrzeugführer
FR3093968A1 (fr) Système de rétrovision pour véhicule comprenant une pluralité d’appareils de prise de vue et un calculateur
FR3110511A1 (fr) Système de visualisation de véhicule comprenant un dispositif d’affichage mobile entre une position portrait et une position paysage
FR3079790A1 (fr) Dispositif de retrovision pour un vehicule
FR3083331A1 (fr) Dispositif de realite augmentee pour vehicule

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20110823

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20120606

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20121017