WO2009024112A2 - Dispositif d'affichage à plan d'image - Google Patents

Dispositif d'affichage à plan d'image Download PDF

Info

Publication number
WO2009024112A2
WO2009024112A2 PCT/DE2008/001156 DE2008001156W WO2009024112A2 WO 2009024112 A2 WO2009024112 A2 WO 2009024112A2 DE 2008001156 W DE2008001156 W DE 2008001156W WO 2009024112 A2 WO2009024112 A2 WO 2009024112A2
Authority
WO
WIPO (PCT)
Prior art keywords
display device
body part
image area
image
displayed
Prior art date
Application number
PCT/DE2008/001156
Other languages
German (de)
English (en)
Other versions
WO2009024112A3 (fr
Inventor
Johannes Angenvoort
Original Assignee
Navigon Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Navigon Ag filed Critical Navigon Ag
Priority to EP08784339A priority Critical patent/EP2181383A2/fr
Publication of WO2009024112A2 publication Critical patent/WO2009024112A2/fr
Publication of WO2009024112A3 publication Critical patent/WO2009024112A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3664Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • G01C21/367Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids
    • G09B29/106Map spot or coordinate position indicators; Map reading aids using electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the invention relates to a display device with a picture surface according to the preamble of claim 1.
  • Generic display devices have an image surface on which image contents can be displayed electronically controllable.
  • the image surface has an extension in the X direction and in the Y direction.
  • Such displays are used on all types of electronic devices to display specific image content to the user.
  • navigation devices are equipped with corresponding display devices.
  • the display device also serves as an input device.
  • the display device is designed in the manner of a touch screen. By touching certain areas of the display devices associated with the corresponding areas operating commands can be triggered. As a result, it is possible, for example, for a control element to be displayed in a certain image area of the image area and to be selected and executed on touching this subarea of the control command assigned to the control element.
  • a control element to be displayed in a certain image area of the image area and to be selected and executed on touching this subarea of the control command assigned to the control element.
  • a limited number of image contents of limited size can be displayed on the image area. This poses a problem, in particular with mobile devices, for example mobile navigation devices, since the display devices present on these mobile devices are kept relatively small in order not to restrict mobility. Due to the relatively small display device, the displayed image contents can only be displayed with a relatively small scale. Although a change in the display scale of the displayed image content is possible, but requires the operation of appropriate controls, which is cumbersome and limits the ease of use
  • the display device is based on the basic idea that the display device cooperates with a proximity sensor.
  • the proximity sensor By the proximity sensor, the distance or the change in distance of a control element, such as a stylus, or a body part, such as a finger, can be detected in the Z direction.
  • the proximity sensor makes it possible to detect the relative position or relative position change of the operating element or of the body part relative to the image area in the X direction and in the Y direction.
  • the image content displayed on the screen is programmatically changed.
  • the type of program-controlled modification of the image content displayed on the screen is basically arbitrary. After a In a preferred embodiment, as a function of the position of the control element or body part detected by the proximity sensor, a partial image area of the image area is selected, and this partial image area is shown enlarged after the selection.
  • the image area is divided into predefined partial areas as a whole.
  • the partial image area is then always selected and enlarged, which has the smallest distance to the operating element or to the body part in the Z direction.
  • the enlargement factor by which the partial image area to be enlarged is enlarged, can also be changed by the approach of the operating element or body part.
  • the distance of the control element or body part to the image surface in the Z direction is detected by the proximity sensor system and derived from this distance in accordance with a predetermined control strategy, the magnification factor.
  • the operation of the display device is particularly simple and intuitive when the magnification factor is linearly proportional to the distance of the operating element or body part to the image surface in the Z direction.
  • magnification factor is greater, the smaller the distance of the operating element or body part to the image surface in the Z direction. In other words, that means extra Approximation of the control or body part to the image area of the magnification factor can be increased.
  • magnification factor is equal to 1 when a predefined limit distance is exceeded.
  • the proximity sensor system consists of a plurality of proximity sensors, which are arranged below the image surface and are respectively assigned to exactly one predefined partial image area.
  • Each of the individual proximity sensors is set up to detect the distance of the operating element or of the body part from the image surface.
  • a comparison is carried out in the control of the display device and the proximity sensor is determined which has detected the smallest distance to the operating element or body part.
  • the partial image area is displayed enlarged, which is associated with the proximity sensor with the smallest distance.
  • the proximity sensor can be formed by a stereo optical camera system.
  • two cameras may be provided in the display device, which interact as a stereo camera.
  • the geometry of the enlarged image area to be displayed is in principle arbitrary. Any shapes are possible, in particular those which correspond to the shape of a control element, such as a button. For most applications, it is advantageous if the partial image area to be enlarged is rectangular or circular. There are different variants for the type of enlargement of the partial image area to be enlarged. According to a first variant, the partial image area to be enlarged is enlarged in the X direction and in the Y direction with the same magnification factor. As a result, distortions are excluded in the enlargement to be displayed partial image.
  • magnification factor in the X direction and in the Y direction is different in each case.
  • the partial image to be enlarged is stretched in one direction or compressed in one direction.
  • magnification factor in each case depends on the position of the corresponding pixel in the sub-image area. In this way, the magnification factor can thus vary in the sub-image area.
  • magnification factor in the partial image area to be displayed enlarged it is conceivable, for example, that the magnification factor is greatest in the center of the partial image area to be displayed and smallest at the edge of the partial image area to be displayed. As a result, it is possible to realize an enlargement which gives the user a lenticular image impression.
  • these sub-image areas may be assigned a surrounding supplementary area.
  • This type of preferred magnification has particular importance in frequently used controls. Due to the supplementary surface, the frequently used operating elements are assigned an enlarged catchment area for the detection of the approach of the operating element or body part.
  • this display device can be used as part of an input device.
  • the user then has the option to make certain inputs depending on the image content on the display device.
  • a plurality of contact sensors are seconded below the image surface.
  • the contact sensors By the contact sensors, the contact of the operating element or the body part can be detected on associated partial image areas. Depending on the detected partial image area, an operating command assigned to the partial image area is selected or triggered. This ultimately corresponds to the function of a classic touch screen.
  • the proximity sensor itself to select or trigger the operating command.
  • the position of the control element or body part is first detected by the proximity sensor, thereby selecting a specific partial image area.
  • the selected partial image area can preferably be displayed enlarged in order to display the corresponding selection to the user.
  • the distance of the operating element or body part to the image area in the Z direction is monitored. If this distance falls below a predefined limit distance, an operating command assigned to the selected sub-image area is triggered. In other words, this means that by the approach of the operating element or body part, a selected operating command, that of the selected Part image area is assigned, can be confirmed by the user and thus triggered.
  • the display device In which device class the display device is integrated is basically arbitrary.
  • the display device according to the invention offers particularly great advantages in navigation systems, in particular mobile navigation devices. In these navigation devices, a large number of image contents must be displayed on a relatively small screen, so that in particular the enlargement of individual image components is of great importance. This enlargement of individual image contents is considerably simplified by using the display device according to the invention.
  • the display device is advantageous. If the map of a navigation system is displayed on a display device according to the invention, the position of the operating element or body part can be detected by the proximity sensor and the map can be shifted as a function of the detected position of the operating element or body part. This allows intuitive operation of the navigation system when displaying and moving maps.
  • the zoom magnification factor of maps displayed on the navigation system can be changed by approaching or removing the operating element or body part.
  • the zoom magnification factor should be increased as the operating element or body part approaches the image surface and be reduced when removed.
  • Show it: 1 shows a navigation device with display device according to the invention when approaching a body part to the display device.
  • FIG. 2 shows the image content of a display device according to the invention without an enlarged partial image area
  • FIG. 3 shows the display device according to FIG. 2 upon approach of a body part to a partial image area to show a control element
  • FIG. 4 shows the display device according to FIG. 2 when a body part approaches a graphics field
  • Fig. 5 the display device of FIG. 2 when approaching a body part to a text box.
  • Fig. 1 shows schematically a navigation device 01 with a display device 02 for program-controlled electronic display of image content.
  • a proximity sensor such as a stereo camera, installed, with the position of a body part, namely a forefinger 03, relative to the display device 02 can be determined.
  • the distance in the Z direction and, on the other hand, the relative position of the index finger 03 in the X direction and Y direction is detected.
  • the image area 04 of the display device 02 is divided into a multiplicity of predefined partial image areas 05.
  • the sub-image area 05 is considered to be selected, which has the smallest distance in the Z direction to the index finger 03.
  • the respectively selected partial image area 05 is then enlarged.
  • the mode of operation of the display device 02 will be explained by way of example with reference to the examples in FIGS. 2 to 5.
  • FIG. 2 shows the image content on the image surface 04 of the display device 02 without approaching the index finger 03. Elements, text fields and graphics fields are displayed electronically in the usual way.
  • Fig. 3 shows the display device 02 at a vertical approach of the index finger 03 to a control element 06.
  • the control element 06 which normally (see Fig. 2) is displayed according to the size of the other controls, is enlarged and at the same time selected when approaching the index finger 03 , By further approximation of the index finger to the now enlarged control element 06 to below a predefined threshold distance of the control element associated control command can be triggered.
  • the display device 02 is formed in the manner of a touch screen and is triggered by touching the enlarged control element 06 of the control element 06 associated control command.
  • the point 10 with the shortest distance to the index finger 03 determines the center of a loupe-like enlarged partial image area 1 1, which is indicated by dashed lines in FIG. 4 .
  • the magnification factor in the partial image area 1 1 varies depending on the location. The largest magnification factor is applied to the center in the area of the center 10, whereas the magnification factor at the edge of the partial image area 11 is the smallest. In this way, a lens-like magnification effect is achieved.
  • FIG. 5 shows the approach of the index finger 03 to a text field 12.
  • the text field 12 for the numerical display of the geographical position of the navigation system is enlarged on the display device 02 when the index finger approaches. In this case, a different magnification factor has been made between the X and Y directions, yet readability is maintained. LIST OF REFERENCE NUMBERS
  • Navigation device Display device Index finger Picture area Picture area Control element Map

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)

Abstract

L'invention concerne un dispositif d'affichage (02) présentant un plan d'image (04) sur lequel des contenus image (05, 06, 07, 11, 12) sont affichables de manière régulée par voie électronique, ledit plan d'image (04) présentant une extension bidimensionnelle en direction X et en direction Y. L'invention est caractérisée en ce que le dispositif d'affichage (02) coopère avec un ensemble capteur d'approximation, en ce que, grâce audit ensemble capteur d'approximation, la distance et/ou la variation de distance d'un élément de commande ou d'une partie du corps (03) vers le plan d'image (04) en direction Z, et la position relative et/ou les variations de position relative de l'élément de commande ou de la partie du corps (03) par rapport au plan d'image (04), sont détectables en direction X et en direction Y, et en ce que le contenu image (06, 07, 11, 12) affiché sur le plan d'image (04) est modifiable, de façon commandée par programme, en fonction de la position de l'élément de commande ou de la partie du corps (03), détectée par ledit ensemble capteur d'approximation.
PCT/DE2008/001156 2007-08-22 2008-07-21 Dispositif d'affichage à plan d'image WO2009024112A2 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP08784339A EP2181383A2 (fr) 2007-08-22 2008-07-21 Dispositif d'affichage à plan d'image

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102007039669A DE102007039669A1 (de) 2007-08-22 2007-08-22 Anzeigeeinrichtung mit Bildfläche
DE102007039669.6 2007-08-22

Publications (2)

Publication Number Publication Date
WO2009024112A2 true WO2009024112A2 (fr) 2009-02-26
WO2009024112A3 WO2009024112A3 (fr) 2009-04-30

Family

ID=39970966

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/DE2008/001156 WO2009024112A2 (fr) 2007-08-22 2008-07-21 Dispositif d'affichage à plan d'image

Country Status (3)

Country Link
EP (1) EP2181383A2 (fr)
DE (1) DE102007039669A1 (fr)
WO (1) WO2009024112A2 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010003471A1 (fr) * 2008-07-09 2010-01-14 Volkswagen Aktiengesellschaft Procédé de fonctionnement d'un système de commande de véhicule et système de commande de véhicule
WO2012031606A1 (fr) 2010-09-06 2012-03-15 Valeo Schalter Und Sensoren Gmbh Procédé permettant de faire fonctionner un dispositif d'aide à la conduite, dispositif d'aide à la conduite et véhicule équipé d'un dispositif d'aide à la conduite
GB2462171B (en) * 2008-07-31 2013-04-10 Northrop Grumman Space & Msn Image magnification system for computer interface
DE102011117289A1 (de) 2011-10-31 2013-05-02 Volkswagen Ag Verfahren zum Betreiben einer mobilen Vorrichtung in einem Fahrzeug
DE102013013697B4 (de) * 2013-08-16 2021-01-28 Audi Ag Vorrichtung und Verfahren zum Eingeben von Schriftzeichen im freien Raum

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4683126B2 (ja) 2008-12-26 2011-05-11 ブラザー工業株式会社 入力装置
EP2367095A3 (fr) 2010-03-19 2015-04-01 Garmin Switzerland GmbH Appareil de navigation électronique portable
US9404767B2 (en) 2011-06-10 2016-08-02 The Boeing Company Methods and systems for performing charting tasks
EP2575007A1 (fr) * 2011-09-27 2013-04-03 Elo Touch Solutions, Inc. Mise à l'échelle d'entrées basées sur les gestes
EP2575006B1 (fr) * 2011-09-27 2018-06-13 Elo Touch Solutions, Inc. Interaction utilisateur avec contact et sans contact avec un dispositif
DE102014216626B4 (de) 2014-08-21 2023-10-05 Volkswagen Aktiengesellschaft Verfahren zum Teilen von Daten in einem Fahrzeug sowie entsprechende Vorrichtung
EP3182250B1 (fr) * 2015-12-18 2019-10-30 Aptiv Technologies Limited Système et procédé de surveillance d'espace 3d en face d'une unité de sortie pour la commande de l'unité

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08212005A (ja) * 1995-02-07 1996-08-20 Hitachi Ltd 3次元位置認識型タッチパネル装置
WO2004013833A2 (fr) * 2002-08-02 2004-02-12 Cirque Corporation Pave tactile a couche unique a zones tactiles
US20040230904A1 (en) * 2003-03-24 2004-11-18 Kenichiro Tada Information display apparatus and information display method
WO2005057921A2 (fr) * 2003-12-09 2005-06-23 Reactrix Systems, Inc. Systeme d'affichage video interactif autonome
GB2412281A (en) * 2004-03-15 2005-09-21 Tomtom Bv Displaying dynamic travel information for whole route
WO2006003586A2 (fr) * 2004-06-29 2006-01-12 Koninklijke Philips Electronics, N.V. Interaction tactile tridimensionnelle a pression digitale et a action directe
US20060112350A1 (en) * 2004-11-22 2006-05-25 Sony Corporation Display apparatus, display method, display program, and recording medium with the display program
US20060161871A1 (en) * 2004-07-30 2006-07-20 Apple Computer, Inc. Proximity detector in handheld device
US20070109323A1 (en) * 2005-11-17 2007-05-17 Denso Corporation System and method for displaying map
DE202006003912U1 (de) * 2006-03-09 2007-07-19 Klicktel Ag Navigationsgerät mit Bedienfläche auf berührungsempfindlichem Bildschirm

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19958443C2 (de) * 1999-12-03 2002-04-25 Siemens Ag Bedieneinrichtung
JP2001256511A (ja) * 2000-01-06 2001-09-21 Canon Inc データ処理システム、プリンタ、画像記録システム及び画像記録方法
JP2004507724A (ja) * 2000-08-24 2004-03-11 シーメンス アクチエンゲゼルシヤフト 目標情報を表示させるための方法、地図画面内でナビゲーションを行う方法、コンピュータプログラム製品及びナビゲーション装置
JP3800984B2 (ja) * 2001-05-21 2006-07-26 ソニー株式会社 ユーザ入力装置
DE10322801A1 (de) * 2003-05-19 2004-12-09 Gate5 Ag Verfahren zur tastaturgestützten, einhändigen Steuerung einer digitalen Karte
DE102004023196A1 (de) * 2004-05-11 2005-12-08 Siemens Ag Verfahren zur visuellen Darstellung einer geographischen Karte auf einem Display eines mobilen Kommunikationsgerätes sowie mobiles Kommunikationsgerät zur Durchführung des Verfahrens

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08212005A (ja) * 1995-02-07 1996-08-20 Hitachi Ltd 3次元位置認識型タッチパネル装置
WO2004013833A2 (fr) * 2002-08-02 2004-02-12 Cirque Corporation Pave tactile a couche unique a zones tactiles
US20040230904A1 (en) * 2003-03-24 2004-11-18 Kenichiro Tada Information display apparatus and information display method
WO2005057921A2 (fr) * 2003-12-09 2005-06-23 Reactrix Systems, Inc. Systeme d'affichage video interactif autonome
GB2412281A (en) * 2004-03-15 2005-09-21 Tomtom Bv Displaying dynamic travel information for whole route
WO2006003586A2 (fr) * 2004-06-29 2006-01-12 Koninklijke Philips Electronics, N.V. Interaction tactile tridimensionnelle a pression digitale et a action directe
US20060161871A1 (en) * 2004-07-30 2006-07-20 Apple Computer, Inc. Proximity detector in handheld device
US20060112350A1 (en) * 2004-11-22 2006-05-25 Sony Corporation Display apparatus, display method, display program, and recording medium with the display program
US20070109323A1 (en) * 2005-11-17 2007-05-17 Denso Corporation System and method for displaying map
DE202006003912U1 (de) * 2006-03-09 2007-07-19 Klicktel Ag Navigationsgerät mit Bedienfläche auf berührungsempfindlichem Bildschirm

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010003471A1 (fr) * 2008-07-09 2010-01-14 Volkswagen Aktiengesellschaft Procédé de fonctionnement d'un système de commande de véhicule et système de commande de véhicule
US8564560B2 (en) 2008-07-09 2013-10-22 Volkswagen Ag Method for operating a control system for a vehicle and control system for a vehicle
US9041674B2 (en) 2008-07-09 2015-05-26 Volkswagen Ag Method for operating a control system for a vehicle and control system for a vehicle
GB2462171B (en) * 2008-07-31 2013-04-10 Northrop Grumman Space & Msn Image magnification system for computer interface
WO2012031606A1 (fr) 2010-09-06 2012-03-15 Valeo Schalter Und Sensoren Gmbh Procédé permettant de faire fonctionner un dispositif d'aide à la conduite, dispositif d'aide à la conduite et véhicule équipé d'un dispositif d'aide à la conduite
DE102011117289A1 (de) 2011-10-31 2013-05-02 Volkswagen Ag Verfahren zum Betreiben einer mobilen Vorrichtung in einem Fahrzeug
WO2013064222A1 (fr) 2011-10-31 2013-05-10 Volkswagen Aktiengesellschaft Procédé de fonctionnement d'un dispositif mobile dans un véhicule
DE102011117289B4 (de) * 2011-10-31 2017-08-24 Volkswagen Ag Verfahren zum Betreiben einer mobilen Vorrichtung in einem Fahrzeug, Koppelvorrichtung, Fahrzeug sowie System
DE102013013697B4 (de) * 2013-08-16 2021-01-28 Audi Ag Vorrichtung und Verfahren zum Eingeben von Schriftzeichen im freien Raum

Also Published As

Publication number Publication date
WO2009024112A3 (fr) 2009-04-30
DE102007039669A1 (de) 2009-02-26
EP2181383A2 (fr) 2010-05-05

Similar Documents

Publication Publication Date Title
WO2009024112A2 (fr) Dispositif d'affichage à plan d'image
EP3113969B1 (fr) Interface utilisateur et procédé de signalisation d'une position en 3d d'un moyen de saisie lors de la détection de gestes
EP1262740B1 (fr) Système d'ordinateur véhiculaire et procédé de commande d'un curseur pour système d'ordinateur véhiculaire
EP2867762B1 (fr) Procédé permettant de recevoir une entrée sur un champ tactile
WO2006066742A1 (fr) Systeme de commande pour un vehicule
EP2822814B1 (fr) Véhicule automobile avec un rétroviseur électronique
WO2013131543A1 (fr) Véhicule à moteur équipé d'un rétroviseur électronique
EP3114554A1 (fr) Procédé et dispositif pour mettre à disposition une interface utilisateur graphique dans un véhicule
WO2010026044A1 (fr) Procédé et dispositif pour l'affichage d'informations, en particulier dans un véhicule
DE102005056458B4 (de) Bedienvorrichtung für ein Fahrzeug
DE102004019893A1 (de) Bedienelement für ein Kraftfahrzeug
WO2013053529A1 (fr) Système de commande et procédé permettant de représenter une surface de commande
DE102012020164A1 (de) Bedienelement für eine Anzeigevorrichtung in einem Kraftfahrzeug
DE102005025887B4 (de) Bedieneinrichtung für ein Kraftfahrzeug und Verfahren zum Bedienen einer Bedieneinrichtung
WO2020144016A1 (fr) Dispositif d'utilisation pour utiliser au moins un appareil et procédé pour l'utilisation d'un tel dispositif d'utilisation
DE202010017428U1 (de) Bedieneinheit
DE10119648B4 (de) Anordnung zur Bedienung von fernsehtechnischen Geräten
WO2014040807A1 (fr) Entrées par effleurement le long d'un seuil d'une surface tactile
DE19941967B4 (de) Verfahren und Vorrichtung zur Bewegung eines Aktivierungselementes auf einer Anzeigeeinheit
DE102017210599A1 (de) Verfahren zum Positionieren eines digitalen Anzeigeinhalts auf einer Anzeigeeinrichtung eines Kraftfahrzeugs, Steuervorrichtung und Kraftfahrzeug mit Steuervorrichtung
EP2331360B1 (fr) Procédé et dispositif pour l'affichage d'informations, en particulier dans un véhicule
DE102009038030B4 (de) Fahrzeug mit einer Auswahleinrichtung zum Auswählen zumindest einer Funktion des Fahrzeugs
WO2021028274A1 (fr) Système utilisateur et procédé pour faire fonctionner un système utilisateur
DE102016224840A1 (de) Bedienvorrichtung für ein Kraftfahrzeug
DE102016008049A1 (de) Verfahren zum Betreiben einer Bedienvorrichtung, Bedienvorrichtung und Kraftfahrzeug

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08784339

Country of ref document: EP

Kind code of ref document: A2

REEP Request for entry into the european phase

Ref document number: 2008784339

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2008784339

Country of ref document: EP