EP2181383A2 - Dispositif d'affichage à plan d'image - Google Patents

Dispositif d'affichage à plan d'image

Info

Publication number
EP2181383A2
EP2181383A2 EP08784339A EP08784339A EP2181383A2 EP 2181383 A2 EP2181383 A2 EP 2181383A2 EP 08784339 A EP08784339 A EP 08784339A EP 08784339 A EP08784339 A EP 08784339A EP 2181383 A2 EP2181383 A2 EP 2181383A2
Authority
EP
European Patent Office
Prior art keywords
display device
body part
image area
image
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP08784339A
Other languages
German (de)
English (en)
Inventor
Johannes Angenvoort
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Garmin Switzerland GmbH
Original Assignee
Navigon AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Navigon AG filed Critical Navigon AG
Publication of EP2181383A2 publication Critical patent/EP2181383A2/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3664Details of the user input interface, e.g. buttons, knobs or sliders, including those provided on a touch screen; remote controllers; input using gestures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3667Display of a road map
    • G01C21/367Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids
    • G09B29/106Map spot or coordinate position indicators; Map reading aids using electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the invention relates to a display device with a picture surface according to the preamble of claim 1.
  • Generic display devices have an image surface on which image contents can be displayed electronically controllable.
  • the image surface has an extension in the X direction and in the Y direction.
  • Such displays are used on all types of electronic devices to display specific image content to the user.
  • navigation devices are equipped with corresponding display devices.
  • the display device also serves as an input device.
  • the display device is designed in the manner of a touch screen. By touching certain areas of the display devices associated with the corresponding areas operating commands can be triggered. As a result, it is possible, for example, for a control element to be displayed in a certain image area of the image area and to be selected and executed on touching this subarea of the control command assigned to the control element.
  • a control element to be displayed in a certain image area of the image area and to be selected and executed on touching this subarea of the control command assigned to the control element.
  • a limited number of image contents of limited size can be displayed on the image area. This poses a problem, in particular with mobile devices, for example mobile navigation devices, since the display devices present on these mobile devices are kept relatively small in order not to restrict mobility. Due to the relatively small display device, the displayed image contents can only be displayed with a relatively small scale. Although a change in the display scale of the displayed image content is possible, but requires the operation of appropriate controls, which is cumbersome and limits the ease of use
  • the display device is based on the basic idea that the display device cooperates with a proximity sensor.
  • the proximity sensor By the proximity sensor, the distance or the change in distance of a control element, such as a stylus, or a body part, such as a finger, can be detected in the Z direction.
  • the proximity sensor makes it possible to detect the relative position or relative position change of the operating element or of the body part relative to the image area in the X direction and in the Y direction.
  • the image content displayed on the screen is programmatically changed.
  • the type of program-controlled modification of the image content displayed on the screen is basically arbitrary. After a In a preferred embodiment, as a function of the position of the control element or body part detected by the proximity sensor, a partial image area of the image area is selected, and this partial image area is shown enlarged after the selection.
  • the image area is divided into predefined partial areas as a whole.
  • the partial image area is then always selected and enlarged, which has the smallest distance to the operating element or to the body part in the Z direction.
  • the enlargement factor by which the partial image area to be enlarged is enlarged, can also be changed by the approach of the operating element or body part.
  • the distance of the control element or body part to the image surface in the Z direction is detected by the proximity sensor system and derived from this distance in accordance with a predetermined control strategy, the magnification factor.
  • the operation of the display device is particularly simple and intuitive when the magnification factor is linearly proportional to the distance of the operating element or body part to the image surface in the Z direction.
  • magnification factor is greater, the smaller the distance of the operating element or body part to the image surface in the Z direction. In other words, that means extra Approximation of the control or body part to the image area of the magnification factor can be increased.
  • magnification factor is equal to 1 when a predefined limit distance is exceeded.
  • the proximity sensor system consists of a plurality of proximity sensors, which are arranged below the image surface and are respectively assigned to exactly one predefined partial image area.
  • Each of the individual proximity sensors is set up to detect the distance of the operating element or of the body part from the image surface.
  • a comparison is carried out in the control of the display device and the proximity sensor is determined which has detected the smallest distance to the operating element or body part.
  • the partial image area is displayed enlarged, which is associated with the proximity sensor with the smallest distance.
  • the proximity sensor can be formed by a stereo optical camera system.
  • two cameras may be provided in the display device, which interact as a stereo camera.
  • the geometry of the enlarged image area to be displayed is in principle arbitrary. Any shapes are possible, in particular those which correspond to the shape of a control element, such as a button. For most applications, it is advantageous if the partial image area to be enlarged is rectangular or circular. There are different variants for the type of enlargement of the partial image area to be enlarged. According to a first variant, the partial image area to be enlarged is enlarged in the X direction and in the Y direction with the same magnification factor. As a result, distortions are excluded in the enlargement to be displayed partial image.
  • magnification factor in the X direction and in the Y direction is different in each case.
  • the partial image to be enlarged is stretched in one direction or compressed in one direction.
  • magnification factor in each case depends on the position of the corresponding pixel in the sub-image area. In this way, the magnification factor can thus vary in the sub-image area.
  • magnification factor in the partial image area to be displayed enlarged it is conceivable, for example, that the magnification factor is greatest in the center of the partial image area to be displayed and smallest at the edge of the partial image area to be displayed. As a result, it is possible to realize an enlargement which gives the user a lenticular image impression.
  • these sub-image areas may be assigned a surrounding supplementary area.
  • This type of preferred magnification has particular importance in frequently used controls. Due to the supplementary surface, the frequently used operating elements are assigned an enlarged catchment area for the detection of the approach of the operating element or body part.
  • this display device can be used as part of an input device.
  • the user then has the option to make certain inputs depending on the image content on the display device.
  • a plurality of contact sensors are seconded below the image surface.
  • the contact sensors By the contact sensors, the contact of the operating element or the body part can be detected on associated partial image areas. Depending on the detected partial image area, an operating command assigned to the partial image area is selected or triggered. This ultimately corresponds to the function of a classic touch screen.
  • the proximity sensor itself to select or trigger the operating command.
  • the position of the control element or body part is first detected by the proximity sensor, thereby selecting a specific partial image area.
  • the selected partial image area can preferably be displayed enlarged in order to display the corresponding selection to the user.
  • the distance of the operating element or body part to the image area in the Z direction is monitored. If this distance falls below a predefined limit distance, an operating command assigned to the selected sub-image area is triggered. In other words, this means that by the approach of the operating element or body part, a selected operating command, that of the selected Part image area is assigned, can be confirmed by the user and thus triggered.
  • the display device In which device class the display device is integrated is basically arbitrary.
  • the display device according to the invention offers particularly great advantages in navigation systems, in particular mobile navigation devices. In these navigation devices, a large number of image contents must be displayed on a relatively small screen, so that in particular the enlargement of individual image components is of great importance. This enlargement of individual image contents is considerably simplified by using the display device according to the invention.
  • the display device is advantageous. If the map of a navigation system is displayed on a display device according to the invention, the position of the operating element or body part can be detected by the proximity sensor and the map can be shifted as a function of the detected position of the operating element or body part. This allows intuitive operation of the navigation system when displaying and moving maps.
  • the zoom magnification factor of maps displayed on the navigation system can be changed by approaching or removing the operating element or body part.
  • the zoom magnification factor should be increased as the operating element or body part approaches the image surface and be reduced when removed.
  • Show it: 1 shows a navigation device with display device according to the invention when approaching a body part to the display device.
  • FIG. 2 shows the image content of a display device according to the invention without an enlarged partial image area
  • FIG. 3 shows the display device according to FIG. 2 upon approach of a body part to a partial image area to show a control element
  • FIG. 4 shows the display device according to FIG. 2 when a body part approaches a graphics field
  • Fig. 5 the display device of FIG. 2 when approaching a body part to a text box.
  • Fig. 1 shows schematically a navigation device 01 with a display device 02 for program-controlled electronic display of image content.
  • a proximity sensor such as a stereo camera, installed, with the position of a body part, namely a forefinger 03, relative to the display device 02 can be determined.
  • the distance in the Z direction and, on the other hand, the relative position of the index finger 03 in the X direction and Y direction is detected.
  • the image area 04 of the display device 02 is divided into a multiplicity of predefined partial image areas 05.
  • the sub-image area 05 is considered to be selected, which has the smallest distance in the Z direction to the index finger 03.
  • the respectively selected partial image area 05 is then enlarged.
  • the mode of operation of the display device 02 will be explained by way of example with reference to the examples in FIGS. 2 to 5.
  • FIG. 2 shows the image content on the image surface 04 of the display device 02 without approaching the index finger 03. Elements, text fields and graphics fields are displayed electronically in the usual way.
  • Fig. 3 shows the display device 02 at a vertical approach of the index finger 03 to a control element 06.
  • the control element 06 which normally (see Fig. 2) is displayed according to the size of the other controls, is enlarged and at the same time selected when approaching the index finger 03 , By further approximation of the index finger to the now enlarged control element 06 to below a predefined threshold distance of the control element associated control command can be triggered.
  • the display device 02 is formed in the manner of a touch screen and is triggered by touching the enlarged control element 06 of the control element 06 associated control command.
  • the point 10 with the shortest distance to the index finger 03 determines the center of a loupe-like enlarged partial image area 1 1, which is indicated by dashed lines in FIG. 4 .
  • the magnification factor in the partial image area 1 1 varies depending on the location. The largest magnification factor is applied to the center in the area of the center 10, whereas the magnification factor at the edge of the partial image area 11 is the smallest. In this way, a lens-like magnification effect is achieved.
  • FIG. 5 shows the approach of the index finger 03 to a text field 12.
  • the text field 12 for the numerical display of the geographical position of the navigation system is enlarged on the display device 02 when the index finger approaches. In this case, a different magnification factor has been made between the X and Y directions, yet readability is maintained. LIST OF REFERENCE NUMBERS
  • Navigation device Display device Index finger Picture area Picture area Control element Map

Abstract

L'invention concerne un dispositif d'affichage (02) présentant un plan d'image (04) sur lequel des contenus image (05, 06, 07, 11, 12) sont affichables de manière régulée par voie électronique, ledit plan d'image (04) présentant une extension bidimensionnelle en direction X et en direction Y. L'invention est caractérisée en ce que le dispositif d'affichage (02) coopère avec un ensemble capteur d'approximation, en ce que, grâce audit ensemble capteur d'approximation, la distance et/ou la variation de distance d'un élément de commande ou d'une partie du corps (03) vers le plan d'image (04) en direction Z, et la position relative et/ou les variations de position relative de l'élément de commande ou de la partie du corps (03) par rapport au plan d'image (04), sont détectables en direction X et en direction Y, et en ce que le contenu image (06, 07, 11, 12) affiché sur le plan d'image (04) est modifiable, de façon commandée par programme, en fonction de la position de l'élément de commande ou de la partie du corps (03), détectée par ledit ensemble capteur d'approximation.
EP08784339A 2007-08-22 2008-07-21 Dispositif d'affichage à plan d'image Withdrawn EP2181383A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102007039669A DE102007039669A1 (de) 2007-08-22 2007-08-22 Anzeigeeinrichtung mit Bildfläche
PCT/DE2008/001156 WO2009024112A2 (fr) 2007-08-22 2008-07-21 Dispositif d'affichage à plan d'image

Publications (1)

Publication Number Publication Date
EP2181383A2 true EP2181383A2 (fr) 2010-05-05

Family

ID=39970966

Family Applications (1)

Application Number Title Priority Date Filing Date
EP08784339A Withdrawn EP2181383A2 (fr) 2007-08-22 2008-07-21 Dispositif d'affichage à plan d'image

Country Status (3)

Country Link
EP (1) EP2181383A2 (fr)
DE (1) DE102007039669A1 (fr)
WO (1) WO2009024112A2 (fr)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102008032377A1 (de) * 2008-07-09 2010-01-14 Volkswagen Ag Verfahren zum Betrieb eines Bediensystems für ein Fahrzeug und Bediensystem für ein Fahrzeug
US9658765B2 (en) * 2008-07-31 2017-05-23 Northrop Grumman Systems Corporation Image magnification system for computer interface
JP4683126B2 (ja) 2008-12-26 2011-05-11 ブラザー工業株式会社 入力装置
EP2367095A3 (fr) 2010-03-19 2015-04-01 Garmin Switzerland GmbH Appareil de navigation électronique portable
EP2614417B1 (fr) * 2010-09-06 2017-08-02 Valeo Schalter und Sensoren GmbH Procédé permettant de faire fonctionner un dispositif d'aide à la conduite, dispositif d'aide à la conduite et véhicule équipé d'un dispositif d'aide à la conduite
US9404767B2 (en) 2011-06-10 2016-08-02 The Boeing Company Methods and systems for performing charting tasks
EP2575007A1 (fr) * 2011-09-27 2013-04-03 Elo Touch Solutions, Inc. Mise à l'échelle d'entrées basées sur les gestes
EP2575006B1 (fr) 2011-09-27 2018-06-13 Elo Touch Solutions, Inc. Interaction utilisateur avec contact et sans contact avec un dispositif
DE102011117289B4 (de) 2011-10-31 2017-08-24 Volkswagen Ag Verfahren zum Betreiben einer mobilen Vorrichtung in einem Fahrzeug, Koppelvorrichtung, Fahrzeug sowie System
DE102013013697B4 (de) * 2013-08-16 2021-01-28 Audi Ag Vorrichtung und Verfahren zum Eingeben von Schriftzeichen im freien Raum
DE102014216626B4 (de) 2014-08-21 2023-10-05 Volkswagen Aktiengesellschaft Verfahren zum Teilen von Daten in einem Fahrzeug sowie entsprechende Vorrichtung
EP3182250B1 (fr) * 2015-12-18 2019-10-30 Aptiv Technologies Limited Système et procédé de surveillance d'espace 3d en face d'une unité de sortie pour la commande de l'unité

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08212005A (ja) * 1995-02-07 1996-08-20 Hitachi Ltd 3次元位置認識型タッチパネル装置
DE19958443C2 (de) * 1999-12-03 2002-04-25 Siemens Ag Bedieneinrichtung
JP2001256511A (ja) * 2000-01-06 2001-09-21 Canon Inc データ処理システム、プリンタ、画像記録システム及び画像記録方法
DE50014953D1 (de) * 2000-08-24 2008-03-20 Siemens Vdo Automotive Ag Verfahren und Navigationsgerät zum Abfragen von Zielinformation und zum Navigieren in einer Kartenansicht
JP3800984B2 (ja) * 2001-05-21 2006-07-26 ソニー株式会社 ユーザ入力装置
US8035612B2 (en) * 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Self-contained interactive video display system
KR20050026070A (ko) * 2002-08-02 2005-03-14 서크 코퍼레이션 터치 존을 갖는 단일층 터치패드
JP2004287168A (ja) * 2003-03-24 2004-10-14 Pioneer Electronic Corp 情報表示装置及び情報表示方法
DE10322801A1 (de) * 2003-05-19 2004-12-09 Gate5 Ag Verfahren zur tastaturgestützten, einhändigen Steuerung einer digitalen Karte
GB2445964B (en) * 2004-03-15 2008-10-08 Tomtom Bv GPS navigation device
DE102004023196A1 (de) * 2004-05-11 2005-12-08 Siemens Ag Verfahren zur visuellen Darstellung einer geographischen Karte auf einem Display eines mobilen Kommunikationsgerätes sowie mobiles Kommunikationsgerät zur Durchführung des Verfahrens
JP2008505379A (ja) * 2004-06-29 2008-02-21 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 3次元タッチ型インタラクションにおけるタッチダウン型フィードフォワード
US7653883B2 (en) * 2004-07-30 2010-01-26 Apple Inc. Proximity detector in handheld device
JP4839603B2 (ja) * 2004-11-22 2011-12-21 ソニー株式会社 表示装置、表示方法、表示プログラム及び表示プログラムを記録した記録媒体
JP5007782B2 (ja) * 2005-11-17 2012-08-22 株式会社デンソー ナビゲーション装置および地図表示縮尺設定方法
DE202006003912U1 (de) * 2006-03-09 2007-07-19 Klicktel Ag Navigationsgerät mit Bedienfläche auf berührungsempfindlichem Bildschirm

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2009024112A3 *

Also Published As

Publication number Publication date
DE102007039669A1 (de) 2009-02-26
WO2009024112A3 (fr) 2009-04-30
WO2009024112A2 (fr) 2009-02-26

Similar Documents

Publication Publication Date Title
EP2181383A2 (fr) Dispositif d'affichage à plan d'image
EP3113969B1 (fr) Interface utilisateur et procédé de signalisation d'une position en 3d d'un moyen de saisie lors de la détection de gestes
EP1262740B1 (fr) Système d'ordinateur véhiculaire et procédé de commande d'un curseur pour système d'ordinateur véhiculaire
EP2867762B1 (fr) Procédé permettant de recevoir une entrée sur un champ tactile
DE102007025530A1 (de) Informationsvermittlungsvorrichtung und Verfahren zur Vermittlung von Informationen
WO2006066742A1 (fr) Systeme de commande pour un vehicule
EP2822814B1 (fr) Véhicule automobile avec un rétroviseur électronique
EP2822812A1 (fr) Véhicule à moteur équipé d'un rétroviseur électronique
EP3114554A1 (fr) Procédé et dispositif pour mettre à disposition une interface utilisateur graphique dans un véhicule
WO2010025781A1 (fr) Procédé et dispositif pour l'affichage d'informations dans un véhicule
DE102004019893A1 (de) Bedienelement für ein Kraftfahrzeug
WO2013053529A1 (fr) Système de commande et procédé permettant de représenter une surface de commande
DE102005056458A1 (de) Bedienvorrichtung für ein Fahrzeug
WO2014067774A1 (fr) Procédé et dispositif de fonctionnement d'un système d'entrée
DE102012020164A1 (de) Bedienelement für eine Anzeigevorrichtung in einem Kraftfahrzeug
WO2020144016A1 (fr) Dispositif d'utilisation pour utiliser au moins un appareil et procédé pour l'utilisation d'un tel dispositif d'utilisation
DE202010017428U1 (de) Bedieneinheit
DE10119648B4 (de) Anordnung zur Bedienung von fernsehtechnischen Geräten
DE102010009622A1 (de) Verfahren zum Betreiben einer Benutzerschnittstelle und Vorrichtung dazu, insbesondere in einem Fahrzeug
WO2014040807A1 (fr) Entrées par effleurement le long d'un seuil d'une surface tactile
DE19941967B4 (de) Verfahren und Vorrichtung zur Bewegung eines Aktivierungselementes auf einer Anzeigeeinheit
DE102018205616A1 (de) Portable, mobile Bedienvorrichtung, bei welcher an zumindest zwei Oberflächenbereichen eine Anzeigefläche bereitgestellt ist
EP2331360B1 (fr) Procédé et dispositif pour l'affichage d'informations, en particulier dans un véhicule
DE102015117386B4 (de) Verfahren und Vorrichtung zur Aktivierung eines Eingabebereiches auf einer kapazitiven Eingabefläche
DE102009038030B4 (de) Fahrzeug mit einer Auswahleinrichtung zum Auswählen zumindest einer Funktion des Fahrzeugs

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20100126

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA MK RS

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20120919

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: GARMIN SWITZERLAND GMBH

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20170201