WO2014108160A2 - Interface utilisateur destinée à la sélection sans fil d'une fonction d'un appareil - Google Patents

Interface utilisateur destinée à la sélection sans fil d'une fonction d'un appareil Download PDF

Info

Publication number
WO2014108160A2
WO2014108160A2 PCT/EP2013/003880 EP2013003880W WO2014108160A2 WO 2014108160 A2 WO2014108160 A2 WO 2014108160A2 EP 2013003880 W EP2013003880 W EP 2013003880W WO 2014108160 A2 WO2014108160 A2 WO 2014108160A2
Authority
WO
WIPO (PCT)
Prior art keywords
hand
movement
function
operator
motor vehicle
Prior art date
Application number
PCT/EP2013/003880
Other languages
German (de)
English (en)
Other versions
WO2014108160A3 (fr
Inventor
Michael SCHLITTENBAUER
Martin Roehder
Heiko Maiwand
Nathaniel COSER
Lorenz Bohrer
Alexander Sebastian Strauch
Original Assignee
Audi Ag
Volkswagen Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi Ag, Volkswagen Ag filed Critical Audi Ag
Publication of WO2014108160A2 publication Critical patent/WO2014108160A2/fr
Publication of WO2014108160A3 publication Critical patent/WO2014108160A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/21Optical features of instruments using cameras

Definitions

  • the invention relates to a method for operating a control point of a device.
  • a position of a hand of a person is determined without contact and a function of the device is selected and activated as a function of a movement of the hand.
  • the invention also includes a motor vehicle with an operating interface for activating at least one function of the motor vehicle.
  • the invention includes a computer program product by means of which it is possible to carry out the method according to the invention with a mobile terminal, that is, for example, a smartphone or a tablet PC, and a corresponding mobile terminal.
  • Gesture recognition devices have the advantage that an operation of a device without buttons is possible. Keys are locally defined and usually must be located by an operator prior to operation. In particular, in connection with the operation of a motor vehicle while driving this is dangerous for a driver, because he is distracted from the traffic. Gesture recognition reduces attention loss by eliminating the need to search for a switch. The disadvantage of a gesture recognition, however, is that the operator has to remember the different gestures. Another drawback is that it can be difficult to ensure robust gesture recognition in varying light conditions.
  • the invention is based on the object to provide a user interface for a device, with which an operator can operate the device with little attention.
  • the method according to the invention is based on the approach described at the beginning, by means of an optical sensor device of an operator interface, to contactlessly determine a position of a hand of an operator and to select and activate a function of the device as a function of a movement of the hand.
  • an optical sensor device of an operator interface to contactlessly determine a position of a hand of an operator and to select and activate a function of the device as a function of a movement of the hand.
  • the optical sensor device works by the inventive method, so to speak, similar to a light barrier. This is achieved as follows.
  • the sensor device first detects whether the operator grasps his hand into a predefined spatial area. After detecting the hand in the room area then a function selection is activated. In this function selection, a wiping movement of the hand is detected to an edge of the room area. The wiping movement defines a direction of movement. This is determined by the sensor device accordingly. Depending on the direction of movement, a specific function associated with the direction of movement is then selected. This can then be activated either immediately or automatically after confirmation by the operator.
  • pivoting the hand to the left may trigger a first function, eg accepting a telephone call, and pivoting the hand to the right a second function, eg refusing the telephone call.
  • a current position of the hand in the spatial area can be defined as the starting position, and then a wiping movement of the hand starting from the starting position to an edge of the spatial area can be detected. Due to the starting position and the wiping movement emanating from this then the direction of movement is particularly clearly defined.
  • the method according to the invention has the advantage that the operator, without loss of attention, makes a binary selection with a single wiping motion (activate function or not). Another advantage is that the operation of the device is accelerated, if otherwise the function would otherwise first have to be selected from an operating menu of the device in order to then activate it.
  • the actual detection of the position of the hand and its movement can be carried out in a known manner. Thus, a system used for this purpose does not need to be structurally adapted in a particular way.
  • a further advantage results when, when activating the function selection via a display device of the operator interface of the operator is displayed, by which movement, ie which direction of movement, which function can be selected.
  • movement ie which direction of movement, which function can be selected.
  • at least one possible direction of movement and in each case one of the possible directions of movement assigned, selectable function is displayed.
  • the user interface is then without prior knowledge and thus very intuitive to use.
  • a user input regarding a specific function is received by the user interface and, depending on the user input, an assignment of the function to a possible direction of movement is determined.
  • Such an assignment thus represents an abbreviation (short cut) for The function selection is also possible. It is also possible for several functions in different directions of movement.
  • a further advantage results if the ZuOrd Vietnamesesvorschrift by which at least one function is assigned to a possible direction of movement, is determined depending on a current operating state of the device. Other functions are then offered when a telephone is currently active than when music is playing. To make this possible, several prepared ZuOrd Vietnamesesvorschriften are provided, for example, one for the telephony and one for music playback, and one of a change of the operating state, one of them as the current valid ZuOrd Vietnamesesvorschrift selected. Thus, a context-dependent user interface is realized in an advantageous manner.
  • an operator may be allowed to hand-grip into the space area and then determine whether to answer the telephone call (for example, pivoting the hand) to the left in the room area) or to reject the call (swiveling the hand to the right in the room area). If, on the other hand, the phone does not ring, but if, for example, music is being played back from a music file, it can be provided by a corresponding other assignment specification that the hand swiveling to the left is a previously played song and the hand is swiveling to the right to play one in the Playlist triggers subsequent song.
  • the sensor device additionally detects a respective finger position of at least one finger of the hand after the hand is detected within the spatial area and the function selection is activated only if the at least one finger has a respective predetermined finger position.
  • the operator interface responds only when the operator forms a predetermined character with his hand within the space area. This avoids accidental selection of functions.
  • the orientation of the rotation of the entire hand can also be checked to see whether it is inclined in a predetermined manner.
  • the recognition of the finger position and the inclination of the hand can be realized on the basis of a skeleton or hand model adapted to a shape of the hand recognizable in the image data. If the model conforms to the shape of the imaged hand, then a digital representation of the finger position and the position of the hand is available with the parameters of the model.
  • the invention also includes a motor vehicle.
  • the motor vehicle according to the invention has an operating interface for activating at least one function of the motor vehicle, wherein the operating interface identifies an optical sensor device and is designed to carry out an embodiment of the method according to the invention.
  • the operating interface is part of an infotainment system of the motor vehicle.
  • the motor vehicle according to the invention has the advantage that it is also possible for a driver to select and activate the functions of the motor vehicle during a journey without requiring a great deal of attention from the driver.
  • the sensor device has a TOF camera for observing the hand. Variable light conditions then have no particularly disturbing influence on the motion detection.
  • the use of a simple infrared camera can lead to an error detection when warm sunlight is received.
  • the TOF camera uses a light beam modulated in the light intensity (eg visible light or infrared light), so that foreign light sources can be reliably distinguished from the light source of the TOF camera.
  • the invention further includes a computer program product, ie a program stored on at least one storage medium. This is designed, when executed by a processor device of a mobile terminal, for example, a smartphone, a tablet PC, a personal digital assistant (PDA) or a notebook, on the basis of camera data of a camera of the mobile terminal, an embodiment of the method according to the invention perform.
  • the computer program product according to the invention thus has the advantage that the mobile terminal can be operated by simple wiping movements of one hand, if they are performed in a detection range of the camera of the mobile terminal.
  • the computer program product is preferably provided in a form known from the prior art in connection with so-called “apps” (applications - application programs) for mobile terminals.
  • the invention also encompasses a mobile terminal in which an embodiment of the computer program product according to the invention is provided.
  • the mobile terminal according to the invention may in particular be a smartphone, a tablet PC, a notebook or a personal digital assistant.
  • FIG. 1 shows a block diagram of an operating interface which may be installed in an embodiment of the motor vehicle according to the invention
  • FIG. 2 shows a sketch of an operating procedure as it is made possible for an operator on the basis of an embodiment of the method according to the invention.
  • a motor vehicle such as a passenger car
  • the optical sensor device 10 forms an operating
  • the playback device 12 can be, for example, an infotainment system, an audio system, a navigation system, a television system, a telephone system, a combination instrument or a head-up display.
  • the sensor device 10 comprises a measuring device 14 and a calculation unit 16.
  • the measuring device 14 comprises an optical sensor 18, which may be, for example, a TOF camera or PMD camera.
  • the optical sensor 18 may also be, for example, a stereo camera. In the example shown in FIG. 1, it has been assumed that the optical sensor 18 is a PMD camera.
  • the optical sensor 18 may be arranged, for example, in a headliner of the motor vehicle.
  • the optical sensor 8 can be configured in a manner known per se, ie a light source 20 for electromagnetic radiation, for example an infrared light, illuminates a detection area 22, for example a space above a center console of the motor vehicle. If there is an object in it, for example a hand 24 of the driver of the motor vehicle, then the radiation emitted by the light source 20 is reflected back by the hand 24 to a sensor array 26. By means of the sensor array 26 then 3D image data can be generated, which indicate 3D coordinates to individual surface elements of the hand 24. The 3D image data are transmitted from the measuring device 14 to the calculation unit 16.
  • the calculation unit 16 may be, for example, a control unit of the motor vehicle.
  • the signals are evaluated and then the evaluated data are made available to the vehicle, for example, by being transmitted to the reproduction device 12.
  • limbs such as, for example, the hand 24
  • limbs can be segmented from the 3D image data, whereby, for example, the position of a fingertip in the detection area 22 can be determined.
  • known per se segmentation algorithms can be used.
  • the 3D image data of the sensor array 26 of the optical sensor 18 can also represent a sequence of successive 3D images, ie movements of the hand 24 can also be detected with the optical sensor 18.
  • FIG. 2 shows how an embodiment of the method according to the invention can be implemented with the optical sensor device shown in FIG.
  • the optical sensor 26 is directed to an interior of the motor vehicle, so that in a detection area 28 of the optical sensor 26 therein objects are detected and to individual surface areas of the objects 3D image data, so for example coordinates, to their position in space by the measuring device 14th be generated and transferred to the calculation unit.
  • the detection area 28 may, for example, include a space area above a center console of the motor vehicle.
  • the calculation unit 26 recognizes from the 3D image data of the optical sensor 26 that it is a moving object in the shape of a hand and follows a movement trajectory of the hand 24 in the detection area 28.
  • the calculation unit 16 checks whether the hand is in a predetermined space area 30 is located.
  • the space region 30 may be, for example, a cuboid volume region, the edges of which may each independently have dimensions which are, for example, in a range from 20 cm to 50 cm.
  • a selection mode for a function selection is activated. It is thereby possible for the operator to select a function of the vehicle by means of a pivoting movement of the hand 24 in possible directions of movement 32, 34 (here, for example, either to the right or to the left), for example accepting a telephone call or changing a radio station, and this function then activate.
  • the function selection is activated, the calculation unit generates on a screen 36 a display 38 by means of which selectable functions 40, 42 are displayed.
  • the functions are represented in FIG. 2 by symbolic function names (F1, F2).
  • an image of a slide switch 44 is displayed, which intuitively suggests to the operator that from her now a movement with the hand 24 in one of the possible Movement directions 32, 34 is expected to select one of the functions 40, 42.
  • the selection is possible for the operator by a simple swipe with the hand 24 after the function selection has been activated.
  • the detection of the wiping movement is performed by the computing unit 16 based on the 3D image data of the optical sensor 26.
  • the current position of the hand 24 after activating the function selection is set as the starting position PO.
  • the computing unit 16 observes a change in the coordinates of a reference point on the hand 24, e.g. that reference point whose position has defined the starting position PO. As soon as the reference point has a distance to the starting position PO which is greater than a predetermined threshold, this is interpreted as a selection operation by the operator, i. H.
  • the direction of movement described by the starting point PO and the current position of the hand 24 is compared with one of the possible directions of movement 32, 34. If it is detected by the calculation unit that the hand 24 has been moved in one of the possible directions of movement 32, 34, then the corresponding function 40, 42 is selected and the selection of the display device 12 is signaled. As a result, the selected function can then be activated in the playback device 12.
  • the example shows how a clear and intuitive selection of functions can be made possible, which can be performed after a short learning period for a passenger by heart and without distraction from the traffic.
  • the operator can thus with a loss of attention with a single operating gesture make a selection, in particular a binary selection.
  • the operation can also be accelerated.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de fonctionnement d'une interface utilisateur (10) d'un appareil, notamment d'un véhicule automobile, un dispositif détecteur optique (26) permettant d'observer sans contact une main (24) d'un utilisateur puis une fonction (40, 42) de l'appareil (12) étant sélectionnée et activée en fonction de la direction (32, 34) du mouvement de la main (24). L'invention a pour but de procurer à un appareil une interface utilisateur qui permette à l'utilisateur de commander l'appareil même en prêtant peu d'attention. A cet effet, le dispositif détecteur (10) permet de détecter si l'utilisateur introduit la main (24) dans une zone spatiale (30) prédéfinie et, une fois la main (24) détectée dans la zone spatiale (30), une sélection de fonction est activée. Un mouvement de glissement effectué par la main (24) en direction d'un bord de la zone spatiale (30) est détecté et une direction de mouvement (32, 34) définie par le mouvement de glissement est déterminée. Finalement une fonction (40, 42) associée à la direction de mouvement (32, 34) est sélectionnée.
PCT/EP2013/003880 2013-01-08 2013-12-20 Interface utilisateur destinée à la sélection sans fil d'une fonction d'un appareil WO2014108160A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102013000081.5A DE102013000081B4 (de) 2013-01-08 2013-01-08 Bedienschnittstelle zum berührungslosen Auswählen einer Gerätefunktion
DE102013000081.5 2013-01-08

Publications (2)

Publication Number Publication Date
WO2014108160A2 true WO2014108160A2 (fr) 2014-07-17
WO2014108160A3 WO2014108160A3 (fr) 2014-11-27

Family

ID=49949611

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2013/003880 WO2014108160A2 (fr) 2013-01-08 2013-12-20 Interface utilisateur destinée à la sélection sans fil d'une fonction d'un appareil

Country Status (2)

Country Link
DE (1) DE102013000081B4 (fr)
WO (1) WO2014108160A2 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015006613A1 (de) * 2015-05-21 2016-11-24 Audi Ag Bediensystem und Verfahren zum Betreiben eines Bediensystems für ein Kraftfahrzeug
DE102018208827A1 (de) * 2018-06-05 2019-12-05 Bayerische Motoren Werke Aktiengesellschaft Anwenderschnittstelle, Fortbewegungsmittel und Verfahren zum Ermitteln einer Anwendereingabe
DE102019206606B4 (de) * 2019-05-08 2021-01-28 Psa Automobiles Sa Verfahren zur berührungslosen Interaktion mit einem Modul, Computerprogrammprodukt, Modul sowie Kraftfahrzeug

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050063564A1 (en) * 2003-08-11 2005-03-24 Keiichi Yamamoto Hand pattern switch device
US20060136846A1 (en) * 2004-12-20 2006-06-22 Sung-Ho Im User interface apparatus using hand gesture recognition and method thereof
US20110286676A1 (en) * 2010-05-20 2011-11-24 Edge3 Technologies Llc Systems and related methods for three dimensional gesture recognition in vehicles

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8411034B2 (en) * 2009-03-12 2013-04-02 Marc Boillot Sterile networked interface for medical systems
US8589824B2 (en) * 2006-07-13 2013-11-19 Northrop Grumman Systems Corporation Gesture recognition interface system
DE102009036369A1 (de) * 2009-08-06 2011-02-10 Volkswagen Ag Verfahren zum Betreiben einer Bedienvorrichtung und Bedienvorrichtung in einem Fahrzeug
JP5648207B2 (ja) * 2009-09-04 2015-01-07 現代自動車株式会社 車両用操作装置
US8547327B2 (en) * 2009-10-07 2013-10-01 Qualcomm Incorporated Proximity object tracker
US8817087B2 (en) * 2010-11-01 2014-08-26 Robert Bosch Gmbh Robust video-based handwriting and gesture recognition for in-car applications
US8861797B2 (en) * 2010-11-12 2014-10-14 At&T Intellectual Property I, L.P. Calibrating vision systems
US8619049B2 (en) * 2011-05-17 2013-12-31 Microsoft Corporation Monitoring interactions between two or more objects within an environment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050063564A1 (en) * 2003-08-11 2005-03-24 Keiichi Yamamoto Hand pattern switch device
US20060136846A1 (en) * 2004-12-20 2006-06-22 Sung-Ho Im User interface apparatus using hand gesture recognition and method thereof
US20110286676A1 (en) * 2010-05-20 2011-11-24 Edge3 Technologies Llc Systems and related methods for three dimensional gesture recognition in vehicles

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MARTIN ZOBL ET AL: "Gesture Components for Natural Interaction with In-Car Devices", 6. Februar 2004 (2004-02-06), GESTURE-BASED COMMUNICATION IN HUMAN-COMPUTER INTERACTION; [LECTURE NOTES IN COMPUTER SCIENCE;LECTURE NOTES IN ARTIFICIAL INTELLIGENCE;LNCS], SPRINGER-VERLAG, BERLIN/HEIDELBERG, PAGE(S) 448 - 459, XP019003059, ISBN: 978-3-540-21072-6 Seiten 448-459, das ganze Dokument *

Also Published As

Publication number Publication date
DE102013000081A1 (de) 2014-07-10
WO2014108160A3 (fr) 2014-11-27
DE102013000081B4 (de) 2018-11-15

Similar Documents

Publication Publication Date Title
EP2943367B1 (fr) Procédé de synchronisation de dispositifs d'affichage d'un véhicule automobile
EP2451672B1 (fr) Procédé et dispositif permettant de fournir une interface utilisateur dans un véhicule
EP1998996B1 (fr) Serveur interactif et procédé permettant de faire fonctionner le serveur interactif
EP3507681B1 (fr) Procédé d'interaction avec des contenus d'image qui sont représentés sur un dispositif d'affichage dans un véhicule
DE102013012394A1 (de) Verfahren und Vorrichtung zur Fernsteuerung einer Funktion eines Fahrzeugs
EP3358454B1 (fr) Interface utilisateur, véhicule et procédé de distinction de l'utilisateur
DE102012020607B4 (de) Kraftwagen mit einer Gestensteuerungseinrichtung sowie Verfahren zum Steuern eines Auswahlelements
WO2014108147A1 (fr) Zoom et déplacement d'un contenu d'image d'un dispositif d'affichage
DE102013000071B4 (de) Synchronisieren von Nutzdaten zwischen einem Kraftfahrzeug und einem mobilen Endgerät
EP2196359B1 (fr) Procédé de fonctionnement et dispositif de fonctionnement
DE102013000069B4 (de) Kraftfahrzeug-Bedienschnittstelle mit einem Bedienelement zum Erfassen einer Bedienhandlung
WO2014108160A2 (fr) Interface utilisateur destinée à la sélection sans fil d'une fonction d'un appareil
WO2014108150A2 (fr) Interface utilisateur pour une entrée de caractères manuscrite dans un appareil
EP3426516B1 (fr) Dispositif de commande et procédé pour détecter la sélection, par l'utilisateur, d'au moins une fonction de commande du dispositif de commande
EP1992515B1 (fr) Dispositif d'affichage polyvalent et de commande ainsi que procédé de fonctionnement d'un dispositif d'affichage polyvalent et de commande d'un véhicule automobile
DE102018204223A1 (de) Mobile, portable Bedienvorrichtung zum Bedienen eines mit der Bedienvorrichtung drahtlos gekoppelten Geräts, und Verfahren zum Betreiben eines Geräts mithilfe einer mobilen, portablen Bedienvorrichtung
DE102013014889A1 (de) Mauszeigersteuerung für eine Bedienvorrichtung
EP3108333B1 (fr) Interface utilisateur et procédé d'assistance d'un utilisateur lors de la commande d'une interface utilisateur
DE102015201722A1 (de) Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung
WO2015010829A1 (fr) Procédé de fonctionnement d'un dispositif d'entrée et dispositif d'entrée
EP2835719B1 (fr) Véhicule automobile doté d'un dispositif de commande commutable
DE102013000085A1 (de) Verfahren zum Wechseln eines Betriebsmodus eines Infotainmentsystems eines Kraftfahrzeugs
DE102019131944A1 (de) Verfahren zur Steuerung zumindest einer Anzeigeeinheit, Kraftfahrzeug und Computerprogrammprodukt
DE102012021252A1 (de) Bedienvorrichtung und Verfahren zum Ansteuern von Funktionseinheiten und Kraftfahrzeug
DE102019212278A1 (de) Bediensystem und Verfahren zum Betreiben des Bediensystems

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13818984

Country of ref document: EP

Kind code of ref document: A2

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
122 Ep: pct application non-entry in european phase

Ref document number: 13818984

Country of ref document: EP

Kind code of ref document: A2