WO2014108147A1 - Zoom et déplacement d'un contenu d'image d'un dispositif d'affichage - Google Patents

Zoom et déplacement d'un contenu d'image d'un dispositif d'affichage Download PDF

Info

Publication number
WO2014108147A1
WO2014108147A1 PCT/EP2013/003375 EP2013003375W WO2014108147A1 WO 2014108147 A1 WO2014108147 A1 WO 2014108147A1 EP 2013003375 W EP2013003375 W EP 2013003375W WO 2014108147 A1 WO2014108147 A1 WO 2014108147A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
display device
sensor device
image content
motor vehicle
Prior art date
Application number
PCT/EP2013/003375
Other languages
German (de)
English (en)
Inventor
Michael SCHLITTENBAUER
Martin Roehder
Lorenz Bohrer
Original Assignee
Audi Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi Ag filed Critical Audi Ag
Publication of WO2014108147A1 publication Critical patent/WO2014108147A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour

Definitions

  • the invention relates to a method for operating a display device, by means of which a picture content is displayed on a screen.
  • a location and / or a scaling of the image content can be changed by moving a hand in front of the screen.
  • a zooming and / or shifting of the image content can be effected by a hand movement.
  • a position of the hand is repeatedly determined by an optical sensor device.
  • the invention also includes a motor vehicle, a mobile terminal, such as a smartphone or a tablet PC, and a computer program product with which a conventional mobile terminal for carrying out the method according to the invention is improved.
  • the device has a touch-sensitive control panel that allows a user to hold their hand to move or scale a map on a screen by moving their hand.
  • the movement of the hand is detected by a CCD camera or a CMOS camera.
  • a distance of the palm from the control panel can be detected by ultrasonic sensors. Another possibility for the distance determination is the evaluation of brightness values of the palm.
  • the HMI device also detects this movement. Then it can happen that the map section is moved again and changed in the scale becomes. Such a "tearing" at the end of an operation should be avoided if possible.
  • a method for operating a navigation system is likewise known. Thereafter, a position of a finger is detected relative to a display device of the navigation device, and a representation of navigation information is adjusted as a function of the determined position of the finger. As a result, the user can, for example, effect an enlargement of the area (scaling) of the navigation information. A displacement of the section shown is thereby possible.
  • the position of the finger over the display surface can be detected with four optical sensors mounted on corners of the display surface. For this purpose, the finger is illuminated by means of a LED with modulated light which impinges on the sensors at different times. A run time measurement then gives the position of the finger.
  • the user can either first move the finger into a predetermined volume of space above the display surface in which a movement of the finger does not change the image content.
  • Another possibility is to change the "areal extent" of the finger or to carry out a specific movement with the finger, meaning in this case that the user has to move the finger very close to the display area, and then laterally
  • Such "gestures” can be recognized by the system and used to stop the motion detection.
  • An embodiment provides that also operating elements that are integrated into the map display can be selected with the finger. For this purpose, the finger must dwell on the operating element for a predetermined period of operation.
  • the invention has for its object to provide a user with an easy-to-use display device, which is a changing a La ge and / or a scaling (ie moving and / or zooming) of an image content on a screen allows.
  • the method according to the invention sets a position and / or a scaling of an image content on a screen of a display device as a function of a current position of a user's hand. If the hand changes its position, the position of the image content is thereby changed accordingly (shifting of the image content) and / or the scaling of the image content is changed (zooming in or out). The position of the hand is detected by an optical sensor device.
  • the display device may be, for example, that of a navigation device or an infotainment system of a motor vehicle.
  • embodiments of the method are also encompassed by the invention which are executed on a mobile terminal so as to shift and / or to the image content of a screen, for example a smartphone, a tablet PC, a personal digital assistant (PDA) or a notebook to zoom.
  • a mobile terminal so as to shift and / or to the image content of a screen, for example a smartphone, a tablet PC, a personal digital assistant (PDA) or a notebook to zoom.
  • PDA personal digital assistant
  • the image content is only changed by the display device as a function of the determined position of the hand if the display device is in a mode which is referred to here as a coupling mode.
  • the hand first has to be "coupled” to the display device, for this purpose a relative finger position of several fingers of the hand is recognized according to the method by a gesture recognition device of the optical sensor device. tearing ", because the position of the hand is crucial and not that of the individual fingers.
  • the display device is operated only in the coupling mode while signaled by the gesture recognition device is that the finger position matches a predetermined operating gesture.
  • the gesture recognition device checks whether at least two fingertips touch each other as an operating gesture. By merging the fingers, the user can thus activate the shifting or zooming process (coupling mode activated) and then adjust the image content intuitively according to his needs by moving his hand. If he then separates the fingertips from each other again (coupling mode disabled), the hand is decoupled from the display device and it can change the position of the hand arbitrarily without affecting the image content.
  • the described operating gesture (touching of fingertips) has the additional advantage that the user himself generates a haptic feedback at the fingertips, and it is so clear to him when the coupling mode is active.
  • the inventive method has the advantage that the described "tearing" of the image content can now be easily avoided.
  • the independent detection of the hand position on the one hand and the finger position on the other hand allows the user to couple his hand movements with the image content at any time, as desired
  • 3D image data of a 3D camera of the sensor device are preferably used as the basis, whereby the 3D camera is preferably arranged outside the display device and monitors one from there
  • a TOF camera TOF - Time of Flight
  • a TOF camera is characterized by a particularly large detection range, so monitored by means of a single TOF camera multiple screens in the manner described and their image content can be controlled.
  • the shading is not a problem, as it results from the fact that by means of a 3D camera usually only from one side of a hand can be viewed. Such shading can be achieved by means of a manual
  • a further advantage of a TOF camera is that it can be used to record 3D image data even in changeable lighting conditions, even for the If infrared light is used for image acquisition, this can be reliably distinguished from, for example, warming solar radiation on the basis of the modulation pattern of the TOF light source.
  • a further advantage results if the display device sets an initial position which the hand has when activating the coupling mode as the reference position and then determines a distance of the current position of the hand from the reference position in the coupling mode and / or the scaling and / or or shift (change in position), depending on the distance is set.
  • the user can "catch up" in an advantageous manner, meaning that the user only has to perform small hand movements for a larger displacement, for example, the user can first activate the coupling mode for a large shift of the image content to the right (eg, merge fingertips).
  • the motor vehicle according to the invention has a display device with a screen for displaying an image content, for example a navigation map, as well as an optical sensor device
  • the optical sensor device is designed as a function of a position of a hand to generate a signal in front of the screen, by which then the image content on the screen in the position and / or scaling of the display device is changed by the display device.
  • the motor vehicle according to the invention is characterized in this case by a gesture recognition device and is designed to carry out an embodiment of the method according to the invention by means of the gesture recognition device.
  • a development of the motor vehicle according to the invention comprises an SD camera for detecting the position of the hand and / or for determining the position of the finger, wherein the 3D camera in a headliner of the motor vehicle is arranged. From this position, a plurality of screens are preferably observed by means of a single 3D camera or a single SD camera stereo arrangement of two 3D cameras and an operation in the manner described is made possible for each of these screens. This makes the control of multiple screens particularly little effort.
  • a motor vehicle already has an operating device with an optical sensor device, which is also suitable for carrying out an embodiment of the method according to the invention, then it is provided in particular for another display device, which differs from the already existing operating device of the motor vehicle, this optical sensor device the existing operating device is used to allow control of the image content by hand for the further display device.
  • this optical sensor device no additional optical sensor device has to be provided for the additional display device, which brings about an advantageous reduction of the hardware expenditure when manufacturing the motor vehicle.
  • the invention also encompasses a computer program product with a program code stored on at least one storage medium, which is designed to execute an embodiment of the method according to the invention when the program code is executed by a processor device of a mobile terminal on the basis of image data of an optical sensor device of the mobile terminal ,
  • the computer program is provided in particular in the form of a so-called "app" (application application program), as it can be used for smartphones or tablet PCs
  • the invention also includes a correspondingly equipped mobile terminal with such a computer program product Terminal may be provided in the already exemplified embodiments (smartphone, tablet PC, etc.).
  • FIG. 2 is a sketch of an operating process, as an operator on the basis of an embodiment of the inventive method is made possible.
  • the reproduction device 12 can be, for example, an infotainment system, an audio system, a navigation system, a television system, a telephone, an instrument cluster or a head-up display.
  • the sensor device 10 comprises a measuring device 14 and a calculation unit 16.
  • the measuring device 14 comprises an optical sensor 18, which may be, for example, a TOF camera (such as a PMD camera, PMD photon mixing detector).
  • the optical sensor 18 can also be, for example, a stereo camera or even TOF cameras in stereo arrangement. In the example shown in FIG. 1, it has been assumed that the optical sensor 18 is a PMD camera.
  • the optical sensor 18 may be arranged, for example, in a headliner of the motor vehicle.
  • the optical sensor 18 may be configured in a manner known per se, i. a light source 20, e.g. an infrared light illuminates a detection area 22, for example a space above a center console of the motor vehicle. If there is an object in it, for example a hand 24 of the driver of the motor vehicle, then the electromagnetic radiation emitted by the light source 20 is reflected back by the hand 24 to a sensor array 26. By means of the sensor array 26, 3D image data can then be generated which specify 3D coordinates for individual surface elements of the hand 24. The 3D image data are transmitted from the measuring device 14 to the calculation unit 16.
  • a light source 20 e.g. an infrared light illuminates a detection area 22, for example a space above a center console of the motor vehicle. If there is an object in it, for example a hand 24 of the driver of the motor vehicle, then the electromagnetic radiation emitted by the light source 20 is reflected back by the hand 24 to a sensor array 26.
  • 3D image data can then be generated which specify 3
  • the calculation unit 16 may be, for example, a control unit of the motor vehicle.
  • the Evaluated signals and then provided the vehicle the evaluated data available by, for example, transmitted to the display device 12.
  • limbs such as a hand, can be segmented from the 3D image data, whereby, for example, the position of a fingertip in the detection area 22 can be determined.
  • known per se segmentation algorithms can be used.
  • the 3D image data of the sensor array 26 of the optical sensor 18 may also represent a sequence of successive 3D images, i. With the optical sensor 18 and movements of the hand 24 can be detected. By tracing the trajectory, for example the fingertip in this 3D image sequence, in particular by tracking the position and the speed of the fingertip, a motion gesture indicated by the fingertip can be extrapolated from the trajectory.
  • FIGS. 2 and 3 describe the mode of operation of the optical sensor device as a user interface in connection with a navigation device, which here functions as a display device 12, for which reason reference is made below to the playback device 12 as the navigation device 12 with the same reference number.
  • the navigation device 12 displays on a screen 28, which may be installed, for example, in a center console of the motor vehicle, a road map 30 as image content.
  • the optical sensor 26 may, for example, be arranged in a headliner above the center console and observe a spatial area above the center console as the detection area 22, ie. H. generated 3D image data for objects located therein to this space area.
  • the measuring device 14 transmits the 3D image data to the calculation unit 16, which determines in the described manner whether a hand 24 is located in front of the screen 28 and which finger position the hand has.
  • the calculation unit 16 determines a current position P of the hand, for example a reference point on a back of the hand 24, as well as a position of a fingertip 32 of a thumb 34 and a finger tip 36 of an index finger 38.
  • the finger position of the thumb 34 and the index finger 28, in particular the relative position of the fingertips 32, 36 can be compared with a predetermined desired gesture. This may be, for example, that touch the fingertips 32, 36. If this is the case, then this is the navigation device 12 together with the current position P signals.
  • the user can then move with his hand 24, the road map 30 laterally on the screen 28, ie up, down, right or left and also a scale of the road map 30 change. He does not have to touch the screen 28 for this purpose, but can simply move his hand 24 in front of the screen 28 either correspondingly (moving the card 30) or move away from the screen 28 or to move to it (changing the scale).
  • FIG. 2 shows how the user can perform a movement in front of the screen 28 with the hand 24 without the image content of the screen 28 changing. For this he keeps the fingertips 32, 36 separated from each other. In Fig. 2, this is symbolized by two separate circles around the fingertips 32, 36.
  • the user has now positioned his hand 24 in front of the screen 28 when the map view is activated (road map 30 is displayed), he can activate the control of the map by, as shown in FIG. 3, in the example, thumb 34 and index finger 38 so that the fingertips 32, 36 touch. In Fig. 3, this state is symbolized by a single closed curve around both fingertips 32, 36.
  • the current position is defined as the reference point PO for the control of the navigation device 12.
  • the current position P of the hand 24 can be further changed. This changes a distance A between the current position P and the reference point PO.
  • the representation of the road map 30 is then changed depending on the distance A and the direction of movement of the hand 24. For example, by movements 40 on the screen 28 to and away from the hand 24, e.g. into or out of the street map 30. With a horizontal or vertical movement 42 in front of the screen 28, the road map 30 is moved in the appropriate direction 42.
  • the example shows how the physically correct location of an occupant (for example, by a TOF camera) or body parts of the occupant can be used to detect gestures or hand configurations. If the user positions his / her hand in front of a display when the map view is activated, then a control of the hand can be activated or deactivated by means of a corresponding finger position. There is no need for a button for the interaction.
  • the user can navigate with intuitive gestures to position the hand in the map.
  • a gesture recognition unit which may be provided in the calculation unit 16
  • gestures of the fingers can be properly extracted in a manner known per se, whereby the position of fingertips can be determined at any time. If one then follows the trajectory (trajectory) of the hand, e.g.
  • control signals for adjusting the position and / or scaling of the road map and other image content can be set.
  • the driver does not need to locate keys or softkeys (panels on a screen) before using a shift and / or a scaling function, but can immediately activate the card controller with the hand freely positioned in front of the screen 28. This also advantageously reduces the distraction of a driver while driving when operating a navigation device or other playback device.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de commande d'un dispositif d'affichage (12) permettant d'afficher un contenu d'image (30) sur un écran (28). Un dispositif de détection optique (10) permet de déterminer une position (P) d'une main (24) devant l'écran (28) et de déplacer le contenu de l'image et/ou de zoomer sur celui-ci en fonction de la position actuelle (P). L'invention vise à fournir à un utilisateur un dispositif d'affichage facile à utiliser qui permet de déplacer le contenu de l'image (30) et/ou de zoomer sur celui-ci. A cet effet, le contenu de l'image (30) est modifié par le dispositif d'affichage uniquement en fonction de la position déterminée (P) de la main (24) si le dispositif d'affichage (12) se trouve dans un mode couplage. Un dispositif de reconnaissance gestuelle (16) du dispositif de détection optique permet de déceler une position relative de plusieurs doigts (34, 38) de la main (24) et d'utiliser le dispositif d'affichage (12) uniquement en mode couplage, tandis que le dispositif de reconnaissance gestuelle signale que la position des doigts (34, 38) concorde avec un geste d'utilisation prédéfini.
PCT/EP2013/003375 2013-01-08 2013-11-08 Zoom et déplacement d'un contenu d'image d'un dispositif d'affichage WO2014108147A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102013000066.1A DE102013000066A1 (de) 2013-01-08 2013-01-08 Zoomen und Verschieben eines Bildinhalts einer Anzeigeeinrichtung
DE102013000066.1 2013-01-08

Publications (1)

Publication Number Publication Date
WO2014108147A1 true WO2014108147A1 (fr) 2014-07-17

Family

ID=49582701

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2013/003375 WO2014108147A1 (fr) 2013-01-08 2013-11-08 Zoom et déplacement d'un contenu d'image d'un dispositif d'affichage

Country Status (2)

Country Link
DE (1) DE102013000066A1 (fr)
WO (1) WO2014108147A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10821831B2 (en) 2016-09-01 2020-11-03 Volkswagen Aktiengesellschaft Method for interacting with image contents displayed on a display device in a transportation vehicle
WO2021082711A1 (fr) * 2019-10-30 2021-05-06 维沃移动通信有限公司 Procédé d'affichage d'image et dispositif électronique

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014207637A1 (de) 2014-04-23 2015-10-29 Bayerische Motoren Werke Aktiengesellschaft Gesteninteraktion mit einem Fahrerinformationssystem eines Fahrzeuges
DE102014017179B4 (de) * 2014-11-20 2022-10-06 Audi Ag Verfahren zum Betreiben eines Navigationssystems eines Kraftfahrzeugs mittels einer Bediengeste
DE102015211521A1 (de) * 2015-06-23 2016-12-29 Robert Bosch Gmbh Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung
DE102016212682A1 (de) 2016-07-12 2018-01-18 Audi Ag Gestensteuerung mittels eines Laufzeitmessungskamerasystems
DE102017213435A1 (de) 2017-08-02 2019-02-07 Audi Ag Verfahren und Vorrichtung zur Bedienung einer Bedienvorrichtung in Fahrzeugen

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1830244A2 (fr) * 2006-03-01 2007-09-05 Audi Ag Procédé et dispositif destinés à l'utilisation d'au moins deux composants fonctionnels d'un système, en particulier d'un véhicule
WO2010147600A2 (fr) * 2009-06-19 2010-12-23 Hewlett-Packard Development Company, L, P. Instruction qualifiée
US20110218696A1 (en) * 2007-06-05 2011-09-08 Reiko Okada Vehicle operating device
DE102011089195A1 (de) * 2011-06-30 2013-01-03 Johnson Controls Gmbh Vorrichtung und Verfahren zur berührungslosen Erfassung von Gegenständen und/oder Personen und von diesen ausgeführten Gesten und/oder Bedienvorgängen

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006037157B4 (de) 2005-09-12 2018-04-12 Brose Fahrzeugteile Gmbh & Co. Kommanditgesellschaft, Coburg Kunststoff-Türmodul für eine Kraftfahrzeugtür
US8411034B2 (en) * 2009-03-12 2013-04-02 Marc Boillot Sterile networked interface for medical systems
US9696808B2 (en) * 2006-07-13 2017-07-04 Northrop Grumman Systems Corporation Hand-gesture recognition method
JP2008250774A (ja) 2007-03-30 2008-10-16 Denso Corp 情報機器操作装置
WO2012020865A1 (fr) * 2010-08-13 2012-02-16 엘지전자 주식회사 Terminal mobile, dispositif d'affichage et leur procédé de commande
US8817087B2 (en) * 2010-11-01 2014-08-26 Robert Bosch Gmbh Robust video-based handwriting and gesture recognition for in-car applications

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1830244A2 (fr) * 2006-03-01 2007-09-05 Audi Ag Procédé et dispositif destinés à l'utilisation d'au moins deux composants fonctionnels d'un système, en particulier d'un véhicule
US20110218696A1 (en) * 2007-06-05 2011-09-08 Reiko Okada Vehicle operating device
WO2010147600A2 (fr) * 2009-06-19 2010-12-23 Hewlett-Packard Development Company, L, P. Instruction qualifiée
DE102011089195A1 (de) * 2011-06-30 2013-01-03 Johnson Controls Gmbh Vorrichtung und Verfahren zur berührungslosen Erfassung von Gegenständen und/oder Personen und von diesen ausgeführten Gesten und/oder Bedienvorgängen

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10821831B2 (en) 2016-09-01 2020-11-03 Volkswagen Aktiengesellschaft Method for interacting with image contents displayed on a display device in a transportation vehicle
WO2021082711A1 (fr) * 2019-10-30 2021-05-06 维沃移动通信有限公司 Procédé d'affichage d'image et dispositif électronique

Also Published As

Publication number Publication date
DE102013000066A1 (de) 2014-07-10

Similar Documents

Publication Publication Date Title
EP1998996B1 (fr) Serveur interactif et procédé permettant de faire fonctionner le serveur interactif
WO2014108147A1 (fr) Zoom et déplacement d'un contenu d'image d'un dispositif d'affichage
EP2943367B1 (fr) Procédé de synchronisation de dispositifs d'affichage d'un véhicule automobile
EP1840522B1 (fr) Appareil de navigation et procédé destinés au fonctionnement d'un appareil de navigation
EP3507681B1 (fr) Procédé d'interaction avec des contenus d'image qui sont représentés sur un dispositif d'affichage dans un véhicule
EP2822812B1 (fr) Véhicule automobile avec un miroir électronique
DE102014116292A1 (de) System zur Informationsübertragung in einem Kraftfahrzeug
DE102012216195A1 (de) Eingabeeinrichtung
DE102009019561A1 (de) Verfahren zum Anzeigen von Informationen in einem Kraftfahrzeug und Anzeigeeinrichtung
DE102012020607B4 (de) Kraftwagen mit einer Gestensteuerungseinrichtung sowie Verfahren zum Steuern eines Auswahlelements
EP2754016A1 (fr) Dispositif de commande destiné à un véhicule automobile et procédé de commande du dispositif de commande destiné à un véhicule automobile
WO2015135737A1 (fr) Procédé et dispositif servant à fournir une interface graphique d'utilisateurs dans un véhicule
DE102013020795A1 (de) Verfahren und Anordnung zum Steuern von Funktionen eines Kraftfahrzeugs
DE102012018685B4 (de) System und Verfahren zur Steuerung von zumindest einem Fahrzeugsystem mittels von einem Fahrer durchgeführter Gesten
DE102009019560A1 (de) Verfahren und Vorrichtung zum Anzeigen von in Listen geordneter Information
WO2015162058A1 (fr) Interaction de gestes avec un système d'information de conducteur d'un véhicule
DE102013000069B4 (de) Kraftfahrzeug-Bedienschnittstelle mit einem Bedienelement zum Erfassen einer Bedienhandlung
WO2017054894A1 (fr) Système de commande interactif et procédé de réalisation d'une opération de commande sur un système de commande intéractif
DE102013013166A1 (de) Kraftwagen mit Head-up-Anzeige und zugehöriger Gestenbedienung
DE102019206606B4 (de) Verfahren zur berührungslosen Interaktion mit einem Modul, Computerprogrammprodukt, Modul sowie Kraftfahrzeug
DE102013000081B4 (de) Bedienschnittstelle zum berührungslosen Auswählen einer Gerätefunktion
DE102015201722A1 (de) Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung
DE102013211046A1 (de) Verfahren und Vorrichtung zum Gewinnen eines Stellsignals aus einer Bediengeste
EP3025214A1 (fr) Procédé de fonctionnement d'un dispositif d'entrée et dispositif d'entrée
DE102019212278A1 (de) Bediensystem und Verfahren zum Betreiben des Bediensystems

Legal Events

Date Code Title Description
DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13789710

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 13789710

Country of ref document: EP

Kind code of ref document: A1