EP3108332A1 - Interface utilisateur et procédé de commutation d'un premier mode de commande d'une interface utilisateur en un mode gestuel en 3d - Google Patents

Interface utilisateur et procédé de commutation d'un premier mode de commande d'une interface utilisateur en un mode gestuel en 3d

Info

Publication number
EP3108332A1
EP3108332A1 EP15705545.0A EP15705545A EP3108332A1 EP 3108332 A1 EP3108332 A1 EP 3108332A1 EP 15705545 A EP15705545 A EP 15705545A EP 3108332 A1 EP3108332 A1 EP 3108332A1
Authority
EP
European Patent Office
Prior art keywords
user interface
user
hand
predefined
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP15705545.0A
Other languages
German (de)
English (en)
Inventor
Holger Wild
Mark Peter Czelnik
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volkswagen AG
Original Assignee
Volkswagen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volkswagen AG filed Critical Volkswagen AG
Publication of EP3108332A1 publication Critical patent/EP3108332A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • the present invention relates to a user interface as well as a method for changing from a first operating mode of a user interface into an SD gesture mode, by means of which an operation of the user interface can be carried out by a plurality of freely executed in space gestures.
  • the present invention relates to avoiding erroneous operations by a comfortable yet secure initialization of a 3D gesture mode.
  • the gesture is detected and converted into control commands.
  • the start of the 3D gesture operation and the
  • DE 10 2012 000 263 A1 discloses a method and a device for operating functions in a vehicle using in three-dimensional space
  • Detection device detects a hand or a finger of the user for a first predetermined period of time in a valid detection area, a gesture control is activated.
  • a gesture control is activated for the learning process of the user. It is proposed to permit blind operation by the user by visual and / or acoustic feedback during a gesture operation.
  • DE 10 2006 037 156 A1 describes an interactive operating device, by means of which the gesture operation of a graphical content displayed on a display device is made possible. Depending on a determined operating intention the on the
  • Display device displayed graphical content for activating one with it shown optimized function. For example, when approaching a button, the button will be enlarged.
  • DE 10 2009 008 041 A1 discloses a method for operating a motor vehicle with a touchscreen, in which an approach sensor system for detecting a gesture for executing or initializing the execution of a function of the motor vehicle is provided. If a gesture of a hand is performed in the vicinity of the touch screen without touching the touch screen, the function of the motor vehicle is performed.
  • Computer program product a signal sequence and a means of transport proposed to solve the problem.
  • the method is used to switch from a first operating mode, in which a 3D gesture operation for starting a plurality of functions of the user interface or a communication technology connected to him computer system is not possible, in a 3D gesture mode, by means of which an operation of the user interface can be done by a variety of gestures executed freely in space.
  • the first operating mode can be completely based on the operation via hardware-configured buttons, turn-push dials,
  • a touch-sensitive surface of a display unit e.g., a touch screen.
  • functions associated with graphic contents are not called by 3D gestures.
  • a hand of a user is detected in a predefined detection area of a sensor in a first step.
  • the sensor can
  • a predefined attitude of the user's hand and / or a predefined movement of the user's hand are detected.
  • the predefined attitude of the user's hand and / or a predefined movement of the hand can be used.
  • the inventive method comprises starting a timer.
  • the timer may be started in response to detecting a predefined posture of the user's hand and / or in response to detecting a predefined movement of the user's hand.
  • the 3D gesture mode is changed only if the predefined posture and / or movement is not ended before the timer expires.
  • the timer expires, it is checked if the predefined posture / movement is still in progress and changed from the first operating mode to the SD-gesture mode only in response to both conditions.
  • the predefined posture may include, for example, a palm open in the direction of a sensor used for 3D gesture recognition and / or in the direction of a display unit of the user interface.
  • the opened palm is substantially parallel to a surface of the display unit or substantially perpendicular in the direction of the sensor.
  • another gesture or other gestures may be predefined to start or end the gesture operation. Further examples of this are a so-called "thumbs up" gesture or a gripping gesture (bale of a fist opened from, for example, in the direction of the sensor used and / or in the direction of a display unit of the user interface
  • Hand gesture or merge several or all fingertips Such gestures have been found to be sensibly secure and therefore found to be suitable for starting the SD gesture mode.
  • an optical and / or audible indication to the user may be made that the 3D gesture mode is now activated.
  • this indication can also be made in response to the change to the SD gesture mode. In this way it is avoided that the user takes the predefined posture unnecessarily long or carries out movement, although the SD gesture operation can already take place. For example, a driver may become one
  • a hand symbol in an edge area of a graphic content of a display unit can be displayed, in particular animated.
  • a sound signal, a click sound or a voice output can be made as an acoustic indication.
  • a user interface comprising a sensor for detection by a user's hand of free gestures executed in space. Furthermore, the user interface has a display unit for displaying a random graphic content.
  • the display unit may be configured, for example, as a matrix screen, which is embedded in the dashboard of a means of transportation as an instrument cluster and / or as a central information display.
  • an evaluation unit which may comprise a programmable processor, for example, is provided for detecting a plurality of gestures in the signals of the sensor.
  • evaluation units are also referred to as “electronic control unit (ECU).”
  • ECU electronic control unit
  • the evaluation unit is set up to detect a predefined position of the user's hand and / or a predefined movement of the user's hand, for which the evaluation unit can analyze the detected movement patterns and compares with references stored (for example, in a local or wireless database), if the match is successful, in other words, a detected 3D gesture is associated with the start of the 3D gesture mode
  • Evaluation unit the user interface of the invention of a first
  • the senor may be an optical sensor, which is arranged for example in the headliner of a means of locomotion.
  • a sensor operating in the infrared range may also be provided to enable or support gesture recognition.
  • an optical and / or acoustic output unit for outputting an indication of the successful start of the 3-D gesture mode can be provided.
  • LEDs or matrix displays have prevailed as optical output units
  • loudspeakers are preferably used for the output of acoustic information (for example speech output, sound signals, sounds).
  • a third aspect of the present invention is a
  • a computer program product e.g., a data memory
  • instructions are stored containing a programmable processor
  • Evaluation unit of a user interface according to the second-mentioned aspect of the present invention enable to perform the steps of a method according to the first-mentioned aspect of the invention.
  • the computer program product can be as CD, DVD, Blu-ray Disc, Flash Memory, Hard Disk, RAM / ROM, Cache etc.
  • a signal sequence representing instructions which enable a programmable processor of a user interface evaluation unit according to the second-mentioned aspect of the present invention to carry out the steps of a method according to the first-mentioned aspect of the invention is proposed.
  • the information technology provision of the instructions is provided for the case under protection, that the storage means required for this purpose are outside the scope of the appended claims.
  • a user terminal which may for example be designed as a portable electronic device.
  • electronic wireless communication devices such as
  • Smartphones tablet PCs or laptops / netbooks, suitable for one
  • Portable user terminals often have all the hardware required to support 3D gesture operation. With regard to the user terminal and the associated features and advantages, reference is made to the above statements.
  • a means of transportation is proposed, which may be designed, for example, as a car, van, truck, aircraft and / or watercraft.
  • a means of transportation is proposed, which may be designed, for example, as a car, van, truck, aircraft and / or watercraft.
  • Display unit can be designed as fixed in a dashboard of the means of transport built central screen and / or as an instrument cluster. In this way, the means of locomotion is also set up to solve the problem of the invention or for use in a method according to the invention.
  • Figure 1 is a schematic overview of components of an embodiment of a user interface according to the invention in an embodiment of an inventively designed car;
  • Figure 2 is a schematic overview of components of an embodiment of a user interface according to the invention in an embodiment of an inventively designed user terminal;
  • FIG. 3 shows a diagram for illustrating a first method step of a
  • FIG. 4 shows a selection of gestures that can be used according to the invention
  • FIG. 5 shows a diagram for illustrating a first method step of a second exemplary embodiment of a method according to the invention
  • FIG. 6 shows a flow chart, illustrating steps of an embodiment of a method according to the invention
  • Figure 1 shows a car 10 as a means of transport, in which a screen 3 of a user interface according to the invention as a display unit in the dashboard of the car 10 is inserted. Below the screen 3, a sensor 5 is arranged, which spans a detection area 9 in front of the screen 3 in space. A speaker 1 1 is provided for outputting hints of the user interface 1. Finally, there is also a data store s for providing predefined references for the
  • the aforementioned components are connected to an evaluation unit in the form of an electronic control unit 7 in terms of information technology.
  • the electronic control unit 7 is also set up to display graphic content 2 on the screen 3.
  • FIG. 2 shows an overview of components of an embodiment of a user interface according to the invention in an embodiment of a user terminal 20 according to the invention in the form of a smartphone.
  • the screen 3 of the smartphone represents the display unit, a microprocessor 7, the evaluation unit, a flash memory 8, the memory means and a speaker 1 1 a signal generator of the user interface 1 according to the invention.
  • an infrared LED strip 5 as a sensor with the microprocessor 7 connected in terms of information technology.
  • the latter is also coupled to an antenna 17 for wireless communication with a wireless infrastructure or other user terminals.
  • a keyboard 6 of the smartphone 20 is partially cut to allow the view of the components behind it.
  • FIG. 3 shows an operating step in which a menu 2 for selecting a radio station as graphical content by a user's hand 4 is to be operated on a screen 3 as a display unit.
  • the user's hand 4 executes a predefined gesture in the detection area of the user interface. This consists in the example in that the open hand points in the direction of the screen 3 and the five fingers of the hand 4 are held out and held moderately spread.
  • the evaluation unit (not shown) is set up, also further gestures of the hand 4 into commands for controlling the menu 2 on the
  • FIG. 4 shows possible hand postures and gestures for interaction with a person
  • inventive user interface 1 In part a, the open palm of the hand 4 points in the direction of the screen 3. The fingers of the hand 4 are stretched out and (slightly) spread. Partial figure b shows a pointing gesture of the hand 4, in which only the index finger is stretched, the remaining fingers are balled into a fist. In subfigure c, a hint gesture or click gesture is illustrated, in which the
  • Forefinger of the hand 4 is initially stretched out and then folded in the direction of the screen 3.
  • FIG. 5 shows an operating step which, following successful 3D gesture operation, may follow subsequent to the sign-on process discussed in FIG.
  • FIG. 6 shows method steps of an exemplary embodiment of a method according to the invention for changing from a first operating mode of a user interface to a 3D gesture mode.
  • a first step 100 graphical content is displayed on a display unit, which is subsequently to be operated via 3D gestures.
  • the user guides his hand into a predefined detection area of a sensor of the user interface.
  • step 200 the user's hand is detected by the sensor, and in step 300, a predefined posture and / or movement of the user's hand is detected.
  • sensory, reliably detectable gestures that are nevertheless easily distinguishable from the usual postures / movements when handling the user are suitable.
  • step 400 a timer is started, the sequence of which establishes another condition for starting the 3D gesture mode. Is it going? Timer off after the predefined posture / movement is completed (Y), the system remains in the first operating mode. Accordingly, the process returns to step 100 in this case. If the timer expires before the user's manual pose / movement is terminated, the system will switch from the first mode of operation to the 3D gesture mode in step 500 and will issue a visual and audible indication in step 600 that the 3D gesture mode has been successfully activated.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne une interface utilisateur et un procédé de commutation d'un premier mode de commande d'une interface utilisateur en un mode gestuel en 3D, au moyen duquel il est possible de commander l'interface utilisateur par une pluralité de gestes réalisés librement dans l'espace, désignés ci-après gestes en 3D. Le procédé comprend les étapes suivantes : détecter la main (4) de l'utilisateur dans une zone prédéfinie au moyen d'un capteur, identifier une posture prédéfinie de la main (4) de l'utilisateur et/ou un mouvement prédéfini de la main (4) de l'utilisateur et commuter du premier mode de commande en mode gestuel en 3D.
EP15705545.0A 2014-02-17 2015-02-06 Interface utilisateur et procédé de commutation d'un premier mode de commande d'une interface utilisateur en un mode gestuel en 3d Withdrawn EP3108332A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102014202833.7A DE102014202833A1 (de) 2014-02-17 2014-02-17 Anwenderschnittstelle und Verfahren zum Wechseln von einem ersten Bedienmodus einer Anwenderschnittstelle in einen 3D-Gesten-Modus
PCT/EP2015/052549 WO2015121173A1 (fr) 2014-02-17 2015-02-06 Interface utilisateur et procédé de commutation d'un premier mode de commande d'une interface utilisateur en un mode gestuel en 3d

Publications (1)

Publication Number Publication Date
EP3108332A1 true EP3108332A1 (fr) 2016-12-28

Family

ID=52544458

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15705545.0A Withdrawn EP3108332A1 (fr) 2014-02-17 2015-02-06 Interface utilisateur et procédé de commutation d'un premier mode de commande d'une interface utilisateur en un mode gestuel en 3d

Country Status (4)

Country Link
EP (1) EP3108332A1 (fr)
CN (1) CN106030462A (fr)
DE (1) DE102014202833A1 (fr)
WO (1) WO2015121173A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016011365A1 (de) * 2016-09-21 2018-03-22 Daimler Ag Verfahren zur Steuerung eines Kraftfahrzeugmoduls und Kraftfahrzeugmodul
DE102017201312A1 (de) 2017-01-27 2018-08-02 Audi Ag Verfahren zum Steuern eines Temperierungselements eines Behälterhalters

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110221666A1 (en) * 2009-11-24 2011-09-15 Not Yet Assigned Methods and Apparatus For Gesture Recognition Mode Control
DE202012005255U1 (de) * 2012-05-29 2012-06-26 Youse Gmbh Bedienvorrichtung mit einer Gestenüberwachungseinheit

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100575906B1 (ko) * 2002-10-25 2006-05-02 미츠비시 후소 트럭 앤드 버스 코포레이션 핸드 패턴 스위치 장치
JP3903968B2 (ja) * 2003-07-30 2007-04-11 日産自動車株式会社 非接触式情報入力装置
KR20060070280A (ko) * 2004-12-20 2006-06-23 한국전자통신연구원 손 제스처 인식을 이용한 사용자 인터페이스 장치 및 그방법
US7834847B2 (en) * 2005-12-01 2010-11-16 Navisense Method and system for activating a touchless control
DE102006037156A1 (de) 2006-03-22 2007-09-27 Volkswagen Ag Interaktive Bedienvorrichtung und Verfahren zum Betreiben der interaktiven Bedienvorrichtung
DE102009008041A1 (de) 2009-02-09 2010-08-12 Volkswagen Ag Verfahren zum Betrieb eines Kraftfahrzeuges mit einem Touchscreen
US8619029B2 (en) * 2009-05-22 2013-12-31 Motorola Mobility Llc Electronic device with sensing assembly and method for interpreting consecutive gestures
CN102221891A (zh) * 2011-07-13 2011-10-19 广州视源电子科技有限公司 实现视像手势识别的方法和系统
DE102012000263A1 (de) 2012-01-10 2013-07-11 Daimler Ag Verfahren und Vorrichtung zum Bedienen von Funktionen in einem Fahrzeug unter Verwendung von im dreidimensionalen Raum ausgeführten Gesten sowie betreffendes Computerprogrammprodukt

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110221666A1 (en) * 2009-11-24 2011-09-15 Not Yet Assigned Methods and Apparatus For Gesture Recognition Mode Control
DE202012005255U1 (de) * 2012-05-29 2012-06-26 Youse Gmbh Bedienvorrichtung mit einer Gestenüberwachungseinheit

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2015121173A1 *

Also Published As

Publication number Publication date
CN106030462A (zh) 2016-10-12
DE102014202833A1 (de) 2015-08-20
WO2015121173A1 (fr) 2015-08-20

Similar Documents

Publication Publication Date Title
EP2200858B1 (fr) Systeme de vehicule a fonction d'assistance
EP3108331B1 (fr) Interface utilisateur et procédé pour commander sans contact un élément de commande intégré dans un équipement selon un mode gestuel en 3d
EP3025223A1 (fr) Procédé et dispositif de commande à distance d'une fonction d'un véhicule automobile
EP3118048B1 (fr) Selection de fonction par commande gestuelle comprenant un retour haptique
EP3358454B1 (fr) Interface utilisateur, véhicule et procédé de distinction de l'utilisateur
DE102014200993A1 (de) Anwenderschnittstelle und Verfahren zur Anpassung einer Ansicht auf einer Anzeigeeinheit
EP3508968A1 (fr) Procédé de fonctionnement d'une interface homme-machine ainsi qu'interface homme-machine
DE102013020795A1 (de) Verfahren und Anordnung zum Steuern von Funktionen eines Kraftfahrzeugs
DE102014019005A1 (de) Verfahren zum Betreiben einer Bedienvorrichtung eines Kraftfahrzeugs in unterschiedlichen Bedienmodi sowie Bedienvorrichtung und Kraftfahrzeug
EP3108332A1 (fr) Interface utilisateur et procédé de commutation d'un premier mode de commande d'une interface utilisateur en un mode gestuel en 3d
WO2017140569A1 (fr) Dispositif de commande d'un véhicule automobile et procédé de fonctionnement d'un dispositif de commande pour provoquer un effet de changement entre un plan virtuel de représentation et une main
DE102014200609A1 (de) Berührungssensitive Bedienvorrichtung mit Mulde
DE102013019570A1 (de) Bedieneinrichtung mit berührungsloser Schrifteingabefunktion für ein Kraftfahrzeug, Kraftfahrzeug und Verfahren zum Betreiben einer Bedieneinrichtung
EP2030828B9 (fr) Système de commande multimodal et procédé de commande de composants et de fonctions dans un véhicule
EP3108333B1 (fr) Interface utilisateur et procédé d'assistance d'un utilisateur lors de la commande d'une interface utilisateur
DE102010009622A1 (de) Verfahren zum Betreiben einer Benutzerschnittstelle und Vorrichtung dazu, insbesondere in einem Fahrzeug
DE102009033061A1 (de) Verfahren und Vorrichtung zum Steuern einer Benutzerschnittstelle
DE102015222682A1 (de) Verfahren zum Aktivieren eines Bedienelements eines Kraftfahrzeugs und Bediensystem für ein Kraftfahrzeug
DE102018204223A1 (de) Mobile, portable Bedienvorrichtung zum Bedienen eines mit der Bedienvorrichtung drahtlos gekoppelten Geräts, und Verfahren zum Betreiben eines Geräts mithilfe einer mobilen, portablen Bedienvorrichtung
DE102011116312A1 (de) Verfahren und Eingabevorrichtung zum Steuern eines elektronischen Geräts
WO2021028274A1 (fr) Système utilisateur et procédé pour faire fonctionner un système utilisateur
DE102016003072A1 (de) Bedienvorrichtung und Verfahren zum Erfassen einer Benutzerauswahl zumindest einer Bedienfuktion der Bedienvorrichtung
DE102017210958A1 (de) Verfahren zur taktilen Interaktion eines Benutzers mit einem elektronischen Gerät sowie elektronisches Gerät dazu
DE102014111749A1 (de) Verfahren und Vorrichtung zum Steuern eines technischen Gerätes
WO2021004681A1 (fr) Procédé pour faire fonctionner un système de commande utilisateur et système de commande utilisateur

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20160919

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RIN1 Information on inventor provided before grant (corrected)

Inventor name: WILD, HOLGER

Inventor name: CZELNIK, MARK PETER

17Q First examination report despatched

Effective date: 20181108

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20200305