EP3049909A1 - Interface d'utilisateur et procédé d'assistance d'un utilisateur lors de l'actionnement d'une unité de commande - Google Patents

Interface d'utilisateur et procédé d'assistance d'un utilisateur lors de l'actionnement d'une unité de commande

Info

Publication number
EP3049909A1
EP3049909A1 EP13770487.0A EP13770487A EP3049909A1 EP 3049909 A1 EP3049909 A1 EP 3049909A1 EP 13770487 A EP13770487 A EP 13770487A EP 3049909 A1 EP3049909 A1 EP 3049909A1
Authority
EP
European Patent Office
Prior art keywords
operating unit
user
predefined
button
sound
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP13770487.0A
Other languages
German (de)
English (en)
Inventor
Holger Wild
Mark Peter Czelnik
Gordon Seitz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volkswagen AG
Original Assignee
Volkswagen AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Volkswagen AG filed Critical Volkswagen AG
Publication of EP3049909A1 publication Critical patent/EP3049909A1/fr
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/10
    • B60K35/26
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • B60K2360/113
    • B60K2360/115
    • B60K2360/141
    • B60K2360/1438
    • B60K2360/1442
    • B60K2360/146
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04108Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction

Definitions

  • the present invention relates to a user interface and a method for assisting a user in the operation of a touch-sensitive
  • the present invention relates to the assistance of a driver of a means of locomotion in the operation of a permanently arranged in the means of transport control unit during the perception of his driving task.
  • HMIs human-machine interfaces
  • rotary / push button, pushbutton and other mechanical elements are also known as touch-sensitive surfaces, which include, for example, capacitive sensors.
  • systems are known in which a user gesture executed in front of a screen is recognized without requiring contact with the operating unit.
  • Such an operating step is referred to as “hovering.”
  • the gesture recognition via optical sensors (cameras) and alternatively or additionally via capacitive sensors is detected and assigned to a predetermined button.
  • the operating system “Windows 8" (registered trademark) and referred to “Air View” (registered trademark) of the Samsung Galaxy S4 (registered trademark).
  • WO 2009/062677 A2 shows a multimodal user interface of an infotainment system for inputting and presenting information in which gestures are used in conjunction with voice commands for calling predefined functions.
  • Speech recognition system used to detect voice commands.
  • the inventive method is used to assist a user in the operation of a touch-sensitive control unit, which may be configured as built in a means of transport HMI.
  • a presence of an input means in a predefined first area with respect to the operating unit is detected.
  • the input means may be, for example, a stylus, a user's finger, another body part of the user, etc.
  • the predefined first area may be in contact with the operation unit ("touch operation").
  • control unit can be used to detect the various inputs capacitive sensors, camera-based
  • Sensors or similar include.
  • an audio output is generated by means of which the presence is acknowledged to the user.
  • the audio output comprises a first sound character, which is context-specifically assigned to a first button displayed on the operating unit.
  • the first button may represent, for example, a predefined range of functions which the first sound character symbolizes in a suitable manner.
  • the first sound signal can also be understood as an "acoustic icon" (also called “Earcon”).
  • the function associated with the button determines the shape of the tone so that when listening to the tone, the user associates a particular function of a plurality of functions as nearest. In this way, reaching the first predefined area may cause an audible indication of the function associated with the first button.
  • the button is pressed (for example, by a touch input, a gesture, etc.), one of the buttons is assigned
  • Performed function which according to the invention already recognized by the user due to the audio output. Accordingly, leaving the predefined first area can be accompanied by the same or an alternative sound in the context of an audio output.
  • the present invention supports the user in the operation of the operating unit according to the invention, in particular during the perception of a driving task, during which the driver has a limited
  • the method according to the invention further comprises detecting a presence of an input means in a predefined second area with respect to the operating unit.
  • the process is acknowledged to the user by a predefined second audio output.
  • This includes a second tone that is assigned context-specific to a second button displayed on the control unit.
  • the second button stands for a different function than the first button, with the second sound character in conjunction with the second button, the above applies accordingly.
  • the user may conclude a function associated with the second button and substantially preclude performing the function associated with the first button. This improves the support of the user in the operation of the user interface according to the invention during the performance of his driving task.
  • the predefined first area and the predefined second area may be delimited by an interface lying substantially parallel to the surface of the operating unit, which is spaced apart in the direction perpendicular to the surface of the operating unit.
  • a first parallel interface may be defined by the surface of the operating unit itself or a Höver range may be defined by a between the surface of the operating unit and the first interface.
  • Vertical boundary surfaces of the first and the second area can, for example, with the boundaries of a respective illustrated
  • the illustrated button may be bounded by the border of a vertical projection of the first or second area on the surface of the operating unit.
  • the first or the second region may be at a distance of -1 mm to + 3 mm from the surface of the operating unit.
  • a negative distance range stands for a touch operation, while a positive distance range stands for a Höver range. This typically extends to a distance of 10 mm to 15 cm from the surface of the control unit.
  • the method according to the invention can further recognize a transfer of the input means from the first area into a predefined third area with respect to the operating unit.
  • a predefined third audio output which comprises a third sound character, which is context-specifically assigned to a third button displayed on the operating unit.
  • the third button stands for a third function, for the user through the third Sound symbol is symbolized.
  • Driving task be supported acoustically, so that his attention can remain essentially on the traffic.
  • the audio outputs may comprise a sample or a plurality of mutually related samples, which may be varied, in particular with repeated output.
  • a typical example is a clicking sound, which can be changed in frequency depending on the operating step. For example, when reaching a Höver range, a clicking sound of a first, higher frequency can be reproduced, while when changing from a first Höver range to a second Höver range, a clicking sound of a second, lower frequency is reproduced.
  • the clicking sounds are an example of a poorly contextual association between the button and the sound. More specifically, for example, a weather-related button can be acknowledged by a thunder sound as a sound signal
  • Rain or wind noise can be used. In this way, a current weather situation can be reproduced, provided that this can be symbolized with one of the sounds aptly.
  • Another example is the announcement of a telephone function by playing a DTMF tone sequence or a tone as a tone.
  • a navigation function can be announced by a sonar noise.
  • the aforementioned assignments represent particularly intuitively recognizable relationships for the orientation of the user.
  • the inventive method comprises acknowledging the detected presence by a predefined alternative optical representation of a on the
  • Control unit button In this case, an increased noise level can be partially compensated by the alternative optical representation draws the attention of the user to the control unit and thus only weakly compared to the noise level perceived audio output can be better recognized.
  • the associations between the respective audio output and a respective button can be defined by a user. This can be, for example be made such that the user can assign in a configuration menu a whole list of buttons respective sound.
  • an additional sound signal eg self-designed or purchased
  • a user interface which in particular can be permanently installed in a motor vehicle.
  • Such user interfaces are also known as human-machine interfaces (HMI).
  • HMI human-machine interfaces
  • the motor vehicle may be, for example, a road vehicle (car, truck), which is driven electrically, via an internal combustion engine or hybrid.
  • User interface includes a touch-sensitive control unit, a
  • Recognition device for recognizing a user input and a
  • the touch-sensitive operating unit may comprise, for example, a display which forms a touch screen together with a touch-sensitive element.
  • the recognition device may comprise the touch-sensitive element and a camera system for detecting hover gestures.
  • Processing means may comprise a processor which is arranged within the HMI and adapted, for example, to process navigation commands.
  • the user interface is also configured to perform a method as described in connection with the first aspect of the invention.
  • the operating unit may preferably be integrated centrally in the dashboard of a motor vehicle or arranged as an instrument cluster behind the steering device of the vehicle.
  • the recognition device may comprise a camera system and / or a capacitive sensor in order to be able to recognize and assign hover gestures and touch gestures.
  • the user interface may comprise a storage means in which reference signals representing predefined gestures are stored. The references may be read by the processing means and compared with inputs recognized by the recognizer. This increases the variety of usable operating steps, so that on average fewer consecutive operating steps have to be carried out.
  • a vehicle is proposed which comprises a user interface according to the second aspect of the invention.
  • the user interface may be arranged as a compact unit or as a distributed system within the vehicle. In this way, existing hardware can be used, whereby the present invention essentially in the form of
  • Figure 1 is a schematic overview of components of an embodiment of a user interface according to the invention.
  • FIG. 2 shows an illustration of a possible operating step of an exemplary embodiment of a user interface according to the invention
  • FIG. 3 shows a representation of an alternative operating step of an exemplary embodiment of a user interface according to the invention.
  • FIG. 4 is a flow chart illustrating the steps of an embodiment of a method according to the invention.
  • FIG. 1 shows a means of transport 8, in which a screen of an HMI as
  • Operating unit 1 is embedded in the dashboard.
  • a camera 6 is provided in addition to (not shown) proximity sensor via a light barrier and capacitive technologies as a recognition device in the windshield.
  • the camera 6 is connected to an electronic control unit 7 as a processing device for data processing. Further, the electronic control unit 7 is connected to a speaker 9 for generating an audio output and to the operating unit 1, on the screen of which a first button 10 and a second button 20 are shown.
  • the buttons 10, 20 are respective, cuboidal areas 1 1, 21, shown in dashed lines, in which a presence of an input means as a touch input or as Hovern (depending on a respective distance a to the surface of the operating unit 1) is detected.
  • 21 are oriented by a first, parallel to the surface of the control unit 1 oriented surface 12 and 22, a second, parallel to the surface of the control unit 1 oriented and remote from the surface second surface 13 and 23 and by four perpendicular to the surface of the control unit 1 oriented Areas 14 or 24, 15 or 25, 16 or 26 and 17 or 27 defined such that the buttons 10, 20 vertical projections of the areas 1 1. 21 represent on the control unit 1 and are limited by these.
  • Figure 2 shows a possible user interface, which can be displayed on the control unit 1.
  • a map section is displayed, which corresponds to a part of a route calculated by a navigation system.
  • a second button 20 is displayed in an upper area, over which a currently playing title, its artist and the album on which the title is contained are reproduced.
  • a third button 30 is shown in which the weather in Braunschweig is displayed in the form of an icon in conjunction with a degree Celsius indication and a current precipitation.
  • the hand of a user 2 is located in front of the first button 10.
  • a first predefined audio output 3 which includes a sounder signal as Earcon, the presence of the hand is acknowledged, and the user in the course given an orientation that he with a a navigation function associated button 10 is operated.
  • Figure 3 shows the view shown in connection with Figure 2, in which, however, an alternative gesture is performed by the hand of the user 2.
  • the hand changes along an arrow P from a first area 1 1 in front of the first button 10 in a third area in front of the third button 30.
  • the detected crossing is by a predefined second audio output 4, which includes a discreet, high-frequency clinking, acknowledged.
  • the second audio output 4 corresponds to the content of the third
  • a high-frequency soft chime can cause a corresponding association with the user 2.
  • FIG. 4 shows a flowchart, illustrating steps of an embodiment of a method according to the invention. The method begins in step 100 with the
  • Assigning a first sound character to a displayable on the operating unit first button by a user This can be done in a configuration menu of a user interface according to the invention. In this case, individual or all buttons of the operating system of the user interface by a
  • predefined (but expandable) pool of sound marks are added to selected sound symbols. After the assignment, the presence of a
  • Input means are detected in a predefined first area with respect to the operating unit (for example as a touch or hover gesture).
  • the user is acknowledged by a predefined first audio output, the audio output comprising a first sound character which is context-specifically assigned to a first button displayed on the control unit.
  • the audio output comprising a first sound character which is context-specifically assigned to a first button displayed on the control unit.
  • Button associated function a process that is as simple as possible associated with the first tone by the user. Further, in step 400, the presence of the input means in a predefined second (other) range with respect to
  • Operating unit e.g., on or above a second button
  • step 500 acknowledge to the user the detected presence through a predefined second (other) audio output.
  • the second audio output comprises a second sound character, which represents an acoustic symbol for a function associated with the second button.
  • the input means enters from a first area into a predefined third (again different) area with respect to the operating unit.
  • step 700 the user is acknowledged by a predefined third audio output, the third audio output comprising a third sound character corresponding to a third one displayed on the control unit
  • Button is assigned context-specific. In other words, the third one

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Interface d'utilisateur et procédé d'assistance d'un utilisateur lors de l'actionnement d'une unité de commande tactile (1). Selon l'invention, ledit procédé comporte les étapes suivantes : la présence d'un moyen de saisie (2), en particulier d'un doigt de l'utilisateur, est détectée (200) dans une première zone (11) prédéfinie par rapport à l'unité de commande (1), et en réaction à cela, la présence détectée est signalée (300) par une première sortie audio (3) prédéfinie comprenant un premier signal sonore qui est associé spécifiquement pour ce qui est du contexte à une première surface de commande (10, 20, 30) représentée sur l'unité de commande (1).
EP13770487.0A 2013-09-27 2013-09-27 Interface d'utilisateur et procédé d'assistance d'un utilisateur lors de l'actionnement d'une unité de commande Ceased EP3049909A1 (fr)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2013/070214 WO2015043652A1 (fr) 2013-09-27 2013-09-27 Interface d'utilisateur et procédé d'assistance d'un utilisateur lors de l'actionnement d'une unité de commande

Publications (1)

Publication Number Publication Date
EP3049909A1 true EP3049909A1 (fr) 2016-08-03

Family

ID=49261561

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13770487.0A Ceased EP3049909A1 (fr) 2013-09-27 2013-09-27 Interface d'utilisateur et procédé d'assistance d'un utilisateur lors de l'actionnement d'une unité de commande

Country Status (5)

Country Link
US (1) US10248382B2 (fr)
EP (1) EP3049909A1 (fr)
KR (1) KR101805328B1 (fr)
CN (1) CN105683903A (fr)
WO (1) WO2015043652A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9563329B1 (en) 2015-09-15 2017-02-07 Thunder Power Hong Kong Ltd. Interchangeable display of information panels on a dashboard
GB2572614B (en) * 2018-04-05 2021-12-08 Steris Solutions Ltd Handset for controlling a support device or a movable surface
CN114397996A (zh) * 2021-12-29 2022-04-26 杭州灵伴科技有限公司 交互提示方法、头戴式显示设备和计算机可读介质

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7760187B2 (en) * 2004-07-30 2010-07-20 Apple Inc. Visual expander
US7257536B1 (en) * 1999-11-23 2007-08-14 Radiant Systems, Inc. Audio request interaction system
US20080129520A1 (en) * 2006-12-01 2008-06-05 Apple Computer, Inc. Electronic device with enhanced audio feedback
DE102007039450A1 (de) * 2007-08-21 2009-02-26 Siemens Ag Berührungsempfindlicher Bildschirm und Verfahren zum Betreiben eines berührungsempfindlichen Bildschirms
DE102008051756A1 (de) 2007-11-12 2009-05-14 Volkswagen Ag Multimodale Benutzerschnittstelle eines Fahrerassistenzsystems zur Eingabe und Präsentation von Informationen
US9170649B2 (en) * 2007-12-28 2015-10-27 Nokia Technologies Oy Audio and tactile feedback based on visual environment
US20090225043A1 (en) * 2008-03-05 2009-09-10 Plantronics, Inc. Touch Feedback With Hover
US20100250071A1 (en) * 2008-03-28 2010-09-30 Denso International America, Inc. Dual function touch switch with haptic feedback
JP2012527075A (ja) * 2009-05-13 2012-11-01 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 照明の機能及び照明の設定についての音によるフィードバック及びこれに従うこと
DE102009032069A1 (de) * 2009-07-07 2011-01-13 Volkswagen Aktiengesellschaft Verfahren und Vorrichtung zum Bereitstellen einer Benutzerschnittstelle in einem Fahrzeug
DE102009036369A1 (de) 2009-08-06 2011-02-10 Volkswagen Ag Verfahren zum Betreiben einer Bedienvorrichtung und Bedienvorrichtung in einem Fahrzeug
TWI423112B (zh) * 2009-12-09 2014-01-11 Ind Tech Res Inst 可攜式虛擬輸入操作裝置與其操作方法
US10705794B2 (en) * 2010-01-18 2020-07-07 Apple Inc. Automatically adapting user interfaces for hands-free interaction
US8863256B1 (en) * 2011-01-14 2014-10-14 Cisco Technology, Inc. System and method for enabling secure transactions using flexible identity management in a vehicular environment
US9285944B1 (en) * 2011-04-22 2016-03-15 Angel A. Penilla Methods and systems for defining custom vehicle user interface configurations and cloud services for managing applications for the user interface and learned setting functions
WO2013029257A1 (fr) * 2011-08-31 2013-03-07 Ooros Automotive Co., Ltd. Système interactif de véhicule
KR101852821B1 (ko) * 2011-09-08 2018-04-27 엘지전자 주식회사 휴대 단말기 및 그 제어 방법
JP6311602B2 (ja) * 2012-06-15 2018-04-18 株式会社ニコン 電子機器
KR101943320B1 (ko) * 2012-09-21 2019-04-17 엘지전자 주식회사 이동단말기 및 그 제어방법
KR101990037B1 (ko) * 2012-11-13 2019-06-18 엘지전자 주식회사 이동 단말기 및 그것의 제어 방법
US9300779B2 (en) * 2013-03-15 2016-03-29 Blackberry Limited Stateful integration of a vehicle information system user interface with mobile device operations
US8818716B1 (en) * 2013-03-15 2014-08-26 Honda Motor Co., Ltd. System and method for gesture-based point of interest search

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO2015043652A1 *

Also Published As

Publication number Publication date
CN105683903A (zh) 2016-06-15
KR101805328B1 (ko) 2017-12-07
US10248382B2 (en) 2019-04-02
KR20160057474A (ko) 2016-05-23
US20160239261A1 (en) 2016-08-18
WO2015043652A1 (fr) 2015-04-02

Similar Documents

Publication Publication Date Title
EP2223046B1 (fr) Interface utilisateur multimodale d'un système d'assistance à la conduite pour la saisie et la présentation d'informations
EP2451672B1 (fr) Procédé et dispositif permettant de fournir une interface utilisateur dans un véhicule
EP1853465B1 (fr) Procede et dispositif de commande vocale d'un appareil ou d'un systeme dans un vehicule automobile
DE102014200993A1 (de) Anwenderschnittstelle und Verfahren zur Anpassung einer Ansicht auf einer Anzeigeeinheit
WO2019007811A1 (fr) Interface utilisateur pour un moyen de déplacement et moyen de déplacement contenant une interface utilisateur
EP3063611A1 (fr) Dispositif et procédé d'adaptation du contenu d'une barre d'état
EP2883738B1 (fr) Procédé et système de commande de fonctions d'un véhicule automobile
DE102013014887B4 (de) Kraftfahrzeug-Bedienvorrichtung mit ablenkungsarmem Eingabemodus
DE102017201799A1 (de) Anwenderschnittstelle, Fortbewegungsmittel und Verfahren zur Anwenderunterscheidung
EP3049909A1 (fr) Interface d'utilisateur et procédé d'assistance d'un utilisateur lors de l'actionnement d'une unité de commande
EP3049911B1 (fr) Interface d'utilisateur et procédé d'assistance d'un utilisateur lors de l'actionnement d'une unité de commande
DE102009018590B4 (de) Kraftfahrzeug mit einer Bedienvorrichtung und dazugehöriges Verfahren
DE102015221304A1 (de) Verfahren und Vorrichtung zur Verbesserung der Erkennungsgenauigkeit bei der handschriftlichen Eingabe von alphanumerischen Zeichen und Gesten
WO2014117932A1 (fr) Fonction d'aide déclenchée par le système par trajet et/ou par utilisateur pour la commande d'un dispositif associé à un véhicule
DE102013016196A1 (de) Kraftfahrzeugbedienung mittels kombinierter Eingabemodalitäten
DE102008033441B4 (de) Verfahren zum Betrieb eines Bediensystems für ein Fahrzeug und Bediensystem für ein Fahrzeug
DE102011116122A1 (de) Verfahren zum Bereitstellen einer Bedienvorrichtungin einem Fahrzeug und Bedienvorrichtung
DE102014202833A1 (de) Anwenderschnittstelle und Verfahren zum Wechseln von einem ersten Bedienmodus einer Anwenderschnittstelle in einen 3D-Gesten-Modus
DE102019129396A1 (de) Grafische Anwenderschnittstelle, Fortbewegungsmittel und Verfahren zum Betrieb einer grafischen Anwenderschnittstelle für ein Fortbewegungsmittel
EP3049910B1 (fr) Interface d'utilisateur et procédé d'assistance d'un utilisateur lors de l'actionnement d'une unité de commande
DE102011015693A1 (de) Verfahren und Vorrichtung zum Bereitstellen einer Nutzerschnittstelle, insbesondere in einem Fahrzeug
DE102014018423B4 (de) Kraftfahrzeug mit durch peripheres Sehen interpretierbarer Zustandsanzeige sowie entsprechendes Verfahren
DE102016218270A1 (de) Verfahren zum Betreiben einer Kraftfahrzeug-Bedienvorrichtung mit Spracherkenner, Bedienvorrichtung und Kraftfahrzeug
DE102017200769A1 (de) Verfahren zum Betreiben einer Eingabevorrichtung eines Kraftfahrzeugs und Eingabevorrichtung für ein Kraftfahrzeug
DE102012021220A1 (de) Bedienanordnung für ein Kraftfahrzeug

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20160428

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20190523

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20201010