WO2017140569A1 - Dispositif de commande d'un véhicule automobile et procédé de fonctionnement d'un dispositif de commande pour provoquer un effet de changement entre un plan virtuel de représentation et une main - Google Patents

Dispositif de commande d'un véhicule automobile et procédé de fonctionnement d'un dispositif de commande pour provoquer un effet de changement entre un plan virtuel de représentation et une main Download PDF

Info

Publication number
WO2017140569A1
WO2017140569A1 PCT/EP2017/052848 EP2017052848W WO2017140569A1 WO 2017140569 A1 WO2017140569 A1 WO 2017140569A1 EP 2017052848 W EP2017052848 W EP 2017052848W WO 2017140569 A1 WO2017140569 A1 WO 2017140569A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
objects
hand
operating
interaction
Prior art date
Application number
PCT/EP2017/052848
Other languages
German (de)
English (en)
Inventor
Matthias WUNDERLICH
Original Assignee
Audi Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi Ag filed Critical Audi Ag
Publication of WO2017140569A1 publication Critical patent/WO2017140569A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K37/00Dashboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/11Instrument graphical user interfaces or menu aspects
    • B60K2360/111Instrument graphical user interfaces or menu aspects for controlling multiple devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/143Touch sensitive instrument input devices
    • B60K2360/1438Touch screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/146Instrument input by gesture
    • B60K2360/14643D-gesture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/21Optical features of instruments using cameras

Definitions

  • the invention relates to a method for operating an operating device in order to effect an interaction between graphical objects which are displayed in a virtual display plane of a display device and a user's hand located outside this display device.
  • the invention also includes an operating device for carrying out the method according to the invention.
  • the operating device can be provided in particular in a motor vehicle, which is why the invention also includes a motor vehicle with the operating device according to the invention.
  • a mouse pointer or another representation can be represented, for example, together with the graphic object in the virtual display plane, which the user then displays from outside the virtual display plane by means of a real operating element, e.g. a joystick, can move.
  • a real operating element e.g. a joystick
  • VR virtual reality
  • EP 2 930 603 A1 It is known from EP 2 930 603 A1 to place grouped objects in a different display state in groups when an approach of a hand of a user to the display device is detected. The user can then manipulate this group through further operations. Again, after activating the group, it is first necessary to wait and see which command should actually be performed on this group.
  • the invention is based on the object of effecting an interaction between graphic objects in a virtual display plane of a display device on the one hand and a user's hand located outside the display direction on the other hand.
  • the object is solved by the subject matters of the independent claims.
  • Advantageous developments of the invention are disclosed by the features of the dependent claims, the following description and the figures.
  • the invention provides a method for operating an operating device.
  • the operating device can be configured, for example, as an infotainment system (information entertainment system) of a motor vehicle.
  • the operating device can also be realized, for example, by a portable mobile terminal, that is, for example, a smartphone or a tablet PC.
  • the method effects an interaction between graphical objects which are displayed in a virtual display plane of a display device and a user's hand located outside this display device.
  • a detection device a hand gesture that is free of contact with the outside of the display device is detected by the hand in a predetermined manner.
  • the hand gesture determines the interaction to be applied to one or more of the objects.
  • the user can perform a sliding gesture with his hand as if he were pushing objects floating freely in space away from him or pushing them aside.
  • a selection mode is activated in the virtual presentation level when detected hand gesture.
  • the control device thus changes from a normal operating mode or operating mode into a selection mode.
  • a respectively assigned operating function can be activated, for example an air conditioning system set or a navigation device configured.
  • the selection mode is activated, the user then receives from the user a user selection of at least one of the objects to which the hand gesture is to act.
  • the user determines through user selection which of the objects to manipulate from the interaction represented or determined by the hand gesture. For example, it determines by the user selection which of the objects is to be moved by its shift gesture.
  • the user selection can be done for example by tapping or touching the objects on the display device.
  • the associated operating function is not activated here.
  • the detection device finally acquires a confirmation gesture concluding the user selection.
  • the confirmation gesture finally determines which of the objects as a whole is to be recorded by the hand gesture.
  • the control device determines the interaction determined by the hand gesture on the at least one object selected in the virtual presentation plane selected by the user selection. turned. In the example described, the selected objects are then moved. In other words, the user establishes a manipulation function by the hand gesture, determines by the user selection the objects to which the manipulation function is to be applied, and finally the control device applies the selected manipulation function to the selected objects.
  • the invention provides the advantage that the user can determine freely in space with a hand gesture, which interaction he wishes with the objects, that is, for example, if they are moved or deleted. Since the user can not intervene directly with the virtual representation plane, for example with a virtual representation of his hand, and can directly grasp or touch the objects to be changed or manipulated, the invention solves this problem by determining the interaction in a selection mode the user is still determined by means of the user selection on which of the objects the interaction is to act. This saves the complex representation of the user's hand by a virtual hand in the virtual presentation level. Finally, the selected objects can then actually be manipulated according to the selected or specified interaction.
  • the invention includes optional developments, the characteristics of which provide additional advantages.
  • a gripping movement or catching position is recognized as the hand gesture.
  • the gripping movement may be, for example, a movement of the hand with the fingers open and then closing the fingers to the fist.
  • the catching posture can be, for example, the holding of the flat or curved hand with the palm upwards.
  • the user selection determines which of the objects are then selected from the virtual presentation plane pulled out or fall out of the virtual presentation level.
  • the falling out can be visualized by a corresponding animation of the selected objects on the display device.
  • the user selection is received as an inverse selection, in that all objects except those marked by the user are included in the user selection. So the user first marks some objects. The user selection then includes all unmarked objects. This results in the advantage that the selection of a plurality of objects, namely in particular more than half of the objects, takes place with fewer operating steps.
  • a further development provides that the user selection is terminated by the recognition device only when the final confirmation gesture is recognized, and only then, by the control device, is the interaction simultaneously applied to each selected object. This has the advantage that the user can still reschedule during the user selection, ie before the termination, and can undo a selection, for example. With immediate deletion of an object this would not be possible.
  • the method for operating the operating device can be used for different fields of application.
  • a further development provides that hereby a menu structure or a menu content and a content of a desktop view are adjusted. The operating menu or the desktop in this case represents the virtual presentation level.
  • the interaction is used to adapt a content of an operating menu provided in the virtual representation level for activating operating functions of the operating device and / or a collection of icons (image symbols) ,
  • the user can remove menu items of the operating menu or remove icons. Only the menu entries or icons selected by the user remain.
  • operating functions for example, controlling an air-conditioning device and / or media playback (radio or MP3 playback device) and / or setting a seat position can be performed by the operating device. tion in a motor vehicle and / or the operation of a navigation device may be provided.
  • a further development relates to the selection mode.
  • the objects are displayed differently and graphically and / or dynamically, that is to say in their movement, by the control device when the selection mode is activated than in the case of an inactive selection mode.
  • a graphic change for example, a color and / or shape and / or marking can be changed.
  • a dynamic change for example, a silent or motionless representation on the one hand and, for example, a pulsating or rotating representation on the other hand be provided.
  • the hand gesture is detected by means of a camera by the detection device.
  • the camera here means the actual image capture device as well as an associated image processing device by which a hand is recognized in image data of the image capture device and its position in space is detected.
  • a time-of-flight camera is used. This has the advantage that directly on the basis of the 3D image data of the time-of-flight camera, a relative position of the hand in space with respect to the camera can be detected.
  • a development provides that the user selection of the at least one object is detected by the detection device in each case by means of a sensor matrix on a display surface of the display device as a respective contact of the object. The user thus interacts directly with this by touching each object.
  • a sensor matrix can be realized, for example, by means of a touchscreen, by means of which both the display surface of the display device and the touch-sensitive sensor matrix can be provided.
  • the invention also provides an operating device which comprises a display device, a detection device and a control device.
  • the display device is set up to display graphic objects in a virtual display plane.
  • the detection device is set up to display, outside the display device, a hand gesture of a hand of a user. He realizes that he performs it freely in the room.
  • This predetermined hand gesture represents an interaction to be applied to at least one or more of the objects, that is, for example, a deletion function.
  • the detection device is further configured to receive a user selection of the user of the objects to which the hand gesture is to act.
  • the control device is configured to activate a selection mode for receiving the user selection and to implement or apply the interaction defined by the recognized hand gesture to the at least one object selected by the user selection in the virtual presentation plane.
  • the operating device performs an embodiment of the method according to the invention by means of its display device, detection device and control device.
  • the display device can be realized, for example, by a screen or a head-up display.
  • the detection device may comprise a camera, in particular a time-of-flight camera, and / or a sensor matrix in the manner described.
  • the control device can be realized on the basis of a processor device, for example a microcontroller or microprocessor.
  • the control device then has a corresponding program module, by means of which, when it is executed by the processor device, those method steps which relate to the control device are carried out.
  • the operating device can be provided in the manner described in a portable, mobile terminal, for example a smartphone or tablet PC.
  • the operating device is realized in a motor vehicle.
  • the invention also provides a motor vehicle with an embodiment of the operating device according to the invention.
  • the motor vehicle according to the invention is preferably designed as a motor vehicle, in particular as a passenger car or truck.
  • FIG. 1 shows a schematic representation of an embodiment of the operating device according to the invention in a normal operating mode
  • FIG. 2 shows a schematic representation of the operating device of FIG. 1 upon detection of a hand gesture of a user
  • FIG. 1 shows a schematic representation of an embodiment of the operating device according to the invention in a normal operating mode
  • FIG. 2 shows a schematic representation of the operating device of FIG. 1 upon detection of a hand gesture of a user
  • FIG. 3 shows a schematic illustration of the operating device in a selection mode different from the normal operating mode
  • Fig. 4 is a schematic illustration of a user's hand in carrying out a confirmation gesture
  • Figure 5 is a schematic representation of the operator after returning from the selection mode to the normal mode of operation.
  • the exemplary embodiment explained below is a preferred embodiment of the invention.
  • the described components of the embodiment each represent individual features of the invention that are to be considered independently of one another, which also each independently further develop the invention and thus also individually or in a different combination than the one shown as part of the invention.
  • the described embodiment can also be supplemented by further features of the invention already described.
  • the operating device 1 shows an operating device 1 which may be provided, for example, in a motor vehicle 2 or (not shown) in a portable, mobile terminal, for example a smartphone or a tablet PC.
  • the operating device 1 may have a display device 3 with a display surface 4.
  • the display device 3 may be, for example, a screen or a head-up display.
  • a display content on the display surface 4 may be controlled by a controller 5, which may be realized, for example, on the basis of a microcontroller or microprocessor.
  • the control device 5 for example, device components 6, for example vehicle components of the motor vehicle 2, can be controlled by a corresponding control signal 7.
  • the device components 6 are shown in Fig. 1 only by a single element.
  • a respective operating function 8 can be realized by the example, a graphical user interface for operating the respective device component or a control function for triggering a device function of the respective device component.
  • the control device 5 can be represented by corresponding graphics data 9 on the display surface 4 of the display device 3, the operating functions 8 each by a graphical object 10, such as an icon or a logo, or a menu item.
  • the operating device 1 it is possible for a user to simultaneously select a plurality of the graphic objects 10 and thereby, for example, in a menu select the operating functions 8 to be offered to the user on the display surface 4, ie a selection of the graphic objects 10 to be displayed only. meets.
  • the operating device 1 may comprise a detection device 11, which may comprise, for example, a camera 12, in particular a TOF camera, and / or a proximity-sensitive and / or touch-sensitive sensor matrix 13.
  • the sensor matrix 13 may be provided, for example, on the display surface 4, that is, the display device 3 is in this case a touch screen.
  • a detection area 14 of the camera 12 may also be directed to the display area 4.
  • the graphical objects 10 may be, for example, list entries of an operating menu or icons of a desktop. Accordingly, a display plane 15, which is shown on the display surface 4 and on which the objects 10 are arranged, forms a virtual display plane, such as an operating menu or a desktop. The user is now able to select which graphical objects 10 he wants to get displayed from the presentation plane 15. In other words, he can customize an operating menu or an arrangement of icons. By means of the operating device 1, it is hereby possible for him by means of gesture operation and on the sensor matrix 13 of the display device 3 to delete / hide menu contents / elements, that is, the graphical objects 10, make.
  • the display of graphic objects 10 shown in FIG. 1 may, for example, represent a standard orientation of the menu arrangement of individual menu items in the form of the graphic objects 10. Each menu item may represent, for example, one of the operating functions 8, for example for radio, navigation, climate control, seat adjustment.
  • FIG. 2 shows how the user can now specify that some of the graphical objects are to be removed from the presentation plane 15.
  • the user performs with his hand 16 an operating gesture 17, which determines the desire of a deletion process or the desire to hide one of the graphic objects 10, thus announcing an interaction that the user with the hand 16 and the graphical Want to cause objects 10.
  • the hand gesture 17 may, for example, represent an attitude of the hand 16, in which the hand represents for collecting the falling out of the virtual display plane 15 and thus from the display device 3 graphic objects 10 by a flat, turned up palm or a curved, upward turned palm is shown.
  • the camera 12 the hand 16 is imaged in the hand gesture 17 by 3D image data 18.
  • the detection device 1 1 detects the hand gesture 17 and signals it to the control device 5, which then activates a selection mode A for the display plane 15, which is also shown in FIG.
  • a selection mode A for the display plane 15, which is also shown in FIG.
  • a graphical and / or dynamic representation of the objects 10 is modified in such a way that the graphical objects 10 are animated and / or displayed in a different shape, size, color and / or in comparison to the normal operating mode illustrated in FIG. or in a dynamically variable form, for example, pulsatingly larger and smaller. It can also be a wobble or flashing provided.
  • FIG. 3 shows how a selection of the graphic objects 10 to be removed is determined by the detection device 11 in the selection mode A.
  • a direct selection takes place, that is to say the user can mark the objects 10 to be removed.
  • the user selection can be detected via the sensor matrix 13 on the display surface 4.
  • the user can, for example, touch or touch the objects 10 to be selected with a finger 19 of the hand 16 on the display surface 4. These can then also be marked graphically as selected.
  • the marked or selected objects 10 represent a user selection 10 '.
  • the user can freely execute in space a confirmation gesture 20 in which the user, for example, performs a swiping movement with the hand 16 in a predetermined wiping direction.
  • This confirmation gesture 20 can in turn be detected by the camera 12 and represented by 3D image data 18 so that it can be recognized by the image recognition device of the detection device 11 as the confirmation gesture 20.
  • the user selection is completed by the control device 5 in such a way that the interaction or manipulation function established by the hand gesture 17 is now applied to the selected objects 10.
  • the selected objects "fall out of the display device 15", for example by being shifted downwards in an animation 21 on the virtual presentation plane 5 as shown in FIG. 5 and then out of the virtual presentation plane 15, that is to say the display device 3 falls out or disappears, the animation 21 thus representing the falling out of the selected object 10 and thus the interaction defined by the hand gesture 17.
  • the graphic objects 10 desired by the user then remain for this reason Quick access or direct access to operating functions 8 is possible over the remaining graphical objects 10, without the user having to visually seek out the right objects from a variety of otherwise unnecessary graphical objects can be used on an operating device.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de fonctionnement d'un dispositif de commande (1) pour provoquer une interaction (21) entre des objets graphiques (10) et une main (16) d'un utilisateur. Le procédé consiste en ce que : un dispositif de détection (11) reconnaît un geste de la main (17) prédéfini, exécuté sans contact de la main (16) librement dans l'espace, par lequel est déterminée l'interaction (21) à appliquer à au moins un ou à quelques-uns des objets (10) ; en cas de geste de la main (17) reconnu, un dispositif de commande (5) active un mode de sélection (A) ; en cas de mode de sélection (A) activé, le dispositif de détection (11) reçoit de l'utilisateur une sélection d'utilisateur (10') d'au moins un des objets (10), sur lequel le geste de la main (17) doit avoir un effet, et détecte un geste de confirmation (20) terminant la sélection (10') de l'utilisateur ; et le dispositif de commande (5) applique l'interaction (21), déterminée par le geste de la main (17) reconnu, à l'au moins un objet (10) sélectionné par la sélection (10') de l'utilisateur.
PCT/EP2017/052848 2016-02-19 2017-02-09 Dispositif de commande d'un véhicule automobile et procédé de fonctionnement d'un dispositif de commande pour provoquer un effet de changement entre un plan virtuel de représentation et une main WO2017140569A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102016001998.0 2016-02-19
DE102016001998.0A DE102016001998A1 (de) 2016-02-19 2016-02-19 Kraftfahrzeug-Bedienvorrichtung und Verfahren zum Betreiben einer Bedienvorrichtung, um eine Wechselwirkung zwischen einer virtuellen Darstellungsebene und einer Hand zu bewirken

Publications (1)

Publication Number Publication Date
WO2017140569A1 true WO2017140569A1 (fr) 2017-08-24

Family

ID=58016702

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/052848 WO2017140569A1 (fr) 2016-02-19 2017-02-09 Dispositif de commande d'un véhicule automobile et procédé de fonctionnement d'un dispositif de commande pour provoquer un effet de changement entre un plan virtuel de représentation et une main

Country Status (2)

Country Link
DE (1) DE102016001998A1 (fr)
WO (1) WO2017140569A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020078319A1 (fr) * 2018-10-15 2020-04-23 华为技术有限公司 Procédé de manipulation basé sur le geste et dispositif de terminal

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102018205664A1 (de) * 2018-04-13 2019-10-17 Bayerische Motoren Werke Aktiengesellschaft Vorrichtung zur Assistenz eines Insassen im Innenraum eines Kraftfahrzeugs
DE102019205097A1 (de) * 2019-04-09 2020-10-15 Volkswagen Aktiengesellschaft Verfahren zur Inszenierung eines Bedienmodalitätenwechsels eines Fahrzeugs, System zur Inszenierung eines Bedienmodalitätenwechsels eines Fahrzeugs

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070157089A1 (en) 2005-12-30 2007-07-05 Van Os Marcel Portable Electronic Device with Interface Reconfiguration Mode
WO2010031454A1 (fr) * 2008-09-22 2010-03-25 Volkswagen Ag Système d'affichage et de commande dans un véhicule automobile comportant une représentation, activable par l'utilisateur, d'objets d'affichage et procédé d'utilisation d'un tel système d'affichage et de commande
US20110107268A1 (en) * 2009-11-05 2011-05-05 International Business Machines Corporation Managing large user selections in an application
US20110252346A1 (en) * 2010-04-07 2011-10-13 Imran Chaudhri Device, Method, and Graphical User Interface for Managing Folders
US20150143272A1 (en) * 2012-04-25 2015-05-21 Zte Corporation Method for performing batch management on desktop icon and digital mobile device
EP2930603A1 (fr) 2014-04-07 2015-10-14 Volkswagen Aktiengesellschaft Procédé et dispositif de préparation d'une interface utilisateur graphique dans un véhicule

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009059867A1 (de) * 2009-12-21 2011-06-22 Volkswagen AG, 38440 Verfahren und Vorrichtung zum Bereitstellen einer graphischen Benutzerschnittstelle
DE102014202836A1 (de) * 2014-02-17 2015-08-20 Volkswagen Aktiengesellschaft Anwenderschnittstelle und Verfahren zur Unterstützung eines Anwenders bei der Bedienung einer Anwenderschnittstelle
US20150261659A1 (en) * 2014-03-12 2015-09-17 Bjoern BADER Usability testing of applications by assessing gesture inputs

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070157089A1 (en) 2005-12-30 2007-07-05 Van Os Marcel Portable Electronic Device with Interface Reconfiguration Mode
WO2010031454A1 (fr) * 2008-09-22 2010-03-25 Volkswagen Ag Système d'affichage et de commande dans un véhicule automobile comportant une représentation, activable par l'utilisateur, d'objets d'affichage et procédé d'utilisation d'un tel système d'affichage et de commande
DE102008048825A1 (de) 2008-09-22 2010-03-25 Volkswagen Ag Anzeige- und Bediensystem in einem Kraftfahrzeug mit nutzerbeeinflussbarer Darstellung von Anzeigeobjekten sowie Verfahren zum Betreiben eines solchen Anzeige- und Bediensystems
US20110107268A1 (en) * 2009-11-05 2011-05-05 International Business Machines Corporation Managing large user selections in an application
US20110252346A1 (en) * 2010-04-07 2011-10-13 Imran Chaudhri Device, Method, and Graphical User Interface for Managing Folders
US20150143272A1 (en) * 2012-04-25 2015-05-21 Zte Corporation Method for performing batch management on desktop icon and digital mobile device
EP2930603A1 (fr) 2014-04-07 2015-10-14 Volkswagen Aktiengesellschaft Procédé et dispositif de préparation d'une interface utilisateur graphique dans un véhicule

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020078319A1 (fr) * 2018-10-15 2020-04-23 华为技术有限公司 Procédé de manipulation basé sur le geste et dispositif de terminal

Also Published As

Publication number Publication date
DE102016001998A1 (de) 2017-08-24

Similar Documents

Publication Publication Date Title
EP2930049B1 (fr) Interface utilisateur et procédé d'adaptation d'une vue sur une unité d'affichage
EP3113969B1 (fr) Interface utilisateur et procédé de signalisation d'une position en 3d d'un moyen de saisie lors de la détection de gestes
EP3067244B1 (fr) Vehicule avec mode de conduite s'adaptant automatiquement a la situation
DE102010027915A1 (de) Benutzer-Schnittstellen-Einrichtung zum Steuern eines Fahrzeug-Multimedia-Systems
EP2960099B1 (fr) Interface utilisateur et procédé d'adaptation d'un réglage d'un moyen de déplacement à l'aide d'une unité d'affichage d'une interface utilisateur
WO2015110227A1 (fr) Interface utilisateur, et procédé pour l'adaptation d'une vue sur une unité d'affichage
EP2867762B1 (fr) Procédé permettant de recevoir une entrée sur un champ tactile
EP3234743B1 (fr) Procédé de fonctionnement d'un dispositif de commande d'un véhicule dans des modes de fonctionnement différents, dispositif de commande et véhicule automobile
DE102015011647B3 (de) Kraftfahrzeug-Bedienvorrichtung mit mehreren gekoppelten Bildschirmen
EP3508968A1 (fr) Procédé de fonctionnement d'une interface homme-machine ainsi qu'interface homme-machine
DE102012020607A1 (de) Kraftwagen mit einer Gestensteuerungseinrichtung sowie Verfahren zum Steuern eines Auswahlelements
EP3508967A1 (fr) Procédé de fonctionnement d'une interface homme-machine ainsi qu'interface homme-machine
WO2017140569A1 (fr) Dispositif de commande d'un véhicule automobile et procédé de fonctionnement d'un dispositif de commande pour provoquer un effet de changement entre un plan virtuel de représentation et une main
DE102011084345A1 (de) Bediensystem und Verfahren zur Darstellung einer Bedienfläche
EP3347804B1 (fr) Dispositif de commande avec une entrée de caractères et une fonction d'effacement
WO2015169462A1 (fr) Interface utilisateur et procédé permettant de basculer entre des écrans d'une interface utilisateur
EP3188922B1 (fr) Dispositif de commande et procédé permettant de commander les fonctions d'un véhicule, en particulier d'un véhicule à moteur
EP3426516B1 (fr) Dispositif de commande et procédé pour détecter la sélection, par l'utilisateur, d'au moins une fonction de commande du dispositif de commande
EP2885154B1 (fr) Procédé et dispositif de commande et d'affichage pour le fonctionnement d'un dispositif de commande interactif
DE102012218155A1 (de) Erleichtern der Eingabe auf einer berührungsempfindlichen Anzeige in einem Fahrzeug
EP3108332A1 (fr) Interface utilisateur et procédé de commutation d'un premier mode de commande d'une interface utilisateur en un mode gestuel en 3d
WO2014114428A1 (fr) Procédé et système pour commander en fonction de la direction du regard une pluralité d'unités fonctionnelles et véhicule automobile et terminal mobile comprenant un tel système
DE102008023890A1 (de) Bedieneinrichtung mit einer Anzeigeeinrichtung sowie Verfahren zu ihrem Betrieb
WO2024046612A1 (fr) Commande d'une fonction à bord d'un véhicule à moteur
DE102016008049A1 (de) Verfahren zum Betreiben einer Bedienvorrichtung, Bedienvorrichtung und Kraftfahrzeug

Legal Events

Date Code Title Description
DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17704452

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17704452

Country of ref document: EP

Kind code of ref document: A1