WO2014067798A1 - Mise à disposition d'une entrée de commande à l'aide d'un visiocasque - Google Patents

Mise à disposition d'une entrée de commande à l'aide d'un visiocasque Download PDF

Info

Publication number
WO2014067798A1
WO2014067798A1 PCT/EP2013/071824 EP2013071824W WO2014067798A1 WO 2014067798 A1 WO2014067798 A1 WO 2014067798A1 EP 2013071824 W EP2013071824 W EP 2013071824W WO 2014067798 A1 WO2014067798 A1 WO 2014067798A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
gesture
menu
user
head
Prior art date
Application number
PCT/EP2013/071824
Other languages
German (de)
English (en)
Inventor
Felix Lauber
Wolfgang Spiessl
Original Assignee
Bayerische Motoren Werke Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bayerische Motoren Werke Aktiengesellschaft filed Critical Bayerische Motoren Werke Aktiengesellschaft
Publication of WO2014067798A1 publication Critical patent/WO2014067798A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • the invention relates to a method for providing an operating input using a head-mounted display, a corresponding device and a computer program.
  • head-mounted displays are known that are worn by a user and include displays arranged in front of one or both eyes of the user. These displays may be semi-translucent, allowing the user to perceive both the representation on the display and the surrounding environment.
  • Document WO 01/56007 A1 mentions a system comprising a head-mounted display and a tracker carried on one hand of the user. A menu is displayed depending on the position of the tracker in the head-mounted display. However, document WO 01/56007 A1 does not deal with the interaction with the menu.
  • a method for providing operator input using a head-mounted display carried by a user comprises: detecting the position of an object, in particular a user's hand; Display of a menu with in particular graphically displayed selection possibilities based on the detected position of the object; Detecting a gesture of at least a part of a hand, in particular the hand whose position has been detected, if the position of the hand is detected; Recognizing that the detected gesture corresponds to a gesture associated with a choice of the menu; in response to detecting that the detected gesture corresponds to the gesture, selecting the associated selection option.
  • the user is enabled a comfortable menu control.
  • the object may be a steering wheel or a controller, and the menu may also include a list, the choices being elements of the list. Lists can also be scrolled (scrolled) using the method, other submenus can be called up, each of which is assigned to individual displayed options.
  • the object is the user's hand.
  • the menu is then displayed based on the detected position of a user's hand, and is thus based on the "action radius" of the user or on the body of the user.
  • This type of placement of the menu leads to an intuitive input situation, as the user By manipulating the position of the hand, which the user habitually uses for inputs to, for example, computers, a user-familiar situation is created and the menu is easily associated with the ability to make inputs.
  • this placement is advantageous in that the gestures can be developed to produce the same result with always the same defined movements, thus making it easier for the user to learn or understand the operator input. On comparatively complex sensors can be dispensed with, it is only a camera needed.
  • a gesture may include moving at least a portion of the hand and showing a number using the fingers of the hand.
  • the number shown by the fingers may be associated with a choice assigned the same number.
  • the method comprises cases in which the input is performed by hand, by whose position the input possibilities are displayed, as well as cases in which the input is carried out by the other hand.
  • the position of the hand and the detection of the gesture can be performed using one or more camera shots and corresponding image processing known per se.
  • the camera can be a camera that takes pictures in the visible light range or in the infrared range.
  • the camera can also provide depth information.
  • the camera may in particular be included in the head-mounted display itself, but also be arranged, for example, on the dashboard or in the front region of the headliner of a motor vehicle.
  • the options along a circle are arranged adjacent to the position of the object.
  • the choices may also be displayed aligned along a circle with the object substantially at the center of the circle.
  • this arrangement of the selection possibilities advantageously enables hand movements in the direction of a selection possibility the hand is traversed during a movement gesture, is achieved with a so- tion of the input elements as possible kept kiein, which increases the input speed.
  • the selection possible chains can be displayed on the object or the detected hand, in particular the back of the hand. This reinforces the relation of the selection possibilities to the "input medium" even further.
  • the gesture which is associated with a selection possibility, comprises a movement of the hand in the direction of the selection possibility; and, in particular, the gesture is associated with the option displayed in the direction of movement of the hand from the detected position of the object, in particular the detected hand.
  • the user moves aiso his hand in the direction in which he is also the choice. Since the choices, in the case of the object being a hand, may be based on the position of the hand, the same movement of the hand may each lead to the selection of the same choice regardless of the absolute position of the hand. Even when using the user's hand, which does not serve to position the choices, there is always the same movement relative to the position-determining hand.
  • the gesture associated with a selection facility comprises interpreting a number using extended fingers of the hand. The number is then assigned a choice. This is advantageous, for example, in the case of a large number of choices, if, for example, small deviations in the movement of the hand can lead to an input possibility leading to an unwanted selection. Nevertheless, this input method can also be provided as standard input.
  • the head-mounted display comprises a camera and the method further comprises the following steps; Detecting that the camera of the head-mounted display is for a predetermined time substantially aligned with the object, in particular the hand, in particular on the back of the hand; the menu being displayed in response to detecting that the camera is aligned.
  • the predetermined time is advantageously chosen so that an unintentional activation, ie the display of the menu, is avoided.
  • the user is offered a convenient way to display the menu and to open the possibility for input, in the particularly advantageous case that the camera of the head-mounted display is arranged so that it receives in the same direction as the orientation of the head-mounted display is an orientation of the camera on the hand that the user also focuses his gaze on the hand. This represents an intuitive activation of the menu entry.
  • the number of outstretched fingers can also be detected and the menu displayed in response.
  • the method comprises: detecting the closing of the hand into a fist, in response to detecting the closing end of the hand into a fist, closing the menu. Likewise, the menu can be opened again by opening the hand. Again, this provides the user with an intuitive and easily and quickly executable gesture to interact with the menu entry facility.
  • the object is a steering wheel of a vehicle or a controller, that is to say a specific physical structure provided for input.
  • a choice can be selected.
  • An arrangement along the steering wheel or controller may determine the location and shape of the choices shown graphically.
  • an apparatus comprises: a head-mounted display; A camera, which is in particular comprised of the head-mounted display; An electronic processing unit, the apparatus being adapted to carry out one of the methods presented above.
  • the electronic processing unit can be a programmable microcontroller or universal processor.
  • a computer program includes instructions for performing one of the methods presented above.
  • the computer program may cause a computer to execute one of the methods presented above when executing the instructions.
  • the invention is also based on the following considerations:
  • the user looks at his outstretched hand (the back of his hand is turned to him). It can be provided that, after a certain time or based on the number of extended fingers, the system recognizes the interaction request and inserts a (for example) circularly arranged menu in the head-mounted display.
  • the opening and closing of the menu can be done, for example, by opening or closing a fist. Since the system is able to determine the position of the hand, the menu can be placed around the outstretched hand of the user.
  • the user can now select a list entry by gestures of the hand (or hands) such as extending a defined number of fingers, by a hand movement on the corresponding menu item or by adding the second hand.
  • the steering wheel specifies the location of the menu, for example.
  • a menu item can be selected to navigate in the menu.
  • Fig. 1 shows a device according to an embodiment.
  • Fig. 2 shows the arrangement of choices of the menu according to an embodiment.
  • FIG. 3 shows an exemplary movement gesture for selecting a selection option according to an embodiment.
  • 4 shows an example gesture for interpreting a number for selecting a selection option according to an exemplary embodiment.
  • Fig. 1 shows an apparatus according to an embodiment.
  • the device comprises a camera 2, an electronic processing unit 3 and a head-mounted display 4, wherein the camera 2 is arranged on the head-mounted display 4 and receives in the same direction as the line of sight of the user is when the head-mounted display 4 is mounted
  • the camera 2 picks up the area in front of it and thus also the hand 1 held there.
  • the recording of the camera 2 is forwarded to the processing unit 3, which is adapted to the position of the hand 1, the orientation of the hand 1 (here the angle a), a gesture of the hand 1 and the number, the direction and the type to recognize outstretched fingers.
  • the processing unit 3 uses the coordinate system drawn in FIG.
  • the processing unit 3 is set to determine a menu with choices and the position at which the menu is to be displayed.
  • the position of the menu to be displayed is determined by the position of the hand 1.
  • An example of the position and arrangement of the menu is shown in FIG.
  • the menu to be displayed is forwarded by the processing unit 3 to the head-mounted display 4 and the head-mounted display 4 finally displays the menu.
  • the processing unit 3 is adapted to detect a gesture of the detected hand in camera shots and to recognize that this gesture corresponds to a gesture that a choice is assigned. Once this is done, a choice is selected.
  • the processing unit 3 shows the arrangement of options 5 of the menu, as they appear for the user of the head-mounted display 4 and can be determined by the processing unit 3.
  • the choices 5 are arranged along a circle which in turn adjoins the user's hand 1.
  • the individual selection chains are numbered 1 to 5.
  • a user can select one of the selections 5 by making a gesture of his hand in the direction of the selection element as a gesture. If the user guides his hand 1, for example in the direction of the option No. 3, as indicated by the dashed line in FIG. 3, this gesture is detected and it is recognized that this gesture corresponds to the selection of the selection option No. 3. Consequently, option # 3 is activated.
  • Fig. 4 shows the case that the user wants to select the third option, and thus, as a gesture, stretches three fingers, which is interpreted as a number "3".

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de mise à disposition d'une entrée de commande à l'aide d'un visiocasque qui est porté par un utilisateur. Ledit procédé consiste à : détecter la position d'un objet, en particulier une main de l'utilisateur ; afficher un menu présentant des possibilités de sélection sur la base de la position détectée de l'objet ; détecter un geste d'au moins une partie d'une main, en particulier de la main dont la position a été détectée ; reconnaître que le geste détecté correspond à un geste auquel est assignée une possibilité de sélection du menu ; en réponse à la reconnaissance que le geste détecté correspond au geste, sélectionner la possibilité de sélection assignée.
PCT/EP2013/071824 2012-10-30 2013-10-18 Mise à disposition d'une entrée de commande à l'aide d'un visiocasque WO2014067798A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102012219814.8A DE102012219814A1 (de) 2012-10-30 2012-10-30 Bereitstellen einer Bedienungseingabe unter Verwendung eines Head-mounted Displays
DE102012219814.8 2012-10-30

Publications (1)

Publication Number Publication Date
WO2014067798A1 true WO2014067798A1 (fr) 2014-05-08

Family

ID=49448143

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2013/071824 WO2014067798A1 (fr) 2012-10-30 2013-10-18 Mise à disposition d'une entrée de commande à l'aide d'un visiocasque

Country Status (2)

Country Link
DE (1) DE102012219814A1 (fr)
WO (1) WO2014067798A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113672158A (zh) * 2021-08-20 2021-11-19 上海电气集团股份有限公司 一种增强现实的人机交互方法及设备

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107003142A (zh) 2014-12-05 2017-08-01 奥迪股份公司 车辆特别是客运车辆的操作装置及其操作方法
DE102015006664A1 (de) * 2015-05-22 2016-11-24 Giesecke & Devrient Gmbh System und Verfahren zur Bearbeitung von Wertdokumenten
CN108958590A (zh) * 2017-05-26 2018-12-07 成都理想境界科技有限公司 应用于头戴式显示设备的菜单操作方法及头戴式显示设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001056007A1 (fr) 2000-01-28 2001-08-02 Intersense, Inc. Poursuite a auto-reference
US6771294B1 (en) * 1999-12-29 2004-08-03 Petri Pulli User interface
US20110197147A1 (en) * 2010-02-11 2011-08-11 Apple Inc. Projected display shared workspaces
WO2012082971A1 (fr) * 2010-12-16 2012-06-21 Siemens Corporation Systèmes et procédés pour une interface du regard et des gestes

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9223134B2 (en) * 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6771294B1 (en) * 1999-12-29 2004-08-03 Petri Pulli User interface
WO2001056007A1 (fr) 2000-01-28 2001-08-02 Intersense, Inc. Poursuite a auto-reference
US20110197147A1 (en) * 2010-02-11 2011-08-11 Apple Inc. Projected display shared workspaces
WO2012082971A1 (fr) * 2010-12-16 2012-06-21 Siemens Corporation Systèmes et procédés pour une interface du regard et des gestes

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SEAN WHITE ET AL: "Interaction and presentation techniques for shake menus in tangible augmented reality", MIXED AND AUGMENTED REALITY, 2009. ISMAR 2009. 8TH IEEE INTERNATIONAL SYMPOSIUM ON, IEEE, PISCATAWAY, NJ, USA, 19 October 2009 (2009-10-19), pages 39 - 48, XP031568946, ISBN: 978-1-4244-5390-0 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113672158A (zh) * 2021-08-20 2021-11-19 上海电气集团股份有限公司 一种增强现实的人机交互方法及设备

Also Published As

Publication number Publication date
DE102012219814A1 (de) 2014-04-30

Similar Documents

Publication Publication Date Title
EP1998996A1 (fr) Serveur interactif et procédé permettant de faire fonctionner le serveur interactif
DE102008048825A1 (de) Anzeige- und Bediensystem in einem Kraftfahrzeug mit nutzerbeeinflussbarer Darstellung von Anzeigeobjekten sowie Verfahren zum Betreiben eines solchen Anzeige- und Bediensystems
WO2013029772A2 (fr) Procédé et système fournissant une interface utilisateur graphique, en particulier dans un véhicule
DE102012020607B4 (de) Kraftwagen mit einer Gestensteuerungseinrichtung sowie Verfahren zum Steuern eines Auswahlelements
EP3067244A1 (fr) Vehicule avec mode de conduite s'adaptant automatiquement a la situation
DE102017113763B4 (de) Verfahren zum Betreiben einer Anzeigevorrichtung für ein Kraftfahrzeug sowie Kraftfahrzeug
WO2014067798A1 (fr) Mise à disposition d'une entrée de commande à l'aide d'un visiocasque
EP2953793B1 (fr) Système de commande d'une presse à imprimer
EP3508968A1 (fr) Procédé de fonctionnement d'une interface homme-machine ainsi qu'interface homme-machine
DE102014009302A1 (de) Verfahren zum Betreiben einer Virtual-Reality-Brille und System mit einer Virtual-Reality-Brille
WO2015162058A1 (fr) Interaction de gestes avec un système d'information de conducteur d'un véhicule
WO2017140569A1 (fr) Dispositif de commande d'un véhicule automobile et procédé de fonctionnement d'un dispositif de commande pour provoquer un effet de changement entre un plan virtuel de représentation et une main
WO2016184971A1 (fr) Procédé de fonctionnement d'un dispositif de commande ainsi que dispositif de commande pour un véhicule automobile
EP3025214B1 (fr) Procédé de fonctionnement d'un dispositif d'entrée et dispositif d'entrée
EP3274789B1 (fr) Ensemble de simulation de véhicule automobile pour la simulation d'un environnement virtuel avec un véhicule automobile virtuel et procédé de simulation d'un environnement virtuel
EP3093182B1 (fr) Moyen de deplacement, machine de travail, interface utilisateur et procede d'affichage d'un contenu d'un premier dispositif d'affichage sur un second dispositif d'affichage
EP3108332A1 (fr) Interface utilisateur et procédé de commutation d'un premier mode de commande d'une interface utilisateur en un mode gestuel en 3d
DE102012218155A1 (de) Erleichtern der Eingabe auf einer berührungsempfindlichen Anzeige in einem Fahrzeug
WO2014040807A1 (fr) Entrées par effleurement le long d'un seuil d'une surface tactile
WO2014114428A1 (fr) Procédé et système pour commander en fonction de la direction du regard une pluralité d'unités fonctionnelles et véhicule automobile et terminal mobile comprenant un tel système
WO2021028274A1 (fr) Système utilisateur et procédé pour faire fonctionner un système utilisateur
DE102014006945A1 (de) Fahrzeugsystem, Fahrzeug und Verfahren zum Reagieren auf Gesten
DE102013016490A1 (de) Kraftfahrzeug mit berührungslos aktivierbarem Handschrifterkenner
DE102016008049A1 (de) Verfahren zum Betreiben einer Bedienvorrichtung, Bedienvorrichtung und Kraftfahrzeug
DE102013021927A1 (de) Verfahren und System zum Betreiben einer Anzeigeeinrichtung sowie Vorrichtung mit einer Anzeigeeinrichtung

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13779805

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13779805

Country of ref document: EP

Kind code of ref document: A1