WO2017084253A1 - Procédé de commande appliqué à un dispositif monté sur la tête et dispositif monté sur la tête - Google Patents

Procédé de commande appliqué à un dispositif monté sur la tête et dispositif monté sur la tête Download PDF

Info

Publication number
WO2017084253A1
WO2017084253A1 PCT/CN2016/081916 CN2016081916W WO2017084253A1 WO 2017084253 A1 WO2017084253 A1 WO 2017084253A1 CN 2016081916 W CN2016081916 W CN 2016081916W WO 2017084253 A1 WO2017084253 A1 WO 2017084253A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
motion track
action
cursor
light
Prior art date
Application number
PCT/CN2016/081916
Other languages
English (en)
Chinese (zh)
Inventor
陈相金
Original Assignee
乐视控股(北京)有限公司
乐视致新电子科技(天津)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 乐视控股(北京)有限公司, 乐视致新电子科技(天津)有限公司 filed Critical 乐视控股(北京)有限公司
Publication of WO2017084253A1 publication Critical patent/WO2017084253A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to the field of device control technologies, and in particular, to a control method and a head mounted device applied to a head mounted device.
  • a headset is at least a head-mounted display device that gives the user a virtual reality experience.
  • other features can be integrated on the headset.
  • the game application as an example, the game console, audio, etc. can all be integrated into the headset.
  • a keyboard, a mouse, a remote controller, and the like are generally used for control. Specifically, the cursor is moved on the display interface of the display screen of the head wear device through a keyboard, a mouse, a remote controller, and the like, and operations such as dragging and clicking are performed.
  • an unexpected result may be generated by inputting an incorrect command, such as pressing an incorrect button, causing an abnormality in the current application (for example, a currently running game), in severe cases. May cause the system to run incorrectly.
  • the embodiment of the invention provides a control method and a head-wearing device applied to a head-mounted device, which are used to solve the problem that the user needs to explore and operate the control in the prior art, and the risk of misoperation is high, which may result in The current application or even the system is running incorrectly.
  • An embodiment of the present invention provides a control method applied to a headset, including:
  • gesture action is a control action
  • determining that the cursor is located in a display area to which the control action acts according to the cursor motion trajectory performing a control operation corresponding to the control action.
  • An embodiment of the present invention provides a headset device, including:
  • a gesture information sensor for acquiring gesture information
  • a processor configured to determine a gesture motion and a gesture motion track according to the gesture information acquired by the gesture information sensor; determine a cursor motion track on the display interface of the display screen according to the gesture motion track; if the gesture motion is a control action, and according to The cursor movement trajectory determines that the cursor is located in a display area in which the control action is applied, and performs a control operation corresponding to the control action;
  • Display for providing a display interface.
  • control method and the head-wearing device applied to the head-wearing device provided by the embodiment of the present invention because of the control by the gesture recognition technology, do not require the user to explore the operation, thereby reducing the risk of misoperation, thereby improving the system performance.
  • FIG. 1 is a flow chart of an embodiment of a control method applied to a headwear device according to the present invention
  • FIG. 2 is a schematic structural view of an embodiment of a headset device according to the present invention.
  • FIG. 3 is a schematic diagram of a working principle of a gesture information sensor according to an embodiment of the present invention.
  • FIG. 1 is a schematic diagram of a control method applied to a headset device according to an embodiment of the present invention, where the method includes:
  • Step 100 Determine a gesture motion and a gesture motion track according to the acquired gesture information.
  • a head-mounted device is a component of a game device or a game device. Then, the user wearing the headset makes gestures during the game to manipulate the game application, such as adjusting the volume, controlling the motion of the game character, and the like.
  • the gesture information corresponding to the gesture action made by the user during the game can be obtained by using the gesture recognition technology. In the implementation of the step, the gesture gesture and the gesture gesture made by the user during the game can be determined according to the acquired gesture information. Track.
  • the gesture information corresponding to the gesture motion corresponding to the user's right-to-left swipe gesture can be obtained by the gesture recognition technology, and the gesture can be performed according to the gesture in step 100.
  • the information determines the gesture action of sliding, and determines that the gesture motion trajectory is a right-to-left trajectory.
  • Step 110 Determine a cursor motion track on the display interface of the headset according to the gesture motion track.
  • the cursor on the display interface can be controlled to move according to the determined motion track.
  • those skilled in the art can easily set the correspondence between the gesture motion track and the cursor motion track on the display interface.
  • the most intuitive way is that when the gesture motion trajectory is a right-to-left trajectory, then in step 110, the cursor motion trajectory on the display interface of the headset is determined to be a right-to-left trajectory on the screen.
  • the display trajectory can also be set to a left to right or top to bottom trajectory.
  • the shape of the cursor on the display interface of the headwear device is not limited.
  • the shape of the cursor can be an arrow, a blinking line segment, a shape of a hand, and the like.
  • Step 120 If the gesture action is a control action, and determining that the cursor is located in the display area affected by the control action according to the cursor motion track, performing a control operation corresponding to the control action.
  • the cursor motion track is determined only according to the gesture motion track Trace, and then control the cursor movement.
  • each control action corresponds to a display area.
  • the display area is also referred to as a display area to which the control action acts.
  • each control action corresponds to one control operation.
  • the control method applied to the headset device provided by the embodiment of the invention because of the control by the gesture recognition technology, does not require the user to explore the operation, reduces the risk of misoperation, and further improves the system performance.
  • the manner of representing the gesture information is various, for example, it can be represented by the depth information image of the gesture.
  • the implementation of the foregoing step 100 may be: determining a gesture action and a gesture motion track by performing gesture recognition on the depth information image of the acquired gesture.
  • the coordinates of the key point in the three-dimensional coordinate system are acquired from the depth information image of each frame of the gesture, and the gesture motion and the gesture motion track are determined according to the coordinates of the acquired key point.
  • the key points can be, but are not limited to, including: the coordinates of each finger, the palm coordinates, the arm coordinates, and the like.
  • whether the gesture action is completed can be determined by determining whether the coordinate change of the same key point in the depth information image within the predetermined number is within the set threshold.
  • the depth information image of the gesture may be obtained by analyzing, by the light receiver, the received light intensity, and the light received by the light receiver is emitted by the light emitter and reflected by the occlusion object. It should be noted that the depth information image of the gesture may also be acquired by a multi-view stereo camera such as a binocular stereo camera.
  • the light receiver can be either a laser receiver or an infrared receiver.
  • the determined gesture motion track may be valid before determining the cursor motion track on the display interface of the headset according to the gesture motion track.
  • the gesture motion trajectory template is matched; if the matching is successful, the above step 110 is to determine a cursor motion trajectory on the display interface of the headset according to the gesture motion track that is successfully matched. If the match is not successful, the corresponding action is not triggered.
  • the specific content and quantity of the effective motion trajectory template can be set according to actual needs, which is not limited by the present invention.
  • template matching is taken as an example because the method is relatively easy to implement, so the template matching is taken as an example here; however, as long as the implementation method capable of recognizing the invalid gesture motion track is applicable to this embodiment
  • the embodiments of the present invention are not limited to the manner of template matching.
  • the manner of template matching is only used to teach the person skilled in the art how to implement the present invention, but it does not mean that only the method can be used, and the implementation process can be determined in combination with practical needs. The corresponding way.
  • the determined gesture action can be matched with the control action template, and if matched, the gesture action is a control action.
  • control action template can be set according to actual needs, which is not limited by the present invention.
  • template matching is taken as an example because the method is relatively easy to implement, so template matching is taken as an example here; however, any implementation that can recognize the control action is applicable to the implementation of the present invention.
  • confirmation by means of template matching the manner of template matching is only used to teach those skilled in the art how to implement the present invention in detail, but it does not mean that only this method can be used, and the implementation process can be combined with practical needs to determine corresponding the way. .
  • the embodiment of the present invention further provides a head-wearing device, as shown in FIG. 2, including:
  • a gesture information sensor 201 configured to acquire gesture information
  • the processor 202 is configured to determine a gesture motion and a gesture motion track according to the gesture information acquired by the gesture information sensor 201, and determine a cursor motion track on the display interface of the display screen 203 according to the gesture motion track; if the gesture action is a control action And determining, according to the cursor motion trajectory, that the cursor is located in a display area in which the control action is applied, performing a control operation corresponding to the control action;
  • the display screen 203 is configured to provide a display interface.
  • the headset device provided by the embodiment of the present invention does not require the user to explore the operation because the gesture recognition technology is used for control, thereby reducing the risk of misoperation, thereby improving system performance.
  • the gesture information sensor may be a light emitter and a light receiver.
  • the light emitter is used to emit light;
  • the light receiver is used to analyze the received light intensity to obtain a depth information image of the gesture, and the light received by the light receiver is emitted by the light emitter and reflected by the obstruction.
  • the light emitter is a laser emitter
  • the light receiver is a laser receiver
  • the light emitter is an infrared emitter
  • the light receiver is an infrared receiver.
  • the light emitter is composed of a light source and a grating disposed in a light exiting direction of the light source.
  • gesture information sensor can also be implemented by other devices capable of acquiring gesture information, such as a multi-view camera.
  • the gesture information sensor and the display screen are located on the same side of the headset.
  • an infrared emitter and an infrared receiver are disposed on the headset and on the same side of the display screen.
  • the infrared emitter is composed of an infrared light source and a grating.
  • the memory of the headset stores a correspondence between a plurality of control actions and a display area to which it acts, and a correspondence relationship with the control operation.
  • the action of the finger press acts on the display area where all the clickable button icons are located, and corresponds to the click operation
  • the action of the finger lift acts on the display area where all the clickable button icons are located, and corresponds to the operation of releasing the button.
  • a plurality of effective gesture motion track templates are also stored in the memory of the headset.
  • a plurality of control action templates are also stored in the memory of the headset.
  • the infrared emitter emits infrared light
  • the human body will reflect on the optical fiber in the infrared light projection range.
  • the infrared receiver collects light of different intensity, thereby generating Depth information image. It is assumed that the infrared receiver generates a frame of depth information image every millisecond in units of 1 millisecond.
  • the processor receives the depth information image generated by the infrared receiver, and determines the gesture motion and the gesture motion trajectory accordingly. Specifically, the processor analyzes the coordinates of the key points in the depth information image of each frame, and determines the gesture action and the gesture track according to the change of the coordinates of the key points.
  • the processor compares the determined gesture motion trajectory with each effective gesture motion trajectory template Match. If the matching is successful, for example, matching the effective gesture motion track template of the linear motion, the subsequent process is continued, otherwise the motion track is not responded to.
  • the processor determines a cursor motion trajectory on the display interface of the display screen according to the gesture motion track that matches the success. Wherein, if the shape of the cursor is the shape of the hand, the action of the hand displayed on the display interface can also be controlled according to the recognized gesture action.
  • the processor matches the determined gesture action with each control action template. If the matching is successful, for example, matching the control action template pressed by the finger, and the cursor is located in the display area to which the gesture action is applied, the control operation corresponding to the gesture action is performed, such as clicking the button icon. If the match is unsuccessful, or the cursor is not located in the display area where the gesture action is applied, the gesture action is not responded.
  • the device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, ie may be located A place, or it can be distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment. Those of ordinary skill in the art can understand and implement without deliberate labor.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de commande appliqué à un dispositif monté sur la tête et un dispositif monté sur la tête. Le procédé consiste : à déterminer une action de geste et une piste de mouvement de geste selon des informations de geste acquises ; à déterminer une piste de mouvement de curseur sur une interface d'affichage du dispositif monté sur la tête selon la piste de mouvement de geste ; et si l'action de geste est une action de commande et s'il est déterminé qu'un curseur est situé dans une région d'affichage sur laquelle l'action de commande agit selon la piste de mouvement de curseur, à exécuter une opération de commande correspondant à l'action de commande. Dans la présente invention, une commande est réalisée par l'intermédiaire d'une technologie de reconnaissance de geste sans opération de tâtonnement d'utilisateur, réduisant ainsi le risque d'une opération défectueuse, et améliorant ainsi les performances de système.
PCT/CN2016/081916 2015-11-20 2016-05-12 Procédé de commande appliqué à un dispositif monté sur la tête et dispositif monté sur la tête WO2017084253A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510811932.9A CN105892636A (zh) 2015-11-20 2015-11-20 一种应用于头戴设备的控制方法及头戴设备
CN201510811932.9 2015-11-20

Publications (1)

Publication Number Publication Date
WO2017084253A1 true WO2017084253A1 (fr) 2017-05-26

Family

ID=57002864

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/081916 WO2017084253A1 (fr) 2015-11-20 2016-05-12 Procédé de commande appliqué à un dispositif monté sur la tête et dispositif monté sur la tête

Country Status (2)

Country Link
CN (1) CN105892636A (fr)
WO (1) WO2017084253A1 (fr)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108268124A (zh) * 2016-12-30 2018-07-10 成都虚拟世界科技有限公司 基于头戴式显示设备的手势识别方法及装置
WO2018184233A1 (fr) * 2017-04-07 2018-10-11 深圳市柔宇科技有限公司 Procédé de reconnaissance de geste de la main et dispositif associé
CN107242882A (zh) * 2017-06-05 2017-10-13 上海瓴舸网络科技有限公司 一种b超显示辅助设备及其控制方法
CN109410691A (zh) * 2018-12-17 2019-03-01 深圳市中智仿真科技有限公司 一种手势控制功能的汽车驾培模拟机
CN112083796A (zh) * 2019-06-12 2020-12-15 Oppo广东移动通信有限公司 控制方法、头戴设备、移动终端和控制系统
CN110119209A (zh) * 2019-06-13 2019-08-13 漫谷科技股份公司 音频设备控制方法和装置
CN111601129B (zh) * 2020-06-05 2022-04-01 北京字节跳动网络技术有限公司 控制方法、装置、终端及存储介质
CN112162631B (zh) * 2020-09-18 2023-05-16 聚好看科技股份有限公司 一种交互设备、数据处理方法及介质
CN112988107A (zh) * 2021-04-25 2021-06-18 歌尔股份有限公司 一种音量调节方法、系统及头戴式设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150042680A1 (en) * 2013-08-08 2015-02-12 Pebbles Ltd. Method and device for controlling a near eye display
CN104571510A (zh) * 2014-12-30 2015-04-29 青岛歌尔声学科技有限公司 一种3d场景中输入手势的系统和方法
CN105045398A (zh) * 2015-09-07 2015-11-11 哈尔滨市一舍科技有限公司 一种基于手势识别的虚拟现实交互设备
CN204790857U (zh) * 2015-07-24 2015-11-18 贺杰 一种手势控制器及一种虚拟现实系统

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9069164B2 (en) * 2011-07-12 2015-06-30 Google Inc. Methods and systems for a virtual input device
US9389420B2 (en) * 2012-06-14 2016-07-12 Qualcomm Incorporated User interface interaction for transparent head-mounted displays
EP2877909B1 (fr) * 2012-07-27 2018-12-26 Nokia Technologies Oy Interaction multimodale avec dispositif d'affichage près de l'oeil
CN104076907A (zh) * 2013-03-25 2014-10-01 联想(北京)有限公司 一种控制方法、装置和穿戴式电子设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150042680A1 (en) * 2013-08-08 2015-02-12 Pebbles Ltd. Method and device for controlling a near eye display
CN104571510A (zh) * 2014-12-30 2015-04-29 青岛歌尔声学科技有限公司 一种3d场景中输入手势的系统和方法
CN204790857U (zh) * 2015-07-24 2015-11-18 贺杰 一种手势控制器及一种虚拟现实系统
CN105045398A (zh) * 2015-09-07 2015-11-11 哈尔滨市一舍科技有限公司 一种基于手势识别的虚拟现实交互设备

Also Published As

Publication number Publication date
CN105892636A (zh) 2016-08-24

Similar Documents

Publication Publication Date Title
WO2017084253A1 (fr) Procédé de commande appliqué à un dispositif monté sur la tête et dispositif monté sur la tête
US10653472B2 (en) Touch free operation of ablator workstation by use of depth sensors
US11983823B2 (en) Transmodal input fusion for a wearable system
US11614793B2 (en) Precision tracking of user interaction with a virtual input device
US10394334B2 (en) Gesture-based control system
US10452151B2 (en) Non-tactile interface systems and methods
US11294475B1 (en) Artificial reality multi-modal input switching model
CN106845335B (zh) 用于虚拟现实设备的手势识别方法、装置及虚拟现实设备
US9400548B2 (en) Gesture personalization and profile roaming
US9625993B2 (en) Touch free operation of devices by use of depth sensors
CN107710132B (zh) 用于施加用于表面约束控制的自由空间输入的方法和装置
EP3465414B1 (fr) Procédé, appareil et support lisible par ordinateur pour une interface tactile et vocale avec emplacement audio
CN107589827B (zh) 一种可识别手势的虚拟现实头盔及其手势识别方法
US11188145B2 (en) Gesture control systems
US10156938B2 (en) Information processing apparatus, method for controlling the same, and storage medium
WO2019214442A1 (fr) Procédé de commande de dispositif, appareil, dispositif de commande et support d'enregistrement
US20150234467A1 (en) Method and apparatus for gesture detection and display control
US20160320846A1 (en) Method for providing user commands to an electronic processor and related processor program and electronic circuit
JP2024534472A (ja) タッチレス式画像ベース入力インタフェース
WO2019184240A1 (fr) Procédé et dispositif de détermination de position interactive
WO2021241038A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations basé sur une opération d'entrée d'un utilisateur et programme informatique permettant d'exécuter ledit procédé
US20190310688A1 (en) Transcribing augmented reality keyboard input based on hand poses for improved typing accuracy
KR101374316B1 (ko) 시스루 디스플레이를 이용한 동작인식 장치 및 그 방법
CN105630176A (zh) 一种智能体感控制的方法及装置
Nagy et al. Evaluation of AI-Supported Input Methods in Augmented Reality Environment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16865435

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16865435

Country of ref document: EP

Kind code of ref document: A1