WO2018068584A1 - 虚拟现实眼镜及其菜单控制方法 - Google Patents

虚拟现实眼镜及其菜单控制方法 Download PDF

Info

Publication number
WO2018068584A1
WO2018068584A1 PCT/CN2017/098434 CN2017098434W WO2018068584A1 WO 2018068584 A1 WO2018068584 A1 WO 2018068584A1 CN 2017098434 W CN2017098434 W CN 2017098434W WO 2018068584 A1 WO2018068584 A1 WO 2018068584A1
Authority
WO
WIPO (PCT)
Prior art keywords
head
action
virtual reality
menu
reality glasses
Prior art date
Application number
PCT/CN2017/098434
Other languages
English (en)
French (fr)
Inventor
金鑫
Original Assignee
捷开通讯(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 捷开通讯(深圳)有限公司 filed Critical 捷开通讯(深圳)有限公司
Publication of WO2018068584A1 publication Critical patent/WO2018068584A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Definitions

  • the present invention relates to the field of virtual reality technologies, and in particular, to a virtual reality glasses and a menu control method thereof.
  • VR glasses can be used as a display screen through intelligent mobile communication terminals, and realize virtual reality functions through related applications on intelligent mobile communication terminals.
  • the control of the VR glasses is mainly realized by connecting a controller of the intelligent mobile communication terminal or a Bluetooth remote controller, and when the menu is selected, the selection of the menu items of the up, down, left, and right is the most commonly used control action, and is put on the wearer. In the case where the VR glasses cannot see the controller or the remote control, the operation is inconvenient.
  • Embodiments of the present invention provide a virtual reality glasses and a menu control method thereof, which are capable of performing menu operations by motion sensing.
  • the present invention provides a virtual reality glasses that establish a connection with an intelligent mobile communication terminal, and a virtual reality application is installed on the smart mobile communication terminal, and the smart mobile communication terminal includes a somatosensory unit for sensing the current user's head.
  • the virtual reality glasses include: an analysis module, connected to the somatosensory unit, configured to parse the action of the head to obtain a direction of the action of the head; and a mapping module connected to the parsing module for acquiring the action of the head The direction is mapped to the moving direction of the menu focus; wherein the somatosensory unit is a gyro sensor, and the parsing module is further configured to acquire the number of movements of the head in the direction according to the motion of the head.
  • the mapping module is further configured to map the number of actions of the action of the head in the direction to the number of times the focus of the menu moves in the moving direction.
  • the action of the head includes at least one of up, down, shaking to the left, and shaking to the right.
  • the present invention provides a virtual reality glasses, the virtual reality glasses establish a connection with the smart mobile communication terminal, and the smart mobile communication terminal is installed with a virtual reality application, and the smart mobile communication terminal includes a somatosensory unit for sensing the action of the current user's head.
  • the virtual reality glasses include: a parsing module connected to the somatosensory unit for parsing the action of the head to obtain a direction of the action of the head; and a mapping module connected to the parsing module for obtaining the direction of the action of the head Maps to the direction of movement of the menu focus.
  • the somatosensory unit is a gyro sensor.
  • the parsing module is further configured to acquire the number of actions of the action of the head in the direction according to the action of the head.
  • the mapping module is further configured to map the number of actions of the action of the head in the direction to the number of times the focus of the menu moves in the moving direction.
  • the action of the head includes at least one of up, down, shaking to the left, and shaking to the right.
  • the present invention also provides a menu control method for virtual reality glasses, the virtual reality glasses establish a connection with the intelligent mobile communication terminal, and the virtual mobile communication terminal is installed with a virtual reality application, and the menu control method includes: sensing an action of the current user's head; The action of the head is analyzed to obtain the direction of the action of the head; the direction of the action of acquiring the head is mapped to the moving direction of the focus of the menu.
  • the step of sensing the motion of the current user's head includes: applying a gyro sensor to sense the motion of the current user's head.
  • the step of analyzing the motion of the head and obtaining the direction of the motion of the head further includes: obtaining the number of motions of the motion of the head in the direction according to the motion of the head.
  • the step of mapping the direction of the action of acquiring the head to the moving direction of the menu focus further includes: mapping the number of actions of the action of the head in the direction to the number of times the focus of the menu moves in the moving direction.
  • the action of the head includes at least one of up, down, shaking to the left, and shaking to the right.
  • the virtual reality glasses of the present invention include: a parsing module connected to the sensing module, configured to parse the action of the head, and obtain the direction of the action of the head; the mapping module, and The parsing module is connected to map the direction of the action of acquiring the head to the moving direction of the menu focus, and can perform menu operations by motion sensing.
  • FIG. 1 is a schematic structural diagram of virtual reality glasses according to an embodiment of the present invention.
  • FIG. 2 is a schematic structural diagram of a physical device of a virtual reality glasses according to an embodiment of the present invention
  • FIG. 3 is a schematic flow chart of a menu control method of virtual reality glasses according to an embodiment of the present invention.
  • FIG. 1 is a schematic structural diagram of virtual reality glasses according to an embodiment of the present invention.
  • the virtual reality glasses establish a connection with the smart mobile communication terminal, and the smart mobile communication terminal is installed with a virtual reality application, and the smart mobile communication terminal includes a somatosensory unit for sensing the action of the current user's head.
  • the virtual reality glasses 10 include a parsing module 11 and a mapping module 12 .
  • the analysis module 11 is connected to the somatosensory unit for analyzing the motion of the head and obtaining the direction of the motion of the head.
  • the mapping module 12 is connected to the parsing module 11 for mapping the direction of the action of acquiring the head to the moving direction of the menu focus.
  • the menu focus can move the preset number of items along the moving direction at the original position, and select the corresponding menu item.
  • the preset number of items may be set by the user according to requirements, for example, the preset item is 1 item, which is not limited herein.
  • the virtual reality glasses 10, in turn, perform the functions indicated by the menu at the location where the current menu focus is located, so that menu operations can be performed by motion sensing.
  • the somatosensory unit is a gyro sensor disposed in an intelligent mobile communication terminal connected to the virtual reality glasses.
  • the gyro sensor can measure the rotation and deflection very well, so that the user's head movement can be accurately analyzed and judged.
  • the action of the head includes at least one of up, down, shaking to the left, and shaking to the right.
  • the gyro sensor can determine whether the user's head is up, down, left, or right.
  • the parsing module 11 further acquires the number of actions of the action of the head in the direction according to the action of the head.
  • the mapping module 12 further maps the number of actions of the action of the head in the direction to the number of times the menu focus moves in the direction of movement. For example, if the somatosensory unit 11 detects that the head is up once, after parsing by the parsing module 11, the mapping module 12 moves the menu focus up by one according to the parsing result, and the virtual reality glasses 10 and the menu where the menu focus is currently located. Indicated function.
  • the mapping module 12 shifts the focus of the menu to the right according to the parsing result menu to obtain the current position of the menu focus, and then the virtual reality glasses 10 can perform corresponding Features.
  • the virtual reality glasses 20 include a processor 210, a memory 211, and a data bus 212.
  • the processor 210 and the memory 211 are connected by a data bus 212 for mutual communication.
  • the memory 211 stores a program, and the processor 210 runs the program. This program is used to:
  • the action of the head includes at least one of an up, a down point, a leftward shaking, and a rightward shaking.
  • the menu focus can move the preset number of items along the moving direction at the original position, and select the corresponding menu item.
  • the preset number of items may be set by the user according to requirements, for example, the preset item is 1 item, which is not limited herein.
  • the virtual reality glasses 20, in turn, perform the functions indicated by the menu at the location where the current menu focus is located, so that menu operations can be performed by motion sensing.
  • the program is further configured to: obtain the number of actions of the action of the head in the direction according to the action of the head. Further, the number of actions of the action of the head in the direction is mapped to the number of times the menu focus moves in the moving direction. For example, if the head is detected once, after parsing, the menu focus is shifted up one item according to the parsing result, and the virtual reality glasses 20 and the function indicated by the menu where the menu focus is currently located. If it is detected that the head is shaken twice, after parsing, the menu focus is shifted to the right according to the parsing result to obtain the current position of the menu focus, and then the virtual reality glasses 20 can perform the corresponding functions.
  • the present invention also provides a menu control method for virtual reality glasses.
  • the virtual reality glasses establish a connection with the intelligent mobile communication terminal, and the virtual reality application is installed on the intelligent mobile communication terminal.
  • the menu control method of the virtual reality glasses includes:
  • Step S10 sensing the action of the current user's head.
  • the gyro sensor is applied to sense the motion of the current user's head.
  • the gyro sensor is disposed in an intelligent mobile communication terminal connected to the virtual reality glasses.
  • the gyro sensor can measure the rotation and deflection very well, so that the user's head movement can be accurately analyzed and judged.
  • the action of the head includes at least one of up, down, shaking to the left, and shaking to the right.
  • the gyro sensor can determine whether the user's head is up, down, left, or right.
  • Step S11 Analyze the action of the head to obtain the direction of the action of the head.
  • step S11 the number of movements of the movement of the head in the direction is further acquired based on the motion of the head.
  • Step S12 Map the direction of the action of acquiring the head to the moving direction of the menu focus.
  • the menu focus can move the preset number of items along the moving direction at the original position, and select the corresponding menu item.
  • the preset number of items may be set by the user according to requirements, for example, the preset item is 1 item, which is not limited herein.
  • the virtual reality glasses in turn, perform the functions indicated by the menu at the location where the current menu focus is located, so that menu operations can be performed by motion sensing.
  • step S12 the number of movements of the movement of the head in the direction may be further mapped to the number of movements of the menu focus in the movement direction.
  • the menu focus is shifted up by one item according to the parsing result, and the virtual reality glasses and the function indicated by the menu where the menu focus is currently located. If it is detected that the head is shaken twice, after parsing, the menu focus is shifted to the right according to the analysis result to obtain the current position of the menu focus, and then the virtual reality glasses can perform corresponding functions, so that the menu can be performed by motion sensing. operating.
  • the virtual reality glasses of the present invention include: a parsing module connected to the sensing module for parsing the action of the head to obtain the direction of the action of the head; and a mapping module connected to the parsing module for The direction in which the motion of the head is acquired is mapped to the moving direction of the menu focus, and the menu operation can be performed by motion sensing.

Abstract

本发明公开了一种虚拟现实眼镜及其菜单控制方法,虚拟现实眼镜与智能移动通信终端建立连接,且智能移动通信终端上安装有虚拟现实应用,该智能移动通信终端包括体感单元,用于感应当前用户头部的动作,虚拟现实眼镜包括:解析模块,与体感单元连接,用于对头部的动作进行解析,获取头部的动作的方向;映射模块,与解析模块连接,用于将获取头部的动作的方向映射为菜单焦点的移动方向。通过以上方式,本发明能够通过动作感应进行菜单操作。

Description

虚拟现实眼镜及其菜单控制方法
【技术领域】
本发明涉及虚拟现实技术领域,特别是涉及一种虚拟现实眼镜及其菜单控制方法。
【背景技术】
虚拟现实眼镜(Virtual Reality Glasses,VR眼镜)可以通过智能移动通信终端作为显示屏,并通过智能移动通信终端上的相关应用来实现虚拟现实功能。目前对VR眼镜的控制主要是通过连接智能移动通信终端的控制器或蓝牙遥控器等方式来实现的,而在菜单选择时,进行上下左右的菜单项选择是最常用的操控动作,在戴上VR眼镜后人眼无法看到控制器或遥控的情况下,操作较为不便。
【发明内容】
本发明实施例提供了一种虚拟现实眼镜及其菜单控制方法,能够通过动作感应进行菜单操作。
本发明提供一种虚拟现实眼镜,该虚拟现实眼镜与智能移动通信终端建立连接,且智能移动通信终端上安装有虚拟现实应用,该智能移动通信终端包括体感单元,用于感应当前用户头部的动作,该虚拟现实眼镜包括:解析模块,与体感单元连接,用于对头部的动作进行解析,获取头部的动作的方向;映射模块,与解析模块连接,用于将获取头部的动作的方向映射为菜单焦点的移动方向;其中,体感单元为陀螺仪传感器,解析模块还用于根据头部的动作获取头部的动作在方向上的动作次数。
其中,映射模块还用于将头部的动作在方向上的动作次数映射为菜单焦点在移动方向上移动的次数。
其中,头部的动作包括上昂、下点、往左摇动以及往右摇动的至少之一。
本发明提供一种虚拟现实眼镜,虚拟现实眼镜与智能移动通信终端建立连接,且智能移动通信终端上安装有虚拟现实应用,该智能移动通信终端包括体感单元,用于感应当前用户头部的动作,虚拟现实眼镜包括:解析模块,与体感单元连接,用于对头部的动作进行解析,获取头部的动作的方向;映射模块,与解析模块连接,用于将获取头部的动作的方向映射为菜单焦点的移动方向。
其中,体感单元为陀螺仪传感器。
其中,解析模块还用于根据头部的动作获取头部的动作在方向上的动作次数。
其中,映射模块还用于将头部的动作在方向上的动作次数映射为菜单焦点在移动方向上移动的次数。
其中,头部的动作包括上昂、下点、往左摇动以及往右摇动的至少之一。
本发明还提供一种虚拟现实眼镜的菜单控制方法,虚拟现实眼镜与智能移动通信终端建立连接,且智能移动通信终端上安装有虚拟现实应用,菜单控制方法包括:感应当前用户头部的动作;对头部的动作进行解析,获取头部的动作的方向;将获取头部的动作的方向映射为菜单焦点的移动方向。
其中,感应当前用户头部的动作的步骤包括:应用陀螺仪传感器感应当前用户头部的动作。
其中,对头部的动作进行解析,获取头部的动作的方向的步骤还包括:根据头部的动作获取头部的动作在方向上的动作次数。
其中,将获取头部的动作的方向映射为菜单焦点的移动方向的步骤还包括:将头部的动作在方向上的动作次数映射为菜单焦点在移动方向上移动的次数。
其中,头部的动作包括上昂、下点、往左摇动以及往右摇动的至少之一。
通过上述方案,本发明的有益效果是:本发明的虚拟现实眼镜包括:解析模块,与传感模块连接,用于对头部的动作进行解析,获取头部的动作的方向;映射模块,与解析模块连接,用于将获取头部的动作的方向映射为菜单焦点的移动方向,能够通过动作感应进行菜单操作。
【附图说明】
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。其中:
图1是本发明实施例的虚拟现实眼镜的结构示意图;
图2是本发明实施例的虚拟现实眼镜的一实体装置的结构示意图;
图3是本发明实施例的虚拟现实眼镜的菜单控制方法的流程示意图。
【具体实施方式】
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性的劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
图1是本发明实施例的虚拟现实眼镜的结构示意图。虚拟现实眼镜与智能移动通信终端建立连接,且智能移动通信终端上安装有虚拟现实应用,该智能移动通信终端包括体感单元,用于感应当前用户头部的动作。如图1所述,虚拟现实眼镜10包括:解析模块11以及映射模块12。解析模块11与体感单元连接,用于对头部的动作进行解析,获取头部的动作的方向。映射模块12与解析模块11连接,用于将获取头部的动作的方向映射为菜单焦点的移动方向。菜单焦点可以在原来所处位置沿该移动方向移动预设的项数,选定对应的菜单项。预设的项数可以是用户根据需要设置,如预设为1项,在此不作限定。虚拟现实眼镜10进而执行当前菜单焦点所处位置处的菜单所指示的功能,所如此能够通过动作感应进行菜单操作。
在本发明实施例中,体感单元为陀螺仪传感器,设置在与虚拟现实眼镜连接的智能移动通信终端中。陀螺仪传感器可以对转动、偏转的动作做很好的测量,这样就可以精确分析判断出用户头部的动作。头部的动作包括上昂、下点、往左摇动以及往右摇动的至少之一。当智能移动通信终端放入VR眼镜中佩戴在用户头上时,通过陀螺仪传感器可以判定用户的头部是上昂、下点、往左摇或者往右摇。
在本发明实施例中,解析模块11还进一步根据头部的动作获取头部的动作在方向上的动作次数。映射模块12进而将头部的动作在方向上的动作次数进一步映射为菜单焦点在移动方向上移动的次数。举例说明,如果体感单元11检测到头部上昂一次,经过解析模块11解析后,映射模块12根据解析结果将菜单焦点上移一项,虚拟现实眼镜10进而菜单焦点当前所处位置的菜单所指示的功能。如果体感单元检测到头部右摇两次,经过解析模块11解析后,映射模块12根据解析结果菜单焦点右移两项,以获得菜单焦点当前所处位置,进而虚拟现实眼镜10可以执行对应的功能。
图2是本发明实施例的虚拟现实眼镜的一实体装置的结构示意图。如图2所示,虚拟现实眼镜20包括:处理器210、存储器211以及数据总线212。处理器210以及存储器211通过数据总线212相连,以进行相互通信。
存储器211存储有程序,处理器210运行该程序。该程序用于:
感应当前用户头部的动作;
对头部的动作进行解析,获取头部的动作的方向;
将获取头部的动作的方向映射为菜单焦点的移动方向。
在本发明实施例中,头部的动作包括上昂、下点、往左摇动以及往右摇动的至少之一。菜单焦点可以在原来所处位置沿该移动方向移动预设的项数,选定对应的菜单项。预设的项数可以是用户根据需要设置,如预设为1项,在此不作限定。虚拟现实眼镜20进而执行当前菜单焦点所处位置处的菜单所指示的功能,所如此能够通过动作感应进行菜单操作。
在本发明实施例中,该程序还用于:根据头部的动作获取头部的动作在方向上的动作次数。进一步将头部的动作在方向上的动作次数映射为菜单焦点在移动方向上移动的次数。举例说明,如果检测到头部上昂一次,经过解析后,根据解析结果将菜单焦点上移一项,虚拟现实眼镜20进而菜单焦点当前所处位置的菜单所指示的功能。如果检测到头部右摇两次,经过解析后,根据解析结果菜单焦点右移两项,以获得菜单焦点当前所处位置,进而虚拟现实眼镜20可以执行对应的功能。
参见图3,本发明还提供一种虚拟现实眼镜的菜单控制方法。虚拟现实眼镜与智能移动通信终端建立连接,且智能移动通信终端上安装有虚拟现实应用。如图3所示,虚拟现实眼镜的菜单控制方法包括:
步骤S10:感应当前用户头部的动作。
在本发明实施例中,应用陀螺仪传感器感应当前用户头部的动作。陀螺仪传感器设置在与虚拟现实眼镜连接的智能移动通信终端中。陀螺仪传感器可以对转动、偏转的动作做很好的测量,这样就可以精确分析判断出用户头部的动作。头部的动作包括上昂、下点、往左摇动以及往右摇动的至少之一。当智能移动通信终端放入VR眼镜中佩戴在用户头上时,通过陀螺仪传感器可以判定用户的头部是上昂、下点、往左摇或者往右摇。
步骤S11:对头部的动作进行解析,获取头部的动作的方向。
在步骤S11中,还进一步根据头部的动作获取头部的动作在方向上的动作次数。
步骤S12:将获取头部的动作的方向映射为菜单焦点的移动方向。
菜单焦点可以在原来所处位置沿该移动方向移动预设的项数,选定对应的菜单项。预设的项数可以是用户根据需要设置,如预设为1项,在此不作限定。虚拟现实眼镜进而执行当前菜单焦点所处位置处的菜单所指示的功能,所如此能够通过动作感应进行菜单操作。
在步骤S12中,也可以进一步将头部的动作在方向上的动作次数映射为菜单焦点在移动方向上移动的次数。
举例说明,如果检测到头部上昂一次,经过解析后,根据解析结果将菜单焦点上移一项,虚拟现实眼镜进而菜单焦点当前所处位置的菜单所指示的功能。如果检测到头部右摇两次,经过解析后,根据解析结果菜单焦点右移两项,以获得菜单焦点当前所处位置,进而虚拟现实眼镜可以执行对应的功能,如此能够通过动作感应进行菜单操作。
综上所述,本发明的虚拟现实眼镜包括:解析模块,与传感模块连接,用于对头部的动作进行解析,获取头部的动作的方向;映射模块,与解析模块连接,用于将获取头部的动作的方向映射为菜单焦点的移动方向,能够通过动作感应进行菜单操作。
以上所述仅为本发明的实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。

Claims (13)

  1. 一种虚拟现实眼镜,其中,所述虚拟现实眼镜与智能移动通信终端建立连接,且所述智能移动通信终端上安装有虚拟现实应用,所述智能移动通信终端包括体感单元,用于感应当前用户头部的动作,所述虚拟现实眼镜包括:
    解析模块,与所述体感单元连接,用于对所述头部的动作进行解析,获取所述头部的动作的方向;
    映射模块,与所述解析模块连接,用于将获取所述头部的动作的方向映射为菜单焦点的移动方向;
    其中,所述体感单元为陀螺仪传感器,所述解析模块还用于根据所述头部的动作获取所述头部的动作在所述方向上的动作次数。
  2. 根据权利要求1所述的虚拟现实眼镜,其中,所述映射模块还用于将所述头部的动作在所述方向上的动作次数映射为所述菜单焦点在所述移动方向上移动的次数。
  3. 根据权利要求1所述的虚拟现实眼镜,其中,所述头部的动作包括上昂、下点、往左摇动以及往右摇动的至少之一。
  4. 一种虚拟现实眼镜,其中,所述虚拟现实眼镜与智能移动通信终端建立连接,且所述智能移动通信终端上安装有虚拟现实应用,所述智能移动通信终端包括体感单元,用于感应当前用户头部的动作,所述虚拟现实眼镜包括:
    解析模块,与所述体感单元连接,用于对所述头部的动作进行解析,获取所述头部的动作的方向;
    映射模块,与所述解析模块连接,用于将获取所述头部的动作的方向映射为菜单焦点的移动方向。
  5. 根据权利要求4所述的虚拟现实眼镜,其中,所述体感单元为陀螺仪传感器。
  6. 根据权利要求4所述的虚拟现实眼镜,其中,所述解析模块还用于根据所述头部的动作获取所述头部的动作在所述方向上的动作次数。
  7. 根据权利要求6所述的虚拟现实眼镜,其中,所述映射模块还用于将所述头部的动作在所述方向上的动作次数映射为所述菜单焦点在所述移动方向上移动的次数。
  8. 根据权利要求4所述的虚拟现实眼镜,其中,所述头部的动作包括上昂、下点、往左摇动以及往右摇动的至少之一。
  9. 一种虚拟现实眼镜的菜单控制方法,其中,所述虚拟现实眼镜与智能移动通信终端建立连接,且所述智能移动通信终端上安装有虚拟现实应用,所述菜单控制方法包括:
    感应当前用户头部的动作;
    对所述头部的动作进行解析,获取所述头部的动作的方向;
    将获取所述头部的动作的方向映射为菜单焦点的移动方向。
  10. 根据权利要求9所述的菜单控制方法,其中,所述感应当前用户头部的动作的步骤包括:应用陀螺仪传感器感应当前用户头部的动作。
  11. 根据权利要求9所述的菜单控制方法,其中,所述对所述头部的动作进行解析,获取所述头部的动作的方向的步骤还包括:根据所述头部的动作获取所述头部的动作在所述方向上的动作次数。
  12. 根据权利要求11所述的菜单控制方法,其中,所述将获取所述头部的动作的方向映射为菜单焦点的移动方向的步骤还包括:将所述头部的动作在所述方向上的动作次数映射为所述菜单焦点在所述移动方向上移动的次数。
  13. 根据权利要求9所述的菜单控制方法,其中,所述头部的动作包括上昂、下点、往左摇动以及往右摇动的至少之一。
PCT/CN2017/098434 2016-10-11 2017-08-22 虚拟现实眼镜及其菜单控制方法 WO2018068584A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610888755.9 2016-10-11
CN201610888755.9A CN106371612A (zh) 2016-10-11 2016-10-11 虚拟现实眼镜及其菜单控制方法

Publications (1)

Publication Number Publication Date
WO2018068584A1 true WO2018068584A1 (zh) 2018-04-19

Family

ID=57896599

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/098434 WO2018068584A1 (zh) 2016-10-11 2017-08-22 虚拟现实眼镜及其菜单控制方法

Country Status (2)

Country Link
CN (1) CN106371612A (zh)
WO (1) WO2018068584A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106371612A (zh) * 2016-10-11 2017-02-01 惠州Tcl移动通信有限公司 虚拟现实眼镜及其菜单控制方法
CN107102725B (zh) * 2017-03-09 2020-01-14 国网山东省电力公司济宁供电公司 一种基于体感手柄进行虚拟现实移动的控制方法及系统
CN107025784B (zh) * 2017-03-30 2020-11-27 北京奇艺世纪科技有限公司 一种遥控器、头戴设备及系统

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105393192A (zh) * 2013-06-28 2016-03-09 微软技术许可有限责任公司 用于近眼显示器的网状分层菜单显示配置
CN205427767U (zh) * 2016-01-28 2016-08-03 深圳云院线科技有限公司 虚拟现实装置
CN106371612A (zh) * 2016-10-11 2017-02-01 惠州Tcl移动通信有限公司 虚拟现实眼镜及其菜单控制方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102906623A (zh) * 2010-02-28 2013-01-30 奥斯特豪特集团有限公司 交互式头戴目镜上的本地广告内容
US20110213664A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
CN105867613A (zh) * 2016-03-21 2016-08-17 乐视致新电子科技(天津)有限公司 基于虚拟现实系统的头控交互方法及装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105393192A (zh) * 2013-06-28 2016-03-09 微软技术许可有限责任公司 用于近眼显示器的网状分层菜单显示配置
CN205427767U (zh) * 2016-01-28 2016-08-03 深圳云院线科技有限公司 虚拟现实装置
CN106371612A (zh) * 2016-10-11 2017-02-01 惠州Tcl移动通信有限公司 虚拟现实眼镜及其菜单控制方法

Also Published As

Publication number Publication date
CN106371612A (zh) 2017-02-01

Similar Documents

Publication Publication Date Title
WO2018068584A1 (zh) 虚拟现实眼镜及其菜单控制方法
JP2021072136A (ja) ジェスチャに基づいて制御するための筋活動センサ信号と慣性センサ信号とを結合する方法および装置
WO2017118075A1 (zh) 人机交互系统、方法及装置
WO2018217060A1 (en) Method and wearable device for performing actions using body sensor array
WO2016165665A1 (zh) 体感交互系统及体感交互方法
WO2014135023A1 (zh) 一种智能终端的人机交互方法及系统
WO2017113807A1 (zh) 一种基于智能手表的智能家居控制方法及智能手表
CN111459060A (zh) 一种机器人软件系统及其机器人
CN107621777A (zh) 电子设备和采集控制方法
WO2015126197A1 (ko) 카메라 중심의 가상터치를 이용한 원격 조작 장치 및 방법
WO2016088981A1 (ko) 사용자 인터페이스를 제공하기 위한 방법, 디바이스, 시스템 및 비일시성의 컴퓨터 판독 가능한 기록 매체
CN111938608A (zh) 一种老人智能监护用ar眼镜、监护系统、监护方法
WO2017215223A1 (zh) Vr系统、用于控制vr设备的穿戴设备及其方法
WO2016114496A1 (ko) 시선 인식 및 생체 신호를 이용한 헤드 마운트 디스플레이를 통해 사용자 인터페이스를 제공하는 방법, 이를 이용한 장치 및 컴퓨터 판독 가능한 기록 매체
CN109839827B (zh) 一种基于全空间位置信息的手势识别智能家居控制系统
CN107765850A (zh) 一种基于电子皮肤及多传感融合的手语识别系统
WO2016049842A1 (zh) 一种便携或可穿戴智能设备的混合交互方法
Seetharamu et al. TV remote control via wearable smart watch device
WO2015081646A1 (zh) 一种触摸屏的操作方法及触摸屏设备
CN105975065A (zh) 智能手表的屏幕控制方法、装置以及智能手表
US11340703B1 (en) Smart glasses based configuration of programming code
CN112426709B (zh) 前臂运动姿态识别方法、界面交互的控制方法及装置
CN211241839U (zh) 一种用于手势识别的数据手套
CN207270687U (zh) 一种双目手柄系统、装置以及ar装置
CN213149712U (zh) 一种基于红外传感器的人体手部动作识别装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17859784

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 30.08.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17859784

Country of ref document: EP

Kind code of ref document: A1