WO2018049747A1 - 用于虚拟现实设备的焦点定位方法、装置及虚拟现实设备 - Google Patents

用于虚拟现实设备的焦点定位方法、装置及虚拟现实设备 Download PDF

Info

Publication number
WO2018049747A1
WO2018049747A1 PCT/CN2016/110949 CN2016110949W WO2018049747A1 WO 2018049747 A1 WO2018049747 A1 WO 2018049747A1 CN 2016110949 W CN2016110949 W CN 2016110949W WO 2018049747 A1 WO2018049747 A1 WO 2018049747A1
Authority
WO
WIPO (PCT)
Prior art keywords
focus
eyebrow
virtual reality
reality device
detected
Prior art date
Application number
PCT/CN2016/110949
Other languages
English (en)
French (fr)
Inventor
尹左水
Original Assignee
歌尔科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 歌尔科技有限公司 filed Critical 歌尔科技有限公司
Publication of WO2018049747A1 publication Critical patent/WO2018049747A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • the present invention relates to the field of virtual reality device technologies, and more particularly, to a focus positioning method for a virtual reality device, a focus positioning device for a virtual reality device, and a virtual reality device.
  • VRs helmet-type virtual reality devices
  • existing virtual reality devices basically need to interact between the user and the VR scene through the adapted handheld device, for example, control the focus through the buttons on the handheld device. Appears, adjusts the focus position, and clicks on the interface element after locking the focus position.
  • a focus positioning method for a virtual reality device comprising:
  • the control screen appears in focus
  • the click action is performed in accordance with the detected focus position determination operation.
  • the detecting the inter-eyebrow operation corresponding to the eyebrow region of the virtual reality device comprises:
  • the focus positioning method further includes:
  • the focus position determining operation is two consecutive eyebrow operations, wherein a time interval of two consecutive eyebrow operations is set to be smaller than the set time.
  • the focus position adjustment operation is a head swing operation or an eye movement operation.
  • a focus positioning apparatus for a virtual reality device comprising:
  • a module for detecting an eyebrow operation corresponding to an eyebrow region of the virtual reality device a module for detecting an eyebrow operation corresponding to an eyebrow region of the virtual reality device
  • a module for adjusting a position of the focus according to a focus position adjustment operation during the presence of the focus and
  • a module for performing a click action according to a focus position determination operation during the presence of the focus is a module for performing a click action according to a focus position determination operation during the presence of the focus.
  • the module for detecting an eyebrow operation corresponding to an eyebrow region of the virtual reality device includes:
  • the focus positioning device further includes:
  • a virtual reality device comprising the focus positioning device according to the second aspect of the invention.
  • a virtual reality device comprising a memory and a processor, the memory for storing instructions for controlling the processor to perform according to the first aspect of the present invention Focus positioning method.
  • a computer readable storage medium storing program code for performing the focus positioning method according to the first aspect of the invention.
  • FIG. 1 is a flow chart of an embodiment of a focus positioning method in accordance with the present invention.
  • FIG. 2 is a flow chart of another embodiment of a focus positioning method in accordance with the present invention.
  • FIG. 3 is a block schematic diagram of an embodiment of a focus positioning device in accordance with the present invention.
  • FIG. 4 is a block schematic diagram of another embodiment of a focus positioning device in accordance with the present invention.
  • FIG. 5 is a block schematic diagram of an implementation of an electronic device in accordance with the present invention.
  • FIG. 1 is a flow chart of one embodiment of a focus positioning method in accordance with the present invention.
  • the focus positioning method of the present invention comprises the following steps:
  • Step S110 after the power-on starts, determining whether the eyebrow area corresponding to the virtual reality device is detected The eyebrow operation of the domain, if yes, step S120 is performed, and if not, the step S110 is executed cyclically.
  • the area between the eyebrows of the virtual reality device is the part of the virtual reality device corresponding to the area between the user's eyebrows. Therefore, the operation between the eyebrows may be performed by the user on the part of the virtual reality device by frowning, raising the eyebrow or the like. Squeeze operation.
  • the inter-mirror operation for detecting the inter-brow area of the virtual reality device can be employed without being limited to the following embodiments.
  • Step S111a Acquire a pressure signal output by a pressure sensor mounted on an area of the eyebrow of the virtual reality device.
  • the pressure sensor can employ, for example, a piezoelectric pressure sensor having a higher sensitivity.
  • Step S112b According to the pressure signal, it is determined whether the current pressure value exceeds a set value, and if so, it is determined that an eyebrow operation occurs.
  • the set value can be determined according to experimental data, that is, a large number of measurements are taken when the user wears the virtual reality device, the pressure caused by the above eyebrow operation on the eyebrow region of the virtual reality device, and the appropriate set value is selected according to the measured data. In order to provide better ease of operation and lower false trigger rate.
  • Step S111b Acquire an image acquired by the image capturing device installed on the virtual reality device.
  • Step S112b extracting features of the user's eyebrow region (including the eyebrow region) in the image.
  • Step S113b The extracted features are compared with the shape of the eyebrow region and/or the relative position stored in the database as a reference, and if the comparison is successful, it is determined that the eyebrow operation occurs.
  • Step S111c Acquire a proximity signal output by the distance sensor installed in the eyebrow region of the virtual reality device.
  • Step S112c determining, according to the proximity signal, whether the proximity amount exceeds a set amount, and if so, Then it is determined that an eyebrow operation has occurred.
  • the proximity amount may be, for example, a difference between the standard distance and the current distance, which may be manually set, or may be collected by the distance sensor according to the user's selection.
  • step S120 the control screen appears in focus.
  • step S130 during the existence of the focus, the position of the focus is adjusted according to the detected focus position adjustment operation, that is, the adjustment of the focus position is performed in step S130.
  • the focus position adjustment operation may be, but not limited to, a head swing operation or an eye movement operation or the like.
  • the position of the focus on the screen can be adjusted correspondingly by detecting the up, down, left, and right swings of the user's head.
  • the head swing operation can be identified based on data collected by an inertial sensor or the like.
  • the position of the focus on the screen can be adjusted correspondingly by detecting the up, down, left, and right movements of the user's eyeball.
  • the eye movement operation can be identified based on the image acquired by the image acquisition device.
  • Step S140 during the existence of the focus, the click operation is performed according to the detected focus position determining operation, that is, the click operation of the interface element at the position where the focus is located is performed at step S140.
  • the focus position determining operation may be, for example, a nodding operation, a closed eye operation, two consecutive eyebrow operations, and the like, but the operations corresponding to the various functions should be different to distinguish within the system.
  • the nodding operation can be identified based on the data collected by the inertial sensor.
  • the closed eye operation can be identified based on the image acquired by the image capture device.
  • the two consecutive eyebrow operations can be judged by referring to the description in step S110, and it is only necessary to increase whether the time interval between the two eyebrow operations satisfies the definition of the continuous operation versus time.
  • FIG. 2 is a flow chart of another embodiment of a focus positioning method in accordance with the present invention.
  • this embodiment adds steps S210 to S230 on the basis of the embodiment shown in FIG. 1.
  • step S210 after the detected eyebrow operation, the control screen appears focus, it is determined whether the detected eyebrow operation disappears, and if so, step S220 is performed, and if not, looping to step S210.
  • step S210 the specific embodiment for judging whether or not the detected inter-brow operation has disappeared corresponds to the embodiment of detecting the inter-brow operation of the inter-mirror region of the virtual reality device in step S110.
  • the operation of detecting the inter-mirror according to the pressure signal is taken as an example, and determining whether the detected operation between the eyebrows disappears may further include:
  • Step S211a Acquire a pressure signal output by the pressure sensor mounted on the eyebrow area of the virtual reality device.
  • Step S212b According to the pressure signal, it is determined whether the current pressure value is changed to be smaller than the set value, and if so, it is determined that the detected eyebrow operation disappears.
  • Step S220 determining whether the disappearance time of the operation between the eyebrows reaches the set time, and if so, executing step S230 to avoid erroneously controlling the focus disappearing due to the erroneous operation. If not, returning to step S210, the operation between the eyebrows still does not exist. Judgment.
  • step S140 the user needs to maintain the eyebrow operation until step S140 is performed. Otherwise, as long as the detected eyebrow operation disappears to reach the set time, the current operation ends, and the process returns to step S110 to perform the eyebrow operation. Detection.
  • step S230 the control focus disappears.
  • the time interval between two consecutive eyebrow operations should be set to be smaller than the above set time.
  • it may be set to maintain the existing state after the focus appears, until the step S140 is completed, the control focus disappears.
  • FIG. 3 is a block schematic diagram of one embodiment of a focus positioning device in accordance with the present invention.
  • the focus positioning device includes a module 301 for detecting an eyebrow operation corresponding to an eyebrow region of the virtual reality device; a module 302 for controlling the screen to appear focus according to the detected eyebrow operation; A module 303 that adjusts the position of the focus according to the focus position adjustment operation during the presence of the focus; and a module 304 for performing the click action according to the focus position determination operation during the presence of the focus.
  • the module 301 for detecting an eyebrow operation corresponding to an eyebrow region of the virtual reality device may further include: a pressure signal for acquiring a pressure sensor output mounted on the eyebrow region And a unit for determining whether the current pressure value exceeds the set value according to the pressure signal, and determining that the eyebrow operation occurs when the set value is exceeded.
  • FIG. 4 is a block schematic diagram of another embodiment of a focus positioning device in accordance with the present invention.
  • the focus positioning apparatus of the present invention further comprises means 305 for controlling the disappearance of the focus on the screen when the detected operation between the eyebrows disappears and the disappearance time reaches the set time.
  • the focus position determining operation may be two consecutive eyebrow operations, wherein the time interval between two consecutive eyebrow operations is set to be less than the set time.
  • the focus position adjustment operation may be a head swing operation or an eye movement operation.
  • the present invention also provides a virtual reality device including any of the above-described focus positioning devices.
  • the virtual reality device should also include the sensor described above for use with the corresponding focus positioning device.
  • Figure 5 is a block schematic diagram of one embodiment of the virtual reality device in hardware configuration.
  • the virtual reality device 500 includes a memory 501 and a processor 502 for storing instructions for controlling the processor 502 to perform the focus positioning method of any of the present inventions.
  • the processor can be, for example, a central processing unit CPU, a microprocessor MCU, or the like.
  • the memory includes, for example, a ROM (Read Only Memory), a RAM (Random Access Memory), a nonvolatile memory such as a hard disk, and the like.
  • the virtual reality device 500 may further include an interface device 503, an input device 504, a display device 505, a communication device 506, a speaker 507, a microphone 508, and the like.
  • an interface device 503 an input device 504
  • a display device 505 a communication device 506, a speaker 507, a microphone 508, and the like.
  • the virtual reality device 500 of the present invention may only be related to some of the devices, such as the processor 501, the memory 502, and the like.
  • the communication device 506 can be wired or wirelessly communicated, for example.
  • the above interface device 503 includes, for example, a headphone jack, a USB interface, and the like.
  • the input device 504 described above may include, for example, a touch screen, a button, and the like.
  • the display device 505 described above is, for example, a liquid crystal display, a touch display, or the like.
  • the invention can be an apparatus, method and/or computer program product.
  • the computer program product can comprise a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement various aspects of the present invention.
  • the computer readable storage medium can be a tangible device that can hold and store the instructions used by the instruction execution device.
  • the computer readable storage medium can be, for example, but not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • Non-exhaustive list of computer readable storage media include: portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) Or flash memory), static random access memory (SRAM), portable compact disk read only memory (CD-ROM), digital versatile disk (DVD), memory stick, floppy disk, mechanical encoding device, for example, with instructions stored thereon A raised structure in the hole card or groove, and any suitable combination of the above.
  • a computer readable storage medium as used herein is not to be interpreted as a transient signal itself, such as a radio wave or other freely propagating electromagnetic wave, an electromagnetic wave propagating through a waveguide or other transmission medium (eg, a light pulse through a fiber optic cable), or through a wire The electrical signal transmitted.
  • the computer readable program instructions described herein can be downloaded from a computer readable storage medium to various computing/processing devices or downloaded to an external computer or external storage device over a network, such as the Internet, a local area network, a wide area network, and/or a wireless network.
  • the network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers, and/or edge services Server.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in each computing/processing device .
  • Computer program instructions for performing the operations of the present invention may be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine related instructions, microcode, firmware instructions, state setting data, or in one or more programming languages.
  • the computer readable program instructions can execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer, partly on the remote computer, or entirely on the remote computer or server. carried out.
  • the remote computer can be connected to the user's computer through any kind of network, including a local area network (LAN) or wide area network (WAN), or can be connected to an external computer (eg, using an Internet service provider to access the Internet) connection).
  • the customized electronic circuit such as a programmable logic circuit, a field programmable gate array (FPGA), or a programmable logic array (PLA), can be customized by utilizing state information of computer readable program instructions.
  • Computer readable program instructions are executed to implement various aspects of the present invention.
  • the computer readable program instructions can be provided to a general purpose computer, a special purpose computer, or a processor of other programmable data processing apparatus to produce a machine such that when executed by a processor of a computer or other programmable data processing apparatus Means for implementing the functions/acts specified in one or more of the blocks of the flowcharts and/or block diagrams.
  • the computer readable program instructions can also be stored in a computer readable storage medium that causes the computer, programmable data processing device, and/or other device to operate in a particular manner, such that the computer readable medium storing the instructions includes An article of manufacture that includes instructions for implementing various aspects of the functions/acts recited in one or more of the flowcharts.
  • the computer readable program instructions can also be loaded onto a computer, other programmable data processing device, or other device to perform a series of operational steps on a computer, other programmable data processing device or other device to produce a computer-implemented process.
  • instructions executed on a computer, other programmable data processing apparatus, or other device implement the functions/acts recited in one or more of the flowcharts and/or block diagrams.
  • each block in the flowchart or block diagram can represent a module, a program segment, or a portion of an instruction that includes one or more components for implementing the specified logical functions.
  • Executable instructions can also occur in a different order than those illustrated in the drawings. For example, two consecutive blocks may be executed substantially in parallel, and they may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowcharts, and combinations of blocks in the block diagrams and/or flowcharts can be implemented in a dedicated hardware-based system that performs the specified function or function. Or it can be implemented by a combination of dedicated hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, implementation by software, and implementation by a combination of software and hardware are equivalent.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明公开了一种用于虚拟现实设备的焦点定位方法、装置及虚拟现实设备,该方法包括:检测对应虚拟现实设备的眉间区域的眉间操作;根据检测到的眉间操作,控制屏幕出现焦点;在所述焦点存在期间,根据检测到的焦点位置调整操作调整所述焦点的位置;在所述焦点存在期间,根据检测到的焦点位置确定操作执行点击动作。

Description

用于虚拟现实设备的焦点定位方法、装置及虚拟现实设备 技术领域
本发明涉及虚拟现实设备技术领域,更具体地,本发明涉及一种用于虚拟现实设备的焦点定位方法、一种用于虚拟现实设备的焦点定位装置、及一种虚拟现实设备。
背景技术
近年来,头盔式虚拟现实设备(VR)越来越多,现有的虚拟现实设备基本需要通过适配的手持设备进行用户与VR场景之间的交互,例如,通过手持设备上的按键控制焦点出现、调整焦点位置、并在锁定焦点位置后实现对界面元素的点击。
发明内容
根据本发明的第一方面,提供了一种用于虚拟现实设备的焦点定位方法,其包括:
检测对应虚拟现实设备的眉间区域的眉间操作;
根据检测到的眉间操作,控制屏幕出现焦点;
在所述焦点存在期间,根据检测到的焦点位置调整操作调整所述焦点的位置;
在所述焦点存在期间,根据检测到的焦点位置确定操作执行点击动作。
可选的是,所述检测对应虚拟现实设备的眉间区域的眉间操作包括:
获取安装在所述眉间区域上的压力传感器输出的压力信号;
根据所述压力信号,判断当前压力值是否超过设定值,如是,则判定发生眉间操作。
可选的是,所述焦点定位方法还包括:
如果检测到的眉间操作消失,且消失时间达到设定时间,则:
控制屏幕上的焦点消失。
可选的是,所述焦点位置确定操作为连续两次眉间操作,其中,连续两次眉间操作的时间间隔被设定为小于所述设定时间。
可选的是,所述焦点位置调整操作为头部摆动操作或者眼球移动操作。
根据本发明的第二方面,提供了一种用于虚拟现实设备的焦点定位装置,其包括:
用于检测对应虚拟现实设备的眉间区域的眉间操作的模块;
用于根据检测到的眉间操作、控制屏幕出现焦点的模块;
用于在焦点存在期间,根据焦点位置调整操作调整所述焦点的位置的模块;以及,
用于在焦点存在期间,根据焦点位置确定操作执行点击动作的模块。
可选的是,所述用于检测对应虚拟现实设备的眉间区域的眉间操作的模块包括:
用于获取安装在所述眉间区域上的压力传感器输出的压力信号的单元;以及,
用于根据所述压力信号,判断当前压力值是否超过设定值,并在超过设定值时判定发生眉间操作的单元。
可选的是,所述焦点定位装置还包括:
用于在检测到的眉间操作消失、且消失时间达到设定时间时,控制屏幕上的焦点消失的模块。
根据本发明的第三方面,提供了一种虚拟现实设备,其包括根据本发明的第二方面所述的焦点定位装置。
根据本发明的第四方面,提供了一种虚拟现实设备,包括存储器和处理器,所述存储器用于存储指令,所述指令用于控制所述处理器执行根据本发明的第一方面所述的焦点定位方法。
根据本发明的第五方面,提供了一种计算机可读存储介质,其存储有用于执行根据本发明的第一方面所述焦点定位方法的程序代码。
通过以下参照附图对本发明的示例性实施例的详细描述,本发明的其它特征及其优点将会变得清楚。
附图说明
被结合在说明书中并构成说明书的一部分的附图示出了本发明的实施例,并且连同其说明一起用于解释本发明的原理。
图1是根据本发明焦点定位方法的一种实施方式的流程图;
图2是根据本发明焦点定位方法的另一种实施方式的流程图;
图3是根据本发明焦点定位装置的一种实施结构的方框原理图;
图4是根据本发明焦点定位装置的另一种实施结构的方框原理图;
图5是根据本发明电子设备的一种实施结构的方框原理图。
具体实施方式
现在将参照附图来详细描述本发明的各种示例性实施例。应注意到:除非另外具体说明,否则在这些实施例中阐述的部件和步骤的相对布置、数字表达式和数值不限制本发明的范围。
以下对至少一个示例性实施例的描述实际上仅仅是说明性的,决不作为对本发明及其应用或使用的任何限制。
对于相关领域普通技术人员已知的技术、方法和设备可能不作详细讨论,但在适当情况下,所述技术、方法和设备应当被视为说明书的一部分。
在这里示出和讨论的所有例子中,任何具体值应被解释为仅仅是示例性的,而不是作为限制。因此,示例性实施例的其它例子可以具有不同的值。
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步讨论。
图1为根据本发明焦点定位方法的一种实施方式的流程图。
根据图1所示,本发明焦点定位方法包括如下步骤:
步骤S110,上电开始后,判断是否检测到对应虚拟现实设备的眉间区 域的眉间操作,如是,则执行步骤S120,如否,则循环执行该步骤S110。
此处的虚拟现实设备的眉间区域即为虚拟现实设备的、对应用户眉间区域的部位,因此,该眉间操作可以是用户通过皱眉、抬眉等行为对虚拟现实设备的该部位进行的挤压操作。
因此,检测虚拟现实设备的眉间区域的眉间操作可以采用但不局限于以下实施例。
实施例1:
步骤S111a:获取安装在虚拟现实设备的眉间区域上的压力传感器输出的压力信号。
该压力传感器例如可以采用灵敏性较高的压电式压力传感器。
步骤S112b:根据该压力信号,判断当前压力值是否超过设定值,如是,则判定发生眉间操作。
该设定值可以根据实验数据确定,即大量测量用户佩戴虚拟现实设备时,通过以上眉间操作会对虚拟现实设备的眉间区域造成的压力,并根据测量得到的数据选取合适的设定值,以兼顾提供较好的易操作性、及较低的误触发率。
实施例2:
步骤S111b:获取安装在虚拟现实设备上的图像采集装置采集到的图像。
步骤S112b:提取图像中用户眉间区域(包括眼眉部位)的特征。
步骤S113b:将提取到的特征与存储在数据库中作为参考的眉间区域形状和/或相对位置进行比对,如果比对成功,则判定发生眉间操作。
或者,根据提取到的特征在连续多帧采集到的图像中的位置变化,判断是否发生设定的眉间操作,如果位置变化符合设定的眉间操作的动作过程,则判定发生眉间操作。
实施例3:
步骤S111c:获取安装在虚拟现实设备的眉间区域的距离传感器输出的接近信号。
步骤S112c:根据该接近信号,判断接近量是否超过设定量,如是, 则判定发生眉间操作。
该接近量例如可以是标准距离与当前距离之间的差值,该标准距离可以人为设置,也可以根据用户的选择通过距离传感器采集设定。
步骤S120,控制屏幕出现焦点。
步骤S130,在焦点存在期间,根据检测到的焦点位置调整操作调整焦点的位置,即在该步骤S130进行焦点位置的调整。
该焦点位置调整操作可以但不局限于是头部摆动操作或者眼球移动操作等。
以头部摆动操作为例,可以通过检测用户头部的上、下、左、右摆动对应地调整焦点在屏幕上的位置。该头部摆动操作可以根据惯性传感器等采集到的数据进行识别。
又以眼球移动操作为例,可以通过检测用户眼球的上、下、左、右移动对应地调整焦点在屏幕上的位置。该眼球移动操作可以根据图像采集装置采集到的图像进行识别。
步骤S140,在焦点存在期间,根据检测到的焦点位置确定操作执行点击动作,即在该步骤S140执行对焦点所在位置的界面元素的点击操作。
该焦点位置确定操作例如可以是点头操作、闭眼操作、连续两次眉间操作等,但是,对应各种功能的操作应该各不相同,以在系统内部进行区分。
其中,点头操作可以根据惯性传感器采集到的数据进行识别。闭眼操作可以根据图像采集装置采集到的图像进行识别。连续两次眉间操作可参照在步骤S110中的说明进行判断,只需增加对两次眉间操作的时间间隔是否满足连续操作对时间的定义即可。
图2是根据本发明焦点定位方法的另一种实施例的流程图。
根据图2所示,该实施例在图1所示实施例的基础上,增加了步骤S210至步骤S230。
在步骤S210中,根据检测到的眉间操作,控制屏幕出现焦点后,判断检测到的眉间操作是否消失,如是,则执行步骤S220,如否,循环执行步骤S210。
在该步骤S210中,判断检测到的眉间操作是否消失的具体实施例与步骤S110中检测虚拟现实设备的眉间区域的眉间操作的实施例相对应。
以上述实施例1中,根据压力信号检测眉间操作为例,判断检测到的眉间操作是否消失可进一步包括:
步骤S211a:获取安装在虚拟现实设备的眉间区域上的压力传感器输出的压力信号。
步骤S212b:根据该压力信号,判断当前压力值是否转变为小于设定值,如是,则判定检测到的眉间操作消失。
步骤S220,判断眉间操作的消失时间是否达到设定时间,如是,再执行步骤S230,以避免因误操作而错误地控制焦点消失,如否,则回到步骤S210进行眉间操作仍不存在的判断。
这说明,在该实施例中,用户需要保持眉间操作直至执行完步骤S140,否则,只要检测到的眉间操作消失达到设定时间,则当前操作结束,回到步骤S110进行眉间操作的检测。
步骤S230,控制焦点消失。
在该实施例中,如果将连续两次眉间操作作为焦点位置确定操作,则连续两次眉间操作的时间间隔应该被设定为小于上述设定时间。
在另外的实施例中,也可以设置为焦点出现后即保持存在状态,直至完成步骤S140,控制焦点消失。
在另外的实施例中,还可以设置为焦点出现后,如果一定时间内没有检测到焦点位置调整操作和/或焦点位置确定操作,则控制焦点消失。
图3是根据本发明焦点定位装置的一种实施例的方框原理图。
根据图3所示,该焦点定位装置包括用于检测对应虚拟现实设备的眉间区域的眉间操作的模块301;用于根据检测到的眉间操作、控制屏幕出现焦点的模块302;用于在焦点存在期间,根据焦点位置调整操作调整焦点的位置的模块303;以及,用于在焦点存在期间,根据焦点位置确定操作执行点击动作的模块304。
上述用于检测对应虚拟现实设备的眉间区域的眉间操作的模块301可以进一步包括:用于获取安装在眉间区域上的压力传感器输出的压力信号 的单元;以及,用于根据压力信号,判断当前压力值是否超过设定值,并在超过设定值时判定发生眉间操作的单元。
图4是根据本发明焦点定位装置的另一种实施例的方框原理图。
根据图4所示,在该实施例中,本发明焦点定位装置还包括:用于在检测到的眉间操作消失、且消失时间达到设定时间时,控制屏幕上的焦点消失的模块305。
在本发明焦点定位装置的一个具体实施例中,上述焦点位置确定操作可以为连续两次眉间操作,其中,连续两次眉间操作的时间间隔被设定为小于上述设定时间。
在本发明焦点定位装置的一个具体实施例中,上述焦点位置调整操作可以为头部摆动操作或者眼球移动操作。
本发明还提供了一种虚拟现实设备,该虚拟现实设备包括上述任一种焦点定位装置。另外该虚拟现实设备还应该包括以上说明的配合对应焦点定位装置使用的传感器。
图5是该虚拟现实设备在硬件配置上的一种实施例的方框原理图。
根据图5所示,该虚拟现实设备500包括存储器501和处理器502,存储器501用于存储指令,该指令用于控制处理器502执行所述本发明任一种所述的焦点定位方法。
该处理器例如可以是中央处理器CPU、微处理器MCU等。该存储器例如包括ROM(只读存储器)、RAM(随机存取存储器)、诸如硬盘的非易失性存储器等。
根据图5所示,该虚拟现实设备500还可以包括接口装置503、输入装置504、显示装置505、通信装置506、扬声器507、麦克风508等等。尽管在图5中示出了多个装置,但是,本发明虚拟现实设备500可以仅涉及其中的部分装置,例如,处理器501、存储器502等。
上述通信装置506例如能够进行有有线或无线通信。
上述接口装置503例如包括耳机插孔、USB接口等。
上述输入装置504例如可以包括触摸屏、按键等。
上述显示装置505例如是液晶显示屏、触摸显示屏等。
上述各实施例主要重点描述与其他实施例的不同之处,但本领域技术人员应当清楚的是,上述各实施例可以根据需要单独使用或者相互结合使用。
本说明书中的各个实施例均采用递进的方式描述,各个实施例之间相同相似的部分相互参见即可,每个实施例重点说明的都是与其他实施例的不同之处,但本领域技术人员应当清楚的是,上述各实施例可以根据需要单独使用或者相互结合使用。另外,对于装置实施例而言,由于其是与方法实施例相对应,所以描述得比较简单,相关之处参见方法实施例的对应部分的说明即可。以上所描述的系统实施例仅仅是示意性的,其中作为分离部件说明的模块可以是或者也可以不是物理上分开的。
本发明可以是装置、方法和/或计算机程序产品。计算机程序产品可以包括计算机可读存储介质,其上载有用于使处理器实现本发明的各个方面的计算机可读程序指令。
计算机可读存储介质可以是可以保持和存储由指令执行设备使用的指令的有形设备。计算机可读存储介质例如可以是――但不限于――电存储设备、磁存储设备、光存储设备、电磁存储设备、半导体存储设备或者上述的任意合适的组合。计算机可读存储介质的更具体的例子(非穷举的列表)包括:便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、静态随机存取存储器(SRAM)、便携式压缩盘只读存储器(CD-ROM)、数字多功能盘(DVD)、记忆棒、软盘、机械编码设备、例如其上存储有指令的打孔卡或凹槽内凸起结构、以及上述的任意合适的组合。这里所使用的计算机可读存储介质不被解释为瞬时信号本身,诸如无线电波或者其他自由传播的电磁波、通过波导或其他传输媒介传播的电磁波(例如,通过光纤电缆的光脉冲)、或者通过电线传输的电信号。
这里所描述的计算机可读程序指令可以从计算机可读存储介质下载到各个计算/处理设备,或者通过网络、例如因特网、局域网、广域网和/或无线网下载到外部计算机或外部存储设备。网络可以包括铜传输电缆、光纤传输、无线传输、路由器、防火墙、交换机、网关计算机和/或边缘服 务器。每个计算/处理设备中的网络适配卡或者网络接口从网络接收计算机可读程序指令,并转发该计算机可读程序指令,以供存储在各个计算/处理设备中的计算机可读存储介质中。
用于执行本发明操作的计算机程序指令可以是汇编指令、指令集架构(ISA)指令、机器指令、机器相关指令、微代码、固件指令、状态设置数据、或者以一种或多种编程语言的任意组合编写的源代码或目标代码,所述编程语言包括面向对象的编程语言—诸如Smalltalk、C++等,以及常规的过程式编程语言—诸如“C”语言或类似的编程语言。计算机可读程序指令可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络—包括局域网(LAN)或广域网(WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。在一些实施例中,通过利用计算机可读程序指令的状态信息来个性化定制电子电路,例如可编程逻辑电路、现场可编程门阵列(FPGA)或可编程逻辑阵列(PLA),该电子电路可以执行计算机可读程序指令,从而实现本发明的各个方面。
这里参照根据本发明实施例的方法、装置(系统)和计算机程序产品的流程图和/或框图描述了本发明的各个方面。应当理解,流程图和/或框图的每个方框以及流程图和/或框图中各方框的组合,都可以由计算机可读程序指令实现。
这些计算机可读程序指令可以提供给通用计算机、专用计算机或其它可编程数据处理装置的处理器,从而生产出一种机器,使得这些指令在通过计算机或其它可编程数据处理装置的处理器执行时,产生了实现流程图和/或框图中的一个或多个方框中规定的功能/动作的装置。也可以把这些计算机可读程序指令存储在计算机可读存储介质中,这些指令使得计算机、可编程数据处理装置和/或其他设备以特定方式工作,从而,存储有指令的计算机可读介质则包括一个制造品,其包括实现流程图和/或框图中的一个或多个方框中规定的功能/动作的各个方面的指令。
也可以把计算机可读程序指令加载到计算机、其它可编程数据处理装置、或其它设备上,使得在计算机、其它可编程数据处理装置或其它设备上执行一系列操作步骤,以产生计算机实现的过程,从而使得在计算机、其它可编程数据处理装置、或其它设备上执行的指令实现流程图和/或框图中的一个或多个方框中规定的功能/动作。
附图中的流程图和框图显示了根据本发明的多个实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段或指令的一部分,所述模块、程序段或指令的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个连续的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或动作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。对于本领域技术人员来说公知的是,通过硬件方式实现、通过软件方式实现以及通过软件和硬件结合的方式实现都是等价的。
以上已经描述了本发明的各实施例,上述说明是示例性的,并非穷尽性的,并且也不限于所披露的各实施例。在不偏离所说明的各实施例的范围和精神的情况下,对于本技术领域的普通技术人员来说许多修改和变更都是显而易见的。本文中所用术语的选择,旨在最好地解释各实施例的原理、实际应用或对市场中的技术的技术改进,或者使本技术领域的其它普通技术人员能理解本文披露的各实施例。本发明的范围由所附权利要求来限定。

Claims (11)

  1. 一种用于虚拟现实设备的焦点定位方法,其特征在于,包括:
    检测对应虚拟现实设备的眉间区域的眉间操作;
    根据检测到的眉间操作,控制屏幕出现焦点;
    在所述焦点存在期间,根据检测到的焦点位置调整操作调整所述焦点的位置;
    在所述焦点存在期间,根据检测到的焦点位置确定操作执行点击动作。
  2. 根据权利要求1所述的焦点定位方法,其特征在于,所述检测对应虚拟现实设备的眉间区域的眉间操作包括:
    获取安装在所述眉间区域上的压力传感器输出的压力信号;
    根据所述压力信号,判断当前压力值是否超过设定值,如是,则判定发生眉间操作。
  3. 根据权利要求1或2所述的焦点定位方法,其特征在于,所述焦点定位方法还包括:
    如果检测到的眉间操作消失,且消失时间达到设定时间,则:
    控制屏幕上的焦点消失。
  4. 根据权利要求3所述的焦点定位方法,其特征在于,所述焦点位置确定操作为连续两次眉间操作,其中,连续两次眉间操作的时间间隔被设定为小于所述设定时间。
  5. 根据权利要求1至4中任一项所述的焦点定位方法,其特征在于,所述焦点位置调整操作为头部摆动操作或者眼球移动操作。
  6. 一种用于虚拟现实设备的焦点定位装置,其特征在于,包括:
    用于检测对应虚拟现实设备的眉间区域的眉间操作;
    用于根据检测到的眉间操作、控制屏幕出现焦点的模块;
    用于在焦点存在期间,根据焦点位置调整操作调整所述焦点的位置的模块;以及,
    用于在焦点存在期间,根据焦点位置确定操作执行点击动作的模块。
  7. 根据权利要求6所述的焦点定位装置,其特征在于,所述用于检测对应虚拟现实设备的眉间区域的眉间操作包括:
    用于获取安装在所述眉间区域上的压力传感器输出的压力信号的单元;以及,
    用于根据所述压力信号,判断当前压力值是否超过设定值,并在超过设定值时判定发生眉间操作的单元。
  8. 根据权利要求6或7所述的焦点定位装置,其特征在于,所述焦点定位装置还包括:
    用于在检测到的眉间操作消失、且消失时间达到设定时间时,控制屏幕上的焦点消失的模块。
  9. 一种虚拟现实设备,其特征在于,包括权利要求6-8任一项所述的焦点定位装置。
  10. 一种虚拟现实设备,包括存储器和处理器,其特征在于,所述存储器用于存储指令,所述指令用于控制所述处理器执行所述权利要求1-5任一项所述的焦点定位方法。
  11. 一种计算机可读存储介质,其特征在于,存储有用于执行根据权利要求1-5中任一项所述焦点定位方法的程序代码。
PCT/CN2016/110949 2016-09-14 2016-12-20 用于虚拟现实设备的焦点定位方法、装置及虚拟现实设备 WO2018049747A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610825656.6 2016-09-14
CN201610825656.6A CN106648047A (zh) 2016-09-14 2016-09-14 用于虚拟现实设备的焦点定位方法、装置及虚拟现实设备

Publications (1)

Publication Number Publication Date
WO2018049747A1 true WO2018049747A1 (zh) 2018-03-22

Family

ID=58852286

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/110949 WO2018049747A1 (zh) 2016-09-14 2016-12-20 用于虚拟现实设备的焦点定位方法、装置及虚拟现实设备

Country Status (2)

Country Link
CN (1) CN106648047A (zh)
WO (1) WO2018049747A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101311882A (zh) * 2007-05-23 2008-11-26 华为技术有限公司 视线跟踪人机交互方法及装置
WO2010064361A1 (ja) * 2008-12-02 2010-06-10 ブラザー工業株式会社 ヘッドマウントディスプレイ
CN104055478A (zh) * 2014-07-08 2014-09-24 金纯� 基于视线追踪控制的医用内窥镜操控系统
CN105378632A (zh) * 2013-06-12 2016-03-02 微软技术许可有限责任公司 用户焦点控制的有向用户输入
CN105573478A (zh) * 2014-10-13 2016-05-11 北京三星通信技术研究有限公司 便携式设备的输入控制方法和装置
CN105824409A (zh) * 2016-02-16 2016-08-03 乐视致新电子科技(天津)有限公司 虚拟现实的交互控制方法及装置

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9244539B2 (en) * 2014-01-07 2016-01-26 Microsoft Technology Licensing, Llc Target positioning with gaze tracking
CN204695230U (zh) * 2015-05-25 2015-10-07 谢培树 头操作的电子眼镜
CN105739691A (zh) * 2016-01-26 2016-07-06 宋宏 一种虚拟体验式电子说明书系统
CN105807915A (zh) * 2016-02-24 2016-07-27 北京小鸟看看科技有限公司 虚拟鼠标的控制方法、控制装置以及头戴显示设备
CN105867626A (zh) * 2016-04-12 2016-08-17 京东方科技集团股份有限公司 头戴式虚拟现实设备及其控制方法、虚拟现实系统
CN105929953A (zh) * 2016-04-18 2016-09-07 北京小鸟看看科技有限公司 一种3d沉浸式环境下的操作引导方法和装置及虚拟现实设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101311882A (zh) * 2007-05-23 2008-11-26 华为技术有限公司 视线跟踪人机交互方法及装置
WO2010064361A1 (ja) * 2008-12-02 2010-06-10 ブラザー工業株式会社 ヘッドマウントディスプレイ
CN105378632A (zh) * 2013-06-12 2016-03-02 微软技术许可有限责任公司 用户焦点控制的有向用户输入
CN104055478A (zh) * 2014-07-08 2014-09-24 金纯� 基于视线追踪控制的医用内窥镜操控系统
CN105573478A (zh) * 2014-10-13 2016-05-11 北京三星通信技术研究有限公司 便携式设备的输入控制方法和装置
CN105824409A (zh) * 2016-02-16 2016-08-03 乐视致新电子科技(天津)有限公司 虚拟现实的交互控制方法及装置

Also Published As

Publication number Publication date
CN106648047A (zh) 2017-05-10

Similar Documents

Publication Publication Date Title
CN107666581B (zh) 提供视频内容的方法和支持该方法的电子装置
KR102444061B1 (ko) 음성 인식이 가능한 전자 장치 및 방법
CN106462685B (zh) 可穿戴电子设备及保护其的方法
KR102457724B1 (ko) 영상 처리를 수행하기 위한 방법 및 그 전자 장치
US10254828B2 (en) Detection of improper viewing posture
US9632618B2 (en) Expanding touch zones of graphical user interface widgets displayed on a screen of a device without programming changes
WO2023279704A1 (zh) 直播方法、装置、计算机设备、存储介质及程序
US9700200B2 (en) Detecting visual impairment through normal use of a mobile device
BR112016026613B1 (pt) Sistema e método para fornecer retroalimentação háptica para auxiliar na captura de imagens
WO2018082162A1 (zh) 用于虚拟现实设备的功能触发方法、装置及虚拟现实设备
JP2015142181A5 (zh)
KR20160121287A (ko) 이벤트에 기반하여 화면을 디스플레이하는 방법 및 장치
KR102356450B1 (ko) 연결부를 갖는 전자 장치 및 그 동작 방법
US10642417B2 (en) Method and apparatus for determining touching action and display device
US9626580B2 (en) Defining region for motion detection
JP2017215666A (ja) 制御装置、制御方法およびプログラム
US10877297B2 (en) Monitoring component of the position of a head mounted device
US20200018926A1 (en) Information processing apparatus, information processing method, and program
KR20160031217A (ko) 제어 방법 및 그 방법을 처리하는 전자장치
US20160048665A1 (en) Unlocking an electronic device
JP5838271B2 (ja) 生体感知器および省電力モード設定方法
US10468022B2 (en) Multi mode voice assistant for the hearing disabled
KR102457247B1 (ko) 이미지를 처리하는 전자 장치 및 그 제어 방법
WO2018049747A1 (zh) 用于虚拟现实设备的焦点定位方法、装置及虚拟现实设备
US20180144280A1 (en) System and method for analyzing the focus of a person engaged in a task

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16916126

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16916126

Country of ref document: EP

Kind code of ref document: A1