WO2022105677A1 - 虚拟现实设备的键盘透视方法、装置及虚拟现实设备 - Google Patents

虚拟现实设备的键盘透视方法、装置及虚拟现实设备 Download PDF

Info

Publication number
WO2022105677A1
WO2022105677A1 PCT/CN2021/130200 CN2021130200W WO2022105677A1 WO 2022105677 A1 WO2022105677 A1 WO 2022105677A1 CN 2021130200 W CN2021130200 W CN 2021130200W WO 2022105677 A1 WO2022105677 A1 WO 2022105677A1
Authority
WO
WIPO (PCT)
Prior art keywords
keyboard
hands
user
hand
virtual reality
Prior art date
Application number
PCT/CN2021/130200
Other languages
English (en)
French (fr)
Inventor
张明
Original Assignee
青岛小鸟看看科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 青岛小鸟看看科技有限公司 filed Critical 青岛小鸟看看科技有限公司
Priority to US18/037,969 priority Critical patent/US20240004477A1/en
Publication of WO2022105677A1 publication Critical patent/WO2022105677A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Definitions

  • the present application relates to the technical field of virtual reality, and in particular, to a keyboard perspective method and device of a virtual reality device, and a virtual reality device.
  • VR glasses virtual reality glasses
  • productivity tools have become a new application.
  • fast keyboard input is required in most cases, and the closed use environment of VR glasses has become an obstacle to keyboard input.
  • the main purpose of this application is to provide a keyboard perspective method, device and virtual reality device of a virtual reality device, which are used to solve the technical problem that the keyboard perspective method of the existing virtual reality device is relatively complicated and has poor effect. .
  • a keyboard perspective method of a virtual reality device comprising:
  • a keyboard perspective display area is determined according to the positions of the hands of the user, so as to display a physical keyboard in a real scene in the keyboard perspective display area.
  • a keyboard see-through device for a virtual reality device comprising:
  • a two-hand hand motion recognition unit used to recognize the user's two-hand hand motion
  • a keyboard see-through function activation unit used for activating the keyboard see-through function of the virtual reality device if the user's hands and hands meet a preset activation action
  • a two-hand hand position recognition unit used for recognizing the user's two-hand hand position under the keyboard perspective function
  • a virtual reality device comprising: a processor, a memory storing computer-executable instructions,
  • the executable instructions when executed by the processor, implement the foregoing keyboard see-through method of the virtual reality device.
  • a computer-readable storage medium stores one or more programs, the one or more programs, when executed by a processor, implement the aforementioned virtual Keyboard perspective method for real devices.
  • the keyboard perspective method of the virtual reality device first recognizes the movements of the hands of the user's hands, and then matches the movements of the hands and hands of the user with the preset activation actions, and then can determine whether the user is Want to activate the keyboard see-through feature of a virtual reality device. If the user's hands and hands actions match the preset activation actions, the keyboard perspective function of the virtual reality device can be activated; then, under the keyboard perspective function, the user's hands and hands positions are further identified; The position determines the keyboard perspective display area for keyboard display, so that the user can operate the physical keyboard in the real scene in the keyboard perspective display area.
  • the keyboard perspective method of the virtual reality device in the embodiment of the present application uses the existing hand motion recognition algorithm to recognize the user's hand motion and hand position, and determines the keyboard perspective display area based on this. Compared with the traditional hand motion recognition algorithm The keyboard perspective scheme greatly reduces the computing power and computing complexity, has higher compatibility, and can obtain a more accurate keyboard perspective area, which greatly improves the user experience.
  • FIG. 1 is a schematic diagram of a keyboard perspective method in the prior art
  • FIG. 2 is a flowchart of a keyboard perspective method of a virtual reality device according to an embodiment of the application
  • FIG. 3 is a schematic diagram of hand motion recognition according to an embodiment of the present application.
  • FIG. 4 is a perspective display effect diagram of a keyboard in VR glasses according to an embodiment of the application.
  • FIG. 5 is a schematic diagram of a preset activation action according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a perspective display area of a keyboard according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of a perspective display area of a keyboard according to another embodiment of the present application.
  • FIG. 8 is a schematic diagram of a preset closing action according to an embodiment of the present application.
  • FIG. 9 is a block diagram of a keyboard perspective device of a virtual reality device according to an embodiment of the application.
  • FIG. 10 is a schematic structural diagram of a virtual reality device in an embodiment of the present application.
  • Virtual reality technology is a computer simulation system that can create and experience virtual worlds. It uses computers to generate a simulated environment and immerse users in the environment. Virtual reality technology uses data in real life and electronic signals generated by computer technology to combine them with various output devices to transform them into phenomena that can be felt by people. These phenomena can be real objects in reality. It can also be a substance that we cannot see with the naked eye, which is represented by a three-dimensional model.
  • the virtual reality device in this application may refer to VR glasses.
  • the VR glasses use a head-mounted display device to block people's vision and hearing from the outside world, and guide the user to produce a feeling of being in a virtual environment.
  • the display principle is the left and right eyes.
  • the screen displays the images of the left and right eyes respectively, and the human eye generates a three-dimensional sense in the mind after obtaining this information with differences.
  • the following description will be given by taking VR glasses as a specific application example of a virtual reality device.
  • the keyboard perspective method for a virtual reality device includes the following steps S210 to S240:
  • Step S210 identifying the hand movements of the user's hands.
  • existing VR glasses are equipped with a binocular camera at the external front end of the glasses, which is used to collect external environment information and capture the user's posture and motion information such as hand motion information.
  • computer vision technology is usually used for hand motion recognition, and the results of hand motion recognition are often used to perform user interface operations based on hand motions, or some hand motion games.
  • the information collected by the camera in the existing VR glasses can also be used to identify the movements of the hands of the user's hands, so as to determine the perspective display area of the keyboard in combination with the movements of the hands.
  • a monocular camera or other types of cameras can also be used.
  • the specific type of camera to be used can be flexibly set by those skilled in the art according to actual needs. Make specific restrictions.
  • the above-mentioned recognition of the user's hands and hands can be real-time recognition, so as to respond to the user's needs in a timely manner, of course, in order to save the power of the device, it can also be performed every preset time.
  • the specific frequency used to identify the hand movements of both hands can be flexibly set by those skilled in the art according to actual needs, which is not specifically limited here.
  • Step S220 if the user's hands and hands meet the preset activation action, activate the keyboard see-through function of the virtual reality device.
  • the identified user's two-hand and hand movements can be compared with the preset activation actions. Make a match. If the match is successful, the keyboard see-through function of the VR glasses can be activated at this time.
  • the type of the preset activation action can be flexibly set by those skilled in the art according to actual needs, which is not specifically limited here.
  • activating the keyboard see-through function of the virtual reality device in this step can be understood as only activating the keyboard see-through function of the VR glasses.
  • the VR glasses have not entered the see-through state, that is, the user is currently viewing If the real scene is not reached, subsequent steps need to be performed to determine the keyboard perspective display area in the virtual scene.
  • the VR glasses have entered the perspective state, and the user can currently see the real scene, but in order to avoid too much impact on the user's immersive experience, you can re-determine the keyboard perspective display in the virtual scene through subsequent steps area.
  • the user's hands and hands positions can be further identified under the keyboard see-through function and combined with the above-identified user's hands and hands movements, as the basis for determining the keyboard see-through display area.
  • the identification of the positions of the hands of the user's hands here may be the identification of the two palms separately, or of course, the identification of the two hands together.
  • Step S240 determining the keyboard perspective display area according to the positions of the hands of the user, so as to display the physical keyboard in the real scene in the keyboard perspective display area.
  • the range of the keyboard perspective display area can be determined.
  • FIG. 4 a perspective display effect diagram of the keyboard in a VR glasses according to an embodiment of the present application is provided. The user can Operate the physical keyboard in the real scene in the keyboard perspective display area.
  • step S210 in the above embodiment defines that the user's hands and hands actions are specifically recognized.
  • the main function of step S210 is to determine whether to activate the keyboard see-through function of the VR glasses according to the recognized hand movements of the user
  • the preset activation action can also be set to be a one-handed action. The recognition of the movements of the hands and the positions of the hands of both hands is carried out when the area is selected.
  • the keyboard perspective method of the virtual reality device of the embodiment of the present application uses the existing hand motion recognition algorithm to recognize the user's hand motion and hand position, and determines the keyboard perspective display area based on this, which greatly reduces the computing power Compared with the traditional keyboard perspective scheme, the user experience is greatly improved.
  • activating the keyboard perspective function of the virtual reality device includes: if the user's hands and hands movements are the actions of spreading the hands down and approaching, Then it is determined that the user's hands and hands motions satisfy the preset activation motion.
  • the preset activation action in this embodiment of the present application may be the action shown in FIG. 5 . If it is recognized that the user's hands are spread downward and approaching, it is considered that the user's hands and hands meet the preset activation action, and the VR glasses are activated. Keyboard see-through function.
  • preset activation actions those skilled in the art can also set other types of preset activation actions according to actual requirements, which are not listed one by one here.
  • determining the perspective display area of the keyboard according to the positions of the hands of the user's hands includes: determining the circumscribed rectangular area of the hands according to the positions of the hands of the user's hands; The enclosing rectangular area behind is used as the keyboard perspective display area.
  • the above-mentioned preset multiples of magnification can be specifically divided into multiples in the horizontal direction and multiples in the vertical direction.
  • the multiples in the vertical direction may specifically include a vertically upward multiple y1 and a vertically downward multiple y2, and the magnification multiples in different directions can be configured according to actual conditions.
  • FIG. 6 a schematic diagram of a perspective display area of a keyboard according to an embodiment of the present application is provided.
  • the size of the magnification in the horizontal direction can also be related to the distance between the hands. If the distance between the hands is small or touching, the magnification in the horizontal direction can be set to a larger value, and if the distance between the hands is large, the magnification in the horizontal direction can be set to a smaller value.
  • determining the perspective display area of the keyboard according to the positions of the hands of the user's hands includes: determining the circumscribed square area of any one of the user's palms; Width; determine the center position of the user's left palm and the center position of the right palm, connect the center position of the left palm and the center position of the right palm, and expand the proportional length in the vertical direction and the proportional width in the horizontal direction according to the midpoint of the connection. Get the keyboard perspective display area.
  • the circumscribed square of any one of the user's palms may also be determined first.
  • the sizes of the circumscribed squares of the two palms are basically the same, so the circumscribing square of the specific palm has no influence on the subsequent determination of the perspective display area of the keyboard.
  • the scaled length and scaled width for expansion can then be determined based on the size of the enclosing square.
  • the side length of the circumscribed square is a
  • it can be expanded by 1.5a to the left and right, a is expanded upwards, and 0.5a is expanded downwards, then the keyboard perspective display area obtained after the expansion
  • the size is 1.5a x 3a.
  • the method further includes: if the user's hands and hands meet a preset closing action, closing the keyboard see-through function of the virtual reality device.
  • the user's need to activate the keyboard see-through display function of the VR glasses may only be temporary. Therefore, in order to ensure that the user can quickly return to the immersive experience of the virtual scene after using the keyboard see-through display function, it is also possible to detect Whether the user has made a hand action to turn off the keyboard see-through function of the VR glasses, if it is detected that the user's hand movement matches the preset closing action, the keyboard see-through display function of the VR glasses can be turned off at this time.
  • FIG. 8 a schematic diagram of a preset closing action is provided.
  • the user can turn off the keyboard see-through display function of the VR glasses by performing an action of spreading his hands upward and placing them in front of his eyes.
  • other closing actions can also be flexibly set according to actual needs, which is not specifically limited here.
  • a more complex activation/deactivation condition can be further set, for example, the identified user can be The duration of the hand activation/deactivation action is counted. If it exceeds the preset time threshold, it is considered that the user wants to activate/deactivate the keyboard see-through display function of the VR glasses. The number of execution times of the user's hand activation/deactivation actions can also be counted, and if the preset execution times are reached, it is considered that the user wants to activate/deactivate the keyboard see-through display function of the VR glasses. How to configure the activation/deactivation conditions of the keyboard see-through display function can be flexibly set by those skilled in the art according to the actual situation, and will not be listed one by one here.
  • FIG. 9 shows a block diagram of a keyboard see-through device of a virtual reality device according to an embodiment of the present application.
  • the keyboard see-through device 900 of the virtual reality device includes: a two-hand hand motion recognition unit 910 and a keyboard see-through function activation unit 920 , the hand position recognition unit 930 of both hands and the keyboard perspective display area determination unit 940 . in,
  • a two-hand hand motion recognition unit 910 used to recognize the user's two-hand hand motion
  • a keyboard see-through function activation unit 920 configured to activate the keyboard see-through function of the virtual reality device if the user's hands and hands meet the preset activation action;
  • the two-hand and hand position identification unit 930 is used to identify the position of the user's hands and hands under the keyboard see-through function
  • the keyboard see-through display area determining unit 940 is configured to determine the keyboard see-through display area according to the positions of the hands of the user, so as to display the physical keyboard in the real scene in the keyboard see-through display area.
  • the keyboard see-through function activation unit 920 is specifically configured to determine that the user's hands and hands meet the preset activation action if the user's hands and hands are actions of spreading down and approaching the hands.
  • the keyboard perspective display area determination unit 940 is specifically configured to: determine the circumscribed rectangular area of the hands according to the positions of the hands of the user's hands; enlarge the determined circumscribed rectangular area by a preset multiple, and enlarge the circumscribed rectangular area after the enlarged The rectangular area serves as the keyboard perspective display area.
  • the keyboard perspective display area determination unit 940 is specifically configured to: determine the circumscribed square area of any palm of the user; determine the proportional length and proportional width for expansion according to the size of the circumscribed square area; determine The center position of the user's left palm and the center position of the right palm, connect a line between the center position of the left palm and the center position of the right palm, and extend the proportional length vertically and horizontally according to the midpoint of the connection to obtain the keyboard perspective Display area.
  • the device further includes: a keyboard see-through function closing unit, configured to disable the keyboard see-through function of the virtual reality device if the user's hands and hands meet a preset closing action.
  • a keyboard see-through function closing unit configured to disable the keyboard see-through function of the virtual reality device if the user's hands and hands meet a preset closing action.
  • FIG. 10 is a schematic structural diagram of a virtual reality device.
  • the virtual reality device includes a memory, a processor, and optionally an interface module, a communication module, and the like.
  • Memory may include memory, such as high-speed random-access memory (Random-Access Memory,
  • the virtual reality device may also include hardware required by other businesses.
  • the processor, interface module, communication module and memory can be connected to each other through an internal bus, which can be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus or EISA (Extended Industry Standard Architecture, Extended Industry Standard Architecture) bus, etc.
  • the bus can be divided into address bus, data bus, control bus and so on. For ease of representation, only one bidirectional arrow is shown in FIG. 10, but it does not mean that there is only one bus or one type of bus.
  • Memory for storing computer-executable instructions.
  • the memory provides computer-executable instructions to the processor through an internal bus.
  • the processor executes the computer-executable instructions stored in the memory, and is specifically configured to implement the following operations:
  • the keyboard perspective display area is determined according to the position of the user's hands and hands, so as to display the physical keyboard in the real scene in the keyboard perspective display area.
  • a processor may be an integrated circuit chip with signal processing capabilities.
  • each step of the above-mentioned method can be completed by a hardware integrated logic circuit in a processor or an instruction in the form of software.
  • the above-mentioned processor can be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; it can also be a digital signal processor (Digital Signal Processor, DSP), dedicated integrated Circuit (Application Specific Integrated Circuit, ASIC), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the steps of the methods disclosed in conjunction with the embodiments of the present application may be directly embodied as executed by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • the software module may be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other storage media mature in the art.
  • the storage medium is located in the memory, and the processor reads the information in the memory, and completes the steps of the above method in combination with its hardware.
  • the virtual reality device can also perform the steps performed by the keyboard perspective method of the virtual reality device in FIG. 1 , and realize the functions of the keyboard perspective method of the virtual reality device in the embodiment shown in FIG. 1 .
  • the embodiments of the present application further provide a computer-readable storage medium, where the computer-readable storage medium stores one or more programs, and when the one or more programs are executed by the processor, realize the keyboard perspective of the aforementioned virtual reality device method, and is specifically used to execute:
  • the keyboard perspective display area is determined according to the position of the user's hands and hands, so as to display the physical keyboard in the real scene in the keyboard perspective display area.
  • the embodiments of the present application may be provided as a method, a system, or a computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) embodying computer-usable program code.
  • computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory result in an article of manufacture comprising instruction means, the instructions
  • the apparatus implements the functions specified in the flow or flow of the flowcharts and/or the block or blocks of the block diagrams.
  • a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • Memory may include forms of non-persistent memory in computer readable media, random access memory (RAM) and/or non-volatile memory, such as read only memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
  • RAM random access memory
  • ROM read only memory
  • flash RAM flash memory
  • Computer-readable media includes both persistent and non-permanent, removable and non-removable media, and storage of information may be implemented by any method or technology.
  • Information may be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase-change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Flash Memory or other memory technology, Compact Disc Read Only Memory (CD-ROM), Digital Versatile Disc (DVD) or other optical storage, Magnetic tape cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
  • computer-readable media does not include transitory computer-readable media, such as modulated data signals and carrier waves.
  • the embodiments of the present application may be provided as a method, a system or a computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) characterized by embodying computer-usable program code.
  • computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.

Abstract

本申请公开了一种虚拟现实设备的键盘透视方法、装置及虚拟现实设备。所述方法包括:识别用户的双手手部动作;若用户的所述双手手部动作满足预设激活动作,则激活所述虚拟现实设备的键盘透视功能;在所述键盘透视功能下,识别用户的双手手部位置;根据用户的所述双手手部位置确定键盘透视显示区域,以在所述键盘透视显示区域显示现实场景中的物理键盘。本申请的虚拟现实设备的键盘透视方法利用现有的手部动作识别算法对用户的手部动作和手部位置进行识别,并以此为基础确定键盘透视显示区域,大大降低了运算能力和运算复杂度,兼容性更高,且可以获得更为准确的键盘透视区域,同时相比于传统的键盘透视方案,大大提高了用户的使用体验。

Description

虚拟现实设备的键盘透视方法、装置及虚拟现实设备
相关申请的交叉引用
本申请基于申请号为202011319697.0、申请日为2020年11月23日,名称为“虚拟现实设备的键盘透视方法、装置及虚拟现实设备”的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。
技术领域
本申请涉及虚拟现实技术领域,具体涉及一种虚拟现实设备的键盘透视方法、装置及虚拟现实设备。
背景技术
随着虚拟现实眼镜(简称“VR眼镜”)的应用场景的增加,“生产力工具”成了一种新的应用。在生产场景下,大部分情况下需要进行快速的键盘输入,而VR眼镜这种封闭的使用环境,就成了键盘输入的阻碍。
为了解决上述问题,现有技术中提出了一种解决方案,如图1所示,通过VR眼镜上的前置摄像头直接对视野中的键盘位置进行识别,然后将这个位置在VR眼镜的前置摄像头所显示的虚拟场景中透视显示出来,从而实现能够在VR眼镜所显示的虚拟场景中看到现实场景中的物理键盘的功能。
然而发明人发现,上述技术方案对视野中的键盘进行识别需要额外增加运算能力,并且因为键盘的型号众多,识别算法的兼容性也不够好,导致最终的识别结果受到影响。
发明内容
有鉴于此,本申请的主要目的在于提供了一种虚拟现实设备的键盘透视方法、装置及虚拟现实设备,用于解决现有的虚拟现实设备的键盘透视方法较为复杂且效果较差的技术问题。
依据本申请的第一方面,提供了一种虚拟现实设备的键盘透视方法,包括:
识别用户的双手手部动作;
若用户的所述双手手部动作满足预设激活动作,则激活所述虚拟现实设备的键盘透视功能;
在所述键盘透视功能下,识别用户的双手手部位置;
根据用户的所述双手手部位置确定键盘透视显示区域,以在所述键盘透视显示区域显示现实场景中的物理键盘。
依据本申请的第二方面,提供了一种虚拟现实设备的键盘透视装置,包括:
双手手部动作识别单元,用于识别用户的双手手部动作;
键盘透视功能激活单元,用于若用户的所述双手手部动作满足预设激活动作,则激活所述虚拟现实设备的键盘透视功能;
双手手部位置识别单元,用于在所述键盘透视功能下,识别用户的双手手部位置;
键盘透视显示区域确定单元,用于根据用户的所述双手手部位置确定键盘透视显示区域,以在所述键盘透视显示区域显示现实场景中的物理键盘。
依据本申请的第三方面,提供了一种虚拟现实设备,包括:处理器,存储计算机可执行指令的存储器,
所述可执行指令在被所述处理器执行时,实现前述虚拟现实设备的键盘透视方法。
依据本申请的第四方面,提供了一种计算机可读存储介质,所述计算机可读存储介质存储一个或多个程序,所述一个或多个程序当被处理器执行时,实现前述的虚拟现实设备的键盘透视方法。
本申请的有益效果是:本申请实施例的虚拟现实设备的键盘透视方法,先识别用户的双手手部动作,然后将用户的双手手部动作与预设激活动作进行匹配,进而可以确定用户是否想要激活虚拟现实设备的键盘透视功能。如果用户的双手手部动作与预设激活动作相匹配,则可以激活虚拟现实设备的键盘透视功能;之后在键盘透视功能下,进一步识别用户的双手手部位置;进而可以根据用户的双手手部位置确定出用于键盘显示的键盘透视显示区域,使得用户可以在键盘透视显示区域对现实场景中的物理键盘进行操作。本申请实施例的虚拟现实设备的键盘透视方法利用现有的手部动作识别算法对用户的手部动作和手部位置进行识别,并以此为基础确定键盘透视显示区 域,相比于传统的键盘透视方案,大大降低了运算能力和运算复杂度,兼容性更高,且可以获得更为准确的键盘透视区域,大大提高了用户的使用体验。
附图说明
通过阅读下文优选实施方式的详细描述,各种其他的优点和益处对于本领域普通技术人员将变得清楚明了。附图仅用于示出优选实施方式的目的,而并不认为是对本申请的限制。而且在整个附图中,用相同的参考符号表示相同的部件。在附图中:
图1为现有技术中的一种键盘透视方法的示意图;
图2为本申请一个实施例的虚拟现实设备的键盘透视方法的流程图;
图3为本申请一个实施例的手部动作识别示意图;
图4为本申请一个实施例的VR眼镜中的键盘透视显示效果图;
图5为本申请一个实施例的预设激活动作的示意图;
图6为本申请一个实施例的键盘透视显示区域的示意图;
图7为本申请另一个实施例的键盘透视显示区域的示意图;
图8为本申请一个实施例的预设关闭动作的示意图;
图9为本申请一个实施例的虚拟现实设备的键盘透视装置的框图;
图10为本申请一个实施例中虚拟现实设备的结构示意图。
具体实施方式
下面将参照附图更详细地描述本申请的示例性实施例。提供这些实施例是为了能够更透彻地理解本申请,并且能够将本申请的范围完整的传达给本领域的技术人员。虽然附图中显示了本申请的示例性实施例,然而应当理解,可以以各种形式实现本申请而不应被这里阐述的实施例所限制。
虚拟现实技术是一种可以创建和体验虚拟世界的计算机仿真系统,它利用计算机生成一种模拟环境,使用户沉浸到该环境中。虚拟现实技术利用现实生活中的数据,通过计算机技术产生的电子信号,将其与各种输出设备结合使其转化为能够让人们感受到的现象,这些现象可以是现实中真真切切的物体,也可以是我们肉眼所看不到的物质,通过三维模型表现出来。本申请的虚拟现实设备可以是指VR眼镜,VR眼镜是利用头戴式显示设备将人对外界的视觉、听觉封闭,引导用户产生一种身在虚拟环境中的感觉,其显示原 理是左右眼屏幕分别显示左右眼的图像,人眼获取这种带有差异的信息后在脑海中产生立体感。为描述方便,下面将以VR眼镜作为一种具体的虚拟现实设备的应用示例进行描述。
图2示出了根据本申请一个实施例的虚拟现实设备的键盘透视方法的流程示意图,参见图2,本申请实施例的虚拟现实设备的键盘透视方法包括如下步骤S210至步骤S240:
步骤S210,识别用户的双手手部动作。
在进行VR眼镜的键盘透视显示时,可以先识别用户的双手手部动作,
如图3所示,一般现有的VR眼镜都会在眼镜外部前端设置有双目摄像头,用于采集外部环境信息,捕捉用户的姿态运动信息如手部动作信息等。在现有的虚拟现实应用场景下,通常利用计算机视觉技术进行手部动作识别,手部动作识别的结果往往被用来进行基于手部动作的用户界面操作,或者一些手部体感游戏等。在本申请实施例中,同样可以利用现有的VR眼镜中自带的摄像头所采集到的信息进行用户双手手部动作的识别,以结合双手手部动作确定键盘透视显示区域。
当然除了可以采用上述双目摄像头进行手部动作信息的采集,也可以采用单目摄像头或者其他类型的摄像头,具体采用何种类型的摄像头,本领域技术人员可根据实际需求灵活设置,在此不做具体限定。
在利用计算机视觉技术进行双手手部动作识别时,具体可以采用如下方法:首先设计手部动作特征和手部动作模型,并利用手部动作样本提取特征,对手部动作模型进行训练,最终建立手部动作模型。在此基础上,通过双目摄像头采集新的手部动作图像并进行预处理,接着对手部动作图像进行手部动作分割,从而比较准确地提取图像中的人手部分,然后进行手部动作特征提取;最后,利用前面建立好的手部动作模型对输入的手部动作进行分类识别。
当然除了上述识别方法,本领域技术人员也可以根据实际需求选择其他方式进行手部动作识别,在此不作具体限定。
此外,上述对于用户的双手手部动作的识别可以是实时识别,以便于对用户的需求及时作出响应,当然出于节省设备电量的考虑,也可以是每隔预 设时间进行一次双手手部动作的识别,具体采用何种频率识别双手手部动作,本领域技术人员可根据实际需求灵活设置,在此不做具体限定。
步骤S220,若用户的双手手部动作满足预设激活动作,则激活虚拟现实设备的键盘透视功能。
在得到用户的双手手部动作后,需要进一步确定用户的双手手部动作是否是要激活VR眼镜的键盘透视功能的动作,因此这里可以将识别到的用户的双手手部动作与预设激活动作进行匹配,如果匹配成功,则此时可以激活VR眼镜的键盘透视功能。预设激活动作的类型本领域技术人员可以根据实际需求灵活设置,在此不作具体限定。
需要说明的是,该步骤中的“激活虚拟现实设备的键盘透视功能”可以理解为只是激活了VR眼镜的键盘透视功能,本质上VR眼镜还没有进入透视状态,也就是说,用户当前还看不到现实场景,需要进行后续的步骤确定虚拟场景中的键盘透视显示区域。当然,也可以理解为是VR眼镜已经进入透视状态,用户当前能够看到现实场景,但是为了避免对用户的沉浸式体验造成过多影响,可以通过后续的步骤重新确定虚拟场景中的键盘透视显示区域。
步骤S230,在键盘透视功能下,识别用户的双手手部位置。
为了准确确定键盘透视显示区域,可以在键盘透视功能下,结合上述识别到的用户的双手手部动作进一步识别用户的双手手部位置,以此作为确定键盘透视显示区域的基础。需要说明的是,这里对于用户的双手手部位置的识别可以是两只手掌分别识别,当然也可以是双手一起识别。
步骤S240,根据用户的双手手部位置确定键盘透视显示区域,以在键盘透视显示区域显示现实场景中的物理键盘。
根据上述识别到的用户的双手手部位置,进而可以确定出键盘透视显示的区域范围,如图4所示,提供了本申请实施例的一种VR眼镜中的键盘透视显示效果图,用户可以在键盘透视显示区域对现实场景中的物理键盘进行操作。
需要说明的是,由于在实际应用场景下,用户通常是双手操作键盘,因此上述实施例中的步骤S210中限定了具体识别的是用户的双手手部动作。当然,考虑到步骤S210的主要作用是根据识别到的用户的手部动作确定是否激 活VR眼镜的键盘透视功能,因此预设激活动作也可以设定为是单手动作,后续在确定键盘透视显示区域时再进行双手手部动作和双手手部位置的识别。
本申请实施例的虚拟现实设备的键盘透视方法利用现有的手部动作识别算法对用户的手部动作和手部位置进行识别,并以此为基础确定键盘透视显示区域,大大降低了运算能力和运算复杂度,兼容性更高,且可以获得更为准确的键盘透视区域,同时相比于传统的键盘透视方案,大大提高了用户的使用体验。
在本申请的一个实施例中,若用户的双手手部动作满足预设激活动作,则激活虚拟现实设备的键盘透视功能包括:若用户的双手手部动作为双手向下展开并靠近的动作,则确定用户的双手手部动作满足预设激活动作。
本申请实施例的预设激活动作可以是如图5所示的动作,如果识别到用户的双手向下展开并靠近,则认为用户的双手手部动作满足预设激活动作,则激活VR眼镜的键盘透视功能。当然除了上述类型的预设激活动作,本领域技术人员也可以根据实际需求设置其他类型的预设激活动作,在此不一一列举。
在本申请的一个实施例中,根据用户的双手手部位置确定键盘透视显示区域包括:根据用户的双手手部位置确定双手的外接矩形区域;将确定的外接矩形区域放大预设倍数,将放大后的外接矩形区域作为键盘透视显示区域。
实际应用场景下,如果用户想要在键盘透视显示区域操作物理键盘,用户的双手手部动作会做出如图5所示的双手向下展开并靠近的动作,且保证双手尽可能位于同一水平线上。因此本申请实施例在根据用户的双手手部位置确定键盘透视显示区域时,可以先确定出双手的一个外接矩形,由于实际操作时,键盘所在的区域会超出双手的外接矩形所覆盖的区域,因此可以对确定好的外接矩形区域适当放大一定倍数,该倍数的大小可根据实际需求灵活设置,以尽可能覆盖多种型号的键盘大小,最后可以将放大后的外接矩形区域作为键盘透视显示区域,使得用户可以在该键盘透视显示区域对现实场景的物理键盘进行操作。
上述放大的预设倍数可以具体分为水平方向上的倍数和竖直方向上的倍数,进一步地,水平方向上的倍数具体可以包括水平向左的倍数x1和水平向 右的倍数x2,以及竖直方向上的倍数具体可以包括竖直向上的倍数y1和竖直向下的倍数y2,不同方向上放大的倍数都可以根据实际情况进行配置。如图6所示,提供了本申请实施例的一种键盘透视显示区域示意图。
此外,对于水平方向上放大倍数的大小,还可以与双手之间的距离有关。如果双手之间距离很小或者相接触,则水平方向上放大的倍数可以设置为较大数值,如果双手之间距离较大,则水平方向上放大的倍数可以设置为较小数值。
在本申请的一个实施例中,根据用户的双手手部位置确定键盘透视显示区域包括:确定用户的任意一只手掌的外接正方形区域;根据外接正方形区域的大小确定用于扩展的比例长度和比例宽度;确定用户的左手掌中心位置和右手掌中心位置,将左手掌中心位置和右手掌中心位置连线,并根据连线中点向竖直方向扩展比例长度,以及向水平方向扩展比例宽度,得到键盘透视显示区域。
如图7所示,本申请实施例在根据用户的双手手部位置确定键盘透视显示区域时,还可以先确定用户的任意一只手掌的外接正方形。通常来说,两只手掌的外接正方形的大小基本是一致的,因此具体是哪只手掌的外接正方形对于后续确定键盘透视显示区域没有影响。之后可以根据外接正方形的大小确定用于扩展的比例长度和比例宽度。在根据比例长度和比例宽度进行区域扩展时,可以先将两只手掌中心连线,然后根据连线中点,向竖直方向扩展比例长度,向水平方向扩展比例宽度,得到最终的键盘透视显示区域。由于是基于连线中点进行扩展,因此不会受到两手之间距离的影响。
举例说明,假设外接正方形的边长为a,则可以根据连线中点,分别向左和向右扩展1.5a,向上扩展a,向下扩展0.5a,则扩展后得到的键盘透视显示区域的大小即为1.5a x 3a。
在本申请的一个实施例中,该方法还包括:若用户的双手手部动作满足预设关闭动作,则关闭虚拟现实设备的键盘透视功能。
在实际的应用场景下,用户激活VR眼镜的键盘透视显示功能的需求可能只是暂时的,因此为了保证用户在利用完键盘透视显示功能后能够快速回到虚拟场景的沉浸式体验中,还可以检测用户是否有做出要关闭VR眼镜的 键盘透视功能的手部动作,如果检测到用户的手部动作与预设关闭动作相匹配,则此时可以关闭VR眼镜的键盘透视显示功能。
如图8所示,提供了一种预设关闭动作的示意图,用户通过做出双手向上展开,放于眼前的动作,则可以关闭VR眼镜的键盘透视显示功能。当然除了基于图8所示的预设关闭动作关闭虚拟场景中的键盘透视显示区域,也可以根据实际需求灵活设置其他的关闭动作,在此不作具体限定。
在本申请的一个实施例中,为了避免用户的误操作,在用户的手部动作满足预设激活/关闭动作时,还可以进一步设置更复杂的激活/关闭条件,例如可以对识别到的用户的手部激活/关闭动作的持续时间进行统计,如果超过预设时间阈值,则认为用户想要激活/关闭VR眼镜的键盘透视显示功能。还可以对用户的手部激活/关闭动作的执行次数进行统计,如果达到预设执行次数,则认为用户想要激活/关闭VR眼镜的键盘透视显示功能。具体如何配置键盘透视显示功能的激活/关闭条件,本领域技术人员可根据实际情况灵活设置,在此不一一列举。
与前述虚拟现实设备的键盘透视方法同属于一个技术构思,本申请实施例还提供了虚拟现实设备的键盘透视装置。图9示出了本申请一个实施例的虚拟现实设备的键盘透视装置的框图,参见图9,该虚拟现实设备的键盘透视装置900包括:双手手部动作识别单元910、键盘透视功能激活单元920、双手手部位置识别单元930和键盘透视显示区域确定单元940。其中,
双手手部动作识别单元910,用于识别用户的双手手部动作;
键盘透视功能激活单元920,用于若用户的双手手部动作满足预设激活动作,则激活虚拟现实设备的键盘透视功能;
双手手部位置识别单元930,用于在键盘透视功能下,识别用户的双手手部位置;
键盘透视显示区域确定单元940,用于根据用户的双手手部位置确定键盘透视显示区域,以在键盘透视显示区域显示现实场景中的物理键盘。
在本申请的一个实施例中,键盘透视功能激活单元920具体用于:若用户的双手手部动作为双手向下展开并靠近的动作,则确定用户的双手手部动作满足预设激活动作。
在本申请的一个实施例中,键盘透视显示区域确定单元940具体用于:根据用户的双手手部位置确定双手的外接矩形区域;将确定的外接矩形区域放大预设倍数,将放大后的外接矩形区域作为键盘透视显示区域。
在本申请的一个实施例中,键盘透视显示区域确定单元940具体用于:确定用户的任意一只手掌的外接正方形区域;根据外接正方形区域的大小确定用于扩展的比例长度和比例宽度;确定用户的左手掌中心位置和右手掌中心位置,将左手掌中心位置和右手掌中心位置连线,并根据连线中点向竖直方向扩展比例长度,以及向水平方向扩展比例宽度,得到键盘透视显示区域。
在本申请的一个实施例中,该装置还包括:键盘透视功能关闭单元,用于若用户的双手手部动作满足预设关闭动作,则关闭虚拟现实设备的键盘透视功能。
需要说明的是:
图10示意了虚拟现实设备的结构示意图。请参考图10,在硬件层面,该虚拟现实设备包括存储器和处理器,可选地还包括接口模块、通信模块等。存储器可能包含内存,例如高速随机存取存储器(Random-Access Memory,
RAM),也可能还包括非易失性存储器(non-volatile memory),例如至少一个磁盘存储器等。当然,该虚拟现实设备还可能包括其他业务所需要的硬件。
处理器、接口模块、通信模块和存储器可以通过内部总线相互连接,该内部总线可以是ISA(Industry Standard Architecture,工业标准体系结构)总线、PCI(Peripheral Component Interconnect,外设部件互连标准)总线或EISA(Extended Industry Standard Architecture,扩展工业标准结构)总线等。总线可以分为地址总线、数据总线、控制总线等。为便于表示,图10中仅用一个双向箭头表示,但并不表示仅有一根总线或一种类型的总线。
存储器,用于存放计算机可执行指令。存储器通过内部总线向处理器提供计算机可执行指令。
处理器,执行存储器所存放的计算机可执行指令,并具体用于实现以下操作:
识别用户的双手手部动作;
若用户的双手手部动作满足预设激活动作,则激活虚拟现实设备的键盘透视功能;
在键盘透视功能下,识别用户的双手手部位置;
根据用户的双手手部位置确定键盘透视显示区域,以在键盘透视显示区域显示现实场景中的物理键盘。
上述如本申请图9所示实施例揭示的虚拟现实设备的键盘透视装置执行的功能可以应用于处理器中,或者由处理器实现。处理器可能是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器可以是通用处理器,包括中央处理器(Central Processing Unit,CPU)、网络处理器(Network Processor,NP)等;还可以是数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。
该虚拟现实设备还可执行图1中虚拟现实设备的键盘透视方法执行的步骤,并实现虚拟现实设备的键盘透视方法在图1所示实施例的功能,本申请实施例在此不再赘述。
本申请实施例还提出了一种计算机可读存储介质,该计算机可读存储介质存储一个或多个程序,该一个或多个程序当被处理器执行时,实现前述的虚拟现实设备的键盘透视方法,并具体用于执行:
识别用户的双手手部动作;
若用户的双手手部动作满足预设激活动作,则激活虚拟现实设备的键盘透视功能;
在键盘透视功能下,识别用户的双手手部位置;
根据用户的双手手部位置确定键盘透视显示区域,以在键盘透视显示区域显示现实场景中的物理键盘。
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请是根据本申请实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括要素的过程、方法、商品或者设备中还存在另外的相同要素。
本领域技术人员应明白,本申请的实施例可提供为方法、系统或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其特征在于包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
以上仅为本申请的实施例而已,并不用于限制本申请。对于本领域技术人员来说,本申请可以有各种更改和变化。凡在本申请的精神和原理之内所作的任何修改、等同替换、改进等,均应包含在本申请的权利要求范围之内。

Claims (10)

  1. 一种虚拟现实设备的键盘透视方法,其特征在于,包括:
    识别用户的双手手部动作;
    若用户的所述双手手部动作满足预设激活动作,则激活所述虚拟现实设备的键盘透视功能;
    在所述键盘透视功能下,识别用户的双手手部位置;
    根据用户的所述双手手部位置确定键盘透视显示区域,以在所述键盘透视显示区域显示现实场景中的物理键盘。
  2. 根据权利要求1所述的方法,其特征在于,所述若用户的所述双手手部动作满足预设激活动作,则激活所述虚拟现实设备的键盘透视功能包括:
    若用户的所述双手手部动作为双手向下展开并靠近的动作,则确定用户的所述双手手部动作满足预设激活动作。
  3. 根据权利要求1所述的方法,其特征在于,所述根据用户的所述双手手部位置确定键盘透视显示区域包括:
    根据用户的所述双手手部位置确定双手的外接矩形区域;
    将确定的所述外接矩形区域放大预设倍数,将放大后的外接矩形区域作为所述键盘透视显示区域。
  4. 根据权利要求1所述的方法,其特征在于,所述根据用户的所述双手手部位置确定键盘透视显示区域包括:
    确定用户的任意一只手掌的外接正方形区域;
    根据所述外接正方形区域的大小确定用于扩展的比例长度和比例宽度;
    确定所述用户的左手掌中心位置和右手掌中心位置,将所述左手掌中心位置和所述右手掌中心位置连线,并根据连线中点向竖直方向扩展所述比例长度,以及向水平方向扩展所述比例宽度,得到所述键盘透视显示区域。
  5. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    若用户的所述双手手部动作满足预设关闭动作,则关闭所述虚拟现实设备的键盘透视功能。
  6. 一种虚拟现实设备的键盘透视装置,其特征在于,包括:
    双手手部动作识别单元,用于识别用户的双手手部动作;
    键盘透视功能激活单元,用于若用户的所述双手手部动作满足预设激活动作,则激活所述虚拟现实设备的键盘透视功能;
    双手手部位置识别单元,用于在所述键盘透视功能下,识别用户的双手手部位置;
    键盘透视显示区域确定单元,用于根据用户的所述双手手部位置确定键盘透视显示区域,以在所述键盘透视显示区域显示现实场景中的物理键盘。
  7. 根据权利要求6所述的装置,其特征在于,所述键盘透视功能激活单元具体用于:
    若用户的所述双手手部动作为双手向下展开并靠近的动作,则确定用户的所述双手手部动作满足预设激活动作。
  8. 根据权利要求6所述的装置,其特征在于,所述键盘透视显示区域确定单元具体用于:
    根据用户的所述双手手部位置确定双手的外接矩形区域;
    将确定的所述外接矩形区域放大预设倍数,将放大后的外接矩形区域作为所述键盘透视显示区域。
  9. 根据权利要求6所述的装置,其特征在于,所述键盘透视显示区域确定单元具体用于:
    确定用户的任意一只手掌的外接正方形区域;
    根据所述外接正方形区域的大小确定用于扩展的比例长度和比例宽度;
    确定所述用户的左手掌中心位置和右手掌中心位置,将所述左手掌中心位置和所述右手掌中心位置连线,并根据连线中点向竖直方向扩展所述比例长度,以及向水平方向扩展所述比例宽度,得到所述键盘透视显示区域。
  10. 一种虚拟现实设备,其特征在于,包括:处理器,存储计算机可执行指令的存储器,
    所述可执行指令在被所述处理器执行时,实现所述权利要求1至5之任一所述虚拟现实设备的键盘透视方法。
PCT/CN2021/130200 2020-11-23 2021-11-12 虚拟现实设备的键盘透视方法、装置及虚拟现实设备 WO2022105677A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/037,969 US20240004477A1 (en) 2020-11-23 2021-11-12 Keyboard perspective method and apparatus for virtual reality device, and virtual reality device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011319697.0A CN112445341B (zh) 2020-11-23 2020-11-23 虚拟现实设备的键盘透视方法、装置及虚拟现实设备
CN202011319697.0 2020-11-23

Publications (1)

Publication Number Publication Date
WO2022105677A1 true WO2022105677A1 (zh) 2022-05-27

Family

ID=74738578

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/130200 WO2022105677A1 (zh) 2020-11-23 2021-11-12 虚拟现实设备的键盘透视方法、装置及虚拟现实设备

Country Status (3)

Country Link
US (1) US20240004477A1 (zh)
CN (1) CN112445341B (zh)
WO (1) WO2022105677A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112462937B (zh) * 2020-11-23 2022-11-08 青岛小鸟看看科技有限公司 虚拟现实设备的局部透视方法、装置及虚拟现实设备
CN112445341B (zh) * 2020-11-23 2022-11-08 青岛小鸟看看科技有限公司 虚拟现实设备的键盘透视方法、装置及虚拟现实设备
EP4305506A1 (en) * 2021-03-12 2024-01-17 Telefonaktiebolaget LM Ericsson (publ) Electronic device, and methods of the electronic device for generating feedback related to an interaction with a touch input arrangement

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020130844A1 (en) * 1998-12-31 2002-09-19 Natoli Anthony James Francis Virtual reality keyboard system and method
CN105975067A (zh) * 2016-04-28 2016-09-28 上海创米科技有限公司 应用于虚拟现实产品的按键输入设备及方法
CN106537261A (zh) * 2014-07-15 2017-03-22 微软技术许可有限责任公司 全息键盘显示
CN108334203A (zh) * 2018-04-13 2018-07-27 北京理工大学 一种用于虚拟现实的虚实融合键盘系统
CN110832441A (zh) * 2017-05-19 2020-02-21 奇跃公司 用于虚拟、增强和混合现实显示系统的键盘
CN112445341A (zh) * 2020-11-23 2021-03-05 青岛小鸟看看科技有限公司 虚拟现实设备的键盘透视方法、装置及虚拟现实设备

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9865089B2 (en) * 2014-07-25 2018-01-09 Microsoft Technology Licensing, Llc Virtual reality environment with real world objects
CN106484085B (zh) * 2015-08-31 2019-07-23 北京三星通信技术研究有限公司 在头戴式显示器中显示真实物体的方法及其头戴式显示器
CN107368179A (zh) * 2017-06-12 2017-11-21 广东网金控股股份有限公司 一种虚拟现实系统的输入方法及装置
WO2019136248A1 (en) * 2018-01-05 2019-07-11 Google Llc Selecting content to render on display of assistant device
CN108401452B (zh) * 2018-02-23 2021-05-07 香港应用科技研究院有限公司 使用虚拟现实头戴式显示器系统来执行真实目标检测和控制的装置和方法
CN108646997A (zh) * 2018-05-14 2018-10-12 刘智勇 一种虚拟及增强现实设备与其他无线设备进行交互的方法
CN109885174A (zh) * 2019-02-28 2019-06-14 努比亚技术有限公司 手势操控方法、装置、移动终端及存储介质
US11137908B2 (en) * 2019-04-15 2021-10-05 Apple Inc. Keyboard operation with head-mounted device
CN111415422B (zh) * 2020-04-17 2022-03-18 Oppo广东移动通信有限公司 虚拟对象调整方法、装置、存储介质与增强现实设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020130844A1 (en) * 1998-12-31 2002-09-19 Natoli Anthony James Francis Virtual reality keyboard system and method
CN106537261A (zh) * 2014-07-15 2017-03-22 微软技术许可有限责任公司 全息键盘显示
CN105975067A (zh) * 2016-04-28 2016-09-28 上海创米科技有限公司 应用于虚拟现实产品的按键输入设备及方法
CN110832441A (zh) * 2017-05-19 2020-02-21 奇跃公司 用于虚拟、增强和混合现实显示系统的键盘
CN108334203A (zh) * 2018-04-13 2018-07-27 北京理工大学 一种用于虚拟现实的虚实融合键盘系统
CN112445341A (zh) * 2020-11-23 2021-03-05 青岛小鸟看看科技有限公司 虚拟现实设备的键盘透视方法、装置及虚拟现实设备

Also Published As

Publication number Publication date
US20240004477A1 (en) 2024-01-04
CN112445341A (zh) 2021-03-05
CN112445341B (zh) 2022-11-08

Similar Documents

Publication Publication Date Title
WO2022105677A1 (zh) 虚拟现实设备的键盘透视方法、装置及虚拟现实设备
US11354825B2 (en) Method, apparatus for generating special effect based on face, and electronic device
JP6984840B2 (ja) リアルタイムコメント表示方法及び電子機器
TWI654539B (zh) 虛擬實境交互方法、裝置與系統
WO2020001013A1 (zh) 图像处理方法、装置、计算机可读存储介质和终端
WO2022105919A1 (zh) 虚拟现实设备的局部透视方法、装置及虚拟现实设备
TWI687901B (zh) 虛擬實境設備的安全監控方法、裝置及虛擬實境設備
WO2020019665A1 (zh) 基于人脸的三维特效生成方法、装置和电子设备
US10255690B2 (en) System and method to modify display of augmented reality content
US11176355B2 (en) Facial image processing method and apparatus, electronic device and computer readable storage medium
WO2022237268A1 (zh) 头戴式显示设备的信息输入方法、装置及头戴式显示设备
WO2020019664A1 (zh) 基于人脸的形变图像生成方法和装置
WO2020024692A1 (zh) 一种人机交互方法和装置
WO2017052880A1 (en) Augmented reality with off-screen motion sensing
TWI544367B (zh) 手勢辨識與控制方法及其裝置
WO2021248857A1 (zh) 一种障碍物属性判别方法、系统及智能机器人
JP2014059869A (ja) 眼探索方法及び該方法を使用した眼状態検出装置と眼探索装置
CN113282167B (zh) 头戴式显示设备的交互方法、装置及头戴式显示设备
CN114510142B (zh) 基于二维图像的手势识别方法及其系统和电子设备
CN114327063A (zh) 目标虚拟对象的交互方法、装置、电子设备及存储介质
CN114092608A (zh) 表情的处理方法及装置、计算机可读存储介质、电子设备
CN107526439A (zh) 一种界面返回方法及装置
CN108121442A (zh) 三维空间显示界面的操作方法、装置和终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21893824

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 18037969

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 30/08/2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21893824

Country of ref document: EP

Kind code of ref document: A1