WO2022105677A1 - 虚拟现实设备的键盘透视方法、装置及虚拟现实设备 - Google Patents
虚拟现实设备的键盘透视方法、装置及虚拟现实设备 Download PDFInfo
- Publication number
- WO2022105677A1 WO2022105677A1 PCT/CN2021/130200 CN2021130200W WO2022105677A1 WO 2022105677 A1 WO2022105677 A1 WO 2022105677A1 CN 2021130200 W CN2021130200 W CN 2021130200W WO 2022105677 A1 WO2022105677 A1 WO 2022105677A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- keyboard
- hands
- user
- hand
- virtual reality
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 50
- 230000009471 action Effects 0.000 claims abstract description 52
- 230000004913 activation Effects 0.000 claims abstract description 36
- 230000003213 activating effect Effects 0.000 claims abstract description 7
- 230000033001 locomotion Effects 0.000 claims description 59
- 230000006870 function Effects 0.000 claims description 53
- 238000013459 approach Methods 0.000 claims description 3
- 238000004422 calculation algorithm Methods 0.000 abstract description 5
- 239000011521 glass Substances 0.000 description 28
- 238000010586 diagram Methods 0.000 description 21
- 238000003860 storage Methods 0.000 description 17
- 238000004590 computer program Methods 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 230000009849 deactivation Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000007480 spreading Effects 0.000 description 3
- 238000003892 spreading Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000002085 persistent effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
Definitions
- the present application relates to the technical field of virtual reality, and in particular, to a keyboard perspective method and device of a virtual reality device, and a virtual reality device.
- VR glasses virtual reality glasses
- productivity tools have become a new application.
- fast keyboard input is required in most cases, and the closed use environment of VR glasses has become an obstacle to keyboard input.
- the main purpose of this application is to provide a keyboard perspective method, device and virtual reality device of a virtual reality device, which are used to solve the technical problem that the keyboard perspective method of the existing virtual reality device is relatively complicated and has poor effect. .
- a keyboard perspective method of a virtual reality device comprising:
- a keyboard perspective display area is determined according to the positions of the hands of the user, so as to display a physical keyboard in a real scene in the keyboard perspective display area.
- a keyboard see-through device for a virtual reality device comprising:
- a two-hand hand motion recognition unit used to recognize the user's two-hand hand motion
- a keyboard see-through function activation unit used for activating the keyboard see-through function of the virtual reality device if the user's hands and hands meet a preset activation action
- a two-hand hand position recognition unit used for recognizing the user's two-hand hand position under the keyboard perspective function
- a virtual reality device comprising: a processor, a memory storing computer-executable instructions,
- the executable instructions when executed by the processor, implement the foregoing keyboard see-through method of the virtual reality device.
- a computer-readable storage medium stores one or more programs, the one or more programs, when executed by a processor, implement the aforementioned virtual Keyboard perspective method for real devices.
- the keyboard perspective method of the virtual reality device first recognizes the movements of the hands of the user's hands, and then matches the movements of the hands and hands of the user with the preset activation actions, and then can determine whether the user is Want to activate the keyboard see-through feature of a virtual reality device. If the user's hands and hands actions match the preset activation actions, the keyboard perspective function of the virtual reality device can be activated; then, under the keyboard perspective function, the user's hands and hands positions are further identified; The position determines the keyboard perspective display area for keyboard display, so that the user can operate the physical keyboard in the real scene in the keyboard perspective display area.
- the keyboard perspective method of the virtual reality device in the embodiment of the present application uses the existing hand motion recognition algorithm to recognize the user's hand motion and hand position, and determines the keyboard perspective display area based on this. Compared with the traditional hand motion recognition algorithm The keyboard perspective scheme greatly reduces the computing power and computing complexity, has higher compatibility, and can obtain a more accurate keyboard perspective area, which greatly improves the user experience.
- FIG. 1 is a schematic diagram of a keyboard perspective method in the prior art
- FIG. 2 is a flowchart of a keyboard perspective method of a virtual reality device according to an embodiment of the application
- FIG. 3 is a schematic diagram of hand motion recognition according to an embodiment of the present application.
- FIG. 4 is a perspective display effect diagram of a keyboard in VR glasses according to an embodiment of the application.
- FIG. 5 is a schematic diagram of a preset activation action according to an embodiment of the present application.
- FIG. 6 is a schematic diagram of a perspective display area of a keyboard according to an embodiment of the present application.
- FIG. 7 is a schematic diagram of a perspective display area of a keyboard according to another embodiment of the present application.
- FIG. 8 is a schematic diagram of a preset closing action according to an embodiment of the present application.
- FIG. 9 is a block diagram of a keyboard perspective device of a virtual reality device according to an embodiment of the application.
- FIG. 10 is a schematic structural diagram of a virtual reality device in an embodiment of the present application.
- Virtual reality technology is a computer simulation system that can create and experience virtual worlds. It uses computers to generate a simulated environment and immerse users in the environment. Virtual reality technology uses data in real life and electronic signals generated by computer technology to combine them with various output devices to transform them into phenomena that can be felt by people. These phenomena can be real objects in reality. It can also be a substance that we cannot see with the naked eye, which is represented by a three-dimensional model.
- the virtual reality device in this application may refer to VR glasses.
- the VR glasses use a head-mounted display device to block people's vision and hearing from the outside world, and guide the user to produce a feeling of being in a virtual environment.
- the display principle is the left and right eyes.
- the screen displays the images of the left and right eyes respectively, and the human eye generates a three-dimensional sense in the mind after obtaining this information with differences.
- the following description will be given by taking VR glasses as a specific application example of a virtual reality device.
- the keyboard perspective method for a virtual reality device includes the following steps S210 to S240:
- Step S210 identifying the hand movements of the user's hands.
- existing VR glasses are equipped with a binocular camera at the external front end of the glasses, which is used to collect external environment information and capture the user's posture and motion information such as hand motion information.
- computer vision technology is usually used for hand motion recognition, and the results of hand motion recognition are often used to perform user interface operations based on hand motions, or some hand motion games.
- the information collected by the camera in the existing VR glasses can also be used to identify the movements of the hands of the user's hands, so as to determine the perspective display area of the keyboard in combination with the movements of the hands.
- a monocular camera or other types of cameras can also be used.
- the specific type of camera to be used can be flexibly set by those skilled in the art according to actual needs. Make specific restrictions.
- the above-mentioned recognition of the user's hands and hands can be real-time recognition, so as to respond to the user's needs in a timely manner, of course, in order to save the power of the device, it can also be performed every preset time.
- the specific frequency used to identify the hand movements of both hands can be flexibly set by those skilled in the art according to actual needs, which is not specifically limited here.
- Step S220 if the user's hands and hands meet the preset activation action, activate the keyboard see-through function of the virtual reality device.
- the identified user's two-hand and hand movements can be compared with the preset activation actions. Make a match. If the match is successful, the keyboard see-through function of the VR glasses can be activated at this time.
- the type of the preset activation action can be flexibly set by those skilled in the art according to actual needs, which is not specifically limited here.
- activating the keyboard see-through function of the virtual reality device in this step can be understood as only activating the keyboard see-through function of the VR glasses.
- the VR glasses have not entered the see-through state, that is, the user is currently viewing If the real scene is not reached, subsequent steps need to be performed to determine the keyboard perspective display area in the virtual scene.
- the VR glasses have entered the perspective state, and the user can currently see the real scene, but in order to avoid too much impact on the user's immersive experience, you can re-determine the keyboard perspective display in the virtual scene through subsequent steps area.
- the user's hands and hands positions can be further identified under the keyboard see-through function and combined with the above-identified user's hands and hands movements, as the basis for determining the keyboard see-through display area.
- the identification of the positions of the hands of the user's hands here may be the identification of the two palms separately, or of course, the identification of the two hands together.
- Step S240 determining the keyboard perspective display area according to the positions of the hands of the user, so as to display the physical keyboard in the real scene in the keyboard perspective display area.
- the range of the keyboard perspective display area can be determined.
- FIG. 4 a perspective display effect diagram of the keyboard in a VR glasses according to an embodiment of the present application is provided. The user can Operate the physical keyboard in the real scene in the keyboard perspective display area.
- step S210 in the above embodiment defines that the user's hands and hands actions are specifically recognized.
- the main function of step S210 is to determine whether to activate the keyboard see-through function of the VR glasses according to the recognized hand movements of the user
- the preset activation action can also be set to be a one-handed action. The recognition of the movements of the hands and the positions of the hands of both hands is carried out when the area is selected.
- the keyboard perspective method of the virtual reality device of the embodiment of the present application uses the existing hand motion recognition algorithm to recognize the user's hand motion and hand position, and determines the keyboard perspective display area based on this, which greatly reduces the computing power Compared with the traditional keyboard perspective scheme, the user experience is greatly improved.
- activating the keyboard perspective function of the virtual reality device includes: if the user's hands and hands movements are the actions of spreading the hands down and approaching, Then it is determined that the user's hands and hands motions satisfy the preset activation motion.
- the preset activation action in this embodiment of the present application may be the action shown in FIG. 5 . If it is recognized that the user's hands are spread downward and approaching, it is considered that the user's hands and hands meet the preset activation action, and the VR glasses are activated. Keyboard see-through function.
- preset activation actions those skilled in the art can also set other types of preset activation actions according to actual requirements, which are not listed one by one here.
- determining the perspective display area of the keyboard according to the positions of the hands of the user's hands includes: determining the circumscribed rectangular area of the hands according to the positions of the hands of the user's hands; The enclosing rectangular area behind is used as the keyboard perspective display area.
- the above-mentioned preset multiples of magnification can be specifically divided into multiples in the horizontal direction and multiples in the vertical direction.
- the multiples in the vertical direction may specifically include a vertically upward multiple y1 and a vertically downward multiple y2, and the magnification multiples in different directions can be configured according to actual conditions.
- FIG. 6 a schematic diagram of a perspective display area of a keyboard according to an embodiment of the present application is provided.
- the size of the magnification in the horizontal direction can also be related to the distance between the hands. If the distance between the hands is small or touching, the magnification in the horizontal direction can be set to a larger value, and if the distance between the hands is large, the magnification in the horizontal direction can be set to a smaller value.
- determining the perspective display area of the keyboard according to the positions of the hands of the user's hands includes: determining the circumscribed square area of any one of the user's palms; Width; determine the center position of the user's left palm and the center position of the right palm, connect the center position of the left palm and the center position of the right palm, and expand the proportional length in the vertical direction and the proportional width in the horizontal direction according to the midpoint of the connection. Get the keyboard perspective display area.
- the circumscribed square of any one of the user's palms may also be determined first.
- the sizes of the circumscribed squares of the two palms are basically the same, so the circumscribing square of the specific palm has no influence on the subsequent determination of the perspective display area of the keyboard.
- the scaled length and scaled width for expansion can then be determined based on the size of the enclosing square.
- the side length of the circumscribed square is a
- it can be expanded by 1.5a to the left and right, a is expanded upwards, and 0.5a is expanded downwards, then the keyboard perspective display area obtained after the expansion
- the size is 1.5a x 3a.
- the method further includes: if the user's hands and hands meet a preset closing action, closing the keyboard see-through function of the virtual reality device.
- the user's need to activate the keyboard see-through display function of the VR glasses may only be temporary. Therefore, in order to ensure that the user can quickly return to the immersive experience of the virtual scene after using the keyboard see-through display function, it is also possible to detect Whether the user has made a hand action to turn off the keyboard see-through function of the VR glasses, if it is detected that the user's hand movement matches the preset closing action, the keyboard see-through display function of the VR glasses can be turned off at this time.
- FIG. 8 a schematic diagram of a preset closing action is provided.
- the user can turn off the keyboard see-through display function of the VR glasses by performing an action of spreading his hands upward and placing them in front of his eyes.
- other closing actions can also be flexibly set according to actual needs, which is not specifically limited here.
- a more complex activation/deactivation condition can be further set, for example, the identified user can be The duration of the hand activation/deactivation action is counted. If it exceeds the preset time threshold, it is considered that the user wants to activate/deactivate the keyboard see-through display function of the VR glasses. The number of execution times of the user's hand activation/deactivation actions can also be counted, and if the preset execution times are reached, it is considered that the user wants to activate/deactivate the keyboard see-through display function of the VR glasses. How to configure the activation/deactivation conditions of the keyboard see-through display function can be flexibly set by those skilled in the art according to the actual situation, and will not be listed one by one here.
- FIG. 9 shows a block diagram of a keyboard see-through device of a virtual reality device according to an embodiment of the present application.
- the keyboard see-through device 900 of the virtual reality device includes: a two-hand hand motion recognition unit 910 and a keyboard see-through function activation unit 920 , the hand position recognition unit 930 of both hands and the keyboard perspective display area determination unit 940 . in,
- a two-hand hand motion recognition unit 910 used to recognize the user's two-hand hand motion
- a keyboard see-through function activation unit 920 configured to activate the keyboard see-through function of the virtual reality device if the user's hands and hands meet the preset activation action;
- the two-hand and hand position identification unit 930 is used to identify the position of the user's hands and hands under the keyboard see-through function
- the keyboard see-through display area determining unit 940 is configured to determine the keyboard see-through display area according to the positions of the hands of the user, so as to display the physical keyboard in the real scene in the keyboard see-through display area.
- the keyboard see-through function activation unit 920 is specifically configured to determine that the user's hands and hands meet the preset activation action if the user's hands and hands are actions of spreading down and approaching the hands.
- the keyboard perspective display area determination unit 940 is specifically configured to: determine the circumscribed rectangular area of the hands according to the positions of the hands of the user's hands; enlarge the determined circumscribed rectangular area by a preset multiple, and enlarge the circumscribed rectangular area after the enlarged The rectangular area serves as the keyboard perspective display area.
- the keyboard perspective display area determination unit 940 is specifically configured to: determine the circumscribed square area of any palm of the user; determine the proportional length and proportional width for expansion according to the size of the circumscribed square area; determine The center position of the user's left palm and the center position of the right palm, connect a line between the center position of the left palm and the center position of the right palm, and extend the proportional length vertically and horizontally according to the midpoint of the connection to obtain the keyboard perspective Display area.
- the device further includes: a keyboard see-through function closing unit, configured to disable the keyboard see-through function of the virtual reality device if the user's hands and hands meet a preset closing action.
- a keyboard see-through function closing unit configured to disable the keyboard see-through function of the virtual reality device if the user's hands and hands meet a preset closing action.
- FIG. 10 is a schematic structural diagram of a virtual reality device.
- the virtual reality device includes a memory, a processor, and optionally an interface module, a communication module, and the like.
- Memory may include memory, such as high-speed random-access memory (Random-Access Memory,
- the virtual reality device may also include hardware required by other businesses.
- the processor, interface module, communication module and memory can be connected to each other through an internal bus, which can be an ISA (Industry Standard Architecture) bus, a PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus or EISA (Extended Industry Standard Architecture, Extended Industry Standard Architecture) bus, etc.
- the bus can be divided into address bus, data bus, control bus and so on. For ease of representation, only one bidirectional arrow is shown in FIG. 10, but it does not mean that there is only one bus or one type of bus.
- Memory for storing computer-executable instructions.
- the memory provides computer-executable instructions to the processor through an internal bus.
- the processor executes the computer-executable instructions stored in the memory, and is specifically configured to implement the following operations:
- the keyboard perspective display area is determined according to the position of the user's hands and hands, so as to display the physical keyboard in the real scene in the keyboard perspective display area.
- a processor may be an integrated circuit chip with signal processing capabilities.
- each step of the above-mentioned method can be completed by a hardware integrated logic circuit in a processor or an instruction in the form of software.
- the above-mentioned processor can be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; it can also be a digital signal processor (Digital Signal Processor, DSP), dedicated integrated Circuit (Application Specific Integrated Circuit, ASIC), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- FPGA Field-Programmable Gate Array
- a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
- the steps of the methods disclosed in conjunction with the embodiments of the present application may be directly embodied as executed by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
- the software module may be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other storage media mature in the art.
- the storage medium is located in the memory, and the processor reads the information in the memory, and completes the steps of the above method in combination with its hardware.
- the virtual reality device can also perform the steps performed by the keyboard perspective method of the virtual reality device in FIG. 1 , and realize the functions of the keyboard perspective method of the virtual reality device in the embodiment shown in FIG. 1 .
- the embodiments of the present application further provide a computer-readable storage medium, where the computer-readable storage medium stores one or more programs, and when the one or more programs are executed by the processor, realize the keyboard perspective of the aforementioned virtual reality device method, and is specifically used to execute:
- the keyboard perspective display area is determined according to the position of the user's hands and hands, so as to display the physical keyboard in the real scene in the keyboard perspective display area.
- the embodiments of the present application may be provided as a method, a system, or a computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) embodying computer-usable program code.
- computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
- These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory result in an article of manufacture comprising instruction means, the instructions
- the apparatus implements the functions specified in the flow or flow of the flowcharts and/or the block or blocks of the block diagrams.
- a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
- processors CPUs
- input/output interfaces network interfaces
- memory volatile and non-volatile memory
- Memory may include forms of non-persistent memory in computer readable media, random access memory (RAM) and/or non-volatile memory, such as read only memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
- RAM random access memory
- ROM read only memory
- flash RAM flash memory
- Computer-readable media includes both persistent and non-permanent, removable and non-removable media, and storage of information may be implemented by any method or technology.
- Information may be computer readable instructions, data structures, modules of programs, or other data.
- Examples of computer storage media include, but are not limited to, phase-change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Flash Memory or other memory technology, Compact Disc Read Only Memory (CD-ROM), Digital Versatile Disc (DVD) or other optical storage, Magnetic tape cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
- computer-readable media does not include transitory computer-readable media, such as modulated data signals and carrier waves.
- the embodiments of the present application may be provided as a method, a system or a computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) characterized by embodying computer-usable program code.
- computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
Abstract
Description
Claims (10)
- 一种虚拟现实设备的键盘透视方法,其特征在于,包括:识别用户的双手手部动作;若用户的所述双手手部动作满足预设激活动作,则激活所述虚拟现实设备的键盘透视功能;在所述键盘透视功能下,识别用户的双手手部位置;根据用户的所述双手手部位置确定键盘透视显示区域,以在所述键盘透视显示区域显示现实场景中的物理键盘。
- 根据权利要求1所述的方法,其特征在于,所述若用户的所述双手手部动作满足预设激活动作,则激活所述虚拟现实设备的键盘透视功能包括:若用户的所述双手手部动作为双手向下展开并靠近的动作,则确定用户的所述双手手部动作满足预设激活动作。
- 根据权利要求1所述的方法,其特征在于,所述根据用户的所述双手手部位置确定键盘透视显示区域包括:根据用户的所述双手手部位置确定双手的外接矩形区域;将确定的所述外接矩形区域放大预设倍数,将放大后的外接矩形区域作为所述键盘透视显示区域。
- 根据权利要求1所述的方法,其特征在于,所述根据用户的所述双手手部位置确定键盘透视显示区域包括:确定用户的任意一只手掌的外接正方形区域;根据所述外接正方形区域的大小确定用于扩展的比例长度和比例宽度;确定所述用户的左手掌中心位置和右手掌中心位置,将所述左手掌中心位置和所述右手掌中心位置连线,并根据连线中点向竖直方向扩展所述比例长度,以及向水平方向扩展所述比例宽度,得到所述键盘透视显示区域。
- 根据权利要求1所述的方法,其特征在于,所述方法还包括:若用户的所述双手手部动作满足预设关闭动作,则关闭所述虚拟现实设备的键盘透视功能。
- 一种虚拟现实设备的键盘透视装置,其特征在于,包括:双手手部动作识别单元,用于识别用户的双手手部动作;键盘透视功能激活单元,用于若用户的所述双手手部动作满足预设激活动作,则激活所述虚拟现实设备的键盘透视功能;双手手部位置识别单元,用于在所述键盘透视功能下,识别用户的双手手部位置;键盘透视显示区域确定单元,用于根据用户的所述双手手部位置确定键盘透视显示区域,以在所述键盘透视显示区域显示现实场景中的物理键盘。
- 根据权利要求6所述的装置,其特征在于,所述键盘透视功能激活单元具体用于:若用户的所述双手手部动作为双手向下展开并靠近的动作,则确定用户的所述双手手部动作满足预设激活动作。
- 根据权利要求6所述的装置,其特征在于,所述键盘透视显示区域确定单元具体用于:根据用户的所述双手手部位置确定双手的外接矩形区域;将确定的所述外接矩形区域放大预设倍数,将放大后的外接矩形区域作为所述键盘透视显示区域。
- 根据权利要求6所述的装置,其特征在于,所述键盘透视显示区域确定单元具体用于:确定用户的任意一只手掌的外接正方形区域;根据所述外接正方形区域的大小确定用于扩展的比例长度和比例宽度;确定所述用户的左手掌中心位置和右手掌中心位置,将所述左手掌中心位置和所述右手掌中心位置连线,并根据连线中点向竖直方向扩展所述比例长度,以及向水平方向扩展所述比例宽度,得到所述键盘透视显示区域。
- 一种虚拟现实设备,其特征在于,包括:处理器,存储计算机可执行指令的存储器,所述可执行指令在被所述处理器执行时,实现所述权利要求1至5之任一所述虚拟现实设备的键盘透视方法。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/037,969 US20240004477A1 (en) | 2020-11-23 | 2021-11-12 | Keyboard perspective method and apparatus for virtual reality device, and virtual reality device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011319697.0A CN112445341B (zh) | 2020-11-23 | 2020-11-23 | 虚拟现实设备的键盘透视方法、装置及虚拟现实设备 |
CN202011319697.0 | 2020-11-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022105677A1 true WO2022105677A1 (zh) | 2022-05-27 |
Family
ID=74738578
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/130200 WO2022105677A1 (zh) | 2020-11-23 | 2021-11-12 | 虚拟现实设备的键盘透视方法、装置及虚拟现实设备 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240004477A1 (zh) |
CN (1) | CN112445341B (zh) |
WO (1) | WO2022105677A1 (zh) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112462937B (zh) * | 2020-11-23 | 2022-11-08 | 青岛小鸟看看科技有限公司 | 虚拟现实设备的局部透视方法、装置及虚拟现实设备 |
CN112445341B (zh) * | 2020-11-23 | 2022-11-08 | 青岛小鸟看看科技有限公司 | 虚拟现实设备的键盘透视方法、装置及虚拟现实设备 |
EP4305506A1 (en) * | 2021-03-12 | 2024-01-17 | Telefonaktiebolaget LM Ericsson (publ) | Electronic device, and methods of the electronic device for generating feedback related to an interaction with a touch input arrangement |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020130844A1 (en) * | 1998-12-31 | 2002-09-19 | Natoli Anthony James Francis | Virtual reality keyboard system and method |
CN105975067A (zh) * | 2016-04-28 | 2016-09-28 | 上海创米科技有限公司 | 应用于虚拟现实产品的按键输入设备及方法 |
CN106537261A (zh) * | 2014-07-15 | 2017-03-22 | 微软技术许可有限责任公司 | 全息键盘显示 |
CN108334203A (zh) * | 2018-04-13 | 2018-07-27 | 北京理工大学 | 一种用于虚拟现实的虚实融合键盘系统 |
CN110832441A (zh) * | 2017-05-19 | 2020-02-21 | 奇跃公司 | 用于虚拟、增强和混合现实显示系统的键盘 |
CN112445341A (zh) * | 2020-11-23 | 2021-03-05 | 青岛小鸟看看科技有限公司 | 虚拟现实设备的键盘透视方法、装置及虚拟现实设备 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9865089B2 (en) * | 2014-07-25 | 2018-01-09 | Microsoft Technology Licensing, Llc | Virtual reality environment with real world objects |
CN106484085B (zh) * | 2015-08-31 | 2019-07-23 | 北京三星通信技术研究有限公司 | 在头戴式显示器中显示真实物体的方法及其头戴式显示器 |
CN107368179A (zh) * | 2017-06-12 | 2017-11-21 | 广东网金控股股份有限公司 | 一种虚拟现实系统的输入方法及装置 |
WO2019136248A1 (en) * | 2018-01-05 | 2019-07-11 | Google Llc | Selecting content to render on display of assistant device |
CN108401452B (zh) * | 2018-02-23 | 2021-05-07 | 香港应用科技研究院有限公司 | 使用虚拟现实头戴式显示器系统来执行真实目标检测和控制的装置和方法 |
CN108646997A (zh) * | 2018-05-14 | 2018-10-12 | 刘智勇 | 一种虚拟及增强现实设备与其他无线设备进行交互的方法 |
CN109885174A (zh) * | 2019-02-28 | 2019-06-14 | 努比亚技术有限公司 | 手势操控方法、装置、移动终端及存储介质 |
US11137908B2 (en) * | 2019-04-15 | 2021-10-05 | Apple Inc. | Keyboard operation with head-mounted device |
CN111415422B (zh) * | 2020-04-17 | 2022-03-18 | Oppo广东移动通信有限公司 | 虚拟对象调整方法、装置、存储介质与增强现实设备 |
-
2020
- 2020-11-23 CN CN202011319697.0A patent/CN112445341B/zh active Active
-
2021
- 2021-11-12 US US18/037,969 patent/US20240004477A1/en active Pending
- 2021-11-12 WO PCT/CN2021/130200 patent/WO2022105677A1/zh active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020130844A1 (en) * | 1998-12-31 | 2002-09-19 | Natoli Anthony James Francis | Virtual reality keyboard system and method |
CN106537261A (zh) * | 2014-07-15 | 2017-03-22 | 微软技术许可有限责任公司 | 全息键盘显示 |
CN105975067A (zh) * | 2016-04-28 | 2016-09-28 | 上海创米科技有限公司 | 应用于虚拟现实产品的按键输入设备及方法 |
CN110832441A (zh) * | 2017-05-19 | 2020-02-21 | 奇跃公司 | 用于虚拟、增强和混合现实显示系统的键盘 |
CN108334203A (zh) * | 2018-04-13 | 2018-07-27 | 北京理工大学 | 一种用于虚拟现实的虚实融合键盘系统 |
CN112445341A (zh) * | 2020-11-23 | 2021-03-05 | 青岛小鸟看看科技有限公司 | 虚拟现实设备的键盘透视方法、装置及虚拟现实设备 |
Also Published As
Publication number | Publication date |
---|---|
US20240004477A1 (en) | 2024-01-04 |
CN112445341A (zh) | 2021-03-05 |
CN112445341B (zh) | 2022-11-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022105677A1 (zh) | 虚拟现实设备的键盘透视方法、装置及虚拟现实设备 | |
US11354825B2 (en) | Method, apparatus for generating special effect based on face, and electronic device | |
JP6984840B2 (ja) | リアルタイムコメント表示方法及び電子機器 | |
TWI654539B (zh) | 虛擬實境交互方法、裝置與系統 | |
WO2020001013A1 (zh) | 图像处理方法、装置、计算机可读存储介质和终端 | |
WO2022105919A1 (zh) | 虚拟现实设备的局部透视方法、装置及虚拟现实设备 | |
TWI687901B (zh) | 虛擬實境設備的安全監控方法、裝置及虛擬實境設備 | |
WO2020019665A1 (zh) | 基于人脸的三维特效生成方法、装置和电子设备 | |
US10255690B2 (en) | System and method to modify display of augmented reality content | |
US11176355B2 (en) | Facial image processing method and apparatus, electronic device and computer readable storage medium | |
WO2022237268A1 (zh) | 头戴式显示设备的信息输入方法、装置及头戴式显示设备 | |
WO2020019664A1 (zh) | 基于人脸的形变图像生成方法和装置 | |
WO2020024692A1 (zh) | 一种人机交互方法和装置 | |
WO2017052880A1 (en) | Augmented reality with off-screen motion sensing | |
TWI544367B (zh) | 手勢辨識與控制方法及其裝置 | |
WO2021248857A1 (zh) | 一种障碍物属性判别方法、系统及智能机器人 | |
JP2014059869A (ja) | 眼探索方法及び該方法を使用した眼状態検出装置と眼探索装置 | |
CN113282167B (zh) | 头戴式显示设备的交互方法、装置及头戴式显示设备 | |
CN114510142B (zh) | 基于二维图像的手势识别方法及其系统和电子设备 | |
CN114327063A (zh) | 目标虚拟对象的交互方法、装置、电子设备及存储介质 | |
CN114092608A (zh) | 表情的处理方法及装置、计算机可读存储介质、电子设备 | |
CN107526439A (zh) | 一种界面返回方法及装置 | |
CN108121442A (zh) | 三维空间显示界面的操作方法、装置和终端设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21893824 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18037969 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 30/08/2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21893824 Country of ref document: EP Kind code of ref document: A1 |