CN112198962A - Method for interacting with virtual reality equipment and virtual reality equipment - Google Patents

Method for interacting with virtual reality equipment and virtual reality equipment Download PDF

Info

Publication number
CN112198962A
CN112198962A CN202011061911.7A CN202011061911A CN112198962A CN 112198962 A CN112198962 A CN 112198962A CN 202011061911 A CN202011061911 A CN 202011061911A CN 112198962 A CN112198962 A CN 112198962A
Authority
CN
China
Prior art keywords
preset
finger
user interface
ray control
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011061911.7A
Other languages
Chinese (zh)
Other versions
CN112198962B (en
Inventor
杨彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Media Network Technology Co Ltd
Original Assignee
Qingdao Hisense Media Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Media Network Technology Co Ltd filed Critical Qingdao Hisense Media Network Technology Co Ltd
Priority to CN202011061911.7A priority Critical patent/CN112198962B/en
Publication of CN112198962A publication Critical patent/CN112198962A/en
Application granted granted Critical
Publication of CN112198962B publication Critical patent/CN112198962B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method for interacting with virtual reality equipment and the virtual reality equipment, wherein the method comprises the following steps: receiving a hand image shot by a camera; generating a ray control in a user interface according to the relative position of the bone key point in the identified preset finger when the preset finger is identified from the hand image; and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface. In some embodiments, the user's fingers may interact with the virtual reality device, avoiding the inconvenience of handles used in the prior art.

Description

Method for interacting with virtual reality equipment and virtual reality equipment
Technical Field
The invention relates to the technical field of virtual reality equipment, in particular to a method for interacting with virtual reality equipment and the virtual reality equipment.
Background
Virtual Reality technology (VR) is the basic implementation of simulating a Virtual environment to give a person a sense of environmental immersion. With the continuous development of social productivity and scientific technology, VR technology is increasingly in great demand in various industries.
VR hardware equipment on the market at present mainly is VR helmet equipment, lets people immersive immerse in the virtual world after wearing. At present with VR helmet equipment's interactive mode mainly for interacting through handheld handle, the helmet is connected to the handle through the mode of bluetooth, gives the helmet with information transfer such as the position of handle, button, the helmet makes corresponding interaction. The defect of the existing hand-held handle is that the interaction mode is relatively dependent on the handle, when the handle is not powered on, the interaction equipment which is relatively convenient and reliable is lost, a plurality of handles are heavy and thick at present, and the handle is a burden after being held for a long time. VR sees heavily to feel of immersing, and the handheld handle carries out interactive operation and is unfavorable for the experience that the user felt of immersing.
Disclosure of Invention
In order to solve the technical problem, the invention provides a method for interacting with a virtual reality device and the virtual reality device, so as to improve the use experience of a user.
In a first aspect, the present invention provides a virtual reality device, including:
a display for displaying a user interface;
a controller to:
receiving a hand image shot by a camera;
generating a ray control in a user interface according to the relative position of the bone key point in the identified preset finger when the preset finger is identified from the hand image;
and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
In a second aspect, the present invention provides a virtual display device, including:
a display for displaying a user interface;
a controller to:
receiving an image shot by a camera;
generating a ray control in the user interface according to the relative position of the identified preset article when the preset article is identified from the image;
and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
In a third aspect, the present invention provides a method for interacting with a virtual reality device, including:
receiving a hand image shot by a camera;
generating a ray control in a user interface according to the relative position of the bone key point in the identified preset finger when the preset finger is identified from the hand image;
and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
In a fourth aspect, the present invention provides a method for interacting with a virtual reality device, including:
receiving an image shot by a camera;
generating a ray control in the user interface according to the relative position of the identified preset article when the preset article is identified from the image;
and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
In the technical scheme provided by the application, a method for interacting with virtual reality equipment and the virtual reality equipment are provided, wherein the method comprises the following steps: receiving a hand image shot by a camera; generating a ray control in a user interface according to the relative position of the bone key point in the identified preset finger when the preset finger is identified from the hand image; and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface. In some embodiments, the user's fingers may interact with the virtual reality device, avoiding the inconvenience of handles used in the prior art.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
A schematic diagram of skeletal key points of a hand is illustrated in fig. 1;
FIG. 2 is a flow chart illustrating a method of interacting with a virtual reality device;
a schematic diagram of a user interface is illustrated in fig. 3.
Detailed Description
To make the objects, embodiments and advantages of the present application clearer, the following description of exemplary embodiments of the present application will clearly and completely describe the exemplary embodiments of the present application with reference to the accompanying drawings in the exemplary embodiments of the present application, and it is to be understood that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments described herein without inventive step, are intended to be within the scope of the claims appended hereto. In addition, while the disclosure herein has been presented in terms of one or more exemplary examples, it should be appreciated that aspects of the disclosure may be implemented solely as a complete embodiment.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first", "second", "third", and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and are not necessarily meant to define a particular order or sequence Unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the terms first, second, third, etc. are, for example, capable of operation in sequences other than those illustrated or otherwise described herein with respect to some embodiments.
Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module" as used herein refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
VR hardware equipment on the market at present mainly is VR helmet equipment, lets people immersive immerse in the virtual world after wearing. At present with VR helmet equipment's interactive mode mainly for interacting through handheld handle, the helmet is connected to the handle through the mode of bluetooth, gives the helmet with information transfer such as the position of handle, button, the helmet makes corresponding interaction. The defect of the existing hand-held handle is that the interaction mode is relatively dependent on the handle, when the handle is not powered on, the interaction equipment which is relatively convenient and reliable is lost, a plurality of handles are heavy and thick at present, and the handle is a burden after being held for a long time. VR sees heavily to feel of immersing, and the handheld handle carries out interactive operation and is unfavorable for the experience that the user felt of immersing.
And the interaction mode with the virtual reality device used in some embodiments is to use gestures for interaction, so that inconvenience brought to a user by using a handle can be avoided.
In some embodiments, the virtual reality device may be provided with a camera itself, or may be externally connected to the camera. After the virtual reality equipment is opened, the equipment is started to display a user interface, the camera is started at the same time, the gesture can be captured after the camera is opened, and captured hand images are rendered in the 3D user interface of the virtual reality equipment in real time.
In some embodiments, the virtual reality device is provided with a binocular camera, and the virtual reality device is an exemplary VR headset device.
The binocular camera in some embodiments is provided with two cameras. The binocular camera is just like the binocular of people, and can acquire the depth information of the shot object through three-dimensional imaging, namely the world seen by the human eyes is three-dimensional, and the world captured by the binocular camera is also three-dimensional. In addition, the number of cameras may not be limited as long as the depth information of the object can be obtained.
Some embodiments provide a method of interacting with a virtual reality device, and some embodiments support recognizing a finger to operate in place of a handle, interacting with a virtual display device with the finger, performing highlighting, etc., provided in place of the handle. As shown in fig. 1, the method includes:
and S100, receiving a hand image shot by the camera.
S200, when the preset finger is recognized from the hand image, generating a ray control in the user interface according to the relative position of the bone key point in the recognized preset finger.
In some embodiments, hand bone point data generated by the camera according to the shot hand image can be received from the camera, and a ray control is generated in the user interface according to the hand bone point data.
In some embodiments, the preset finger may be a right index finger since the right hand is a habitual hand of most people, but of course, in some embodiments, the preset finger may also be a left index finger in order to facilitate the use of the left hand as a habitual hand user. In addition, in order to avoid fatigue of the user who uses the index finger all the time, any finger on the left hand or the right hand can be used as the preset finger.
In some embodiments, the finger that the user stretches when bending the other fingers is used as the preset finger.
In some embodiments, prior to identifying the preset finger from the hand image, the method further comprises: recognizing a first preset gesture from the hand image; when the first preset gesture is recognized, recognizing a preset finger; and if the first preset gesture is not recognized from the hand image, the preset finger is not recognized from the hand image. Illustratively, the first preset gesture is that only the preset finger is extended and the other fingers are held tightly. Therefore, the preset gesture can be triggered when the ray control is generated, and meaningless finger recognition is avoided.
In some embodiments, the gesture recognition functionality of the virtual reality device may be turned on or off, generally default to on.
In some embodiments, when the user stretches the palm and waves in front of the virtual reality device for two times, the gesture interaction function is turned off after the device recognizes the gesture. When the gesture interaction function is in a closed state, the gesture interaction function can be opened by waving the virtual reality device for two times through the same gesture palm. In other embodiments, the gesture interaction function in the user interface can be selected to be turned on through auxiliary operations, for example, key operations on the virtual reality device.
In some embodiments, the step of generating a ray control in the user interface according to the relative position of the bone key point in the identified preset finger when the preset finger is identified from the hand image comprises:
identifying a preset finger coordinate set of a preset finger in the hand image in a first three-dimensional coordinate system, wherein the preset finger coordinate set represents the relative position of a skeleton key point in the preset finger;
and determining the position coordinates of the ray control in the second three-dimensional coordinate system according to the preset finger coordinate set. The second three-dimensional coordinate system of some embodiments is a coordinate system in a user interface.
And generating the ray control in the user interface according to the position coordinate of the ray control.
In some embodiments, the step of determining the position coordinates of the ray control in the second three-dimensional coordinate system according to the preset finger coordinate set comprises:
converting a preset finger coordinate set in the first three-dimensional coordinate system into a preset finger coordinate set in a second three-dimensional coordinate system;
exemplarily, the coordinates of the first bone point (labeled 5 in fig. 2), the second bone point (labeled 6 in fig. 2), the third bone point (labeled 7 in fig. 2), and the fourth bone point (labeled 8 in fig. 2) in the first three-dimensional coordinate system are Vector (three-dimensional Vector coordinate) 1, Vector2, Vector3, and Vector4, respectively, when the first bone point is set as the origin (0, 0, 0) of the second three-dimensional coordinate system, when the coordinates of the second three-dimensional coordinate system corresponding to the second bone point, the third bone point, and the fourth bone point are calculated by the following formula:
the coordinates of the second three-dimensional coordinate system corresponding to the second bone point are Vector2-Vector1+ (0, 0, 0);
the coordinates of the second three-dimensional coordinate system corresponding to the third bone point are Vector3-Vector1+ (0, 0, 0);
and the coordinates of the second three-dimensional coordinate system corresponding to the fourth bone point are Vector4-Vector1+ (0, 0, 0).
In some embodiments, the second three-dimensional coordinate system is a Unity world coordinate system.
Screening out coordinates of preset skeleton key points in a preset finger coordinate set in a second three-dimensional coordinate system;
and determining the position coordinates of the ray control in the second three-dimensional coordinate system according to the coordinates of the preset skeleton key points.
In some embodiments, the step of determining the coordinates of the location of the ray control in the second three-dimensional coordinate system based on the coordinates of the preset bone keypoints comprises:
all skeleton key points on the preset finger comprise a first skeleton point, a second skeleton point, a third skeleton point and a fourth skeleton point which are arranged in the direction from the finger root to the finger tip, wherein the preset skeleton key points comprise the third skeleton point and the fourth skeleton point;
and determining the position coordinates of a ray control in a second three-dimensional coordinate system, wherein the ray control is positioned between the fingertip of a preset finger and other controls in the user interface, the direction of the ray control is set to be that a third skeleton point points to a fourth skeleton point, and the fingertip coordinates can be determined through the coordinates of the fourth skeleton point.
S300, changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
In some embodiments, other controls in the user interface also have corresponding positions in the second two-dimensional coordinate system, and according to the position of the ray control and the positions of the other controls in the user interface, the direction of the ray control may be used to determine the control pointed by the ray control, and control the control to change the display state, as shown in fig. 3 for example. In some embodiments, the length of the ray control may be dynamically changed according to an object pointed by the ray control, and for example, in some embodiments, the user interface that a user sees using the virtual display device further includes a background of the 3D screen. When the control is pointed, the length of the ray is the length from the finger to the control, and when the control is not pointed, the length of the ray control can be lengthened and points to a far 3D picture.
In some embodiments, taking the second coordinate system as an example of the Unity coordinate system, the processor needs to render in the Unity coordinate system according to an image to be displayed, and then perform three-dimensional projection according to rendered data, or, when the display includes a left-eye sub-display and a right-eye sub-display, acquire images of two different viewing angles according to the rendered data and respectively display on the left-eye sub-display and the right-eye sub-display, or, when the display includes only one stereoscopic display, acquire images of two different viewing angles according to the rendered data and display on the stereoscopic display in a time-sharing manner, so as to enable a user to "see" the stereoscopic image according to the persistence of vision effect.
In some embodiments, rendering of the ray control also needs to be performed in the Unity coordinate system, and the generation of the three-dimensional projection or the images from different perspectives is started after the ray rendering is completed.
In some embodiments, the length of the ray control is determined, and the pointed control is determined to be highlighted according to the straight line in the length direction of the ray control and the position of the control in the display highlight in the Unity coordinate system;
in some embodiments, the length of the ray control is determined according to the straight line in the length direction of the ray control and the focal point of the displayed image in the Unity coordinate system, and the pointed control is determined according to the straight line in the length direction of the ray control and the position of the control in the displayed highlight for highlighting.
In some embodiments, the step of changing the display state of the control pointed to by the ray control in the user interface for highlighting includes:
displaying a control pointed by the ray control in a user interface as a highlight state;
or the control is enlarged and displayed between the hand and the original position of the control.
In some embodiments, the method further comprises generating a virtual hand control at the same time as the ray control, the virtual hand control being located at an end of the ray control at which the displayed image is displayed, the method further comprising: according to the recognized hand image, controlling the generation of a virtual hand control in the user interface:
determining a hand coordinate set in a first three-dimensional coordinate system;
converting the hand coordinate set in the first three-dimensional coordinate system into a hand coordinate set in a second three-dimensional coordinate system; and generating a hand control on the user interface according to the hand coordinate set in the second three-dimensional coordinate system.
In some embodiments, after the virtual reality device is turned on, if the hand is within the detection range of the camera, a hand control is displayed in the user interface of the virtual reality device, and according to a hand tracking technology, the real position and the gesture are captured in real time, and simultaneously according to the data, the virtual hand is rendered in the user interface of the virtual reality device in real time.
The rendering of the hand control is performed in a Unity (game engine) engine layer, and the technical implementation flow is as follows: the Android end (Android end) acquires dot matrix data of skeleton key points of fingers and upper arm trunks of images in the binocular camera, wherein the dot matrix data comprises skeleton key point names and (X, Y, Z) coordinates of the skeleton key points; the data acquired by the Android terminal are transmitted to Unity in a Json character string mode, and the Android terminal transmits the data in each frame of image to Unity; and the Unity processes the data transmitted from the Android terminal and corrects the transmitted (X, Y, Z) coordinates into a Unity world coordinate system. The correction mode is that the central point of the intersection part of the images shot by the binocular camera is used as the central point of the Unity world coordinate system, if the position of the key point of the hand has position deviation relative to the intersected images shot by the binocular camera, the deviation is corrected to the deviation of the central point of the Unity world coordinate system according to the deviation of the position relative to the central point of the binocular camera, and then the (X, Y, Z) coordinate transmitted by the binocular camera is converted into the Unity world coordinate system.
The Unity end acquires coordinates of key points transmitted by the Android end in real time, converts the coordinates into a Unity world coordinate system, and then performs hand drawing according to the coordinates of skeleton key points, each palm comprises 21 skeleton key points, as shown in fig. 2, the hand contour points are determined by using the skeleton key points of the hand and the thickness of the hand, the thickness of the hand refers to the width of fingers or the width of the palm, and the hand contour is drawn by using a Unity line renderer (LineRender). The hand contour points are contour points which are at the distance from the hand skeleton key points, are the thickness of the hand and are arranged on the two sides of the hand skeleton key points. According to the hand contour, the hand is filled with color. In some embodiments, the transparency of the hand fill color is set to 70% and the hand rendering at Unity is complete.
After the hand drawing at the Unity end is finished, the real hand action of a user wearing the virtual reality equipment can be displayed in a user interface of the virtual reality equipment in real time, the gesture of the user can be recognized, and the specific operation can be responded, and the technical implementation flow of the gesture recognition is as follows:
gesture recognition is realized based on a binocular camera of the virtual reality equipment. Left and right visual images of gesture actions of a user are collected from a binocular camera, stereoscopic matching is carried out, a visual difference image is obtained, and the internal parameters and the external parameters of the camera are utilized to carry out triangular calculation so as to obtain a depth image. And segmenting the visual image by using a gesture segmentation algorithm, segmenting the hand and the background environment, and realizing the cutout of the hand. And tracking the hand by using a gesture tracking algorithm, and acquiring the position information of the hand in real time. Joint points of the hand can be acquired through a hand recognition algorithm. According to the acquired information and position information of the hand, the hand and the upper limb can be drawn in real time. According to the gesture tracking algorithm and the motion tracks of the hand joint points, gesture recognition matching is carried out on the gesture, such as a scissor-hand gesture, a two-finger pinch gesture and the like.
In some embodiments, the method further comprises: and determining the coordinates of the finger contour according to the coordinates of all skeleton key points on the preset finger, wherein the coordinates of the finger contour are used for drawing a finger image on a user interface.
In some embodiments, the step of determining the coordinates of the finger contour according to the coordinates of all the skeletal key points on the preset finger includes:
determining coordinates of two side edges of the finger corresponding to each skeleton key point by using the preset finger width and taking each skeleton key point on a preset finger as a central point;
and determining the coordinates of the finger outline by using the coordinates of the edges of the two sides of the finger and the coordinates of the fingertip.
Illustratively, the UNITY world coordinates corresponding to the fourth skeleton point are (x, y, z), the preset finger width is 0.13, and the coordinates of the two side edges of the finger corresponding to the fourth skeleton point are (x-0.13, y, z), (x +0.13, y, z), respectively.
Since the fourth skeletal point is the skeletal keypoint closest to the fingertip, fingertip coordinates are generated using UNITY world coordinates of the fourth skeletal point. Illustratively, setting the distance from the fourth skeletal point to the upper edge of the finger to be 0.13, the fingertip coordinates are (x, y +0.13, z).
In some embodiments, the finger contour coordinates are determined through two side edge coordinates of the finger and the fingertip coordinates which correspond to four skeleton key points on the finger respectively, and nine coordinate points in total. In some embodiments, when the finger image is drawn by using a bezier curve according to the finger contour coordinates, the filling transparency of the finger image may be set to 0.7 color value.
In some embodiments, the method further comprises: when the user moves the preset finger, the finger image on the user interface correspondingly moves in real time to select different other controls.
In some embodiments, the method further comprises: when the user moves the head, the coordinates of the ray control are moved in the same direction according to the movement of the head, so that the relative position of the ray control in the window is kept unchanged. When the user moves the head, the content displayed in the middle position of the user interface can be changed according to the head moving direction. For example, when the user moves the head to the left, the user may see that the middle position of the user interface is changed from the originally displayed content to the content corresponding to the left side of the originally displayed content. The ray control is prevented from moving due to the fact that a certain angle difference exists between the head and the actual hand caused by the fact that the head of the user rotates, and the user feels uncomfortable when watching a user interface. Some embodiments control the co-ordinate movement of the ray control as the user moves the head such that the relative position of the ray control within the viewport remains unchanged. The window refers to the whole user interface seen by the user through the virtual reality equipment. Thus, as the user moves the head, the ray control display position remains unchanged. In some embodiments, the position of the ray control may be reset when the virtual reality device is initialized.
In some embodiments, the method further comprises: and after the display state of the control pointed by the ray control in the user interface is changed, recognizing a second preset gesture, and switching the user interface according to the link content corresponding to the pointed control. For example, the second preset gesture may be a pinch motion of a thumb and a middle finger of the user.
In some embodiments, the method further comprises: and after the display state of the control pointed by the ray control in the user interface is changed, recognizing a third preset gesture, and controlling the control to recover to the original display state. For example, the third preset gesture may be a pinch motion of a thumb and ring finger of the user.
In addition to some embodiments that may utilize a finger to interact with a virtual reality device, in other embodiments, other items may be used instead of a finger to enable interaction with a virtual reality device.
Another method provided by some embodiments for interacting with a virtual reality device, as shown in fig. 3, includes:
receiving an image shot by a camera; generating a ray control in the user interface according to the relative position of the identified preset article when the preset article is identified from the image; and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
For example, the preset item may be a pen, and when the pen is recognized, the ray control may be drawn by determining the position of the pen and the pointing direction of the pen tip, although some embodiments are not limited to the type of the preset item as long as the intention of the application is not violated.
In some embodiments, as the virtual reality device moves, the hand images are re-acquired for generation of the ray controls.
In some embodiments, as the virtual reality device moves, the coordinates of the ray control are moved in the second coordinate system according to the three-dimensional variable of the movement so that the position of the ray control in the display window is maintained relatively stable.
In some embodiments, when the virtual reality device moves, the image area to be displayed is determined according to the moving three-dimensional variable, and the image with the display is displayed at the position of the window control.
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. In particular implementations, the present invention also provides a computer storage medium, wherein the computer storage medium may store a program, which when executed may include program steps included in a method for interacting with a virtual reality device that the controller 250 is configured to perform, when the computer storage medium is located in a display device. The computer storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM) or a Random Access Memory (RAM).
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This invention is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. The specification and examples are to be regarded in an illustrative manner only and are not intended to limit the scope of the present invention. With a true scope and spirit of the invention being indicated by the following claims.

Claims (15)

1. A virtual reality device, comprising:
a display for displaying a user interface;
a controller to:
receiving a hand image shot by a camera;
generating a ray control in a user interface according to the relative position of the bone key point in the identified preset finger when the preset finger is identified from the hand image;
and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
2. The virtual reality device of claim 1, wherein the controller is configured to perform the generating of the ray control in the user interface according to the relative positions of the skeletal keypoints in the identified preset finger when the preset finger is identified from the hand image, according to the following steps:
identifying a preset finger coordinate set of a preset finger in the hand image in a first three-dimensional coordinate system, wherein the preset finger coordinate set represents the relative position of a skeleton key point in the preset finger;
determining the position coordinates of the ray control in a second three-dimensional coordinate system according to the preset finger coordinate set;
and generating the ray control in the user interface according to the position coordinate of the ray control.
3. The virtual reality device of claim 2, wherein the controller is configured to perform the determining the position coordinates of the ray control in the second three-dimensional coordinate system according to the preset set of finger coordinates by:
converting a preset finger coordinate set in the first three-dimensional coordinate system into a preset finger coordinate set in a second three-dimensional coordinate system;
screening out coordinates of preset skeleton key points in a preset finger coordinate set in a second three-dimensional coordinate system;
and determining the position coordinates of the ray control in the second three-dimensional coordinate system according to the coordinates of the preset skeleton key points.
4. The virtual reality device of claim 3, wherein the controller is configured to determine the location coordinates of the ray control in the second three-dimensional coordinate system based on the coordinates of the preset skeletal keypoints according to the following steps:
all skeleton key points on the preset finger comprise a first skeleton point, a second skeleton point, a third skeleton point and a fourth skeleton point which are arranged in the direction from the finger root to the finger tip, wherein the preset skeleton key points comprise the third skeleton point and the fourth skeleton point;
and determining the position coordinates of a ray control in a second three-dimensional coordinate system, wherein the ray control is positioned between the fingertip of a preset finger and other controls in the user interface, the direction of the ray control is set to be that a third skeleton point points to a fourth skeleton point, and the fingertip coordinates can be determined through the coordinates of the fourth skeleton point.
5. The virtual reality device of claim 4, wherein the controller is further configured to: and determining the coordinates of the finger contour according to the coordinates of all skeleton key points on the preset finger, wherein the coordinates of the finger contour are used for drawing a finger image on a user interface.
6. The virtual reality device of claim 5, wherein the controller is configured to determine the coordinates of the finger contour based on coordinates of all skeletal key points on the preset finger according to the following steps:
determining coordinates of two side edges of the finger corresponding to each skeleton key point by using the preset finger width and taking each skeleton key point on a preset finger as a central point;
and determining the coordinates of the finger outline by using the coordinates of the edges of the two sides of the finger and the coordinates of the fingertip.
7. The virtual reality device of claim 5, wherein the controller is further configured to: when the user moves the preset finger, the finger image on the user interface correspondingly moves in real time to select different other controls.
8. The virtual reality device of claim 1, wherein the controller is configured to change the display state of the control pointed to by the ray control in the user interface according to the following steps:
displaying a control pointed by the ray control in a user interface as a highlight state;
or the control is enlarged and displayed between the hand and the original position of the control.
9. The virtual reality device of claim 1, wherein the controller is further configured to: when the user moves the head, the coordinates of the ray control are moved in the same direction according to the movement of the head, so that the relative position of the ray control in the window is kept unchanged.
10. The virtual reality device of claim 1, wherein the controller, prior to identifying the preset finger from the hand image, is further configured to:
recognizing a first preset gesture from the hand image;
recognizing a preset finger when the first preset gesture is recognized;
and if the first preset gesture is not recognized from the hand image, the preset finger is not recognized from the hand image.
11. The virtual reality device of claim 1, wherein the controller is further configured to: and after the display state of the control pointed by the ray control in the user interface is changed, recognizing a second preset gesture, and switching the user interface according to the link content corresponding to the pointed control.
12. The virtual reality device of claim 1, wherein the controller is further configured to: and after the display state of the control pointed by the ray control in the user interface is changed, recognizing a third preset gesture, and controlling the control to recover to the original display state.
13. A virtual display device, comprising:
a display for displaying a user interface;
a controller to:
receiving an image shot by a camera;
generating a ray control in the user interface according to the relative position of the identified preset article when the preset article is identified from the image;
and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
14. A method of interacting with a virtual reality device, comprising:
receiving a hand image shot by a camera;
generating a ray control in a user interface according to the relative position of the bone key point in the identified preset finger when the preset finger is identified from the hand image;
and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
15. A method of interacting with a virtual reality device, comprising:
receiving an image shot by a camera;
generating a ray control in the user interface according to the relative position of the identified preset article when the preset article is identified from the image;
and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
CN202011061911.7A 2020-09-30 2020-09-30 Method for interacting with virtual reality equipment and virtual reality equipment Active CN112198962B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011061911.7A CN112198962B (en) 2020-09-30 2020-09-30 Method for interacting with virtual reality equipment and virtual reality equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011061911.7A CN112198962B (en) 2020-09-30 2020-09-30 Method for interacting with virtual reality equipment and virtual reality equipment

Publications (2)

Publication Number Publication Date
CN112198962A true CN112198962A (en) 2021-01-08
CN112198962B CN112198962B (en) 2023-04-28

Family

ID=74013569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011061911.7A Active CN112198962B (en) 2020-09-30 2020-09-30 Method for interacting with virtual reality equipment and virtual reality equipment

Country Status (1)

Country Link
CN (1) CN112198962B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112817447A (en) * 2021-01-25 2021-05-18 暗物智能科技(广州)有限公司 AR content display method and system
CN113238650A (en) * 2021-04-15 2021-08-10 青岛小鸟看看科技有限公司 Gesture recognition and control method and device and virtual reality equipment
CN113282166A (en) * 2021-05-08 2021-08-20 青岛小鸟看看科技有限公司 Interaction method and device of head-mounted display equipment and head-mounted display equipment
TWI776522B (en) * 2021-03-23 2022-09-01 宏達國際電子股份有限公司 Method for interacting with virtual environment, electronic device, and computer readable storage medium
CN115145395A (en) * 2022-07-01 2022-10-04 江西意孚欧科技有限公司 Virtual reality interaction control method and system and virtual reality equipment
WO2023226578A1 (en) * 2022-05-27 2023-11-30 腾讯科技(深圳)有限公司 Palm contour extraction method and apparatus, and control instruction generation method and apparatus
TWI829467B (en) * 2021-12-20 2024-01-11 宏達國際電子股份有限公司 Method for interacting with virtual world, host, and computer readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080136775A1 (en) * 2006-12-08 2008-06-12 Conant Carson V Virtual input device for computing
CN103971102A (en) * 2014-05-21 2014-08-06 南京大学 Static gesture recognition method based on finger contour and decision-making trees
CN105867599A (en) * 2015-08-17 2016-08-17 乐视致新电子科技(天津)有限公司 Gesture control method and device
CN106325509A (en) * 2016-08-19 2017-01-11 北京暴风魔镜科技有限公司 Three-dimensional gesture recognition method and system
CN107340868A (en) * 2017-07-05 2017-11-10 北京奇艺世纪科技有限公司 A kind of data processing method, device and VR equipment
CN108230383A (en) * 2017-03-29 2018-06-29 北京市商汤科技开发有限公司 Hand three-dimensional data determines method, apparatus and electronic equipment
CN108549490A (en) * 2018-05-03 2018-09-18 林潼 A kind of gesture identification interactive approach based on Leap Motion equipment
CN109196447A (en) * 2016-03-31 2019-01-11 奇跃公司 Use the interaction of posture and more DOF controllers and 3D virtual objects

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080136775A1 (en) * 2006-12-08 2008-06-12 Conant Carson V Virtual input device for computing
CN103971102A (en) * 2014-05-21 2014-08-06 南京大学 Static gesture recognition method based on finger contour and decision-making trees
CN105867599A (en) * 2015-08-17 2016-08-17 乐视致新电子科技(天津)有限公司 Gesture control method and device
CN109196447A (en) * 2016-03-31 2019-01-11 奇跃公司 Use the interaction of posture and more DOF controllers and 3D virtual objects
CN106325509A (en) * 2016-08-19 2017-01-11 北京暴风魔镜科技有限公司 Three-dimensional gesture recognition method and system
CN108230383A (en) * 2017-03-29 2018-06-29 北京市商汤科技开发有限公司 Hand three-dimensional data determines method, apparatus and electronic equipment
CN107340868A (en) * 2017-07-05 2017-11-10 北京奇艺世纪科技有限公司 A kind of data processing method, device and VR equipment
CN108549490A (en) * 2018-05-03 2018-09-18 林潼 A kind of gesture identification interactive approach based on Leap Motion equipment

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112817447A (en) * 2021-01-25 2021-05-18 暗物智能科技(广州)有限公司 AR content display method and system
CN112817447B (en) * 2021-01-25 2024-05-07 暗物智能科技(广州)有限公司 AR content display method and system
TWI776522B (en) * 2021-03-23 2022-09-01 宏達國際電子股份有限公司 Method for interacting with virtual environment, electronic device, and computer readable storage medium
CN113238650A (en) * 2021-04-15 2021-08-10 青岛小鸟看看科技有限公司 Gesture recognition and control method and device and virtual reality equipment
US11947729B2 (en) 2021-04-15 2024-04-02 Qingdao Pico Technology Co., Ltd. Gesture recognition method and device, gesture control method and device and virtual reality apparatus
CN113282166A (en) * 2021-05-08 2021-08-20 青岛小鸟看看科技有限公司 Interaction method and device of head-mounted display equipment and head-mounted display equipment
TWI829467B (en) * 2021-12-20 2024-01-11 宏達國際電子股份有限公司 Method for interacting with virtual world, host, and computer readable storage medium
US11954266B2 (en) 2021-12-20 2024-04-09 Htc Corporation Method for interacting with virtual world, host, and computer readable storage medium
WO2023226578A1 (en) * 2022-05-27 2023-11-30 腾讯科技(深圳)有限公司 Palm contour extraction method and apparatus, and control instruction generation method and apparatus
CN115145395A (en) * 2022-07-01 2022-10-04 江西意孚欧科技有限公司 Virtual reality interaction control method and system and virtual reality equipment
CN115145395B (en) * 2022-07-01 2023-12-12 山西极智峰数字科技有限公司 Virtual reality interaction control method and system and virtual reality equipment

Also Published As

Publication number Publication date
CN112198962B (en) 2023-04-28

Similar Documents

Publication Publication Date Title
CN112198962B (en) Method for interacting with virtual reality equipment and virtual reality equipment
US20220084279A1 (en) Methods for manipulating objects in an environment
JP6393367B2 (en) Tracking display system, tracking display program, tracking display method, wearable device using them, tracking display program for wearable device, and operation method of wearable device
CN105045398B (en) A kind of virtual reality interactive device based on gesture identification
JP5936155B2 (en) 3D user interface device and 3D operation method
CN110363867B (en) Virtual decorating system, method, device and medium
US20170293364A1 (en) Gesture-based control system
US10514767B2 (en) Information processing apparatus and information processing method
CN110456907A (en) Control method, device, terminal device and the storage medium of virtual screen
KR20140010616A (en) Apparatus and method for processing manipulation of 3d virtual object
JP6165485B2 (en) AR gesture user interface system for mobile terminals
CN108509026B (en) Remote maintenance support system and method based on enhanced interaction mode
US9979946B2 (en) I/O device, I/O program, and I/O method
US9933853B2 (en) Display control device, display control program, and display control method
JPWO2018198910A1 (en) Information processing apparatus, information processing apparatus control method, and program
CN104813258A (en) Data input device
JPWO2014141504A1 (en) 3D user interface device and 3D operation processing method
JP6506443B1 (en) Image generation apparatus and image generation program
KR101654311B1 (en) User motion perception method and apparatus
CN115328304A (en) 2D-3D fused virtual reality interaction method and device
KR20210045898A (en) Methods and apparatus for controlling hand posture-based virtual menus in mixed reality
CN109102571B (en) Virtual image control method, device, equipment and storage medium thereof
JP2019192224A (en) Image generation device and image generation program
CN109993059A (en) Binocular vision and object recognition technique on intelligent electronic device based on single camera
CN111651031A (en) Virtual content display method and device, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant