CN112198962B - Method for interacting with virtual reality equipment and virtual reality equipment - Google Patents

Method for interacting with virtual reality equipment and virtual reality equipment Download PDF

Info

Publication number
CN112198962B
CN112198962B CN202011061911.7A CN202011061911A CN112198962B CN 112198962 B CN112198962 B CN 112198962B CN 202011061911 A CN202011061911 A CN 202011061911A CN 112198962 B CN112198962 B CN 112198962B
Authority
CN
China
Prior art keywords
finger
preset
point
coordinate system
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011061911.7A
Other languages
Chinese (zh)
Other versions
CN112198962A (en
Inventor
杨彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Juhaokan Technology Co Ltd
Original Assignee
Juhaokan Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Juhaokan Technology Co Ltd filed Critical Juhaokan Technology Co Ltd
Priority to CN202011061911.7A priority Critical patent/CN112198962B/en
Publication of CN112198962A publication Critical patent/CN112198962A/en
Application granted granted Critical
Publication of CN112198962B publication Critical patent/CN112198962B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm

Abstract

The invention discloses a method for interacting with virtual reality equipment and the virtual reality equipment, wherein the method comprises the following steps: receiving a hand image shot by a camera; generating a ray control in the user interface according to the relative positions of the bone key points in the identified preset finger when the preset finger is identified from the hand image; and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface. In some embodiments, the user's finger can interact with the virtual reality device, avoiding the inconvenience of the handles used in the prior art.

Description

Method for interacting with virtual reality equipment and virtual reality equipment
Technical Field
0001. The present invention relates to the field of virtual reality devices, and in particular, to a method for interacting with a virtual reality device and a virtual reality device.
Background
0002. Virtual Reality technology (VR), the basic implementation of which is to simulate a Virtual environment to give an environmental immersion. With the continuous development of social productivity and scientific technology, VR technology is increasingly required by various industries.
0003. VR hardware device on the market is mainly VR helmet device, lets people personally immerse in virtual world after wearing. At present, the interaction mode with VR helmet equipment mainly comprises the steps of carrying out interaction through a handheld handle, connecting the handle with a helmet through a Bluetooth mode, transmitting information such as the position and keys of the handle to the helmet, and carrying out corresponding interaction by the helmet. The defect of the existing hand-held handles is that the interaction mode is dependent on the handles, when the handles are not powered on, convenient and reliable interaction equipment is lost, and many of the existing handles are heavy and thick, and the hand-held handles are burdened after long holding time. VR is the sense of immersion, and the hand-held handle interoperates to the detriment of the user's experience of immersion.
Disclosure of Invention
0004. In order to solve the technical problems, the invention provides a method for interacting with virtual reality equipment and the virtual reality equipment so as to improve the use experience of a user.
0005. In a first aspect, the present invention provides a virtual reality device, including:
0006. a display for displaying a user interface;
0007. a controller for:
0008. receiving a hand image shot by a camera;
0009. generating a ray control in the user interface according to the relative positions of the bone key points in the identified preset finger when the preset finger is identified from the hand image;
0010. and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
0011. In a second aspect, the present invention provides a virtual display device, including:
0012. a display for displaying a user interface;
0013. a controller for:
0014. receiving an image shot by a camera;
0015. generating a ray control in the user interface according to the relative position of the identified preset article when the preset article is identified from the image;
0016. and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
0017. In a third aspect, the present invention provides a method for interacting with a virtual reality device, including:
0018. receiving a hand image shot by a camera;
0019. generating a ray control in the user interface according to the relative positions of the bone key points in the identified preset finger when the preset finger is identified from the hand image;
0020. and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
0021. In a fourth aspect, the present invention provides a method for interacting with a virtual reality device, including:
0022. receiving an image shot by a camera;
0023. generating a ray control in the user interface according to the relative position of the identified preset article when the preset article is identified from the image;
0024. and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
0025. In the technical scheme provided by the application, a method for interacting with virtual reality equipment and the virtual reality equipment are provided, wherein the method comprises the following steps: receiving a hand image shot by a camera; generating a ray control in the user interface according to the relative positions of the bone key points in the identified preset finger when the preset finger is identified from the hand image; and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface. In some embodiments, the user's finger can interact with the virtual reality device, avoiding the inconvenience of the handles used in the prior art.
Drawings
0026. In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
0027. A schematic diagram of skeletal keypoints of a hand is shown schematically in fig. 1;
0028. a flowchart of a method of interacting with a virtual reality device is exemplarily shown in fig. 2;
0029. a schematic of a user interface is shown schematically in fig. 3.
Detailed Description
0030. For purposes of clarity, embodiments and advantages of the present application, the following description will make clear and complete the exemplary embodiments of the present application, with reference to the accompanying drawings in the exemplary embodiments of the present application, it being apparent that the exemplary embodiments described are only some, but not all, of the examples of the present application.
0031. Based on the exemplary embodiments described herein, all other embodiments that may be obtained by one of ordinary skill in the art without making any inventive effort are within the scope of the claims appended hereto. Furthermore, while the disclosure is presented in the context of an exemplary embodiment or embodiments, it should be appreciated that the various aspects of the disclosure may, separately, comprise a complete embodiment.
0032. It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
0033. The terms "first," second, "" third and the like in the description and in the claims and in the above drawings are used for distinguishing between similar or similar objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated (Unless otherwise indicated). It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the operations can, for example, be performed in sequences other than those illustrated or otherwise described according to some embodiments.
0034. Furthermore, the terms "comprise" and "have," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to those elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
0035. The term "module" as used in this application refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the function associated with that element.
0036. VR hardware device on the market is mainly VR helmet device, lets people personally immerse in virtual world after wearing. At present, the interaction mode with VR helmet equipment mainly comprises the steps of carrying out interaction through a handheld handle, connecting the handle with a helmet through a Bluetooth mode, transmitting information such as the position and keys of the handle to the helmet, and carrying out corresponding interaction by the helmet. The defect of the existing hand-held handles is that the interaction mode is dependent on the handles, when the handles are not powered on, convenient and reliable interaction equipment is lost, and many of the existing handles are heavy and thick, and the hand-held handles are burdened after long holding time. VR is the sense of immersion, and the hand-held handle interoperates to the detriment of the user's experience of immersion.
0037. The interaction mode with the virtual reality device used in some embodiments is to use gestures to interact, so that inconvenience brought to a user by using a handle can be avoided.
0038. In some embodiments, the virtual reality device may be provided with a camera, or may be externally connected to the camera. After the virtual reality equipment is opened, the equipment is started to display a user interface, meanwhile, a camera is started, motion capture can be carried out on gestures after the camera is started, and captured hand images are rendered into a 3D user interface of the virtual reality equipment in real time.
0039. In some embodiments, the virtual reality device is a VR headset device with a binocular camera.
0040. The binocular camera in some embodiments is provided with two cameras. The binocular camera is just like the binocular camera of a person, and depth information of a shot object can be obtained through stereoscopic imaging, namely, the world seen by human eyes is three-dimensional, and the world captured by the binocular camera is also three-dimensional. In addition, the number of cameras may not be limited as long as depth information of the object can be obtained.
0041. Some embodiments provide a method of interacting with a virtual reality device, some embodiments supporting recognizing that a finger is operating instead of a handle, interacting with a virtual display device by the finger, performing highlighting or the like provided instead of the handle. As shown in fig. 1, the method includes:
0042.S100, receiving hand images shot by a camera.
0043.s200, generating a ray control in the user interface based on the relative positions of skeletal keypoints in the identified preset finger when the preset finger is identified from the hand image.
0044. In some embodiments, the hand skeletal point data generated by the camera according to the captured hand image may also be received from the camera, and the ray control may be generated in the user interface according to the hand skeletal point data.
0045. In some embodiments, since the right hand is the handedness of most people, the preset finger may be the right index finger, and of course, some embodiments may set the preset finger as the left index finger for the convenience of the user using the left hand as the handedness. In addition, in order to avoid the user from being fatigued when using the index finger all the time, any finger on the left hand or the right hand can be used as the preset finger.
0046. In some embodiments, the fingers extending from the user can be used as preset fingers when other fingers are bent.
0047. In some embodiments, before identifying the preset finger from the hand image, the method further comprises: identifying a first preset gesture from the hand image; when the first preset gesture is recognized, recognizing a preset finger; and if the first preset gesture is not recognized from the hand image, not recognizing the preset finger from the hand image. The first preset gesture is that only the preset finger is extended, and the other fingers are held tightly. In this way, the generation of the ray control can be triggered by the preset gesture, and meaningless finger recognition is avoided.
0048. In some embodiments, the gesture recognition function of the virtual reality device may be turned on or off, typically by default.
0049. In some embodiments, when the user swings the palm two times in front of the virtual reality device, the device turns off the gesture interaction function after recognizing the gesture. When the gesture interaction function is in a closed state, the gesture interaction function can be started by waving the same gesture palm in front of the virtual reality device for two times. In other embodiments, the gesture interactive function in the user interface may also be selected to be turned on by an auxiliary operation, such as using a key press operation on the virtual reality device.
0050. In some embodiments, the step of generating the ray control in the user interface according to the identified relative positions of the skeletal keypoints in the preset finger when the preset finger is identified from the hand image includes:
0051. identifying a preset finger coordinate set of a preset finger in the hand image in a first three-dimensional coordinate system, wherein the preset finger coordinate set represents the relative position of a skeleton key point in the preset finger;
0052. and determining the position coordinates of the ray control in the second three-dimensional coordinate system according to the preset finger coordinate set. The second three-dimensional coordinate system of some embodiments is a coordinate system in a user interface.
0053. And generating the ray control in the user interface according to the position coordinates of the ray control.
0054. In some embodiments, the step of determining the position coordinates of the ray control in the second three-dimensional coordinate system according to the preset finger coordinate set includes:
0055. converting the preset finger coordinate set in the first three-dimensional coordinate system into a preset finger coordinate set in a second three-dimensional coordinate system;
0056. illustratively, the coordinates of the first bone point (labeled 5 in fig. 2), the second bone point (labeled 6 in fig. 2), the third bone point (labeled 7 in fig. 2), and the fourth bone point (labeled 8 in fig. 2) in the first three-dimensional coordinate system are Vector (three-dimensional Vector coordinates) 1, vector2, vector3, vector4, respectively, when the first bone point is set as the origin (0, 0) of the second three-dimensional coordinate system, and the coordinates of the second three-dimensional coordinate system corresponding to the second bone point, the third bone point, and the fourth bone point are calculated by the following formula:
0057. coordinates of a second three-dimensional coordinate system corresponding to the second bone point = Vector2-Vector1+ (0, 0);
0058. coordinates of the second three-dimensional coordinate system corresponding to the third bone point = Vector3-Vector1+ (0, 0);
0059. coordinates of the second three-dimensional coordinate system corresponding to the fourth bone point=vector 4-Vector1+ (0, 0).
0060. In some embodiments, the second three-dimensional coordinate system is a Unity world coordinate system.
0061. Screening out coordinates of preset skeleton key points in a preset finger coordinate set in a second three-dimensional coordinate system;
0062. and determining the position coordinates of the ray control in the second three-dimensional coordinate system according to the coordinates of the preset skeleton key points.
0063. In some embodiments, the step of determining the position coordinates of the ray control in the second three-dimensional coordinate system according to the coordinates of the preset bone key points includes:
0064. all bone key points on the preset finger comprise a first bone point, a second bone point, a third bone point and a fourth bone point which are arranged according to the direction from the finger root to the fingertip, wherein the preset bone key points comprise the third bone point and the fourth bone point;
0065. and determining position coordinates of the ray control in a second three-dimensional coordinate system, wherein the ray control is positioned between the fingertip of the preset finger and other controls in the user interface, the direction of the ray control is set to be that a third skeleton point points to a fourth skeleton point, and the fingertip coordinates can be determined through the coordinates of the fourth skeleton point.
And S300, changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
0067. In some embodiments, other controls in the user interface also have corresponding positions in the second two-dimensional coordinate system, and according to the positions of the ray controls and the positions of the other controls in the user interface, the direction of the ray controls may be used to determine the control pointed by the ray controls, and control the control to change the display state, which is illustrated in fig. 3, for example. In some embodiments, the length of the ray control may be dynamically changed according to the object pointed by the ray control, and illustratively, the user interface that the user sees using the virtual display device in some embodiments further includes a background of the 3D screen. When pointing to the control, the length of the ray is the length from the finger to the control, and when not pointing to the control, the length of the ray control can be prolonged and points to a far 3D picture.
0068. In some embodiments, taking the second coordinate system as the Unity coordinate system as an example, the processor needs to render in the Unity coordinate system according to the image to be displayed, and then perform three-dimensional projection according to the rendered data, or acquire two images of different viewing angles according to the rendered data to be displayed on the left-eye sub-display and the right-eye sub-display respectively when the display includes the left-eye sub-display and the right-eye sub-display, or acquire two images of different viewing angles according to the rendered data to be displayed on the stereoscopic display in a time-sharing manner when the display includes only one stereoscopic display, so that the user can "see" the stereoscopic image according to the persistence of vision effect.
0069. In some embodiments, the ray control is also required to be rendered in the Unity coordinate system, and the three-dimensional projection or the generation of the images with different perspectives is started after the ray rendering is completed.
0070. In some embodiments, the length of the ray control is determined, and the pointed control is determined to be highlighted according to a straight line in the length direction of the ray control and the position of the highlighted control to be displayed in the Unity coordinate system;
0071. in some embodiments, the length of the ray control is determined according to a straight line in the length direction of the ray control and a focus of an image to be displayed in a Unity coordinate system, and the pointed control is determined to be highlighted according to the straight line in the length direction of the ray control and the position of the highlighted control to be displayed.
0072. In some embodiments, the step of changing the display state of the control pointed by the ray control in the user interface to be highlighted includes:
0073. displaying a control pointed by the ray control in a user interface as a highlight state;
0074. or magnifying the control and displaying the control between the hand and the home position where the control is located.
0075. In some embodiments, the ray control is generated and a virtual hand control is also generated, the virtual hand control is located at one end of the ray control away from the display image, and the method further comprises: and controlling generation of virtual hand controls in the user interface according to the identified hand images:
0076. determining a hand coordinate set in a first three-dimensional coordinate system;
0077. converting the hand coordinate set in the first three-dimensional coordinate system into a hand coordinate set in a second three-dimensional coordinate system; and generating a hand control on a user interface according to the hand coordinate set in the second three-dimensional coordinate system.
0078. In some embodiments, after the virtual reality device is turned on, if the hand is within the detection range of the camera, a hand control is displayed in the user interface of the virtual reality device, and according to the hand tracking technology, the real position and gesture are captured in real time, and according to the data, the virtual hand is rendered into the user interface of the virtual reality device in real time.
0079. Rendering of the hand control is performed on a Unity (game engine) engine layer, and the technical implementation flow is as follows: the Android end (An Zhuoduan) acquires dot matrix data of the fingers of the image in the binocular camera and skeleton key points of the upper arm trunk, wherein the dot matrix data comprise skeleton key point names and (X, Y, Z) coordinates of the skeleton key points; transmitting the data acquired by the Android end to Unity in the form of Json character strings, and transmitting the data in each frame of image to Unity by the Android end; and the Unity processes the data transmitted by the Android terminal and corrects the transmitted (X, Y, Z) coordinates into a Unity world coordinate system. The correction mode is that the center point of the crossing part of the image shot by the binocular camera is taken as the center point of the Unity world coordinate system, if the position of the hand key point has position deviation to the crossing image shot by the binocular camera, the offset of the center point of the binocular camera is correspondingly corrected to the offset of the center point of the Unity world coordinate system according to the offset of the position, and the (X, Y, Z) coordinate transmitted by the binocular camera can be converted into the Unity world coordinate system.
After acquiring coordinates of key points transmitted by an Android end in real time and converting the coordinates into a Unity world coordinate system, the Unity end performs hand drawing according to the coordinates of skeleton key points, each palm comprises 21 skeleton key points, as shown in fig. 2, the hand outline points are determined by using the hand skeleton key points and the hand thickness, wherein the hand thickness refers to the width of a finger or the width of the palm, and the hand outline is drawn by using a Unity line renderer (linerenderer). The hand contour points are contour points which are located at the two sides of the hand skeleton key points, wherein the distance between the hand contour points and the hand skeleton key points is the thickness of the hand. The hands are color filled according to the hand contour. In some embodiments, the transparency of the hand fill color is set to 70% and the hand drawing at the Unity end is completed.
After the hand drawing of the unity end is finished, real hand actions of a user wearing the virtual reality device can be displayed in a user interface of the virtual reality device in real time, gestures of the user can be identified, specific operations can be responded, and the gesture identification technology comprises the following steps:
0082. gesture recognition is based on binocular camera implementation of virtual reality devices. Left and right visual images of gesture actions of a user are acquired from the binocular camera, stereoscopic matching is carried out, visual difference images are obtained, and triangle calculation is carried out by utilizing inner parameters and outer parameters of the camera so as to obtain depth images. The visual image is segmented by using a gesture segmentation algorithm, and the human hand and the background environment are segmented, so that the hand is scratched. And tracking the hand by using a gesture tracking algorithm, and acquiring the hand position information in real time. The node of the hand can be obtained through a hand recognition algorithm. According to the acquired hand information and position information, the hands and the upper limbs can be drawn in real time. According to a gesture tracking algorithm and a motion track of a hand joint point, the gestures are identified and matched, for example, the gestures are compared with the gestures of scissors hands, the gestures pinched by two fingers and the like.
0083. In some embodiments, the method further comprises: and determining finger contour coordinates according to coordinates of all skeleton key points on a preset finger, wherein the finger contour coordinates are used for drawing a finger image on a user interface.
0084. In some embodiments, the step of determining the coordinates of the outline of the finger according to the coordinates of all skeletal key points on the preset finger includes:
0085. taking each skeleton key point on a preset finger as a central point, and determining the coordinates of the two sides of the finger corresponding to each skeleton key point by utilizing the width of the preset finger;
0086. and determining the finger contour coordinates by utilizing the edge coordinates of the two sides of the finger and the fingertip coordinates.
0087. For example, the world coordinates of UNITY corresponding to the fourth bone point are (x, y, z), and the preset finger width is 0.13, and the coordinates of the two sides of the finger corresponding to the fourth bone point are (x-0.13, y, z), (x+0.13, y, z), respectively.
0088. Since the fourth skeletal point is the skeletal key point closest to the fingertip, fingertip coordinates are generated using the UNITY world coordinates of the fourth skeletal point. Illustratively, the fingertip coordinates are (x, y+0.13, z) if the distance from the fourth skeletal point to the upper edge of the finger is set to 0.13.
0089. Some embodiments determine finger contour coordinates through finger two-side edge coordinates and fingertip coordinates corresponding to four skeleton key points on the finger respectively, for a total of nine coordinate points. In some embodiments, the fill transparency of the finger image may be set to 0.7 color values when the finger image is drawn using Bezier curves according to the finger profile coordinates.
0090. In some embodiments, the method further comprises: when the user moves a preset finger, the finger image on the user interface correspondingly moves in real time so as to select different other controls.
0091. In some embodiments, the method further comprises: when the user moves the head, the coordinates of the ray control are moved in the same direction according to the movement of the head, so that the relative position of the ray control in the window is kept unchanged. When the user moves the head, the content displayed in the middle position of the user interface may change according to the head movement direction. Illustratively, when the user moves his head to the left, he can see that the middle position of the user interface changes from the originally displayed content to the corresponding content located to the left of the originally displayed content. In order to avoid the fact that the head and the actual hand of the ray control have a certain angle difference due to the rotation of the head of the user, the ray control moves, and the user is uncomfortable when watching the user interface. Some embodiments control the coordinates of the ray control to move in the same direction as the user moves the head so that the relative position of the ray control in the window remains unchanged. The window refers to all user interfaces that a user sees through the virtual reality device. Thus, when the user moves the head, the ray control display position remains unchanged. In some embodiments, the position of the ray control may be reset when the virtual reality device is initialized.
0092. In some embodiments, the method further comprises: after the display state of the control pointed by the ray control in the user interface is changed, a second preset gesture is recognized, and switching of the user interface is performed according to the link content corresponding to the pointed control. For example, the second preset gesture may be a pinching motion of the thumb and middle finger of the user.
0093. In some embodiments, the method further comprises: and after the display state of the control pointed by the ray control in the user interface is changed, recognizing a third preset gesture, and controlling the control to restore to the original display state. For example, the third preset gesture may be a pinching motion of the user's thumb and ring finger.
0094. Some embodiments may utilize a finger in addition to interacting with a virtual reality device, and in other embodiments, other items may be used in place of the finger to effect interaction with the virtual reality device.
0095. Another method for interacting with a virtual reality device provided by some embodiments, as shown in fig. 3, includes:
0096. receiving an image shot by a camera; generating a ray control in the user interface according to the relative position of the identified preset article when the preset article is identified from the image; and changing the display state of the control pointed by the ray control in the user interface according to the position of the ray control and the positions of other controls in the user interface.
0097. The preset article may be a pen, and when the pen is identified, the ray control may be drawn by determining the position of the pen and the pointing direction of the pen tip, which is not limited to the type of the preset article in some embodiments, as long as the intention of the application is not violated.
0098. In some embodiments, when the virtual reality device moves, the hand image is reacquired for the generation of the ray control.
0099. In some embodiments, as the virtual reality device moves, the coordinates of the ray control are moved in a second coordinate system according to the moving three-dimensional variables such that the position of the ray control in the display window is maintained relatively stable.
0100. In some embodiments, when the virtual reality device moves, an image area to be displayed is determined according to the moving three-dimensional variable, and an image with display is displayed at the position of the window control.
0101. It will be apparent to those skilled in the art that the techniques of embodiments of the present invention may be implemented in software plus a necessary general purpose hardware platform. In a specific implementation, the present invention also provides a computer storage medium, where the computer storage medium may store a program, where the program when executed may include program steps included in a method for interacting with a virtual reality device that the controller 250 is configured to perform. The computer storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a random-access memory (random access memory, RAM), or the like.
0102. Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This invention is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope of the invention being indicated by the following claims. The true scope and spirit of the invention is indicated by the following claims.

Claims (10)

1. A virtual reality device, comprising:
a display for displaying a user interface;
a controller for:
receiving a hand image shot by a camera;
when a preset finger is identified from a hand image, identifying a preset finger coordinate set of the preset finger in a first three-dimensional coordinate system in the hand image, wherein the preset finger coordinate set represents the relative position of a skeleton key point in the preset finger, and the preset finger coordinate set comprises coordinates of a first skeleton point, a second skeleton point, a third skeleton point and a fourth skeleton point which are arranged from the fingertip direction to the finger root direction, the fourth skeleton point indicates the fingertip of the preset finger, the third skeleton point indicates a joint point closest to the fingertip, and the first skeleton point indicates the finger root of the preset finger;
converting a preset finger coordinate set in the first three-dimensional coordinate system into a preset finger coordinate set in a second three-dimensional coordinate system, wherein the origin of coordinates of the second three-dimensional coordinate system is the first bone point, the second three-dimensional coordinate system is different from the first three-dimensional coordinate system, and the user interface is positioned in the second three-dimensional coordinate system;
taking the coordinates of the fourth bone point in the second three-dimensional coordinate system as a starting point, drawing a visible ray control according to the direction of the third bone point pointing to the fourth bone point in the second three-dimensional coordinate system as a ray direction, and displaying the ray control on the user interface;
when the ray control points to other controls in the user interface, changing the display state of the other controls.
2. The virtual reality device of claim 1, wherein the controller is further configured to: and determining finger contour coordinates according to coordinates of all skeletal key points on the preset finger in the second three-dimensional coordinate system, wherein the finger contour coordinates are used for drawing a finger image on the user interface.
3. The virtual reality device of claim 2, wherein the controller performs determining finger profile coordinates from coordinates of all skeletal keypoints on a preset finger according to the following steps:
taking each skeleton key point on a preset finger as a central point, and determining the coordinates of the two sides of the finger corresponding to each skeleton key point by utilizing the width of the preset finger;
and determining the finger contour coordinates by utilizing the edge coordinates of the two sides of the finger and the fingertip coordinates.
4. The virtual reality device of claim 1, wherein the controller is further configured to:
when the hand of the user is identified to be moved, determining a preset finger coordinate set of the hand after the movement in the original second three-dimensional coordinate system; and drawing a new visible ray control by taking the coordinates of a fourth skeleton point in the preset finger coordinate set of the hand after movement as a starting point and taking the direction of the third skeleton point pointing to the fourth skeleton point as a ray direction so as to select different other controls.
5. The virtual reality device of claim 1, wherein the controller is configured to perform changing a display state of a control pointed to by the ray control in a user interface according to:
displaying a control pointed by the ray control in a user interface as a highlight state;
or magnifying the control and displaying the control between the hand and the home position where the control is located.
6. The virtual reality device of claim 1, wherein the controller is further configured to: when the user moves the head, the coordinates of the ray control are moved in the same direction according to the movement of the head, so that the relative position of the ray control in the window is kept unchanged.
7. The virtual reality device of claim 1, wherein the controller is further configured to, prior to identifying the preset finger from the hand image:
identifying a first preset gesture from the hand image;
when the first preset gesture is recognized, recognizing a preset finger;
and if the first preset gesture is not recognized from the hand image, not recognizing the preset finger from the hand image.
8. The virtual reality device of claim 1, wherein the controller is further configured to: after the display state of the control pointed by the ray control in the user interface is changed, a second preset gesture is recognized, and switching of the user interface is performed according to the link content corresponding to the pointed control.
9. The virtual reality device of claim 1, wherein the controller is further configured to: and after the display state of the control pointed by the ray control in the user interface is changed, recognizing a third preset gesture, and controlling the control to restore to the original display state.
10. A method of interacting with a virtual reality device, comprising:
receiving a hand image shot by a camera;
when a preset finger is identified from a hand image, identifying a preset finger coordinate set of the preset finger in a first three-dimensional coordinate system in the hand image, wherein the preset finger coordinate set represents the relative position of a skeleton key point in the preset finger, and the preset finger coordinate set comprises coordinates of a first skeleton point, a second skeleton point, a third skeleton point and a fourth skeleton point which are arranged from the fingertip direction to the finger root direction, the fourth skeleton point indicates the fingertip of the preset finger, the third skeleton point indicates a joint point closest to the fingertip, and the first skeleton point indicates the finger root of the preset finger;
converting the preset finger coordinate set in the first three-dimensional coordinate system into a preset finger coordinate set in a second three-dimensional coordinate system, wherein the origin of coordinates of the second three-dimensional coordinate system is the first bone point, the second three-dimensional coordinate system is different from the first three-dimensional coordinate system, and the user interface is positioned in the second three-dimensional coordinate system;
taking the coordinates of the fourth bone point in the second three-dimensional coordinate system as a starting point, drawing a visible ray control according to the direction of the third bone point pointing to the fourth bone point in the second three-dimensional coordinate system as a ray direction, and displaying the ray control on the user interface;
when the ray control points to other controls in the user interface, changing the display state of the other controls.
CN202011061911.7A 2020-09-30 2020-09-30 Method for interacting with virtual reality equipment and virtual reality equipment Active CN112198962B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011061911.7A CN112198962B (en) 2020-09-30 2020-09-30 Method for interacting with virtual reality equipment and virtual reality equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011061911.7A CN112198962B (en) 2020-09-30 2020-09-30 Method for interacting with virtual reality equipment and virtual reality equipment

Publications (2)

Publication Number Publication Date
CN112198962A CN112198962A (en) 2021-01-08
CN112198962B true CN112198962B (en) 2023-04-28

Family

ID=74013569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011061911.7A Active CN112198962B (en) 2020-09-30 2020-09-30 Method for interacting with virtual reality equipment and virtual reality equipment

Country Status (1)

Country Link
CN (1) CN112198962B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112817447A (en) * 2021-01-25 2021-05-18 暗物智能科技(广州)有限公司 AR content display method and system
US20220308659A1 (en) * 2021-03-23 2022-09-29 Htc Corporation Method for interacting with virtual environment, electronic device, and computer readable storage medium
CN113238650B (en) * 2021-04-15 2023-04-07 青岛小鸟看看科技有限公司 Gesture recognition and control method and device and virtual reality equipment
CN113282166A (en) * 2021-05-08 2021-08-20 青岛小鸟看看科技有限公司 Interaction method and device of head-mounted display equipment and head-mounted display equipment
US11954266B2 (en) * 2021-12-20 2024-04-09 Htc Corporation Method for interacting with virtual world, host, and computer readable storage medium
CN117173734A (en) * 2022-05-27 2023-12-05 腾讯科技(深圳)有限公司 Palm contour extraction and control instruction generation method and device and computer equipment
CN115145395B (en) * 2022-07-01 2023-12-12 山西极智峰数字科技有限公司 Virtual reality interaction control method and system and virtual reality equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971102A (en) * 2014-05-21 2014-08-06 南京大学 Static gesture recognition method based on finger contour and decision-making trees
CN105867599A (en) * 2015-08-17 2016-08-17 乐视致新电子科技(天津)有限公司 Gesture control method and device
CN106325509A (en) * 2016-08-19 2017-01-11 北京暴风魔镜科技有限公司 Three-dimensional gesture recognition method and system
CN107340868A (en) * 2017-07-05 2017-11-10 北京奇艺世纪科技有限公司 A kind of data processing method, device and VR equipment
CN108230383A (en) * 2017-03-29 2018-06-29 北京市商汤科技开发有限公司 Hand three-dimensional data determines method, apparatus and electronic equipment
CN108549490A (en) * 2018-05-03 2018-09-18 林潼 A kind of gesture identification interactive approach based on Leap Motion equipment
CN109196447A (en) * 2016-03-31 2019-01-11 奇跃公司 Use the interaction of posture and more DOF controllers and 3D virtual objects

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080136775A1 (en) * 2006-12-08 2008-06-12 Conant Carson V Virtual input device for computing

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103971102A (en) * 2014-05-21 2014-08-06 南京大学 Static gesture recognition method based on finger contour and decision-making trees
CN105867599A (en) * 2015-08-17 2016-08-17 乐视致新电子科技(天津)有限公司 Gesture control method and device
CN109196447A (en) * 2016-03-31 2019-01-11 奇跃公司 Use the interaction of posture and more DOF controllers and 3D virtual objects
CN106325509A (en) * 2016-08-19 2017-01-11 北京暴风魔镜科技有限公司 Three-dimensional gesture recognition method and system
CN108230383A (en) * 2017-03-29 2018-06-29 北京市商汤科技开发有限公司 Hand three-dimensional data determines method, apparatus and electronic equipment
CN107340868A (en) * 2017-07-05 2017-11-10 北京奇艺世纪科技有限公司 A kind of data processing method, device and VR equipment
CN108549490A (en) * 2018-05-03 2018-09-18 林潼 A kind of gesture identification interactive approach based on Leap Motion equipment

Also Published As

Publication number Publication date
CN112198962A (en) 2021-01-08

Similar Documents

Publication Publication Date Title
CN112198962B (en) Method for interacting with virtual reality equipment and virtual reality equipment
US20210177124A1 (en) Information processing apparatus, information processing method, and computer-readable storage medium
US20220084279A1 (en) Methods for manipulating objects in an environment
CN105045398B (en) A kind of virtual reality interactive device based on gesture identification
US10095030B2 (en) Shape recognition device, shape recognition program, and shape recognition method
US11947729B2 (en) Gesture recognition method and device, gesture control method and device and virtual reality apparatus
CN103793060B (en) A kind of user interactive system and method
JP5936155B2 (en) 3D user interface device and 3D operation method
JP6057396B2 (en) 3D user interface device and 3D operation processing method
CN110363867B (en) Virtual decorating system, method, device and medium
CN110456907A (en) Control method, device, terminal device and the storage medium of virtual screen
CN102779000B (en) User interaction system and method
CN109144252B (en) Object determination method, device, equipment and storage medium
CN107357434A (en) Information input equipment, system and method under a kind of reality environment
US20240104813A1 (en) Techniques for enabling drawing in a computer-generated reality environment
JP6506443B1 (en) Image generation apparatus and image generation program
US10171800B2 (en) Input/output device, input/output program, and input/output method that provide visual recognition of object to add a sense of distance
KR101719927B1 (en) Real-time make up mirror simulation apparatus using leap motion
CN109102571B (en) Virtual image control method, device, equipment and storage medium thereof
CN106502401A (en) A kind of display control method and device
CN110717993B (en) Interaction method, system and medium of split type AR glasses system
CN112613374A (en) Face visible region analyzing and segmenting method, face making-up method and mobile terminal
CN117369649B (en) Virtual reality interaction system and method based on proprioception
KR102612430B1 (en) System for deep learning-based user hand gesture recognition using transfer learning and providing virtual reality contents
CN117435055A (en) Man-machine interaction method for gesture enhanced eyeball tracking based on spatial stereoscopic display

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant