CN113703571B - Virtual reality man-machine interaction method, device, equipment and medium - Google Patents

Virtual reality man-machine interaction method, device, equipment and medium Download PDF

Info

Publication number
CN113703571B
CN113703571B CN202110972624.XA CN202110972624A CN113703571B CN 113703571 B CN113703571 B CN 113703571B CN 202110972624 A CN202110972624 A CN 202110972624A CN 113703571 B CN113703571 B CN 113703571B
Authority
CN
China
Prior art keywords
virtual
virtual object
human hand
field
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110972624.XA
Other languages
Chinese (zh)
Other versions
CN113703571A (en
Inventor
梁枫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202110972624.XA priority Critical patent/CN113703571B/en
Publication of CN113703571A publication Critical patent/CN113703571A/en
Application granted granted Critical
Publication of CN113703571B publication Critical patent/CN113703571B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method, a device, equipment and a medium for virtual reality man-machine interaction, wherein the method comprises the following steps: screening out a virtual object actually interacted with a human hand; the method comprises the steps of obtaining positions of a virtual object and a human hand in a virtual view field and contents of the virtual object respectively; determining the moving speed of a virtual cursor arranged in the virtual field of view according to the positions of the virtual object and the human hand in the virtual field of view and the content of the virtual object; if the virtual object is in the far field region, acquiring the distance between the virtual object and the human hand in the virtual field according to the positions of the virtual object and the human hand in the virtual field region; determining and setting the moving speed of the virtual cursor according to the distance and the first preset relation; the first preset relation is set corresponding to the content of the virtual object; the far field region refers to a region where the arms of the user straighten and the hands of the user cannot reach. Therefore, the method can realize that a person can accurately control the virtual object, and avoid the fatigue of the user.

Description

Virtual reality man-machine interaction method, device, equipment and medium
Technical Field
The embodiment of the invention relates to the technical field of virtual reality, in particular to a method, a device, equipment and a medium for virtual reality man-machine interaction.
Background
Currently, user interaction with virtual objects in an AR/VR/MR device is mainly done by the following ways: firstly, the hands of the user are identified and tracked through the equipment, the hands of the user are reconstructed in the corresponding positions of the virtual positions, so that the user can interact with the virtual objects, but the interaction range of the mode is limited to the reach of the hands, in addition, the mode needs that the user frequently moves the hands by using the fatigued gesture (the gesture of stretching the hands to grasp is shown in fig. 7), so that the user is fatigued easily and cannot use for a long time; secondly, a user wears the handle, virtual light is emitted from the handle, and the user can interact with a remote object by seeing the place pointed by the light and then pressing the button; however, in this way, the types and the number of interactions are limited to the combination of keys, and are unfavorable for the interaction between the user and the object at a close distance, and in addition, the light spot is large in the near-field light and small in the far-field light, so that the user can hardly identify the light spot; furthermore, the farther the object is, the faster the spot will move with increasing distance, (e.g., the object is at two meters of the hand, the hand is turned 10 degrees, the calculated spot will move about 34 cm, and if the object is at 4 meters, the same action, the spot will move 68 cm), the control of the object is less accurate, less interactive, more learning cost and less efficient, by interacting with the object through virtual light.
Disclosure of Invention
The invention provides a method, a device, equipment and a medium for virtual reality man-machine interaction, which are used for realizing that a person can accurately control a virtual object in a virtual world and avoiding tiredness of a user in the process of accurately controlling the virtual object.
In order to achieve the above object, an embodiment of a first aspect of the present invention provides a method for virtual reality human-computer interaction, including the following steps:
screening out a virtual object actually interacted with a human hand;
acquiring positions of the virtual object and the human hand in a virtual view field and contents of the virtual object respectively;
determining the moving speed of a virtual cursor arranged in a virtual view field according to the positions of the virtual object and the human hand in the virtual view field and the content of the virtual object;
if the virtual object is in the far field region, acquiring the distance between the virtual object and the human hand in the virtual field according to the positions of the virtual object and the human hand in the virtual field;
determining and setting the moving speed of the virtual cursor according to the distance and a first preset relation; the first preset relation is set corresponding to the content of the virtual object;
Wherein, the far field region refers to the region where the arm of the user straightens and the human hand is not reachable.
According to an embodiment of the present invention, the determining a movement speed of a virtual cursor disposed in a virtual field of view according to positions of the virtual object and the human hand in the virtual field of view, respectively, contents of the virtual object includes:
if the human hand is in the comfort zone and the virtual object is in the comfort zone, the moving speed is the same as the moving speed of the human hand;
if the human hand is in a comfortable area and the virtual object is in a near field area, acquiring the distance between the virtual object and the human hand in the virtual field according to the positions of the virtual object and the human hand in the virtual field, and determining to set the moving speed of the virtual cursor according to the distance and a second preset relation, wherein the second preset relation is set corresponding to the content of the virtual object;
if the human hand is in the near field region and the virtual object is in the comfort region or the near field region, the moving speed is the same as the moving speed of the human hand;
the comfort area is an area formed by drawing a circle on the palm by taking an elbow as a supporting point; the near field region refers to arm straightening, and the region formed by the palm circle drawing removes the rest region of the comfort region.
According to one embodiment of the present invention, the method for virtual reality human-computer interaction further includes: determining whether a virtual cursor arranged in a virtual view field is displayed in the virtual view field according to the positions of the virtual object and the human hand in the virtual view field respectively;
if the human hand is in the comfortable area and the virtual object is in the comfortable area, the virtual cursor is not displayed;
if the human hand is in the comfortable area and the virtual object is in the near field area or the far field area, displaying the virtual cursor;
if the human hand is in the near field region, the virtual object is in the comfort region or the near field region, and the virtual cursor is not displayed;
if the human hand is in the near field region and the virtual object is in the far field region, displaying the virtual cursor;
the comfort area is an area formed by drawing a circle on the palm by taking an elbow as a supporting point; the near field region refers to arm straightening, and the region formed by the palm circle drawing removes the rest region of the comfort region.
According to an embodiment of the present invention, before determining whether a virtual cursor provided in the virtual field of view is displayed in the virtual field of view according to positions of the virtual object and the human hand in the virtual field of view, respectively, further includes:
Determining whether a virtual cursor arranged in the virtual view field is displayed in the virtual view field according to the content of the virtual object;
if the content of the virtual object and the human hand have feedback actions, displaying the virtual cursor; if there is no feedback action, the step of determining whether a virtual cursor provided in the virtual field of view is displayed in the virtual field of view according to the positions of the virtual object and the human hand in the virtual field of view, respectively, is performed.
According to one embodiment of the invention, the virtual cursor displayed in the virtual field of view is maintained a first preset distance from the virtual object.
According to one embodiment of the present invention, the virtual cursor is adsorbed on the virtual object when a distance between the virtual cursor and the virtual object is smaller than the first preset distance.
According to one embodiment of the present invention, the method for virtual reality human-computer interaction further includes:
acquiring the gesture of the human hand;
and adjusting one or more of the size, the moving speed and the shape of the virtual cursor displayed in the virtual field of view according to the gesture of the human hand.
According to one embodiment of the present invention, the screening out the virtual object actually interacted with the human hand includes:
acquiring positions of a plurality of virtual objects and human hands in virtual fields of view respectively;
screening the virtual objects to be interacted with the human hand according to the positions of each virtual object and the human hand in the virtual view field respectively, and placing the virtual objects in an interaction list;
and screening out virtual objects actually interacted with the human hand from the interaction list.
According to one embodiment of the present invention, the filtering the virtual object to be interacted with the human hand according to the positions of each virtual object and the human hand in the virtual view field respectively and placing the virtual object to be interacted with the human hand in the interaction list comprises:
if the human hand is in the comfort zone and the virtual object is in the comfort zone, placing the virtual object in the interaction list;
if the human hand is in a comfortable area and the virtual object is in a near field area or a far field area, a virtual light ray is emitted, the virtual light ray surrounds the virtual object within a second preset distance range, and the virtual object is placed in the interaction list;
if the human hand is in the near field region, the virtual object is in the near field region, and then the virtual object is placed in the interaction list;
If the human hand is in a near field region and the virtual object is in a far field region, a virtual ray is emitted, the virtual ray surrounds the virtual object within a second preset distance range, and the virtual object is placed in the interaction list;
the comfort area is an area formed by drawing a circle on the palm by taking an elbow as a supporting point; the near field region refers to arm straightening, and the region formed by the palm circle drawing removes the rest region of the comfort region.
According to one embodiment of the invention, the virtual cursor is a 3D model of the palm.
To achieve the above object, an embodiment of a second aspect of the present invention provides a device for virtual reality human-computer interaction, including:
the screening module is used for screening out virtual objects actually interacted with human hands;
the acquisition module is used for acquiring the positions of the virtual object and the human hand in the virtual view field and the content of the virtual object respectively;
a moving speed determining module, configured to determine a moving speed of a virtual cursor set in a virtual field of view according to positions of the virtual object and the human hand in the virtual field of view, and contents of the virtual object, respectively;
If the virtual object is in a far field region, acquiring the distance between the virtual object and the human hand in the virtual field according to the positions of the virtual object and the human hand in the virtual field;
determining and setting the moving speed of the virtual cursor according to the distance and a first preset relation; the first preset relation is set corresponding to the content of the virtual object; wherein, the far field region refers to the region where the arm of the user straightens and the human hand is not reachable.
To achieve the above object, an embodiment of a third aspect of the present invention provides an electronic device for virtual reality human-computer interaction, the electronic device including:
one or more processors;
a storage means for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of virtual reality human-machine interaction as described previously.
To achieve the above object, a fourth aspect of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method of virtual reality human-machine interaction as described above.
The method, the device, the equipment and the medium for virtual reality man-machine interaction provided by the embodiment of the invention, wherein the method comprises the following steps: screening out a virtual object actually interacted with a human hand; acquiring positions of the virtual object and the human hand in a virtual view field and contents of the virtual object respectively; determining the moving speed of a virtual cursor arranged in a virtual view field according to the positions of the virtual object and the human hand in the virtual view field and the content of the virtual object; if the virtual object is in the far field region, acquiring the distance between the virtual object and the human hand in the virtual field according to the positions of the virtual object and the human hand in the virtual field; determining and setting the moving speed of the virtual cursor according to the distance and a first preset relation; the first preset relation is set corresponding to the content of the virtual object; wherein, the far field region refers to the region where the arm of the user straightens and the human hand is not reachable. Therefore, the method can be realized in the virtual world, people can precisely control the virtual object, and the fatigue of the user can be avoided in the process of precisely controlling the virtual object.
Drawings
FIG. 1 is a flow chart of a method of virtual reality human-machine interaction according to an embodiment of the invention;
FIG. 2 is a flow chart of a method of virtual reality human-machine interaction according to one embodiment of the invention;
FIG. 3 is a flow chart of a method of virtual reality human-machine interaction according to another embodiment of the invention;
FIG. 4 is a flow chart of a method of virtual reality human-machine interaction according to yet another embodiment of the invention;
FIG. 5 is a schematic block diagram of a virtual reality human-machine interaction device according to an embodiment of the present invention;
FIG. 6 is a schematic block diagram of an electronic device for virtual reality human-machine interaction according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of an interactive gesture of a user in the prior art;
FIG. 8 is a schematic illustration of a comfortable standing posture of a user in a method for virtual reality human-computer interaction according to an embodiment of the present invention;
FIG. 9 is a schematic illustration of a standing comfort gesture of a user in a method of virtual reality human-machine interaction according to an embodiment of the invention;
FIG. 10 is a schematic illustration of a comfortable standing posture of a user in a method of virtual reality human-machine interaction according to another embodiment of the invention;
FIG. 11 is a schematic illustration of a standing comfort gesture of a user in a method of virtual reality human-machine interaction according to yet another embodiment of the present invention;
FIG. 12 is a schematic diagram of a user's sitting comfort gesture in a method of virtual reality human-machine interaction according to an embodiment of the invention;
FIG. 13 is a schematic diagram of a user's sitting comfort gesture in a virtual reality human-machine interaction method according to an embodiment of the invention;
FIG. 14 is a schematic diagram of a user's sitting comfort gesture in a method of virtual reality human-machine interaction according to another embodiment of the invention;
fig. 15 is a schematic view of a virtual cursor in a virtual reality man-machine interaction method according to an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
At present, in virtual reality man-machine interaction, generally, a virtual light is emitted and displayed from a handle by wearing the handle, and then the user can interact with a distant object by pressing a button after seeing a place pointed by the light, so that when the user's hand moves at a first speed, a light spot interacted on the object moves at a second speed, the second speed is higher than the first speed, that is, the distance between the hand and the object is satisfied, the speed is equal to the angular speed multiplied by the distance (radius), the further the distance is, the higher the speed is, the user's hand moves at a very small speed, the speed reaching the object is very high, and then the user can quickly scratch the object (cup), and the object cannot be accurately controlled. Alternatively, the user's hand is moving at a very high speed, but the speed is still very low and thus slow across the object (building) relative to across a larger object.
In addition, the visual comfort distance is related to the size of the content (see recommended placement distance of the television and display for details). The larger the content (e.g., display window), the greater the visual comfort distance. For example, a 14 inch notebook screen is typically placed 60-70cm from the user. This distance is the edge of the interactive comfort zone. However, due to the nature of the AR/VR/MR content, the virtual window is typically 14 inches in size and above, that is, a visually pleasing window will typically be placed within the interactive near field, outside the interactive comfort zone. In this case, the method used by holonens 2 would allow the user to move the elbow continuously to click on interactable content (e.g. buttons, etc.) outside the comfort zone, but would also be fatiguing for the user quickly. Empirically, users feel tired in 20 minutes while continuously sliding pages to browse web pages. This is not acceptable for everyday work.
In order to solve the above problems, the embodiment of the invention provides a virtual reality man-machine interaction method, which can be implemented in a virtual world, so that a person can precisely control a virtual object.
Example 1
Fig. 1 is a flowchart of a method for virtual reality human-computer interaction according to an embodiment of the present invention. As shown in fig. 1, the method comprises the steps of:
S110, screening out virtual objects actually interacted with a human hand;
the virtual object is a static object (cube, toy gun, etc.), a dynamic object (animal, etc.), a network interface, etc. displayed in the virtual field of view.
S120, the positions of the virtual object and the human hand in the virtual view field and the content of the virtual object are obtained;
it will be appreciated that in this embodiment the virtual field of view is divided into three zones, the comfort zone, the near field zone and the far field zone, wherein the person is in a standing position (as shown in fig. 8 to 11), typically the elbow is behind the body such that the vertical line of the center of gravity of the whole arm is substantially parallel to the vertical direction of the body, i.e. the elbow is the fulcrum, the forearm is in an acute angle position with respect to the forearm, or the person is in a sitting position (as shown in fig. 12 to 14), typically the elbow is supported on the table top such that the forearm is in an acute angle position with respect to the forearm; in the two postures, the big arm does not basically move, so that the arm is not easy to feel tired; the reach of the palm will then be referred to as the comfort zone in both positions, i.e. in a position in which the forearm is substantially not moving and the forearm is at an acute angle to the forearm. In addition, the lower arm and the upper arm are unfolded at an obtuse angle, and the palm is in a range of touch, called near field region, in both standing and sitting postures. The near field region is a region remaining except for the comfort region. Furthermore, the region where the whole arm is not extended is called far field region, regardless of whether the person is in standing or sitting position. The ranges of the comfort zone, the near field zone and the far field zone can be preset in advance, or the comfort zone, the near field zone and the far field zone of different users can be defined in a personalized way through deep learning, that is, in the process of initializing the virtual reality device, the users can be prompted to take various gestures in the comfort zone, and the standards of judging the comfort zone according to the individual users are recorded and learned.
S130, determining the moving speed of a virtual cursor arranged in the virtual field of view according to the positions of the virtual object and the human hand in the virtual field of view and the content of the virtual object;
the content of the virtual object can be a stereoscopic pistol, a water bath, a building, a cup, a webpage, a little rabbit and the like, namely the content which is additionally represented by the virtual object.
S131, if the virtual object is in the far field region, acquiring the distance between the virtual object and the human hand in the virtual field region according to the positions of the virtual object and the human hand in the virtual field region respectively; determining and setting the moving speed of the virtual cursor according to the distance and the first preset relation; the first preset relation is set corresponding to the content of the virtual object;
after determining the positions of the virtual object and the human hand in the virtual field of view, if the virtual object is in the far field region and the human hand is in the near field region or the comfort region, the human hand cannot reach the virtual object, if the human hand is to interact with the virtual object, the virtual cursor can be set in the virtual field of view, and the movement of the virtual cursor can be controlled by establishing the corresponding relation between the human hand and the virtual cursor, so that the virtual cursor interacts with the virtual object, the interaction between the human hand and the virtual object is experienced, and the virtual cursor is separated from the human hand and is close to the setting of the virtual object. I.e. the virtual cursor is used to replace the real human hand to interact with the virtual object.
Based on this, the movement speed of the virtual cursor can be determined from the distance between the human hand and the virtual object in the virtual field of view, and the first preset relationship. The first preset relationship may be set in advance.
For example, the first preset relationship may beWhere V is the corrected movement speed of the virtual cursor, r is the distance between the human hand and the virtual object in the virtual field of view, ω is the angular speed of human hand movement, and a is the proportionality factor constant. Further, the speed V of the corrected virtual cursor movement may be obtained according to a first preset relationship, where V and r are nonlinear correlations, so that the speed V of the corrected virtual cursor movement is smaller than the speed of the virtual cursor movement before correction, that is, the speed V of the corrected virtual cursor movement is slowed down. This example is suitable for the example where the virtual object is a tiny object. For example, the virtual object is a cup with a cup handle, the user needs to experience the grasping of the cup handle, so that the hand motion in a comfort zone or near field region is not too large, and the motion of controlling the virtual cursor is slower than the pre-motion, so that the user can better experience the grasping of the cup handle.
In addition, the first preset relationship may be v=aωr, where V is the corrected movement speed of the virtual cursor, r is the distance between the human hand and the virtual object in the virtual field of view, ω is the angular speed of the human hand movement, and a is a proportionality factor constant. Further, the speed V of the corrected virtual cursor moving may be obtained according to a first preset relationship, where V and r are linear correlations, so that the moving speed V of the corrected virtual cursor is greater than the moving speed V of the virtual cursor before correction, that is, the moving speed V of the corrected virtual cursor becomes faster.
Furthermore, the first preset relationship may be v=ωr, that is, according to the virtual object content, there is no need to correct the moving speed of the virtual cursor, and whether V of the first two increases or decreases is relative to v=ωr.
Thus, the moving speed of the virtual cursor is determined according to the distance between the human hand and the virtual object in the virtual field of view and the first preset relation, so that the moving speed of the virtual cursor is different from the moving speed of the virtual cursor before correction. Wherein the first preset relationship is related to the content of the virtual object itself, and is determined when the content of the virtual object itself is determined and when the position in the virtual field of view is determined. The setting of the first preset relationship may be generated by means of deep learning. Finally, the moving speed of the virtual cursor can be determined according to the distance and the first preset relation. Furthermore, the user can grasp the virtual object more accurately, and the user experience is improved.
S132, if the human hand is in the comfortable area and the virtual object is in the comfortable area, the moving speed is the same as that of the human hand;
that is, the user brings the human hand in a comfortable position and the virtual object is also in the comfortable position, and the moving speed of the virtual cursor is the same as the moving speed of the human hand. The virtual cursor can be overlapped with the hand, and the moving speed of the virtual cursor is not required to be corrected because the virtual object is at hand. Therefore, the moving speed of the virtual cursor is the same as that of a human hand, and the user can truly experience the interaction with the virtual object.
S133, if the human hand is in the comfort zone and the virtual object is in the near field zone, acquiring the distance between the virtual object and the human hand in the virtual field zone according to the positions of the virtual object and the human hand in the virtual field zone respectively, and determining the moving speed of the set virtual cursor according to the distance and a second preset relation, wherein the second preset relation is set corresponding to the content of the virtual object;
that is, the user makes the human hand in the comfortable region with a comfortable posture, but the virtual object is in the near field region, and the user needs to straighten the arm to reach the virtual object, but the user does not want to straighten the arm, and then the movement speed of the virtual cursor is determined according to the distance and the second preset relation, the second preset relation may be v ' =ωr, that is, the movement speed ω of the user's hand is obtained by the movement speed ω of the user's hand, and then the movement speed of the virtual cursor may be obtained according to the second relation and the distance. The virtual cursor is separated from the human hand and is set close to the virtual object. Furthermore, the comfort level of the user can be met, and the fatigue of the user is avoided. And also does not affect the interactive experience with the virtual object.
In this step, the setting in step S131 can be referred to. Increasing or decreasing the moving speed of the virtual cursor according to the content of the virtual object is relative to v' =ωr.
S134, if the human hand is in the near field region and the virtual object is in the comfort region or the near field region, the moving speed is the same as that of the human hand;
that is, the user makes the human hand in the near field region in a comfortable posture, which means that the user stretches the arm, and then the user can touch the virtual object in the comfortable region or the near field region, and the moving speed of the virtual cursor is the same as the moving speed of the human hand, wherein the virtual cursor can be coincident with the human hand, and the user can actually experience the interaction with the virtual object.
Example two
Fig. 2 is a flowchart of a method for virtual reality human-machine interaction according to an embodiment of the invention. This embodiment is a further optimization on the basis of embodiment one, as shown in fig. 2, the method comprising:
s210, screening out virtual objects actually interacted with a human hand;
s220, acquiring positions of the virtual object and the human hand in a virtual view field respectively;
step S210 and step S220 are already described in detail in the first embodiment, and are not described here again.
S230, determining whether a virtual cursor arranged in the virtual view field is displayed in the virtual view field according to the positions of the virtual object and the human hand in the virtual view field respectively;
s231, if the human hand is in the comfort zone and the virtual object is in the comfort zone, the virtual cursor is not displayed;
That is, the user brings his/her hand in a comfortable position in the comfort zone and the virtual object is also in the comfort zone, and the user can touch the virtual object as long as he/she reaches his/her hand, thereby interacting with the virtual object, and there is no need to display a virtual cursor.
S232, if the human hand is in the comfortable area and the virtual object is in the near field area or the far field area, displaying a virtual cursor;
that is, the user makes his hand in a comfortable area in a comfortable posture, and the virtual object is in a near field area or a far field area, and the user cannot touch the virtual object by extending his hand, and then the user needs to interact with the virtual object through the virtual cursor, and the interaction between the virtual cursor and the virtual object replaces the interaction between the user's hand and the virtual object. The fatigue of the user's arm can be reduced.
S233, if the human hand is in the near field region and the virtual object is in the comfort region or the near field region, the virtual cursor is not displayed;
that is, the user brings his/her hand in a comfortable posture in the near field region, and the virtual object is in the comfort region or the near field region, and the user can touch the virtual object as long as he/she stretches his/her hand, thereby interacting with the virtual object, and there is no need to display a virtual cursor.
S234, if the human hand is in the near field region and the virtual object is in the far field region, the virtual cursor is displayed.
That is, the user makes his hand in a comfortable posture in the near field region, and the virtual object in the far field region, and the user cannot touch the virtual object by extending his hand, interaction with the virtual object through the virtual cursor is required, and the interaction of the virtual cursor with the virtual object replaces the interaction of his hand with the virtual object.
It should be noted that, the virtual cursor displayed in the virtual field of view maintains a first preset distance from the virtual object. In step S232 and step S234, during the interaction between the user and the virtual object, the virtual cursor is displayed in the virtual field, so that in order to ensure better interaction between the virtual cursor and the virtual object, the virtual cursor needs to keep a first preset distance (can be set in advance) from the virtual object, that is, the virtual cursor is on a sphere centered on the virtual object and having a radius of the first preset distance.
It is understood that the virtual cursor is attached to the virtual object when the distance between the virtual cursor and the virtual object is smaller than the first preset distance.
That is, the distance between the virtual cursor and the virtual object is generally a first preset distance, but when the virtual cursor is triggered externally, such as a pinch gesture of a user, so that the distance between the virtual cursor and the virtual object is smaller than the first preset distance, the virtual cursor is adsorbed on the virtual object and moves along with the movement of the virtual object.
According to an embodiment of the present invention, before determining whether a virtual cursor provided in the virtual field of view is displayed in the virtual field of view according to positions of the virtual object and the human hand in the virtual field of view, respectively, further includes:
determining whether a virtual cursor arranged in the virtual field of view is displayed in the virtual field of view according to the content of the virtual object;
if the content of the virtual object and the human hand have feedback actions, displaying a virtual cursor; if there is no feedback action, the step of determining whether a virtual cursor provided in the virtual field of view is displayed in the virtual field of view is performed in accordance with the positions of the virtual object and the human hand in the virtual field of view, respectively.
It should be noted that, the feedback action refers to feedback to the hand of the user in the process of interaction between the virtual object and the hand of the user, for example, when playing a gun fight game, the gun has a backward gun pressing process after gun firing, so that the hand has a tremble feeling, and in this way, no matter in which region the virtual object gun is located, in which region the hand is located, the virtual cursor is displayed, so as to reflect tremble after gun firing. It is understood that if the virtual object is in the near field region or the far field region, the human hand is in the comfort region, or the virtual object is in the far field region, and the human hand is in the near field region, the virtual cursor displayed in the virtual field is maintained at a first preset distance from the virtual object. When the distance between the virtual cursor and the virtual object is smaller than the first preset distance, the virtual cursor is adsorbed on the virtual object. That is, the virtual cursor is adsorbed on the gun, and when the gun vibrates after the gun is opened, the virtual cursor can be driven to vibrate, and the real gun pressing scene is restored, so that the user experience is improved. It should be noted that, if the virtual object is in the comfort zone, the human hand is in the comfort zone, or the virtual object is in the near field zone or the comfort zone, and the human hand is in the near field zone, the display of the virtual cursor coincides with the human hand.
For another example, the virtual object is a little rabbit, and the user wants to experience and feed the little rabbit, then when the user holds the carrot and feeds the little rabbit, the little rabbit bites the carrot, and the user's hand can tremble along with the carrot, and at this moment, the virtual cursor adsorbs on the carrot, can reflect tremble sense.
In another embodiment, the virtual cursor is attached to the virtual object and also plays a role as a flag. For example, the virtual object is a water box, the water box is filled with water, for realistic experience, the water box is generally transparent, after a user plays with water in the water box, the user needs to move the water box, and the user can always display the virtual cursor without seeing the water box, so that the virtual cursor plays a role of a sign, and the virtual cursor is dragged to drag the water box.
For another example, there is an intelligent door outside two meters, there is a handle on the door, through thing networking and object identification, the user can attach virtual cursor on the entity handle, through virtual cursor "pulling" handle. The movement of the handle is actually effected by the motor but is experienced as a similar, spacious manipulation.
Alternatively, the action of pushing the object may be set as a one-hand nudge action, similar to pushing a door. When the virtual cursor is attached to the door and replicates the motion of a user's hand nudge, the user can naturally relate the intent of the push to the actual interaction that will occur. If only one spot is used as a virtual cursor, this is insufficient for the user to relate his hand movements to the interactions to take place.
According to one embodiment of the present invention, the method for virtual reality human-computer interaction further comprises:
acquiring the gesture of a human hand;
and adjusting one or more of the size, the moving speed and the shape of the virtual cursor displayed in the virtual field according to the gesture of the human hand.
In the first and second embodiments, the moving speed of the virtual cursor and in what case the virtual cursor is displayed are explained.
And when the moving speed of the virtual cursor is determined, the gesture of the human hand can be obtained, for example, the two fingers can always do separate actions, so that the moving speed of the virtual cursor can be continuously amplified, or the two fingers can always do furling actions, so that the moving speed of the virtual cursor can be continuously reduced.
The virtual cursor may be a 3D model of the palm, such as a human hand (as shown in fig. 15, the middle hand is the virtual cursor) or a cat or other animal claw, etc. During the user interaction with the virtual object, the action of the virtual cursor may be exactly the same as the action of a human hand.
In addition, in some embodiments, the virtual object is smaller, and the preset virtual cursor is displayed larger, at this time, the gesture of the hand of the user, such as a gesture of folding the whole palm, may be obtained to reduce the display size of the virtual cursor, and similarly, the virtual object is larger, and the preset virtual cursor is displayed smaller, at this time, the gesture of the hand of the user, such as a gesture of repeatedly unfolding the whole palm, may be obtained to enlarge the display size of the virtual cursor. So that the virtual cursor and the virtual object are more matched in size proportion in the process of interaction between the user and the virtual object.
Therefore, the movement speed, the size, the shape and the like of the preset virtual cursor can be adjusted according to the gesture of the human hand, the requirements of the user are met, and the user experience is improved. Ideally, the user should feel that they have more than one virtual hand that can be extended indefinitely, helping them interact with virtual objects in the far field of the near field, without impeding their interaction in the comfort zone.
Furthermore, the present embodiment also proposes an index for evaluating the interactive fatigue, i.e., measuring the moving distance of the elbow per unit time, which is called elbow average speed. When a user performs comfortable interaction in a sitting posture, the elbow does not move, but the forearm does circular motion by taking the elbow as a fulcrum, and under the condition, the average speed of the elbow is generally 1-2cm/s; in standing position, when the user makes comfortable interaction, the forearm makes circular motion by taking the elbow as a fulcrum, the elbow can moderately move by taking the elbow as a balance center of gravity, and the average speed of the elbow is generally 3-5cm/s; when the user needs to reach near field objects, the elbow average speed is generally not slower than 10cm/s, and the user is easy to fatigue. In the test, when the user clicks by using HoloLens2, the average elbow speed is about 10cm/s, and the fatigue interval is reached.
Example III
Fig. 3 is a flowchart of a method for virtual reality human-computer interaction according to another embodiment of the invention. This embodiment is a further optimization of embodiment one and embodiment two, as shown in fig. 3, the method comprising:
s310, acquiring positions of a plurality of virtual objects and human hands in virtual fields of view respectively;
s320, screening virtual objects to be interacted with the human hand according to the positions of each virtual object and the human hand in the virtual view field respectively, and placing the virtual objects and the human hand in an interaction list;
s330, screening out virtual objects actually interacted with the human hand from the interaction list.
According to one embodiment of the invention, screening virtual objects to be interacted with a human hand according to the positions of each virtual object and the human hand in the virtual field of view respectively and placing the virtual objects and the human hand in an interaction list comprises:
s331, if the human hand is in the comfortable area and the virtual object is in the comfortable area, placing the virtual object in an interaction list;
s332, if the human hand is in the comfort zone and the virtual object is in the near field zone or the far field zone, a virtual light is emitted, the virtual light surrounds the virtual object within a second preset distance range, and the virtual object is placed in the interaction list;
s333, if the human hand is in the near field region and the virtual object is in the near field region, placing the virtual object in an interaction list;
S334, if the human hand is in the near field region and the virtual object is in the far field region, a virtual ray is emitted, the virtual ray surrounds the virtual object within a second preset distance range, and the virtual object is placed in the interaction list;
it should be noted that, the virtual light in step S332 and step S334 may be displayed or not, and the user may set according to the experience requirement.
For example, the plurality of virtual objects respectively comprise a toy gun, a cube, a cup, a mirror, an umbrella and a little rabbit, wherein the toy gun and the cube are arranged in a comfort zone; the cup and the mirror are arranged in the near field region; the umbrella and the little rabbit are arranged in the far field.
When a human hand is in the comfort zone, the toy gun and the cube in the comfort zone can interact with a user, and then the toy gun and the cube in the comfort zone are placed in an interaction list. In addition, when the human hand is in the comfort zone, it is also possible to interact with the virtual object in the near field zone or the far field zone only to maintain the comfortable gesture, and this is judged according to whether the virtual light is around the virtual object within the second preset distance range, when the virtual light hits the virtual object or the virtual light is around the virtual object within the second preset distance range, it is indicated that these virtual objects are likely to interact with the human hand, and then the virtual objects are placed in the interaction list.
When the human hand is in the near field region, the user does not want to interact with the virtual object in the comfort region, but may want to interact with the virtual object in the near field region, so that the cup and the mirror in the near field region are placed in the interaction list. In addition, when the human hand is in the near field region, it is also possible to want to interact with the virtual object in the far field region, at this time, the virtual light is judged according to whether the virtual light is around the virtual object within the second preset distance range, when the virtual light hits the virtual object, or the virtual light is around the virtual object within the second preset distance range, which indicates that the virtual objects are likely to interact with the human hand, and then the virtual objects are placed in the interaction list.
Thus, a list is obtained of interactions with the virtual objects according to the user's intent. And then, screening the virtual interaction object with the highest user intention from the list to be used as the virtual object of the actual interaction. The process can be performed by deep learning according to the habit, hobbies and the like of the user. After determining the position of the virtual object, the content of the virtual object, the moving speed of the virtual cursor and whether to display, the size, shape, and the like of the display may be determined according to the position of the user's hand. Further, the user may interact with the virtual object.
According to one embodiment of the present invention, as shown in fig. 4, the method includes:
s401, positions of a plurality of virtual objects and human hands in virtual fields of view are obtained;
s402, screening virtual objects to be interacted with human hands according to the positions of each virtual object and the human hands in the virtual view field respectively, and placing the virtual objects and the human hands in an interaction list;
s403, screening out virtual objects actually interacted with human hands from the interaction list;
s404, acquiring positions of a virtual object and a human hand in a virtual view field and contents of the virtual object respectively;
s405, determining the moving speed of a virtual cursor arranged in the virtual field of view according to the positions of the virtual object and the human hand in the virtual field of view and the content of the virtual object;
meanwhile, S406, determining whether a virtual cursor set in the virtual field of view is displayed in the virtual field of view according to positions of the virtual object and the human hand in the virtual field of view, respectively;
s407, acquiring the gesture of the human hand;
and S408, adjusting one or more of the size, the moving speed and the shape of the virtual cursor displayed in the virtual field according to the gesture of the human hand.
Based on this, the scheme provides more far field interaction possibilities, improves accuracy of far field interaction, increases feedback, and provides more application directions of gesture interaction, thereby improving the user experience when using the AR/VR/MR device as a whole.
Example IV
Fig. 5 is a schematic block diagram of a virtual reality man-machine interaction device according to an embodiment of the invention. As shown in fig. 5, the apparatus 100 includes:
the screening module 110 is used for screening out virtual objects actually interacted with the human hand;
an obtaining module 120, configured to obtain positions of a virtual object and a human hand in a virtual field of view, and contents of the virtual object, respectively;
a moving speed determining module 130, configured to determine a moving speed of a virtual cursor set in a virtual field of view according to positions of a virtual object and a human hand in the virtual field of view, respectively, and contents of the virtual object;
the first moving speed determining submodule is used for acquiring the distance between the virtual object and the human hand in the virtual view field according to the positions of the virtual object and the human hand in the virtual view field respectively if the virtual object is in the far field region;
determining and setting the moving speed of the virtual cursor according to the distance and the first preset relation; the first preset relation is set corresponding to the content of the virtual object;
the far field region refers to a region where the arms of the user straighten and the hands of the user cannot reach.
According to an embodiment of the present invention, the determining a movement speed of a virtual cursor disposed in a virtual field of view according to positions of the virtual object and the human hand in the virtual field of view, respectively, contents of the virtual object includes:
A second movement speed determination sub-module, if the human hand is in a comfort zone and the virtual object is in a comfort zone, the movement speed is the same as the movement speed of the human hand;
a third movement speed determining sub-module, configured to obtain a distance between the virtual object and the human hand in the virtual field of view according to positions of the virtual object and the human hand in the virtual field of view, respectively, if the human hand is in the comfort zone and the virtual object is in the near field zone, and determine to set a movement speed of the virtual cursor according to the distance and a second preset relationship, where the second preset relationship is set corresponding to the content of the virtual object;
a fourth movement speed determination submodule, if the human hand is in a near field region and the virtual object is in a comfort region or a near field region, the movement speed is the same as the movement speed of the human hand;
the comfort area is an area formed by drawing a circle on the palm by taking an elbow as a supporting point; the near field region refers to arm straightening, and the region formed by the palm circle drawing removes the rest region of the comfort region.
According to one embodiment of the present invention, the virtual reality human-computer interaction device further includes:
The virtual cursor display module is used for determining whether a virtual cursor arranged in the virtual view field is displayed in the virtual view field according to the positions of the virtual object and the human hand in the virtual view field respectively;
the first virtual cursor display sub-module is used for displaying the virtual cursor if the human hand is in the comfortable area and the virtual object is in the comfortable area;
the second virtual cursor display sub-module is used for displaying the virtual cursor if the human hand is in the comfort zone and the virtual object is in the near field zone or the far field zone;
the third virtual cursor display sub-module is used for displaying the virtual cursor if the human hand is in the near field region and the virtual object is in the comfort region or the near field region;
a fourth virtual cursor display sub-module, configured to display the virtual cursor if the human hand is in a near field region and the virtual object is in a far field region;
the comfort area is an area formed by drawing a circle on the palm by taking an elbow as a supporting point; the near field region refers to arm straightening, and the region formed by the palm circle drawing removes the rest region of the comfort region.
According to one embodiment of the present invention, further comprising:
The virtual cursor special display module is used for determining whether the virtual cursor arranged in the virtual view field is displayed in the virtual view field or not according to the content of the virtual object;
if the content of the virtual object and the human hand have feedback actions, displaying the virtual cursor; and if the feedback action is not available, operating according to the function of the virtual cursor display module.
According to one embodiment of the invention, the virtual cursor displayed in the virtual field of view is maintained a first preset distance from the virtual object.
According to one embodiment of the present invention, the virtual cursor is adsorbed on the virtual object when a distance between the virtual cursor and the virtual object is smaller than the first preset distance.
According to an embodiment of the present invention, the virtual reality human-computer interaction device further includes:
the gesture acquisition module is used for acquiring the gesture of the human hand;
and the adjusting module is used for adjusting one or more of the size, the moving speed and the shape of the virtual cursor displayed in the virtual field of view according to the gesture of the human hand.
According to one embodiment of the invention, the apparatus further comprises:
The first screening submodule is used for acquiring positions of a plurality of virtual objects and human hands in the virtual view fields respectively;
the second screening sub-module is used for screening the virtual objects to be interacted with the human hand according to the positions of each virtual object and the human hand in the virtual view field and placing the virtual objects in the interaction list;
and the third screening sub-module is used for screening the virtual object actually interacted with the human hand from the interaction list.
According to one embodiment of the invention, the second screening sub-module comprises:
a first screening subunit, configured to place the virtual object in the interaction list if the human hand is in a comfort zone and the virtual object is in a comfort zone;
a second screening subunit, configured to, if the human hand is in a comfort zone and the virtual object is in a near field zone or a far field zone, send out a virtual ray, where the virtual ray surrounds the virtual object within a second preset distance range, and place the virtual object in the interaction list;
a third screening subunit, configured to place the virtual object in the interaction list if the human hand is in a near field region and the virtual object is in a near field region;
A fourth screening subunit, configured to, if the human hand is in a near field region and the virtual object is in a far field region, send out a virtual ray, where the virtual ray surrounds the virtual object within a second preset distance range, and place the virtual object in the interaction list;
the comfort area is an area formed by drawing a circle on the palm by taking an elbow as a supporting point; the near field region refers to arm straightening, and the region formed by the palm circle drawing removes the rest region of the comfort region.
According to one embodiment of the invention, the virtual cursor is a 3D model of the palm.
The embodiment is a part of the apparatus corresponding to the first to third embodiments, and the functions of the related modules are described in the first to third embodiments, which are not repeated here. The product can execute the method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method. It should be noted that, in the embodiment of the apparatus, each module included is only divided according to the functional logic, but not limited to the above division, so long as the corresponding function can be implemented; in addition, the specific names of the functional modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present invention.
Example five
Fig. 6 is a schematic block diagram of an electronic device with virtual reality man-machine interaction according to an embodiment of the invention. As shown in fig. 6, the electronic device 500 includes:
one or more processors 501;
a storage 502 for storing one or more programs;
the one or more programs, when executed by the one or more processors 501, cause the one or more processors 501 to implement the method of virtual reality human-machine interaction as described above.
The method comprises the following steps:
s110, screening out virtual objects actually interacted with a human hand;
s120, the positions of the virtual object and the human hand in the virtual view field and the content of the virtual object are obtained;
s130, determining the moving speed of a virtual cursor arranged in the virtual field of view according to the positions of the virtual object and the human hand in the virtual field of view and the content of the virtual object;
as shown in fig. 6, the electronic device 500 includes a processor 501, a storage device 502, an input device 503, and an output device 505; the number of processors 501 in the device may be one or more, one processor 501 being taken as an example in fig. 6; the processor 501, the storage 502, the input 503 and the output 505 of the apparatus may be connected by a bus or otherwise, in fig. 6 by way of example.
The storage device 502 is used as a computer readable storage medium, and can be used to store a software program, a computer executable program, and a module, such as program instructions corresponding to a virtual reality man-machine interaction method in an embodiment of the present invention. The processor 501 executes software programs, instructions and modules stored in the storage 502 to perform various functional applications and data processing of the device, i.e., to implement the above-described virtual reality human-machine interaction method.
The storage 502 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for functions; the storage data area may store data created according to the use of the terminal, etc. In addition, the storage 502 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid-state storage device. In some examples, the storage 502 may further include memory located remotely from the processor 501, which may be connected to the device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input means 503 may be used to receive input command requests and to generate key signal inputs related to the teacher's settings and function control of the device. The output device 505 may include a display device such as a display screen.
Example six
The embodiment of the present invention also proposes a computer readable storage medium, on which a computer program is stored, which when executed by the processor 501 implements a method of virtual reality human-machine interaction as described above.
The method comprises the following steps:
s110, screening out virtual objects actually interacted with a human hand;
s120, the positions of the virtual object and the human hand in the virtual view field and the content of the virtual object are obtained;
s130, determining the moving speed of a virtual cursor arranged in the virtual field of view according to the positions of the virtual object and the human hand in the virtual field of view and the content of the virtual object;
that is, a storage medium containing computer executable instructions provided in an embodiment of the present invention may perform related operations in the virtual reality human-machine interaction method provided in any embodiment of the present invention.
From the above description of embodiments, it will be clear to a person skilled in the art that the present invention may be implemented by means of software and necessary general purpose hardware, but of course also by means of hardware, although in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, etc., and include several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments of the present invention.
In summary, the method, the device, the equipment and the medium for virtual reality man-machine interaction provided by the embodiment of the invention, wherein the method comprises the following steps: screening out a virtual object actually interacted with a human hand; acquiring positions of the virtual object and the human hand in a virtual view field and contents of the virtual object respectively; determining the moving speed of a virtual cursor arranged in a virtual view field according to the positions of the virtual object and the human hand in the virtual view field and the content of the virtual object; if the virtual object is in the far field region, acquiring the distance between the virtual object and the human hand in the virtual field according to the positions of the virtual object and the human hand in the virtual field; determining and setting the moving speed of the virtual cursor according to the distance and a first preset relation; the first preset relation is set corresponding to the content of the virtual object; wherein, the far field region refers to the region where the arm of the user straightens and the human hand is not reachable. Therefore, the method can be realized in the virtual world, people can precisely control the virtual object, and the fatigue of the user can be avoided in the process of precisely controlling the virtual object.
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (12)

1. The virtual reality man-machine interaction method is characterized by comprising the following steps of:
screening out a virtual object actually interacted with a human hand;
acquiring positions of the virtual object and the human hand in a virtual view field and contents of the virtual object respectively;
determining the moving speed of a virtual cursor arranged in a virtual view field according to the positions of the virtual object and the human hand in the virtual view field and the content of the virtual object;
if the virtual object is in the far field region, acquiring the distance between the virtual object and the human hand in the virtual field according to the positions of the virtual object and the human hand in the virtual field;
Determining and setting the moving speed of the virtual cursor according to the distance and a first preset relation; the first preset relation is set corresponding to the content of the virtual object;
wherein the far field region refers to an area where the arm of the user straightens and the hand cannot reach;
if the human hand is in the comfort zone and the virtual object is in the comfort zone, the moving speed is the same as the moving speed of the human hand;
if the human hand is in a comfortable area and the virtual object is in a near field area, acquiring the distance between the virtual object and the human hand in the virtual field according to the positions of the virtual object and the human hand in the virtual field, and determining to set the moving speed of the virtual cursor according to the distance and a second preset relation, wherein the second preset relation is set corresponding to the content of the virtual object;
if the human hand is in the near field region and the virtual object is in the comfort region or the near field region, the moving speed is the same as the moving speed of the human hand;
the comfort zone is a zone formed by drawing a circle on the palm by taking an elbow as a supporting point; the near field region refers to arm straightening, and the region formed by the palm circle drawing removes the rest region of the comfort region.
2. The method of virtual reality human-machine interaction of claim 1, further comprising: determining whether a virtual cursor arranged in a virtual view field is displayed in the virtual view field according to the positions of the virtual object and the human hand in the virtual view field respectively;
if the human hand is in the comfortable area and the virtual object is in the comfortable area, the virtual cursor is not displayed;
if the human hand is in the comfortable area and the virtual object is in the near field area or the far field area, displaying the virtual cursor;
if the human hand is in the near field region, the virtual object is in the comfort region or the near field region, and the virtual cursor is not displayed;
if the human hand is in the near field region and the virtual object is in the far field region, displaying the virtual cursor;
the comfort area is an area formed by drawing a circle on the palm by taking an elbow as a supporting point; the near field region refers to arm straightening, and the region formed by the palm circle drawing removes the rest region of the comfort region.
3. The method of virtual reality human-machine interaction of claim 2, further comprising, prior to determining whether a virtual cursor disposed in the virtual field of view is displayed in the virtual field of view based on the positions of the virtual object and the human hand in the virtual field of view, respectively:
Determining whether a virtual cursor arranged in the virtual view field is displayed in the virtual view field according to the content of the virtual object;
if the content of the virtual object and the human hand have feedback actions, displaying the virtual cursor; if there is no feedback action, the step of determining whether a virtual cursor provided in the virtual field of view is displayed in the virtual field of view according to the positions of the virtual object and the human hand in the virtual field of view, respectively, is performed.
4. A method of virtual reality human-machine interaction according to claim 2 or 3, characterized in that,
and the virtual cursor displayed in the virtual field of view keeps a first preset distance from the virtual object.
5. The method of claim 4, wherein the virtual cursor is attached to the virtual object when the distance between the virtual cursor and the virtual object is less than the first preset distance.
6. A method of virtual reality human-machine interaction according to claim 2 or 3, further comprising:
acquiring the gesture of the human hand;
and adjusting one or more of the size, the moving speed and the shape of the virtual cursor displayed in the virtual field of view according to the gesture of the human hand.
7. The method for virtual reality human-computer interaction according to claim 1, wherein the screening out the virtual objects actually interacted with the human hand comprises:
acquiring positions of a plurality of virtual objects and human hands in virtual fields of view respectively;
screening the virtual objects to be interacted with the human hand according to the positions of each virtual object and the human hand in the virtual view field respectively, and placing the virtual objects in an interaction list;
and screening out virtual objects actually interacted with the human hand from the interaction list.
8. The method of virtual reality human-machine interaction of claim 7, wherein the screening for virtual objects to be interacted with by the human hand and placing in an interaction list based on the position of each of the virtual objects and the human hand in a virtual field of view, respectively, comprises:
if the human hand is in the comfort zone and the virtual object is in the comfort zone, placing the virtual object in the interaction list;
if the human hand is in a comfortable area and the virtual object is in a near field area or a far field area, a virtual light ray is emitted, the virtual light ray surrounds the virtual object within a second preset distance range, and the virtual object is placed in the interaction list;
If the human hand is in the near field region, the virtual object is in the near field region, and then the virtual object is placed in the interaction list;
if the human hand is in a near field region and the virtual object is in a far field region, a virtual ray is emitted, the virtual ray surrounds the virtual object within a second preset distance range, and the virtual object is placed in the interaction list;
the comfort area is an area formed by drawing a circle on the palm by taking an elbow as a supporting point; the near field region refers to arm straightening, and the region formed by the palm circle drawing removes the rest region of the comfort region.
9. The method of claim 1, wherein the virtual cursor is a 3D model of a palm.
10. A virtual reality human-machine interaction device, comprising:
the screening module is used for screening out virtual objects actually interacted with human hands;
the acquisition module is used for acquiring the positions of the virtual object and the human hand in the virtual view field and the content of the virtual object respectively;
a moving speed determining module, configured to determine a moving speed of a virtual cursor set in a virtual field of view according to positions of the virtual object and the human hand in the virtual field of view, and contents of the virtual object, respectively;
If the virtual object is in a far field region, acquiring the distance between the virtual object and the human hand in the virtual field according to the positions of the virtual object and the human hand in the virtual field;
determining and setting the moving speed of the virtual cursor according to the distance and a first preset relation; the first preset relation is set corresponding to the content of the virtual object; wherein the far field region refers to an area where the arm of the user straightens and the hand cannot reach;
if the human hand is in the comfort zone and the virtual object is in the comfort zone, the moving speed is the same as the moving speed of the human hand;
if the human hand is in a comfortable area and the virtual object is in a near field area, acquiring the distance between the virtual object and the human hand in the virtual field according to the positions of the virtual object and the human hand in the virtual field, and determining to set the moving speed of the virtual cursor according to the distance and a second preset relation, wherein the second preset relation is set corresponding to the content of the virtual object;
if the human hand is in the near field region and the virtual object is in the comfort region or the near field region, the moving speed is the same as the moving speed of the human hand;
The comfort zone is a zone formed by drawing a circle on the palm by taking an elbow as a supporting point; the near field region refers to arm straightening, and the region formed by the palm circle drawing removes the rest region of the comfort region.
11. An electronic device for virtual reality human-machine interaction, the electronic device comprising:
one or more processors;
a storage means for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of virtual reality human-machine interaction of any of claims 1-9.
12. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements a method of virtual reality human-machine interaction as claimed in any of claims 1-9.
CN202110972624.XA 2021-08-24 2021-08-24 Virtual reality man-machine interaction method, device, equipment and medium Active CN113703571B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110972624.XA CN113703571B (en) 2021-08-24 2021-08-24 Virtual reality man-machine interaction method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110972624.XA CN113703571B (en) 2021-08-24 2021-08-24 Virtual reality man-machine interaction method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN113703571A CN113703571A (en) 2021-11-26
CN113703571B true CN113703571B (en) 2024-02-06

Family

ID=78654256

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110972624.XA Active CN113703571B (en) 2021-08-24 2021-08-24 Virtual reality man-machine interaction method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN113703571B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115861581B (en) * 2023-02-08 2023-05-05 成都艺馨达科技有限公司 Mobile internet cloud service method and system based on mixed reality

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107077169A (en) * 2014-11-14 2017-08-18 高通股份有限公司 Spatial interaction in augmented reality
CN107077297A (en) * 2014-11-11 2017-08-18 高通股份有限公司 System and method for controlling cursor based on finger pressure and direction
CN109597544A (en) * 2018-11-23 2019-04-09 青岛海信电器股份有限公司 Input exchange method, device, equipment and storage medium
CN110075519A (en) * 2019-05-06 2019-08-02 网易(杭州)网络有限公司 Information processing method and device, storage medium and electronic equipment in virtual reality
CN111589113A (en) * 2020-04-28 2020-08-28 腾讯科技(深圳)有限公司 Virtual mark display method, device, equipment and storage medium
CN112650391A (en) * 2020-12-23 2021-04-13 网易(杭州)网络有限公司 Human-computer interaction method, device and equipment based on virtual reality and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180046352A1 (en) * 2016-08-09 2018-02-15 Matthew Johnson Virtual cursor movement
US11320957B2 (en) * 2019-01-11 2022-05-03 Microsoft Technology Licensing, Llc Near interaction mode for far virtual object

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107077297A (en) * 2014-11-11 2017-08-18 高通股份有限公司 System and method for controlling cursor based on finger pressure and direction
CN107077169A (en) * 2014-11-14 2017-08-18 高通股份有限公司 Spatial interaction in augmented reality
CN109597544A (en) * 2018-11-23 2019-04-09 青岛海信电器股份有限公司 Input exchange method, device, equipment and storage medium
CN110075519A (en) * 2019-05-06 2019-08-02 网易(杭州)网络有限公司 Information processing method and device, storage medium and electronic equipment in virtual reality
CN111589113A (en) * 2020-04-28 2020-08-28 腾讯科技(深圳)有限公司 Virtual mark display method, device, equipment and storage medium
CN112650391A (en) * 2020-12-23 2021-04-13 网易(杭州)网络有限公司 Human-computer interaction method, device and equipment based on virtual reality and storage medium

Also Published As

Publication number Publication date
CN113703571A (en) 2021-11-26

Similar Documents

Publication Publication Date Title
US11557102B2 (en) Methods for manipulating objects in an environment
US9628783B2 (en) Method for interacting with virtual environment using stereoscope attached to computing device and modifying view of virtual environment based on user input in order to be displayed on portion of display
CN106873767B (en) Operation control method and device for virtual reality application
CN106843498B (en) Dynamic interface interaction method and device based on virtual reality
CN102789313B (en) User interaction system and method
CN108052202A (en) A kind of 3D exchange methods, device, computer equipment and storage medium
CN112424727A (en) Cross-modal input fusion for wearable systems
US9304646B2 (en) Multi-user content interactions
JP5914739B1 (en) Program to control the head mounted display system
US9373025B2 (en) Structured lighting-based content interactions in multiple environments
US9213420B2 (en) Structured lighting based content interactions
US9367124B2 (en) Multi-application content interactions
CN114402589A (en) Smart stylus beam and secondary probability input for element mapping in 2D and 3D graphical user interfaces
RU2667720C1 (en) Method of imitation modeling and controlling virtual sphere in mobile device
WO2019028855A1 (en) Virtual display device, intelligent interaction method, and cloud server
CN113703571B (en) Virtual reality man-machine interaction method, device, equipment and medium
JP2017187757A (en) Image display device
CN117130518A (en) Control display method, head display device, electronic device and readable storage medium
WO2023093329A1 (en) Information output method, head-mounted display device and readable storage medium
CN105224211B (en) A kind of method of controlling operation thereof of operation object, device and mobile terminal
CN106569654A (en) Virtual reality interface display method and virtual reality device
CN109643182A (en) Information processing method and device, cloud processing equipment and computer program product
CN113961069A (en) Augmented reality interaction method and device suitable for real object and storage medium
KR101962464B1 (en) Gesture recognition apparatus for functional control
US20240096043A1 (en) Display method, apparatus, electronic device and storage medium for a virtual input device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant