CN108227928B - Picking method and device in virtual reality scene - Google Patents

Picking method and device in virtual reality scene Download PDF

Info

Publication number
CN108227928B
CN108227928B CN201810021624.XA CN201810021624A CN108227928B CN 108227928 B CN108227928 B CN 108227928B CN 201810021624 A CN201810021624 A CN 201810021624A CN 108227928 B CN108227928 B CN 108227928B
Authority
CN
China
Prior art keywords
picking
pick
pickup
geometry
length
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810021624.XA
Other languages
Chinese (zh)
Other versions
CN108227928A (en
Inventor
姚巍
梁效富
张薇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics China R&D Center, Samsung Electronics Co Ltd filed Critical Samsung Electronics China R&D Center
Priority to CN201810021624.XA priority Critical patent/CN108227928B/en
Publication of CN108227928A publication Critical patent/CN108227928A/en
Application granted granted Critical
Publication of CN108227928B publication Critical patent/CN108227928B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves

Abstract

The invention discloses a picking method and a device in a virtual reality scene, which receive finger pressure data through a data acquisition module, acquire position data and orientation data of a virtual hand in the virtual reality scene through an IMU (inertial measurement Unit) module, construct a picking geometry through a physical computation module, judge whether a bounding box of an object in the virtual reality scene is intersected with the position of the picking geometry, and further determine whether the object is picked. Compared with the prior art, the technical scheme of the invention only needs to pick up and judge the hand pressure of multiple contacts without gesture recognition on each specific finger, reduces the complexity of software calculation, and is more efficient than the existing accurate picking up and judging method. The invention simultaneously improves the immersion of the virtual reality interaction of the user and improves the user experience.

Description

Picking method and device in virtual reality scene
Technical Field
The invention relates to the field of virtual reality interactive design, in particular to a method and a device for picking up in a virtual reality scene based on multi-contact pressure judgment.
Background
At present, the existing virtual reality scene pickup technology generally adopts the following modes:
(1) focusing an object in the virtual scene using the control handle, confirming the pick-up using a button on the handle;
(2) using eyeball tracking to pick up an object according to the sight direction;
(3) and performing gesture recognition according to the position relation of each finger of the virtual hand to pick up the finger.
Chinese patent application publication No. CN103473814A discloses a three-dimensional geometric primitive picking method based on GPU, which closes rasterization during picking and drawing, transforms mouse position information and primitive vertex coordinates to a normalized device coordinate system or a viewport coordinate system, performs hit determination by determining the relationship between a projected two-dimensional primitive and a mouse position or a selection frame in a geometric processor, and returns a picking result to an application program by using transformation feedback. The patent application mainly selects small pixels through mouse clicking and real-time and accurate picking through GPU operation. According to the technical scheme, the fusion degree between the virtual reality and the actual motion is small, and the three-dimensional geometric primitives cannot be picked up by the motion of limbs, particularly the picking up motion of hands.
Chinese patent application publication No. CN106406511A discloses a method for determining motion information based on a point contact type sensing unit, which includes: an information acquisition method based on a point contact type sensing unit data glove and an action information judgment system, the point contact type sensing unit data glove, an inductance point contact type sensing unit data glove, a pressure sensing point contact type sensing unit data glove and a hand-holding point contact type sensing unit data glove auxiliary tool; the problems that the traditional data gloves are high in cost, complex in construction and large in occupied computing resources in use are solved; according to the scheme, each main point contact data is collected from the surface of the data collection equipment, and the characteristics of the main point contact data are compared with the set fixed action data to obtain the expressed action. The patent application mainly completes gesture motion recognition through feature comparison of fixed motion data, and as a certain fact is necessarily required to be done when a certain motion is adopted by depending on a large amount of original fixed motion data and judging according to experience, for the purpose of unknown motion, only depending on the fixed motion data features, erroneous judgment is easy to occur.
Chinese patent application publication No. CN106445085A discloses an interaction control method and device based on virtual reality, which includes: under the condition that the apparent focus is detected to be in a control area corresponding to a control object, displaying the apparent focus on the control object; and executing the operation corresponding to the control object under the condition that the duration of the apparent focus in the control area reaches a first preset value. According to the interaction control method and device based on the virtual reality, the requirement on the operation precision of a user can be lowered, and therefore the user experience can be improved. The patent application mainly judges a pickup object through a focus, and when the focus is in an object control area, the pickup can be completed only after the determination of a preset time. The patent application can not imitate the limb movement of a person and can not give people a sense of being personally on the scene.
Chinese patent application publication No. CN106445176A discloses a human-computer interaction system and method based on virtual reality technology, including: the virtual reality head display equipment, the controller and the body sensing recognition device are used, when a user needs to interact with a virtual scene, the body sensing recognition device can capture body sensing data information of the user and output the body sensing data information to the controller, the controller converts the acquired body sensing data set information into a user motion data processing signal and outputs the user motion data processing signal to the virtual reality head display equipment, and the motion model in the virtual scene displayed by the virtual reality head display equipment is controlled to move synchronously to complete human-computer interaction. The application mainly uses the somatosensory data to display the motion situation of a user in a virtual reality scene in real time, and lacks interactive control between the user and an object in the virtual scene.
Chinese patent application publication No. CN103955295A discloses a real-time grasping method of virtual hands based on data gloves and physical engines, which includes: firstly, starting a computer, a data glove and a position tracker; wearing the data glove, and operating the computer to acquire data sent by the data glove and the position tracker; thirdly, wearing the data glove to bend the fingers and rotate the palm in a translation manner, sending the position and change data of the fingers and the palm to a program by the data glove and the position tracker, and driving the fingers and the palm of the virtual hand to move by the program through receiving the data; simulating a grabbing action by a hand using the data glove in reality to control a virtual hand in the computer to grab the object, wherein the program can judge whether the virtual object is in a grabbed state; if the grabbing is not satisfied, returning to the third step, and continuing to execute the third step and the fourth step until the program exits; if the condition of being grabbed is met, executing a fifth step; and fifthly, controlling the change of the position and the posture of the virtual object by calculating the change of the position and the posture of the virtual hand, thereby controlling the movement and the rotation of the virtual object. According to the technical scheme, the data glove is only used for obtaining the gesture of the user to complete the picking action in the virtual reality scene, the application range is narrow, and when the data glove is separated from the data glove, the user cannot grab the virtual scene due to the fact that the gesture of the user cannot be determined.
As can be seen from the above-mentioned patent publications and other documents in the prior art, the pick-up interaction in the virtual reality scene is mostly completed by the controller selection, eye tracking or gesture judgment. The two methods of controller selection and eyeball tracking cannot simulate real picking actions, and the gesture judgment method is large in calculation amount and not suitable for real-time calculation scenes.
The existing picking technology cannot be close to nature in operation, the realistic experience brought by virtual reality can be reduced, or the cost is higher, and the applicability is poor:
(1) using a control handle: the real picking operation cannot be simulated, misoperation is easily caused, and the immersion feeling is lacked;
(2) eyeball tracking: the response speed is low, the precision is not guaranteed well, and operation feedback cannot be carried out;
(3) and (3) gesture judgment: the calculation amount is large, and the method is not suitable for real-time picking judgment with high precision.
Therefore, the technology in the virtual reality technology still has a great room for improvement.
Disclosure of Invention
In view of this, the present invention provides a method and an apparatus for picking up in a virtual reality scene, so as to efficiently simulate real picking-up operations and improve the immersion of virtual reality user interaction.
The technical scheme of the invention is realized as follows:
a method of picking up in a virtual reality scene, comprising:
receiving finger pressure data;
constructing a pick-up geometry;
determining or modifying the spatial position and the spatial rotation degree of the picking geometric body according to the position data and the orientation data of the virtual hand in the virtual reality scene;
traversing bounding boxes of each object in the virtual reality scene;
determining an object as a pick-up object when its bounding box intersects the spatial location of the pick-up geometry;
modifying a size of the pick-up geometry according to the finger pressure data;
determining whether to pick up the picked-up object and when to release the picked-up object based on the degree of intersection of the pick-up geometry and the bounding box of the picked-up object.
Further, the method further comprises:
when the picking object is determined to be picked up, sending control information for picking up the picking object to the virtual reality equipment;
when it is determined that the pickup object is released, control information for canceling the pickup of the pickup object is transmitted to the virtual reality device.
Further, the finger pressure data is derived from a data glove, a flexible sensor or a pressure sensor mounted on a finger joint.
Further, constructing a pick-up geometry comprising:
acquiring a measurement parameter of the virtual hand;
constructing a picking coordinate system;
and determining the size of the picking geometric body according to the picking coordinate system and the measurement parameters of the virtual hand.
Further, the measurement parameters of the virtual hand comprise a palm length L of the virtual hand, a palm width W of the virtual hand and a thumb length R of the virtual hand;
under the state that the palm of the virtual hand is straightened and the thumb is perpendicular to the rest four fingers, the direction of the index finger of the virtual hand is taken as the positive direction of an x axis, the direction of the thumb is taken as the positive direction of a z axis, and the direction perpendicular to the palm and far away from the palm is taken as the positive direction of a y axis, so that the picking coordinate system is constructed;
the pick-up geometry is located on one side of the palm of the virtual hand.
Further, the geometry of the pick-up geometry comprises:
the device comprises a pickup line segment, a pickup cylinder, a pickup sphere, a pickup prism table and a pickup cuboid.
Further, the geometry is determined based on the number of fingers measured and the user's selection.
Further, the number of measured fingers is at least two, wherein the measured fingers must include a thumb.
Further, when the number of measured fingers is two, the user selectable geometric form is to pick up line segments;
when the number of measured fingers is three, the user-selectable geometric forms include a pickup line segment, a pickup cylinder, and a pickup sphere;
when the number of the measured fingers is four, the geometric forms selectable by the user comprise a pickup line segment, a pickup cylinder, a pickup ball, a pickup prism table and a pickup cuboid;
when the number of fingers measured is five, the user-selectable geometric forms include a pickup line segment, a pickup cylinder, a pickup sphere, a pickup prism, and a pickup cuboid.
Further, when the geometry is to pick up a line segment:
the pickup line segment is located in a plane formed by the x axis and the z axis;
the pick-up line segment has an initial length of
Figure GDA0002749246620000041
The variation range of the length of the picking line section is 0 to
Figure GDA0002749246620000042
Further, modifying a size of the pick-up geometry based on the finger pressure data comprises:
acquiring a pressure value of the thumb, if the pressure value of the thumb is increased, reducing the length of the pickup line segment, and if the pressure value of the thumb is reduced, increasing the length of the pickup line segment;
and acquiring the maximum pressure value of the pressure values of the fingers except the thumb, reducing the length of the pickup line segment if the maximum pressure value is increased, and increasing the length of the pickup line segment if the maximum pressure value is reduced.
Further, the length of the pickup line segment is increased or decreased according to the following formula:
Length1'=Length1×α/F
wherein, Length1' Length is the modified Length of the pickup line segment1And the length of the picked line segment before modification is obtained, alpha is an adjusting coefficient, and F is the pressure value of the finger.
Further, when the geometry is a pick cylinder:
the axis of the pickup cylinder is parallel to the z-axis, the height of the pickup cylinder is W, and the initial diameter of the pickup cylinder is W
Figure GDA0002749246620000043
The diameter of the pick-up cylinder varies from 0 to
Figure GDA0002749246620000044
And the pick-up cylinder is positioned between the virtual fingers of the virtual hand corresponding to the measured finger.
Further, modifying a size of the pick-up geometry based on the finger pressure data comprises:
acquiring a pressure value of the thumb, if the pressure value of the thumb is increased, reducing the diameter of the pickup cylinder, and if the pressure value of the thumb is reduced, increasing the diameter of the pickup cylinder;
and acquiring the maximum pressure value of the pressure values of the fingers except the thumb, reducing the diameter of the pickup cylinder if the maximum pressure value is increased, and increasing the diameter of the pickup cylinder if the maximum pressure value is reduced.
Further, the diameter of the pick-up cylinder is increased or decreased according to the following formula:
Φ1'=Φ1×α/F
wherein phi1' modified diameter of the pick-up cylinder, [ phi ]1To pick up the diameter of the cylinder before modification, α is the adjustment factor and F is the pressure value of the finger.
Further, when the geometry is to pick up a sphere:
the pick-up sphere has an initial diameter of
Figure GDA0002749246620000051
The diameter of the picking ball is in the range of 0 to
Figure GDA0002749246620000052
And the picking sphere is positioned between the virtual fingers corresponding to the measured finger in the virtual hand.
Further, modifying a size of the pick-up geometry based on the finger pressure data comprises:
acquiring a pressure value of the thumb, if the pressure value of the thumb is increased, reducing the diameter of the picking ball, and if the pressure value of the thumb is reduced, increasing the diameter of the picking ball;
and acquiring the maximum pressure value of the pressure values of the fingers except the thumb, reducing the diameter of the picking ball body if the maximum pressure value is increased, and increasing the diameter of the picking ball body if the maximum pressure value is reduced.
Further, the diameter of the pick up sphere is increased or decreased according to the following formula:
Φ2'=Φ2×α/F
wherein phi2' modified diameter of the pick-up sphere, [ phi ]2To pick up the diameter of the ball before modification, alpha is the adjustment systemNumber, F is the pressure value of the finger.
Further, when the geometry is a pick-up land:
the length of picking up the terrace with edge bottom is H, wide is W, the initial length of picking up the terrace with edge top surface is L, the variation range of picking up the terrace with edge top surface length is 0 to L, the width of picking up the terrace with edge top surface is W, the initial height of picking up the terrace with edge is R, the variation range of picking up the terrace with edge height is 0 to R.
Further, modifying a size of the pick-up geometry based on the finger pressure data comprises:
acquiring a pressure value of the thumb, if the pressure value of the thumb is increased, reducing the length of the picking prism table in the y-axis direction, and if the pressure value of the thumb is reduced, increasing the length of the picking prism table in the y-axis direction;
acquiring the maximum pressure value of the pressure values of the fingers except the thumb, if the maximum pressure value is increased, reducing the length of the picking prismatic table in the x-axis direction, and if the maximum pressure value is reduced, increasing the length of the picking prismatic table in the x-axis direction.
Further, the length of the pickup prism in the x-axis and/or y-axis direction is increased or decreased according to the following formula:
Length2'=Length2×α/F
wherein, Length2' Length as modified Length of the pickup prism in the x-and/or y-axis directions2In order to pick up the length of the prism table before modification in the x-axis and/or y-axis direction, alpha is an adjustment coefficient, and F is a pressure value of the finger.
Further, when the geometry is a pick-up cuboid:
the initial length value of the picking cuboid in the x-axis direction is L, the initial length value of the picking cuboid in the y-axis direction is R, and the initial length value of the picking cuboid in the z-axis direction is W.
Further, modifying a size of the pick-up geometry based on the finger pressure data comprises:
acquiring a pressure value of a thumb, if the pressure value of the thumb is increased, reducing the length of the picking cuboid in the y-axis direction, and if the pressure value of the thumb is reduced, increasing the length of the picking cuboid in the y-axis direction;
and acquiring the maximum pressure value of the pressure values of the fingers except the thumb, reducing the length of the picking cuboid in the x-axis direction if the maximum pressure value is increased, and increasing the length of the picking cuboid in the x-axis direction if the maximum pressure value is reduced.
Further, the length of the pick-up geometry in the x-axis and/or y-axis direction is increased or decreased according to the following formula:
Length3'=Length3×α/F
wherein, Length3' Length is the modified Length of the pick-up cuboid in the x-axis and/or y-axis direction3In order to pick up the length of the cuboid before modification in the x-axis and/or y-axis direction, alpha is an adjusting coefficient, and F is a pressure value of the finger.
Further, when the geometric form is a pickup line segment, determining whether to pick up the pickup object and determining when to release the pickup object according to the degree of intersection of the pickup geometry and the bounding box of the pickup object includes:
if the picking object is not in the picking state, calculating the length of the intersection part of the picking line segment and the bounding box of the picking object, calculating the length of the current picking line segment, and if the picking condition formula (1) is satisfied, determining the picking object as the picking state:
Lintersect÷Lpick>β (1)
wherein L isintersectIs the length of the intersection of the pickup line segment and the bounding box of the pickup object, LpickTaking the length of the current pickup line segment and beta as a pickup judgment threshold;
if the pickup object is in a pickup state, calculating the length of the intersection part of the pickup line segment and the bounding box of the pickup object, calculating the length of the current pickup line segment, and if the pickup conditional formula (1) is not satisfied any more and within a first delay time period from the time when the pickup conditional formula (1) is not satisfied any more, releasing the pickup object, and clearing the pickup state of the pickup object.
Further, when the geometry is a pick cylinder, determining whether to pick up the pick up object and when to release the pick up object based on the degree of intersection of the pick up geometry and the bounding box of the pick up object, comprises:
if the picking object is not in the picking state, calculating the area of the intersection part of the cross section of the picking cylinder and the bounding box of the picking object, calculating the current cross section area of the picking cylinder, and if the picking condition formula (2) is met, determining the picking object as the picking state:
Aintersect1÷Apick1>β (2)
wherein A isintersect1Is the area of the intersection of the cross-section of the pick-up cylinder with the bounding box of the pick-up object, Apick1The area of the cross section of the current picking cylinder is beta, and the beta is a picking judgment threshold value;
if the pickup object is already in the pickup state, calculating the area of the intersection part of the cross section of the pickup cylinder and the bounding box of the pickup object, and calculating the current cross section area of the pickup cylinder, if the pickup condition formula (2) is no longer satisfied and the pickup condition formula (2) is not satisfied within a first delay time period from the time when the pickup condition formula (2) is no longer satisfied, releasing the pickup object, and clearing the pickup state of the pickup object.
Further, when the geometry is a pick sphere, determining whether to pick up the pick object and when to release the pick object based on the degree of intersection of the pick geometry and the bounding box of the pick object, comprises:
if the picked object is not in the picking state, calculating the area of the intersection part of the cross section of the picked sphere and the bounding box of the picked object, calculating the current cross section area of the picked sphere, and if the picking condition formula (3) is met, determining the picked object as the picking state:
Aintersect2÷Apick2>β (3)
wherein A isintersect2Is the area of the intersection of the cross section of the pick sphere with the bounding box of the pick object, Apick2The area of the cross section of the current picking sphere is beta, and the beta is a picking judgment threshold value;
if the picking object is in the picking state, calculating the area of the intersection part of the cross section of the picking sphere and the bounding box of the picking object, calculating the current cross section area of the picking sphere, if the picking conditional formula (3) is not satisfied any more and the picking conditional formula (3) is not satisfied within a first delay time period from the moment when the picking conditional formula (3) is not satisfied any more, releasing the picking object, and clearing the picking state of the picking object.
Further, when the geometry is a pick-up pyramid, determining whether to pick up the pick-up object and when to release the pick-up object based on the degree of intersection of the pick-up geometry and the bounding box of the pick-up object, comprises:
if the pickup object is not in a pickup state, calculating a volume of an intersection of the pickup prism and a bounding box of the pickup object, and calculating a current volume of the pickup prism, and if a pickup condition formula (4) is satisfied, determining the pickup object as a pickup state:
Vintersect1÷Vpick1>β (4)
wherein, Vintersect1Is the volume of the intersection of the picking prism and the bounding box of the picking object, Vpick1The volume of the current picking prism table is beta, and the beta is a picking judgment threshold value;
if the pickup object is already in the pickup state, calculating the volume of the intersection of the pickup prism and the bounding box of the pickup object, and calculating the current volume of the pickup prism, if the pickup condition formula (4) is no longer satisfied and the pickup condition formula (4) is not satisfied within a first delay period from the time when the pickup condition formula (4) is no longer satisfied, releasing the pickup object, and clearing the pickup state of the pickup object.
Further, when the geometry is a pick-up cuboid, determining whether to pick up the pick-up object and when to release the pick-up object according to the degree of intersection of the pick-up geometry and the bounding box of the pick-up object, comprises:
if the picking object is not in the picking state, calculating the volume of the intersection part of the picking rectangular parallelepiped and the bounding box of the picking object, and calculating the current volume of the picking rectangular parallelepiped, if the picking condition formula (5) is satisfied, determining the picking object as the picking state:
Vintersect2÷Vpick2>β (5)
wherein, Vintersect2Is the volume of the intersection of the pickup cuboid and the bounding box of the pickup object, Vpick2The volume of the current picking cuboid is beta, and the beta is a picking judgment threshold value;
if the picking object is in the picking state, calculating the volume of the intersection part of the picking cuboid and the bounding box of the picking object, calculating the volume of the current picking cuboid, and if the picking condition formula (5) is not satisfied any more and within a first delay time period from the moment when the picking condition formula (5) is not satisfied any more, releasing the picking object and clearing the picking state of the picking object.
Further, beta is 0.8 to 0.95.
Further, the duration of the first delay time period is 0.2s to 1.0 s.
Further, the method further comprises:
and further determining whether to pick the pick object according to the pick requirement of the pick object.
Further, the pick requirements for the pick object include: one-handed pick or two-handed pick.
Further, when a width of the picked-up object exceeds a picked-up volume threshold, and/or when a weight of the picked-up object exceeds a picked-up weight threshold: the picking requirement of the picking object is two-hand picking;
the pick-up requirement for the picked-up object is a one-handed pick-up when the width of the picked-up object does not exceed a pick-up volume threshold, and/or when the weight of the picked-up object does not exceed a pick-up weight threshold.
Further, the position data and orientation data of the virtual hand in the virtual reality scene are determined by the following method:
acquiring initial position data of a virtual hand in a virtual reality scene;
acquiring acceleration for controlling the virtual hand through an accelerometer;
acquiring an angular velocity for controlling the virtual hand through a gyroscope;
determining the speed of the virtual hand according to the acceleration and the refreshing time of the virtual reality scene;
acquiring new position data of the virtual hand in the virtual reality scene according to the speed of the virtual hand, the refreshing time of the virtual reality scene and the initial position data of the virtual hand in the virtual reality scene;
and carrying out data fusion on the acceleration and the angular speed to obtain orientation data of the virtual hand.
Further, the data fusion is performed by complementary filtering, kalman filtering and/or gradient descent methods.
A pickup device in a virtual reality scene, comprising:
a data acquisition module for receiving finger pressure data;
the system comprises an IMU module, a virtual hand positioning module and a virtual hand positioning module, wherein the IMU module is used for acquiring position data and orientation data of a virtual hand in a virtual reality scene;
the physical computing module performs computing according to data provided by the IMU module to construct a picking geometry, determines or modifies the spatial position and the spatial rotation degree of the picking geometry, traverses bounding boxes of each object in a virtual reality scene, determines an object as a picking object when the bounding box of the object intersects with the spatial position of the picking geometry, modifies the size of the picking geometry according to the finger pressure data, and determines whether to pick the picking object and when to release the picking object according to the intersection degree of the picking geometry and the bounding boxes of the picking object.
Further, the apparatus further comprises:
the communication module is connected to the physical computing module and is communicated with the virtual reality equipment;
when the picking object is determined to be picked up, the physical computing module sends control information for picking up the picking object to the virtual reality device through the communication module;
when the picking object is determined to be loosened, the physical computing module sends control information for canceling the picking of the picking object to the virtual reality device through the communication module.
Further, the data acquisition module is connected to a data glove, a flexible sensor or a pressure sensor mounted on a finger joint, so as to acquire and receive finger pressure data through the data glove, the flexible sensor or the pressure sensor mounted on the finger joint.
According to the picking method and the picking device in the virtual reality scene, the picking of the object in the virtual reality scene is judged based on the multi-touch pressure, the real and efficient picking operation is provided for the current virtual reality equipment, and compared with the prior art, the picking judgment is only carried out on the hand pressure of the multi-touch without carrying out gesture recognition on each specific finger, so that the software calculation complexity is reduced, and the picking judgment method and the picking judgment device are more efficient than the existing accurate picking judgment method. The invention simultaneously improves the immersion of the virtual reality interaction of the user and improves the user experience.
Drawings
FIG. 1 is a flow chart of the steps of a pick-up method in a virtual reality scenario of the present invention;
FIG. 2 is a flow chart of the steps of the present invention to construct a pick geometry;
FIG. 3 is a schematic view of a virtual hand;
FIG. 4 is a schematic diagram of a pick up coordinate system constructed by a virtual hand;
FIG. 5 is a schematic view of a pickup line segment;
FIG. 6 is a schematic view of a pick-up cylinder;
FIG. 7 is a schematic view of a pick up sphere;
FIG. 8 is a schematic view of a pickup prism;
FIG. 9 is a schematic view of a pick-up cuboid;
FIG. 10 is another schematic view of a pick-up cuboid;
FIG. 11 is a flowchart of the steps for obtaining position data and orientation data of the virtual hand in a virtual reality scene;
FIG. 12 is a block diagram of a pick-up device in a virtual reality scenario according to the present invention;
FIG. 13 is a flowchart of one particular implementation of a method and apparatus embodying the present invention;
FIG. 14 is a flow chart illustrating updating the position of a virtual hand;
FIG. 15A is a schematic view of a virtual hand approaching a ball to be picked up in scene 1;
FIG. 15B is a schematic view of a user bending a finger in an attempt to pick up a ball in scenario 1;
FIG. 15C is a schematic view of a ball being picked up by a user in scene 1;
FIG. 16A is a diagram of a user-selected line segment as "pick geometry" in scenario 2;
FIG. 16B is a schematic view of a user beginning to pick up a match by bending their finger in scene 2;
FIG. 16C is a schematic illustration of a match being picked in scenario 2;
FIG. 17A is a diagram of a user selecting a cylinder as a "pick geometry" in scenario 3;
FIG. 17B is a schematic view of a user beginning to pick up a cup by bending a finger in scenario 3;
FIG. 17C is a schematic view of a cup being picked up in scenario 3;
FIG. 18A is a diagram of a user selecting a sphere as a "pick geometry" in scenario 4;
FIG. 18B is a diagram of a user beginning to pick up an apple by bending a finger in scene 4;
FIG. 18C is a schematic diagram of an apple being picked in scene 4;
FIG. 19A is a diagram of a user selecting a cuboid as a "pick-up geometry" in scenario 5;
FIG. 19B is a schematic view of a user beginning to pick up a pencil box by bending a finger in scenario 5;
FIG. 19C is a schematic view of a pencil case being picked up in scene 5;
FIG. 20A is a schematic view of a user selecting a prism table as a "pick-up geometry" in scene 6;
FIG. 20B is a schematic view of a user beginning to pick up a lunch box by bending a finger in scene 6;
FIG. 20C is a schematic view of a lunch box being picked in scenario 6;
FIG. 21A is a diagram of scenario 7 where a user has only one hand to satisfy a large case pick condition;
FIG. 21B is a diagram of scenario 7 where the user has both hands satisfying the large box pick condition;
FIG. 21C is a schematic view of scene 7 with a large box being picked;
FIG. 22A is a diagram of a scenario 8 in which a user has only one hand satisfying shot pick conditions;
FIG. 22B is a diagram illustrating a scenario 8 in which both hands of the user satisfy shot pick-up conditions;
FIG. 22C is a schematic view of a shot being picked in scene 8;
FIG. 23 is a schematic view of an interface window for a pick-up method in scenario 9, which asks the user whether to use existing gesture recognition;
FIG. 24A is a diagram of a pop-up message showing the currently employed pickup method when picking up an apple in scene 10;
fig. 24B is a schematic diagram showing a pop-up message showing a currently employed pickup method when a pencil box is picked up in the scene 10;
fig. 25 is a schematic diagram of a VR device entering a power saving mode in scenario 11 automatically switching to cup pickup via a pickup geometry according to an embodiment of the invention;
FIG. 26A is a schematic diagram of a pick-up method that automatically switches to an embodiment of the invention when the temperature in scene 12 rises above the warning line of a VR device;
FIG. 26B is a diagram illustrating an automatic switch to a pick-up method according to an embodiment of the invention when the frame count in scene 12 is insufficient;
fig. 26C is a schematic diagram of automatic switching to the existing gesture recognition pickup method in the scene 12.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and examples.
As shown in fig. 1, the method for picking up in a virtual reality scene provided by the present invention mainly includes the following steps:
step 1, receiving finger pressure data;
step 2, constructing a pickup geometry;
step 3, determining or modifying the spatial position and the spatial rotation degree of the picking-up geometric body according to the position data and the orientation data of the virtual hand in the virtual reality scene;
step 4, traversing bounding boxes of all objects in the virtual reality scene;
step 5, when the bounding box of an object intersects with the spatial position of the picking geometric body, determining the object as a picking object;
step 6, modifying the size of the picking geometric body according to the finger pressure data;
and 7, determining whether the picked object is picked up and when the picked object is released according to the intersection degree of the picking geometric body and the bounding box of the picked object.
In addition, the method for picking up in the virtual reality scene further comprises the following steps:
when the picking object is determined to be picked up, sending control information for picking up the picking object to the virtual reality equipment;
when it is determined that the pickup object is released, control information for canceling the pickup of the pickup object is transmitted to the virtual reality device.
In the embodiment of the invention, the finger pressure data is from a data glove, a flexible sensor or a pressure sensor arranged on a finger joint.
In the invention, the picking geometry is used for judging whether a picking object in the virtual space is in a space which can be picked and held by a virtual hand, the picking geometry is positioned on one side of the virtual hand facing to the palm and not positioned on one side of the virtual hand facing to the back of the hand, and the size of the picking geometry changes along with the movement of fingers of the virtual hand. In the invention, when the bounding box of the picking object intersects with the picking geometry, the picking object is positioned in the space which can be picked and held by the virtual hand, and then whether to pick the virtual article can be further judged according to the intersection degree of the bounding box and the picking geometry.
As shown in fig. 2, in step 2 of the present invention, the pick-up geometry is constructed by the following method:
step 21, obtaining measurement parameters of the virtual hand;
step 22, constructing a pickup coordinate system;
and step 23, determining the size of the picking geometric body according to the picking coordinate system and the measurement parameters of the virtual hand.
As shown in fig. 3, the measurement parameters of the virtual hand include a palm length L of the virtual hand, a palm width W of the virtual hand, and a thumb length R of the virtual hand. As shown in fig. 4, in a state where the palm of the virtual hand is straightened and the thumb is perpendicular to the remaining four fingers, the picking coordinate system is constructed with the index finger direction of the virtual hand as the positive x-axis direction, the thumb direction as the positive z-axis direction, and the direction perpendicular to the palm and away from the palm as the positive y-axis direction.
In an embodiment of the present invention, the pickup geometry includes the following geometric forms: the device comprises a pickup line segment, a pickup cylinder, a pickup sphere, a pickup prism table and a pickup cuboid. According to the spirit of the present invention, it is within the technical scope of the present invention to adopt other similar geometric forms on the basis of several geometric forms of a pickup line segment, a pickup cylinder, a pickup sphere, a pickup prism and a pickup rectangular solid.
The construction of the pickup line segment, the pickup cylinder, the pickup sphere, the pickup prism, and the pickup rectangular parallelepiped are explained below, respectively.
In an embodiment of the invention, the geometrical form is determined according to the number of measured fingers and the selection of the user. Wherein, the number of the measured fingers is at least two, wherein, the measured fingers must include the thumb.
In the embodiment of the invention, when the number of the measured fingers is two, the geometric form selectable by a user is a pickup line segment; when the number of measured fingers is three, the user-selectable geometric forms include a pickup line segment, a pickup cylinder, and a pickup sphere; when the number of the measured fingers is four, the geometric forms selectable by the user comprise a pickup line segment, a pickup cylinder, a pickup ball, a pickup prism table and a pickup cuboid; when the number of fingers measured is five, the user-selectable geometric forms include a pickup line segment, a pickup cylinder, a pickup sphere, a pickup prism, and a pickup cuboid.
(a) Picking up line segments
As shown in FIG. 5, when the geometric form is a pickup line segment, the pickup line segment is located in a plane formed by the x-axis and the z-axis, and the pickup line segment has an initial length of
Figure GDA0002749246620000131
The variation range of the length of the picking line section is 0 to
Figure GDA0002749246620000132
When the pressure values of only two fingers can be measured, the geometric form of the embodiment of the invention can only adopt a pickup line segment, and when the pressure values of more than two fingers can be measured, the pickup line segment can also be adopted according to the selection of a user and/or the setting of a system.
Further, in step 6, modifying the size of the picking geometry according to the finger pressure data, wherein the geometry of the picking line segment includes:
acquiring a pressure value of the thumb, if the pressure value of the thumb is increased, reducing the length of the pickup line segment, and if the pressure value of the thumb is reduced, increasing the length of the pickup line segment;
and acquiring the maximum pressure value of the pressure values of the fingers except the thumb, reducing the length of the pickup line segment if the maximum pressure value is increased, and increasing the length of the pickup line segment if the maximum pressure value is reduced.
Specifically, the length of the pickup line segment is increased or decreased according to the following formula:
Length1'=Length1×α/F
wherein, Length1' Length is the modified Length of the pickup line segment1And the length of the picked line segment before modification is obtained, alpha is an adjusting coefficient, and F is the pressure value of the finger.
(b) Pick-up cylinder
As shown in FIG. 6, when the geometry is a pick cylinder, the axis of the pick cylinder is parallel to the z-axis, the height of the pick cylinder is W, and the pick cylinder has an initial diameter of W
Figure GDA0002749246620000133
The diameter of the pick-up cylinder varies from 0 to
Figure GDA0002749246620000134
And the pick-up cylinder is positioned between the virtual fingers of the virtual hand corresponding to the measured finger.
Further, in step 6, modifying the size of the pick-up geometry according to the finger pressure data, wherein the geometry of the pick-up cylinder includes:
acquiring a pressure value of the thumb, if the pressure value of the thumb is increased, reducing the diameter of the pickup cylinder, and if the pressure value of the thumb is reduced, increasing the diameter of the pickup cylinder;
and acquiring the maximum pressure value of the pressure values of the fingers except the thumb, reducing the diameter of the pickup cylinder if the maximum pressure value is increased, and increasing the diameter of the pickup cylinder if the maximum pressure value is reduced.
In particular, the diameter of the pick-up cylinder is increased or decreased according to the following formula:
Φ1'=Φ1×α/F
wherein phi1' modified diameter of the pick-up cylinder, [ phi ]1To pick up the diameter of the cylinder before modification, α is the adjustment factor and F is the pressure value of the finger.
(c) Picking up ball
When the geometry is a pick sphere, the initial diameter of the pick sphere is as shown in FIG. 7
Figure GDA0002749246620000141
The diameter of the picking ball is in the range of 0 to
Figure GDA0002749246620000142
And the picking sphere is positioned between the virtual fingers corresponding to the measured finger in the virtual hand.
Further, in step 6, modifying the size of the picking geometry according to the finger pressure data, wherein the geometry of the picking sphere includes:
acquiring a pressure value of the thumb, if the pressure value of the thumb is increased, reducing the diameter of the picking ball, and if the pressure value of the thumb is reduced, increasing the diameter of the picking ball;
and acquiring the maximum pressure value of the pressure values of the fingers except the thumb, reducing the diameter of the picking ball body if the maximum pressure value is increased, and increasing the diameter of the picking ball body if the maximum pressure value is reduced.
Specifically, the diameter of the pick up sphere is increased or decreased according to the following equation:
Φ2'=Φ2×α/F
wherein phi2' modified diameter of the pick-up sphere, [ phi ]2To pick up the diameter of the sphere before modification, α is the adjustment coefficient, and F is the pressure value of the finger.
(d) Picking-up prism table
As shown in fig. 8, when the geometrical form is a pickup prism, the length of the bottom surface of the pickup prism is H and the width thereof is W, the initial length of the top surface of the pickup prism is L, the variation range of the length of the top surface of the pickup prism is 0 to L, the width of the top surface of the pickup prism is W, the initial height of the pickup prism is R, and the variation range of the height of the pickup prism is 0 to R.
Further, step 6, modifying the size of the picking geometry according to the finger pressure data, wherein the geometry of the picking prism comprises:
acquiring a pressure value of the thumb, if the pressure value of the thumb is increased, reducing the length of the picking prism table in the y-axis direction, and if the pressure value of the thumb is reduced, increasing the length of the picking prism table in the y-axis direction;
acquiring the maximum pressure value of the pressure values of the fingers except the thumb, if the maximum pressure value is increased, reducing the length of the picking prismatic table in the x-axis direction, and if the maximum pressure value is reduced, increasing the length of the picking prismatic table in the x-axis direction.
Specifically, the length of the pickup prism in the x-axis and/or y-axis direction is increased or decreased according to the following formula:
Length2'=Length2×α/F
wherein, Length2' Length as modified Length of the pickup prism in the x-and/or y-axis directions2In order to pick up the length of the prism table before modification in the x-axis and/or y-axis direction, alpha is an adjustment coefficient, and F is a pressure value of the finger.
(e) Pick up cuboid
As shown in fig. 9, when the geometry is a pick-up rectangular parallelepiped, the pick-up rectangular parallelepiped has an initial length value L in the x-axis direction, an initial length value R in the y-axis direction, and an initial length value W in the z-axis direction.
Further, in step 6, modifying the size of the pick-up geometry according to the finger pressure data, wherein the pick-up geometry comprises:
acquiring a pressure value of a thumb, if the pressure value of the thumb is increased, reducing the length of the picking cuboid in the y-axis direction, and if the pressure value of the thumb is reduced, increasing the length of the picking cuboid in the y-axis direction;
and acquiring the maximum pressure value of the pressure values of the fingers except the thumb, reducing the length of the picking cuboid in the x-axis direction if the maximum pressure value is increased, and increasing the length of the picking cuboid in the x-axis direction if the maximum pressure value is reduced.
Specifically, the length of the pickup cuboid in the x-axis and/or y-axis direction is increased or decreased according to the following formula:
Length3'=Length3×α/F
wherein, Length3' Length is the modified Length of the pick-up cuboid in the x-axis and/or y-axis direction3In order to pick up the length of the cuboid before modification in the x-axis and/or y-axis direction, alpha is an adjusting coefficient, and F is a pressure value of the finger.
In addition, in a specific embodiment, for the picking-up rectangular parallelepiped, as shown in fig. 10, the maximum length value of the picking-up rectangular parallelepiped is the palm length L of the virtual hand, the width of the picking-up rectangular parallelepiped is the palm width W of the virtual hand, and the maximum height value of the picking-up rectangular parallelepiped is the thumb length R of the virtual hand. The pick-up cuboid is located on one side of the palm of the virtual hand and in the pick-up cuboid shown in fig. 10, the plane abgh is the plane next to the palm. When the virtual hand is stretched, the length of the picking cuboid reaches L.
As shown in fig. 11, in step 3, the position data and orientation data of the virtual hand in the virtual reality scene are determined by the following steps:
step 31, acquiring initial position data of a virtual hand in a virtual reality scene;
step 32, acquiring acceleration for controlling the virtual hand through an accelerometer;
step 33, acquiring the angular velocity for controlling the virtual hand through a gyroscope;
step 34, determining the speed of the virtual hand according to the acceleration and the refreshing time of the virtual reality scene;
step 35, obtaining new position data of the virtual hand in the virtual reality scene according to the speed of the virtual hand, the refreshing time of the virtual reality scene and the initial position data of the virtual hand in the virtual reality scene;
and step 36, performing data fusion on the acceleration and the angular velocity to obtain orientation data of the virtual hand, wherein the data fusion is performed through a complementary filtering method, a Kalman filtering method and/or a shaving descent method.
In step 7, according to the intersection degree of the picking geometry and the bounding box of the picking object, determining whether to pick up the picking object and determining when to release the picking object, wherein the picking geometry is a picking line segment, a picking cylinder, a picking sphere, a picking prism and a picking cuboid, which respectively adopt different forms, but the principles are the same, and the following specific principles are provided.
(a) Picking up line segments
When the geometric form is a pickup line segment, in step 7, determining whether to pick up the pickup object and determining when to release the pickup object according to the intersection degree of the pickup geometric form and the bounding box of the pickup object includes:
if the picking object is not in the picking state, calculating the length of the intersection part of the picking line segment and the bounding box of the picking object, calculating the length of the current picking line segment, and if the picking condition formula (1) is satisfied, determining the picking object as the picking state:
Lintersect÷Lpick>β (1)
wherein L isintersectIs the length of the intersection of the pickup line segment and the bounding box of the pickup object, LpickTaking the length of the current pickup line segment and beta as a pickup judgment threshold;
if the pickup object is in a pickup state, calculating the length of the intersection part of the pickup line segment and the bounding box of the pickup object, calculating the length of the current pickup line segment, and if the pickup conditional formula (1) is not satisfied any more and within a first delay time period from the time when the pickup conditional formula (1) is not satisfied any more, releasing the pickup object, and clearing the pickup state of the pickup object.
(b) Pick-up cylinder
When the geometry is a pick-up cylinder, determining whether to pick up the pick-up object and when to release the pick-up object according to the degree of intersection of the pick-up geometry and the bounding box of the pick-up object in step 7 comprises:
if the picking object is not in the picking state, calculating the area of the intersection part of the cross section of the picking cylinder and the bounding box of the picking object, calculating the current cross section area of the picking cylinder, and if the picking condition formula (2) is met, determining the picking object as the picking state:
Aintersect1÷Apick1>β (2)
wherein A isintersect1Is the area of the intersection of the cross-section of the pick-up cylinder with the bounding box of the pick-up object, Apick1The area of the cross section of the current picking cylinder is beta, and the beta is a picking judgment threshold value;
if the pickup object is already in the pickup state, calculating the area of the intersection part of the cross section of the pickup cylinder and the bounding box of the pickup object, and calculating the current cross section area of the pickup cylinder, if the pickup condition formula (2) is no longer satisfied and the pickup condition formula (2) is not satisfied within a first delay time period from the time when the pickup condition formula (2) is no longer satisfied, releasing the pickup object, and clearing the pickup state of the pickup object.
The cross section of the picking cylinder is any plane perpendicular to the axis of the cylinder.
(c) Picking up ball
When the geometry is a picking sphere, in step 7, determining whether to pick up the picking object and when to release the picking object according to the intersection degree of the picking geometry and the bounding box of the picking object includes:
if the picked object is not in the picking state, calculating the area of the intersection part of the cross section of the picked sphere and the bounding box of the picked object, calculating the current cross section area of the picked sphere, and if the picking condition formula (3) is met, determining the picked object as the picking state:
Aintersect2÷Apick2>β (3)
wherein A isintersect2Is the area of the intersection of the cross section of the pick sphere with the bounding box of the pick object, Apick2The area of the cross section of the current picking sphere is beta, and the beta is a picking judgment threshold value;
if the picking object is in the picking state, calculating the area of the intersection part of the cross section of the picking sphere and the bounding box of the picking object, calculating the current cross section area of the picking sphere, if the picking conditional formula (3) is not satisfied any more and the picking conditional formula (3) is not satisfied within a first delay time period from the moment when the picking conditional formula (3) is not satisfied any more, releasing the picking object, and clearing the picking state of the picking object.
Wherein, the cross section of the picking sphere is any plane passing through the center of the sphere.
(d) Picking-up prism table
When the geometry is a picking prism, in step 7, determining whether to pick up the picking object and when to release the picking object according to the intersection degree of the picking geometry and the bounding box of the picking object includes:
if the pickup object is not in a pickup state, calculating a volume of an intersection of the pickup prism and a bounding box of the pickup object, and calculating a current volume of the pickup prism, and if a pickup condition formula (4) is satisfied, determining the pickup object as a pickup state:
Vintersect1÷Vpick1>β (4)
wherein, Vintersect1Volume of intersection of the picking prism and bounding box of the picking object,Vpick1The volume of the current picking prism table is beta, and the beta is a picking judgment threshold value;
if the pickup object is already in the pickup state, calculating the volume of the intersection of the pickup prism and the bounding box of the pickup object, and calculating the current volume of the pickup prism, if the pickup condition formula (4) is no longer satisfied and the pickup condition formula (4) is not satisfied within a first delay period from the time when the pickup condition formula (4) is no longer satisfied, releasing the pickup object, and clearing the pickup state of the pickup object.
(e) Pick up cuboid
When the geometry is a rectangular picking block, in step 7, determining whether to pick up the picking object and when to release the picking object according to the intersection degree of the picking geometry and the bounding box of the picking object includes:
if the picking object is not in the picking state, calculating the volume of the intersection part of the picking rectangular parallelepiped and the bounding box of the picking object, and calculating the current volume of the picking rectangular parallelepiped, if the picking condition formula (5) is satisfied, determining the picking object as the picking state:
Vintersect2÷Vpick2>β (5)
wherein, Vintersect2Is the volume of the intersection of the pickup cuboid and the bounding box of the pickup object, Vpick2The volume of the current picking cuboid is beta, and the beta is a picking judgment threshold value;
if the picking object is in the picking state, calculating the volume of the intersection part of the picking cuboid and the bounding box of the picking object, calculating the volume of the current picking cuboid, and if the picking condition formula (5) is not satisfied any more and within a first delay time period from the moment when the picking condition formula (5) is not satisfied any more, releasing the picking object and clearing the picking state of the picking object.
In the above embodiments, β is 0.8 to 0.95, and the duration of the first delay time period is 0.2s to 1.0 s.
Considering that the shape, volume and weight parameters of the objects in the virtual scene are different and the picking up of the objects in the virtual scene is as real as possible, some objects in reality, such as large or heavy objects, need to be picked up with two hands, and the picking up should be performed with two hands in the virtual scene. Therefore, in the embodiment of the present invention, the method further includes:
and further determining whether to pick the pick object according to the pick requirement of the pick object.
Wherein the pick requirements of the pick object include: one-handed pick or two-handed pick.
Further:
when a width of the picked object exceeds a pickup volume threshold, and/or when a weight of the picked object exceeds a pickup weight threshold: the picking requirement of the picking object is two-hand picking;
the pick-up requirement for the picked-up object is a one-handed pick-up when the width of the picked-up object does not exceed a pick-up volume threshold, and/or when the weight of the picked-up object does not exceed a pick-up weight threshold.
The invention also provides a pickup device in a virtual reality scene, as shown in fig. 12, which includes a data acquisition module 1, an IMU (Inertial measurement unit) module 2, a physical computation module 3 and a communication module 4. The data acquisition module 1 is used for receiving finger pressure data. The IMU module 2 is used for acquiring position data and orientation data of a virtual hand in a virtual reality scene. The physical computation module 3 performs computation according to the data provided by the IMU module 2 to construct a pickup geometry, determine or modify the spatial position and spatial rotation of the pickup geometry, traverse bounding boxes of various objects in a virtual reality scene, determine an object as a pickup object when a bounding box of the object intersects with the spatial position of the pickup geometry, modify the size of the pickup geometry according to the finger pressure data, determine whether to pickup the pickup object and determine when to release the pickup object according to the intersection degree of the pickup geometry and the bounding boxes of the pickup object. The communication module 4 is connected to the physical computation module 3 and communicates with virtual reality equipment. When the picking object is determined to be picked up, the physical computing module 3 sends control information for picking up the picking object to the virtual reality device through the communication module 4; when it is determined that the pickup object is released, the physical computation module 3 transmits control information for canceling the pickup of the pickup object to the virtual reality device through the communication module 4.
The data acquisition module 1 is connected to a data glove, a flexible sensor or a pressure sensor mounted on a finger joint, so as to acquire and receive finger pressure data through the data glove, the flexible sensor or the pressure sensor mounted on the finger joint.
Hereinafter, the pickup method and apparatus in the virtual reality scene according to the present invention will be described again.
Fig. 13 is a flowchart of a specific implementation process of the picking method and apparatus in a virtual reality scene according to the present invention, which includes the following steps.
Step a1, obtaining the position information of the virtual hand through the IMU module, and then executing step a 2.
Step a2, the physical computation module judges whether there is an object in the virtual scene intersecting the pick-up geometry, if yes, step a3 is executed, otherwise, step a1 is returned.
Step a3, acquiring the finger pressure data through the data acquisition module, and then executing step a 4.
Step a4, acquiring the direction of the virtual hand through the IMU module, and then executing step a 5.
Step a5, judging whether the object can be picked up by the physical calculation module, if so, executing step a6, otherwise, returning to step a 3.
Step a6, sending the pick-up control information through the communication module to control the virtual hand to pick up the object, and then performing step a 7.
Step a7, acquiring the finger pressure data through the data acquisition module, and then executing step a 8.
Step a8, acquiring the direction of the virtual hand through the IMU module, and then executing step a 9.
Step a9, judging whether the object picking state can be maintained by the physical calculation module, if yes, returning to step a7, otherwise, executing step a 10.
In step a9, it is determined whether the picking state of the object can be maintained, that is, it is determined whether the intersection of the picking geometry and the bounding box of the picking object (object) is larger than the picking determination threshold compared with the picking geometry, which can be referred to the descriptions of formula (1), formula (2), formula (3), formula (4), and formula (5). If the intersection of the picking geometry and the bounding box of the picking object (object) exceeds the picking judgment threshold compared with the picking geometry in the step a9, the picking state of the object is judged to be maintained, and then the step a7 is executed; if the intersection of the picking geometry and the bounding box of the picking object (object) does not exceed the picking judgment threshold compared to the picking geometry in this step a9, that is, it is judged that the picking state of the object cannot be maintained, then step a10 is executed.
Step a10, the physical computation module judges whether the duration time which can not keep the picking state is overtime, if yes, step a11 is executed, otherwise, step a7 is returned.
The execution of step a9 and step a10 actually accomplish the steps of the present invention of determining whether to pick up a picked object and determining when to release the picked object based on the degree of intersection of the pick geometry and the bounding box of the picked object.
Step a11, sending release control information through the communication module to control the virtual hand to release the object.
The process of picking up and releasing is completed from step a1 to step a 11.
A data acquisition module:
the data acquisition module is mainly used for acquiring pressure data of each finger joint, and the acquired source can be a data glove, a flexible sensor or a combined system of a plurality of pressure sensors.
An IMU module:
the IMU module comprises two parts of a gyroscope and an accelerometer and is mainly used for acquiring the position and the orientation of a virtual hand of a user in a virtual space.
I. Obtaining a position of a virtual hand
Fig. 14 is a flowchart of updating the position of the virtual hand, which includes the following steps:
and b1, acquiring the current position of the virtual hand, and then entering the step b 2.
The current position of the virtual hand is obtained from the program of the virtual reality scene.
And b2, acquiring accelerometer data, filtering to obtain the current acceleration, and then entering the step b 3.
The accelerometer data is acquired in the IMU module, and the current acceleration of the virtual hand can be obtained by filtering the accelerometer data.
Step b3, obtaining the current speed according to the current acceleration and the refresh time, and then entering step b 4.
In this step, the refresh time refers to the refresh time of the virtual reality scene, which is obtained by
Speed-acceleration-refresh time
The current velocity of the virtual hand can be derived.
And b4, acquiring a new position of the virtual hand according to the current position, the current speed, the refreshing time and the orientation of the virtual hand, and then entering the step b 5.
In this step, by
New position + current speed x refresh time x heading
A new position of the virtual hand is obtained.
Step b5, the communication module transmits the new position information to the application program of the virtual reality scene to update the position of the virtual hand.
II. Obtaining orientation of virtual hand
The orientation of the virtual hand is mainly obtained by performing data fusion on the measurement results of the accelerometer and the gyroscope, and the data fusion can be performed by a complementary filtering method, a Kalman filtering method and a gradient descent method.
A physical computation module:
the physical calculation module is mainly used for carrying out background calculation on the picking process, the calculation of the picking operation comprises three steps, and a picking coordinate system, a picking geometry body used for picking judgment and the shape of the picking geometry body are determined during initialization, a picking object is determined, and whether the object can be picked or not is judged.
1. Determining a pick-up coordinate system and a pick-up geometry
(1) The palm length L, thumb length R and palm width W of the virtual hand in virtual space are determined as shown in fig. 3. The length of the virtual hand is the distance from the root to the middle finger tip of the virtual hand in the straightened state of the virtual hand, the width of the virtual hand is the distance from the widest part of the virtual hand in the straightened state of the virtual hand, and the length of the thumb is the distance from the joint of the root of the thumb to the tip of the thumb of the virtual hand in the straightened state of the thumb.
(2) Determining a cartesian coordinate system of the virtual hand, and taking the direction of the index finger as the positive direction of the x axis, the direction of the thumb as the positive direction of the z axis and the direction perpendicular to the palm as the positive direction of the y axis in the state that the palm is stretched horizontally and the thumb is perpendicular to the rest four fingers, as shown in fig. 4.
(3) The geometry of the "pick geometry" is determined based on the number of fingers measured and the user's selection.
a. Only two finger pressures can be measured:
if only pressure data of two fingers, including the thumb, can be measured, the "pick geometry" is a line segment, i.e., a pick segment. The line segment lies in the XOZ plane and has an initial length of
Figure GDA0002749246620000211
As shown in fig. 5. Since the motion picked up by the user is typically initiated by four fingers other than the thumb, the line segment position can be processed as lying in the XOZ plane, as long as the pressure change of one of the fingers is monitored.
b. The pressure of three fingers can be measured:
if pressure data of two fingers of the other four fingers can be measured except for the thumb, the line segment can be selected as the 'picking geometry' as in a, and in addition, a cylinder or a sphere (namely, the picking cylinder and the picking geometry) can be selected as the 'picking geometry';
if the cylinder is selected to be the "pick-up geometry", the cylinder height is W and the initial diameter is W
Figure GDA0002749246620000221
As shown in fig. 6.
If the sphere is selected as the "pick-up geometry", the initial diameter of the sphere is
Figure GDA0002749246620000222
As shown in fig. 7.
c. The pressure of more than three fingers can be measured:
if more finger pressure data can be measured, all of the geometries in a and b can be selected as "pick-up geometries", and in addition, a frustum (as shown in FIG. 8), a cuboid (as shown in FIG. 9), a pyramid, or other regular polyhedron can be selected as a "pick-up geometry".
If the frustum is selected as "pick-up geometry", the length of the bottom surface of the frustum is H and constant, the width of the bottom surface of the frustum is W and constant, the length of the top surface of the frustum is initially L, the variation range is 0 to L, the width of the top surface of the frustum is W and constant, the height of the frustum is initially R, the variation range is 0 to R, i.e. it is initialized to L in the length in the x-axis direction, R in the y-axis direction, and W in the z-axis direction, as shown in fig. 8.
If the cuboid is selected as the "pick-up geometry", it may be initialized to a length L in the x-axis direction, a length R in the y-axis direction, and a length W in the z-axis direction, as shown in fig. 10.
2. Determining a pickup object
(1) Before each frame of picture of the virtual reality scene is refreshed, the spatial position and the spatial rotation degree of the picked cuboid are modified according to the position and the orientation of the virtual hand obtained by the IMU module.
Since the pickup rectangular parallelepiped is set with reference to the position and orientation of the virtual hand, the spatial position of the pickup rectangular parallelepiped changes in synchronization with the change in the position of the virtual hand, and similarly, the spatial rotation degree of the pickup rectangular parallelepiped rotates with the change in the orientation of the virtual hand.
(2) Traversing the bounding boxes of all objects in the scene, and judging that a picked object is taken as the current picked object if the bounding box of the picked object is intersected with the picked rectangular solid.
(3) If the bounding box of a plurality of objects intersects with the picking cuboid, the object of the bounding box closest to the center of the picking cuboid is taken as the current picking object.
3. Judging object pickup
(1) Before each frame of picture is refreshed, modifying the size of the picking geometric body according to the pressure value of each finger obtained by the data acquisition module, wherein the specific modification method comprises the following steps:
a. using line segments as "pick-up geometry":
acquiring pressure data of the thumb:
if the pressure value is increased, the length of the straight line is reduced;
if the pressure value is reduced, the length of the straight line is increased;
acquiring pressure data of other four fingers, and selecting the maximum pressure data:
if the pressure value is increased, the length of the straight line is reduced;
if the pressure value is reduced, the length of the straight line is increased;
b. using a cylinder or sphere as the "pick-up geometry":
acquiring pressure data of the thumb:
if the pressure value is increased, the length of the diameter of the cylinder or the sphere is reduced;
if the pressure value is reduced, the length of the diameter of the cylinder or the sphere is increased;
acquiring pressure data of other four fingers, and selecting the maximum pressure data:
if the pressure value is increased, the length of the diameter of the cylinder or the sphere is reduced;
if the pressure value is reduced, the length of the diameter of the cylinder or the sphere is increased;
c. other types of "pick-up geometry":
acquiring pressure data of the thumb:
if the pressure value is increased, the length in the Y direction is reduced;
if the pressure value is reduced, the length in the Y direction is increased;
acquiring pressure data of other four fingers, and selecting the maximum pressure data:
if the pressure value is increased, the length in the X direction is reduced;
if the pressure value decreases, the length in the X direction increases.
In the above a, b, c, the following formula is adopted according to the calculation of the pressure increase or decrease length
Length3'=Length3×α×F
Wherein, Length3' Length is the modified Length of the pick-up cuboid in the x-axis and/or y-axis direction3In order to pick up the length of the cuboid before modification in the x-axis and/or y-axis direction, alpha is an adjusting coefficient, and F is a pressure value of the finger. Wherein alpha is related to the sensitivity of the data acquisition module, and the value range is within 0-1.
(2) If the mobile terminal is not in the picking state, the picking judgment is carried out according to the following method:
a. using line segments as "pick-up geometry":
calculating the length of the line segment, and recording as "Lpick", calculating the length of the intersection of the line segment and the bounding box of the pick-up object, and recording as" Lintersect". Defining beta as a picking judgment threshold, the initialization value is 0.9, the value range is 0.8-0.95, and then the picking condition is as the formula (1)
Lintersect÷Lpick>β (1)
b. Using a cylinder as the "pick-up geometry":
calculating the area of the cross section of the cylinder, and recording as "Apick1", the area of the cross section at the intersection with the bounding box of the pick-up object is calculated and denoted as" Aintersect1". Defining beta as a pick-up decision threshold, an initialization value of 0.9,the value range is 0.8-0.95, the pick-up conditions are as the formula (2)
Aintersect1÷Apick1>β (2)
c. Using a sphere as the "pick-up geometry":
calculating the area of the cross section of the sphere, and recording as "Apick2", the area of the cross section at the intersection with the bounding box of the pick-up object is calculated and denoted as" Aintersect2". Defining beta as a picking judgment threshold, the initialization value is 0.9, the value range is 0.8-0.95, and then the picking condition is as the formula (3)
Aintersect2÷Apick2>β (3)
d. With the prism as "pick-up geometry":
calculating the volume of the prism table, and recording as Vpick1", the volume of the intersection of the prism and the bounding box of the picked-up object is calculated and is marked as" Vintersect1". Defining beta as a picking judgment threshold, the initialization value is 0.9, the value range is 0.8-0.95, and then the picking condition is as the formula (4)
Vintersect1÷Vpick1>β (4)
e. Using a cuboid as the "pick-up geometry":
calculating the volume of the cuboid, and recording as Vpick2Calculating the volume of the intersection part of the rectangular parallelepiped and the bounding box of the pickup object, and recording the volume as Vintersect2". Defining beta as a pick-up judgment threshold, the initialization value is 0.9, the value range is 0.8-0.95, and then the pick-up condition is as the formula (5)
Vintersect2÷Vpick2>β (5)
For other types of "picking geometry" such as pyramid or other regular polyhedron, the picking condition can be set by referring to the picking conditions of the above-mentioned frustum of prism and cuboid, and will not be described again.
For various types of 'picking geometric bodies', if the corresponding picking judgment conditions are met, the state is modified to pick the object, and if not, the judgment is continued.
(3) If the object is in a picking state, judging the intersection relation between the 'picking geometry' and the picking object bounding box, if the object does not meet the corresponding picking conditions from the formula (1) to the formula (5) in a, b, c, d and e in the step (2), entering the step (4), and if the object does not meet the corresponding picking conditions from the formula (1) to the formula (5) in the step (2), continuing to judge;
(4) and (3) judging whether the 'picking geometry' meets the corresponding picking conditions from the formula (1) to the formula (5) in a, b, c, d and e in the step (2) in each frame within the range of time t, if not, loosening the object and clearing the picking state, otherwise, judging again from the time t, wherein the time t can be configured by an application program and ranges from 0.2s to 1.0 s.
A communication module:
the communication module is, for example, a bluetooth module, which transmits a control state to a virtual reality device (i.e., a device running virtual reality scene software, i.e., a hardware platform carrying the virtual reality scene) mainly through a bluetooth protocol, transmits a message that an object is picked up when switching to a pickup state, and transmits a message that the pickup state is cancelled when the pickup state is cancelled.
The embodiments of the picking method and the picking device in the virtual reality scene are used for realizing the following picking scenes in the virtual reality scene.
Scene 1:
the user picks a single ball from a pile of balls:
(1) the physical computation module initializes a 'pick geometry' according to the size of the virtual hand;
(2) the virtual hand of the user approaches the ball to be picked up, as shown in fig. 15A, the IMU module acquires the position of the virtual hand in the virtual scene;
(3) the physical computation module detects that a ball is intersected with the picking geometry, and selects the ball closest to the picking geometry as a picking object;
(4) the user bends the fingers to try to pick up the ball, as shown in fig. 15B, the data acquisition module acquires pressure data of each finger;
(5) the physical computation module uses the pressure data to modify the size of the "pick-up geometry";
(6) the physical calculation module judges whether the ball can be picked up according to the current 'picking geometry';
(7) when the 'picking geometry' is small enough, the physical calculation module judges that the ball can be picked and modifies the ball into a picking state;
(8) the communication module sends the pick-up status to the VR app (virtual reality application, running on the virtual reality device), and the virtual scene refreshes and displays that the ball was picked up by the user, as shown in fig. 15C.
Scene 2:
the pressure of only two fingers is monitored and a straight line is selected as the "pick-up geometry".
(1) The user selects a line segment as a "pick-up geometry", and as shown in fig. 16A, the physical computation module initializes the "pick-up geometry" as a line segment;
(2) the method comprises the following steps that a user tries to pick up a match and moves a hand, and an IMU module obtains the position of a virtual hand in a virtual scene;
(3) the physical computation module detects that the line segment is intersected with the match, and the match is selected as a pickup object;
(4) the user bends the fingers to start picking up matches, and as shown in fig. 16B, the data acquisition module acquires pressure data of each finger;
(5) the physical calculation module modifies the length of the line segment according to the pressure data;
(6) the physical calculation module calculates the length of the current line segment and the length of the intersection part and judges whether matches can be picked up or not;
(7) when the intersected part is long enough, the physical computing module judges that matches can be picked up, and modifies the picking state to be that matches are picked up;
(8) the communication module sends the pick-up status to the VR app, and the virtual scene refreshes and shows that the match was picked up by the user, as shown in fig. 16C.
Scene 3:
the pressure of the three fingers was monitored and the cylinder was chosen as the "pick-up geometry".
(1) The user selects a cylinder as a "pick-up geometry", and as shown in fig. 17A, the physical computation module initializes the "pick-up geometry" as the cylinder;
(2) a user tries to pick up a water cup and moves a hand, and the IMU module acquires the position of a virtual hand in a virtual scene;
(3) the physical computation module detects that the cylinder is intersected with the water cup, and the water cup is selected as a pickup object;
(4) when the user bends the fingers and starts to pick up the water cup, as shown in fig. 17B, the data acquisition module acquires pressure data of each finger;
(5) the physical calculation module modifies the diameter of the cylinder according to the pressure data;
(6) the physical calculation module calculates the area of the cross section of the current cylinder and the area of the cross section at the intersection part and judges whether the cup can be picked up or not;
(7) when the intersected part is large enough, the physical computing module judges that the cup can be picked up, and modifies the picking state to that the cup is picked up;
(8) the communication module sends the pick-up status to the VR app, and the virtual scene refreshes and shows that the cup was picked up by the user, as shown in fig. 17C.
Scene 4:
the pressure of three fingers was monitored, and the sphere was chosen as the "pick-up geometry":
(1) the user selects a sphere as a "pick-up geometry", and as shown in fig. 18A, the physical computation module initializes the "pick-up geometry" as the sphere;
(2) a user tries to pick up an apple and moves a hand, and the IMU module acquires the position of a virtual hand in a virtual scene;
(3) the physical computation module detects that the sphere is intersected with the apple, and the apple is selected as a pickup object;
(4) the user bends the fingers to start picking up the apples, and as shown in fig. 18B, the data acquisition module acquires pressure data of each finger;
(5) the physical calculation module modifies the diameter of the sphere according to the pressure data;
(6) the physical calculation module calculates the area of the cross section of the current sphere and the area of the cross section at the intersection part and judges whether the apple can be picked up or not;
(7) when the intersected part is large enough, the physical computing module judges that the apple can be picked up, and modifies the picking state to be that the apple is picked up;
(8) the communication module sends a pick-up status to the VR app, and the virtual scene refreshes and shows that the apple was picked up by the user, as shown in fig. 18C.
Scene 5:
monitoring the pressure of more than three fingers, and selecting a cuboid as a pickup geometry:
(1) the user selects a cuboid as a "pick-up geometry", and as shown in fig. 19A, the physical computation module initializes the "pick-up geometry" as a cuboid;
(2) a user tries to pick up a pencil box and moves a hand, and the IMU module acquires the position of a virtual hand in a virtual scene;
(3) the physical calculation module detects that the cuboid is intersected with the pencil box, and the pencil box is selected as a pickup object;
(4) when the user bends the fingers and starts to pick up the pencil box, as shown in fig. 19B, the data acquisition module acquires pressure data of each finger;
(5) the physical calculation module modifies the size of the cuboid according to the pressure data;
(6) the physical calculation module calculates the volume of the current cuboid and the volume of the intersection part and judges whether the pencil box can be picked up or not;
(7) when the intersected part is large enough, the physical computing module judges that the pencil box can be picked up, and modifies the picking state into that the pencil box is picked up;
(8) the communication module sends the pick status to the VR app, the virtual scene refreshes and shows that the pencil box is picked up by the user, as shown in fig. 19C.
Scene 6:
monitoring the pressure of more than three fingers, and selecting a prism table as a 'picking geometry':
(1) the user selects the frustum as the "picking geometry", and as shown in fig. 20A, the physical computation module initializes the "picking geometry" as the frustum;
(2) a user tries to pick up a lunch box and moves a hand, and the IMU module acquires the position of a virtual hand in a virtual scene;
(3) the physical calculation module detects that the prismatic table is intersected with the lunch box, and selects the lunch box as a pickup object;
(4) when the user bends the fingers and starts to pick up the lunch box, as shown in fig. 20B, the data acquisition module acquires pressure data of each finger;
(5) the physical calculation module modifies the size of the prism table according to the pressure data;
(6) the physical calculation module calculates the volume of the current prismatic table and the volume of the intersection part and judges whether the lunch box can be picked up or not;
(7) when the intersected part is large enough, the physical computation module judges that the lunch box can be picked up, and modifies the picking state into that the lunch box is picked up;
(8) the bluetooth module sends the pick-up status to the VR app, and the virtual scene refreshes and shows that the lunch box is picked up by the user, as shown in fig. 20C.
Scene 7:
the user attempts to pick up a large volume of objects in the virtual scene.
(1) A large box in the virtual scene is defined as having to use two hands to pick up;
(2) the physical computation module initializes the "pick geometry" for each hand;
(3) a user tries to pick up a large box and moves a hand, and the IMU module acquires the position of a virtual hand in a virtual scene;
(4) the physical computation module detects that a 'picking geometric body' of one hand is intersected with the large box, and the large box is selected as a picking object;
(5) the data acquisition module acquires pressure data of each finger;
(6) the physical calculation module calculates a 'picking geometric body' for each current hand and judges whether a large box meets the picking condition of each hand;
(7) when the "pick geometry" of only one hand satisfies the pick condition, the large box is not picked up, as shown in fig. 21A;
(8) when the "pick geometry" of both hands meets the pick condition, as shown in FIG. 21B, the physical computing module modifies the state that the large box is picked;
(9) the communication module sends the pick status to the VR app, the virtual scene refreshes and shows that the large box is picked up by the user with both hands, as shown in fig. 21C.
Scene 8:
user attempts to pick up heavy objects in virtual scene
(1) A heavy shot in the virtual scene is defined as having to be picked up with two hands.
(2) The physical computation module initializes the "pick geometry" for each hand;
(3) the user tries to pick up a shot and move the hand, and the IMU module acquires the position of the virtual hand in the virtual scene;
(4) the physical calculation module detects that a 'picking geometry' of one hand is intersected with the shot, and the shot is selected as a picking object;
(5) the data acquisition module acquires pressure data of each finger;
(6) the physical calculation module calculates a 'picking geometry' for each current hand and judges whether a shot meets the picking condition of each hand;
(7) when the "pick geometry" of one hand satisfies the pick condition, the shot will not be picked up, as shown in FIG. 22A;
(8) when the "pick geometry" of both hands meets the pick condition, as shown in FIG. 22B, the physical computing module modifies the state that the shot is picked;
(9) the communication module sends a pick status to the VR app, the virtual scene refreshes and shows that the shot was picked up by the user with both hands, as shown in fig. 22C.
Scene 9:
in this scenario, if a VR app requires that an existing gesture recognition pickup method must be used, before the VR app is started, the VR app asks the user whether to use the existing gesture recognition pickup method to obtain higher accuracy, such as an interface window shown in fig. 23 that asks the user whether to use the existing gesture recognition pickup method; if the AR app does not have strict requirements, the user is not asked whether to use existing gesture recognition.
Scene 10:
in this scenario, the method of the embodiment of the present invention may automatically switch between "pick geometry" and existing gesture recognition (if the interactive device supports the existing gesture recognition method).
(1) Detecting the interactive equipment, and if the existing gesture recognition method is supported, entering the step (2);
(2) detecting the model of the chip of the interactive equipment to determine the computing power of the chip, if the computing power of the chip is strong enough, automatically selecting the existing gesture recognition as a picking method, otherwise, selecting the method of the invention to realize picking through a 'picking geometry';
(3) the currently employed pickup method is displayed using, for example, a pop-up message, as shown in fig. 24A, 24B.
Scene 11:
the pick-up method of embodiments of the present invention may also automatically switch between the "pick-up geometry" and the existing gesture recognition based on the battery state of the VR device (if the interactive device supports the existing gesture recognition method).
(1) The VR app is launched. If the VR equipment is in the power-saving mode, the picking method is selected to realize picking through a 'picking geometry';
(2) if the VR app is using existing gesture recognition as the pick-up method, when the VR device enters the power saving mode, the auto-switch to the pick-up method of the present invention achieves pick-up through the "pick-up geometry", as shown in fig. 25.
Scene 12:
in this scenario, the pick-up method according to the embodiment of the present invention may be automatically switched to the existing gesture recognition according to the operating state of the VR device (if the interactive device supports the existing gesture recognition method).
(1) Starting a VR app, recording a pickup method used during starting, and entering (2) and (3) if the pickup method is an existing gesture recognition method;
(2) if the temperature rises above the warning line of the VR device, automatically switching to the pickup method of the embodiment of the present invention, as shown in fig. 26A, and entering (4);
(3) if the frame number of the picture is insufficient, automatically switching to the picking method of the embodiment of the invention, as shown in fig. 26B, and entering (5);
(4) when the temperature returns to normal and is maintained for a period of time, automatically switching to the existing gesture recognition pickup method, as shown in fig. 26C;
(5) when the frame number is restored to be smooth and maintained for a while, the conventional gesture recognition picking method is automatically switched to, as shown in fig. 26C.
According to the picking method and the picking device in the virtual reality scene, the picking of the object in the virtual reality scene is judged based on the multi-contact pressure, real and efficient picking operation is provided for the current virtual reality equipment, compared with the prior art, the picking judgment is only carried out on the multi-contact hand pressure, gesture recognition is not needed to be carried out on each specific finger, the software calculation complexity is reduced, and the picking judgment method is more efficient than the existing accurate picking judgment method. The invention simultaneously improves the immersion of the virtual reality interaction of the user and improves the user experience.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (33)

1. A method of picking up in a virtual reality scene, comprising:
receiving finger pressure data;
constructing a pick-up geometry;
determining or modifying the spatial position and the spatial rotation degree of the picking geometric body according to the position data and the orientation data of the virtual hand in the virtual reality scene;
traversing bounding boxes of each object in the virtual reality scene;
determining an object as a pick-up object when its bounding box intersects the spatial location of the pick-up geometry;
modifying a size of the pick-up geometry according to the finger pressure data;
determining whether to pick up the picked-up object and when to release the picked-up object based on the degree of intersection of the picking geometry and the bounding box of the picked-up object;
wherein:
the geometry of the pick-up geometry is a pick-up line segment, and the determining whether to pick up the pick-up object and the determining when to release the pick-up object according to the degree of intersection of the pick-up geometry and the bounding box of the pick-up object comprises:
if the picking object is not in the picking state, calculating the length of the intersection part of the picking line segment and the bounding box of the picking object, calculating the length of the current picking line segment, and if the picking condition formula (1) is satisfied, determining the picking object as the picking state:
Lintersect÷Lpick>β (1)
wherein L isintersectIs the length of the intersection of the pickup line segment and the bounding box of the pickup object, LpickTaking the length of the current pickup line segment and beta as a pickup judgment threshold;
if the picking object is in a picking state, calculating the length of the intersection part of the picking line segment and the bounding box of the picking object, and calculating the length of the current picking line segment, if the picking conditional formula (1) is not satisfied any more and the picking conditional formula (1) is not satisfied within a first delay time period from the moment when the picking conditional formula (1) is not satisfied any more, releasing the picking object, and clearing the picking state of the picking object;
alternatively, the first and second electrodes may be,
the geometry of the pick-up geometry is a pick-up cylinder, and the determining whether to pick up the pick-up object and the determining when to release the pick-up object according to the degree of intersection of the pick-up geometry and the bounding box of the pick-up object comprises:
if the picking object is not in the picking state, calculating the area of the intersection part of the cross section of the picking cylinder and the bounding box of the picking object, calculating the current cross section area of the picking cylinder, and if the picking condition formula (2) is met, determining the picking object as the picking state:
Aintersect1÷Apick1>β (2)
wherein A isintersect1Is the area of the intersection of the cross-section of the pick-up cylinder with the bounding box of the pick-up object, Apick1The area of the cross section of the current picking cylinder is beta, and the beta is a picking judgment threshold value;
if the picking object is in the picking state, calculating the area of the intersection part of the cross section of the picking cylinder and the bounding box of the picking object, calculating the area of the cross section of the current picking cylinder, if the picking condition formula (2) is not satisfied any more and the picking condition formula (2) is not satisfied within a first delay time period from the moment when the picking condition formula (2) is not satisfied any more, releasing the picking object, and clearing the picking state of the picking object;
alternatively, the first and second electrodes may be,
the geometry of the pick-up geometry is a pick-up sphere, and the determining whether to pick up the pick-up object and when to release the pick-up object according to the degree of intersection of the pick-up geometry and the bounding box of the pick-up object comprises:
if the picked object is not in the picking state, calculating the area of the intersection part of the cross section of the picked sphere and the bounding box of the picked object, calculating the current cross section area of the picked sphere, and if the picking condition formula (3) is met, determining the picked object as the picking state:
Aintersect2÷Apick2>β (3)
wherein A isintersect2Is the area of the intersection of the cross section of the pick sphere with the bounding box of the pick object, Apick2The area of the cross section of the current picking sphere is beta, and the beta is a picking judgment threshold value;
if the picking object is in a picking state, calculating the area of the intersection part of the cross section of the picking sphere and the bounding box of the picking object, and calculating the current cross section area of the picking sphere, if the picking conditional formula (3) is not satisfied any more and the picking conditional formula (3) is not satisfied within a first delay time period from the moment when the picking conditional formula (3) is not satisfied any more, releasing the picking object, and clearing the picking state of the picking object;
or, the geometry of the picking geometry is a picking prism, and the determining whether to pick the picking object and the determining when to release the picking object according to the intersection degree of the picking geometry and the bounding box of the picking object comprises:
if the pickup object is not in a pickup state, calculating a volume of an intersection of the pickup prism and a bounding box of the pickup object, and calculating a current volume of the pickup prism, and if a pickup condition formula (4) is satisfied, determining the pickup object as a pickup state:
Vintersect1÷Vpick1>β (4)
wherein, Vintersect1Is the volume of the intersection of the picking prism and the bounding box of the picking object, Vpick1The volume of the current picking prism table is beta, and the beta is a picking judgment threshold value;
if the picking object is in a picking state, calculating the volume of the intersection part of the picking prism and the bounding box of the picking object, and calculating the volume of the current picking prism, if the picking condition formula (4) is not satisfied any more and the picking condition formula (4) is not satisfied within a first delay time period from the moment when the picking condition formula (4) is not satisfied any more, releasing the picking object, and clearing the picking state of the picking object;
alternatively, the first and second electrodes may be,
the geometry of the picking geometry is a picking cuboid, and the determining whether to pick the picking object and the determining when to release the picking object according to the intersection degree of the picking geometry and the bounding box of the picking object comprises:
if the picking object is not in the picking state, calculating the volume of the intersection part of the picking rectangular parallelepiped and the bounding box of the picking object, and calculating the current volume of the picking rectangular parallelepiped, if the picking condition formula (5) is satisfied, determining the picking object as the picking state:
Vintersect2÷Vpick2>β (5)
wherein, Vintersect2Is the volume of the intersection of the pickup cuboid and the bounding box of the pickup object, Vpick2The volume of the current picking cuboid is beta, and the beta is a picking judgment threshold value;
if the picking object is in the picking state, calculating the volume of the intersection part of the picking cuboid and the bounding box of the picking object, calculating the volume of the current picking cuboid, and if the picking condition formula (5) is not satisfied any more and within a first delay time period from the moment when the picking condition formula (5) is not satisfied any more, releasing the picking object and clearing the picking state of the picking object.
2. The method of claim 1, further comprising:
when the picking object is determined to be picked up, sending control information for picking up the picking object to the virtual reality equipment;
when it is determined that the pickup object is released, control information for canceling the pickup of the pickup object is transmitted to the virtual reality device.
3. Pick-up method in a virtual reality scene according to claim 1, characterized in that:
the finger pressure data is derived from a data glove, a flexible sensor or a pressure sensor mounted on a finger joint.
4. The method for picking up in the virtual reality scene according to claim 1, wherein constructing a picking geometry comprises:
acquiring a measurement parameter of the virtual hand;
constructing a picking coordinate system;
and determining the size of the picking geometric body according to the picking coordinate system and the measurement parameters of the virtual hand.
5. The method of claim 4, wherein:
the measurement parameters of the virtual hand comprise the palm length L of the virtual hand, the palm width W of the virtual hand and the thumb length R of the virtual hand;
under the state that the palm of the virtual hand is straightened and the thumb is perpendicular to the rest four fingers, the direction of the index finger of the virtual hand is taken as the positive direction of an x axis, the direction of the thumb is taken as the positive direction of a z axis, and the direction perpendicular to the palm and far away from the palm is taken as the positive direction of a y axis, so that the picking coordinate system is constructed;
the pick-up geometry is located on one side of the palm of the virtual hand.
6. Pick-up method in a virtual reality scene according to claim 5, characterized in that:
the geometry is determined based on the number of fingers measured and the user's selection.
7. The method of claim 6, wherein:
the number of measured fingers is at least two, wherein the measured fingers must include a thumb.
8. Pick-up method in a virtual reality scene according to claim 7, characterized in that:
when the number of the measured fingers is two, the geometric form selectable by the user is a line segment picking-up form;
when the number of measured fingers is three, the user-selectable geometric forms include a pickup line segment, a pickup cylinder, and a pickup sphere;
when the number of the measured fingers is four, the geometric forms selectable by the user comprise a pickup line segment, a pickup cylinder, a pickup ball, a pickup prism table and a pickup cuboid;
when the number of fingers measured is five, the user-selectable geometric forms include a pickup line segment, a pickup cylinder, a pickup sphere, a pickup prism, and a pickup cuboid.
9. The method for picking up in a virtual reality scene according to claim 8, wherein when the geometric form is a picked-up line segment:
the pickup line segment is located in a plane formed by the x axis and the z axis;
the pick-up line segment has an initial length of
Figure FDA0002749246610000041
The variation range of the length of the picking line section is 0 to
Figure FDA0002749246610000042
10. The method of claim 9, wherein modifying the size of the pick-up geometry based on the finger pressure data comprises:
acquiring a pressure value of the thumb, if the pressure value of the thumb is increased, reducing the length of the pickup line segment, and if the pressure value of the thumb is reduced, increasing the length of the pickup line segment;
and acquiring the maximum pressure value of the pressure values of the fingers except the thumb, reducing the length of the pickup line segment if the maximum pressure value is increased, and increasing the length of the pickup line segment if the maximum pressure value is reduced.
11. The method of claim 10, wherein the length of the picked line segment is increased or decreased according to the following formula:
Length1'=Length1×α/F
wherein, Length1' Length is the modified Length of the pickup line segment1And the length of the picked line segment before modification is obtained, alpha is an adjusting coefficient, and F is the pressure value of the finger.
12. The method for picking up in a virtual reality scene according to claim 8, wherein when the geometric form is a picking cylinder:
the axis of the pickup cylinder is parallel to the z-axis, the height of the pickup cylinder is W, and the initial diameter of the pickup cylinder is W
Figure FDA0002749246610000051
The diameter of the pick-up cylinder varies from 0 to
Figure FDA0002749246610000052
And the pick-up cylinder is positioned between the virtual fingers of the virtual hand corresponding to the measured finger.
13. The method of claim 12, wherein modifying the size of the pick-up geometry based on the finger pressure data comprises:
acquiring a pressure value of the thumb, if the pressure value of the thumb is increased, reducing the diameter of the pickup cylinder, and if the pressure value of the thumb is reduced, increasing the diameter of the pickup cylinder;
and acquiring the maximum pressure value of the pressure values of the fingers except the thumb, reducing the diameter of the pickup cylinder if the maximum pressure value is increased, and increasing the diameter of the pickup cylinder if the maximum pressure value is reduced.
14. Pick-up method in a virtual reality scene according to claim 13, characterized in that the diameter of the pick-up cylinder is increased or decreased according to the following formula:
Φ1'=Φ1×α/F
wherein phi1' modified diameter of the pick-up cylinder, [ phi ]1To pick up the diameter of the cylinder before modification, α is the adjustment factor and F is the pressure value of the finger.
15. The method for picking up in a virtual reality scene according to claim 8, wherein when the geometrical shape is a picked up sphere:
the pick-up sphere has an initial diameter of
Figure FDA0002749246610000053
The diameter of the picking ball is in the range of 0 to
Figure FDA0002749246610000054
And the picking sphere is positioned between the virtual fingers corresponding to the measured finger in the virtual hand.
16. The method of claim 15, wherein modifying the size of the pick-up geometry based on the finger pressure data comprises:
acquiring a pressure value of the thumb, if the pressure value of the thumb is increased, reducing the diameter of the picking ball, and if the pressure value of the thumb is reduced, increasing the diameter of the picking ball;
and acquiring the maximum pressure value of the pressure values of the fingers except the thumb, reducing the diameter of the picking ball body if the maximum pressure value is increased, and increasing the diameter of the picking ball body if the maximum pressure value is reduced.
17. The method of claim 16, wherein the diameter of the ball is increased or decreased according to the following formula:
Φ2'=Φ2×α/F
wherein phi2' modified diameter of the pick-up sphere, [ phi ]2To pick up the diameter of the sphere before modification, α is the adjustment coefficient, and F is the pressure value of the finger.
18. The method for picking up in a virtual reality scene according to claim 8, wherein when the geometrical form is a picking platform:
the length of picking up the terrace with edge bottom is H, wide is W, the initial length of picking up the terrace with edge top surface is L, the variation range of picking up the terrace with edge top surface length is 0 to L, the width of picking up the terrace with edge top surface is W, the initial height of picking up the terrace with edge is R, the variation range of picking up the terrace with edge height is 0 to R.
19. The method of claim 18, wherein modifying the size of the pick-up geometry based on the finger pressure data comprises:
acquiring a pressure value of the thumb, if the pressure value of the thumb is increased, reducing the length of the picking prism table in the y-axis direction, and if the pressure value of the thumb is reduced, increasing the length of the picking prism table in the y-axis direction;
acquiring the maximum pressure value of the pressure values of the fingers except the thumb, if the maximum pressure value is increased, reducing the length of the picking prismatic table in the x-axis direction, and if the maximum pressure value is reduced, increasing the length of the picking prismatic table in the x-axis direction.
20. The method of claim 19, wherein the length of the picking prism in the x-axis and/or y-axis direction is increased or decreased according to the following formula:
Length2'=Length2×α/F
wherein, Length2' Length as modified Length of the pickup prism in the x-and/or y-axis directions2In order to pick up the length of the prism table before modification in the x-axis and/or y-axis direction, alpha is an adjustment coefficient, and F is a pressure value of the finger.
21. The method for picking up in a virtual reality scene according to claim 8, wherein when the geometrical shape is a picked-up cuboid:
the initial length value of the picking cuboid in the x-axis direction is L, the initial length value of the picking cuboid in the y-axis direction is R, and the initial length value of the picking cuboid in the z-axis direction is W.
22. The method of claim 21, wherein modifying the size of the pick-up geometry based on the finger pressure data comprises:
acquiring a pressure value of a thumb, if the pressure value of the thumb is increased, reducing the length of the picking cuboid in the y-axis direction, and if the pressure value of the thumb is reduced, increasing the length of the picking cuboid in the y-axis direction;
and acquiring the maximum pressure value of the pressure values of the fingers except the thumb, reducing the length of the picking cuboid in the x-axis direction if the maximum pressure value is increased, and increasing the length of the picking cuboid in the x-axis direction if the maximum pressure value is reduced.
23. The method for picking up in the virtual reality scene according to claim 22, wherein the length of the picking-up cuboid in the x-axis and/or y-axis direction is increased or decreased according to the following formula:
Length3'=Length3×α/F
wherein, Length3' Length is the modified Length of the pick-up cuboid in the x-axis and/or y-axis direction3In order to pick up the length of the cuboid before modification in the x-axis and/or y-axis direction, alpha is an adjusting coefficient, and F is a pressure value of the finger.
24. Pick-up method in a virtual reality scene according to claim 1, characterized in that:
beta is 0.8 to 0.95.
25. Pick-up method in a virtual reality scene according to claim 1, characterized in that:
the duration of the first delay time period is 0.2 s-1.0 s.
26. The method of claim 1, further comprising:
and further determining whether to pick the pick object according to the pick requirement of the pick object.
27. The method of claim 26, wherein the picking requirement of the object comprises: one-handed pick or two-handed pick.
28. Pick-up method in a virtual reality scene according to claim 27, characterized in that:
when a width of the picked object exceeds a pickup volume threshold, and/or when a weight of the picked object exceeds a pickup weight threshold: the picking requirement of the picking object is two-hand picking;
the pick-up requirement for the picked-up object is a one-handed pick-up when the width of the picked-up object does not exceed a pick-up volume threshold, and/or when the weight of the picked-up object does not exceed a pick-up weight threshold.
29. The method of claim 1, wherein the position data and orientation data of the virtual hand in the virtual reality scene are determined by:
acquiring initial position data of a virtual hand in a virtual reality scene;
acquiring acceleration for controlling the virtual hand through an accelerometer;
acquiring an angular velocity for controlling the virtual hand through a gyroscope;
determining the speed of the virtual hand according to the acceleration and the refreshing time of the virtual reality scene;
acquiring new position data of the virtual hand in the virtual reality scene according to the speed of the virtual hand, the refreshing time of the virtual reality scene and the initial position data of the virtual hand in the virtual reality scene;
and carrying out data fusion on the acceleration and the angular speed to obtain orientation data of the virtual hand.
30. Pick-up method in a virtual reality scene according to claim 29, characterized in that:
the data fusion is performed by complementary filtering, kalman filtering, and/or gradient descent methods.
31. A pickup apparatus in a virtual reality scene, comprising:
a data acquisition module for receiving finger pressure data;
the system comprises an IMU module, a virtual hand positioning module and a virtual hand positioning module, wherein the IMU module is used for acquiring position data and orientation data of a virtual hand in a virtual reality scene;
the physical computing module performs computing according to data provided by the IMU module to construct a picking geometry, determines or modifies the spatial position and the spatial rotation degree of the picking geometry, traverses bounding boxes of each object in a virtual reality scene, determines an object as a picking object when the bounding box of the object intersects with the spatial position of the picking geometry, modifies the size of the picking geometry according to the finger pressure data, determines whether to pick the picking object and determines when to release the picking object according to the intersection degree of the picking geometry and the bounding boxes of the picking object; wherein the content of the first and second substances,
the geometry of the pick-up geometry is a pick-up line segment, and the determining whether to pick up the pick-up object and the determining when to release the pick-up object according to the degree of intersection of the pick-up geometry and the bounding box of the pick-up object comprises:
if the picking object is not in the picking state, calculating the length of the intersection part of the picking line segment and the bounding box of the picking object, calculating the length of the current picking line segment, and if the picking condition formula (1) is satisfied, determining the picking object as the picking state:
Lintersect÷Lpick>β (1)
wherein L isintersectIs the length of the intersection of the pickup line segment and the bounding box of the pickup object, LpickTaking the length of the current pickup line segment and beta as a pickup judgment threshold;
if the picking object is in a picking state, calculating the length of the intersection part of the picking line segment and the bounding box of the picking object, and calculating the length of the current picking line segment, if the picking conditional formula (1) is not satisfied any more and the picking conditional formula (1) is not satisfied within a first delay time period from the moment when the picking conditional formula (1) is not satisfied any more, releasing the picking object, and clearing the picking state of the picking object;
alternatively, the first and second electrodes may be,
the geometry of the pick-up geometry is a pick-up cylinder, and the determining whether to pick up the pick-up object and the determining when to release the pick-up object according to the degree of intersection of the pick-up geometry and the bounding box of the pick-up object comprises:
if the picking object is not in the picking state, calculating the area of the intersection part of the cross section of the picking cylinder and the bounding box of the picking object, calculating the current cross section area of the picking cylinder, and if the picking condition formula (2) is met, determining the picking object as the picking state:
Aintersect1÷Apick1>β (2)
wherein A isintersect1Is the area of the intersection of the cross-section of the pick-up cylinder with the bounding box of the pick-up object, Apick1The area of the cross section of the current picking cylinder is beta, and the beta is a picking judgment threshold value;
if the picking object is in the picking state, calculating the area of the intersection part of the cross section of the picking cylinder and the bounding box of the picking object, calculating the area of the cross section of the current picking cylinder, if the picking condition formula (2) is not satisfied any more and the picking condition formula (2) is not satisfied within a first delay time period from the moment when the picking condition formula (2) is not satisfied any more, releasing the picking object, and clearing the picking state of the picking object;
alternatively, the first and second electrodes may be,
the geometry of the pick-up geometry is a pick-up sphere, and the determining whether to pick up the pick-up object and when to release the pick-up object according to the degree of intersection of the pick-up geometry and the bounding box of the pick-up object comprises:
if the picked object is not in the picking state, calculating the area of the intersection part of the cross section of the picked sphere and the bounding box of the picked object, calculating the current cross section area of the picked sphere, and if the picking condition formula (3) is met, determining the picked object as the picking state:
Aintersect2÷Apick2>β (3)
wherein A isintersect2Is the area of the intersection of the cross section of the pick sphere with the bounding box of the pick object, Apick2The area of the cross section of the current picking sphere is beta, and the beta is a picking judgment threshold value;
if the picking object is in a picking state, calculating the area of the intersection part of the cross section of the picking sphere and the bounding box of the picking object, and calculating the current cross section area of the picking sphere, if the picking conditional formula (3) is not satisfied any more and the picking conditional formula (3) is not satisfied within a first delay time period from the moment when the picking conditional formula (3) is not satisfied any more, releasing the picking object, and clearing the picking state of the picking object;
or, the geometry of the picking geometry is a picking prism, and the determining whether to pick the picking object and the determining when to release the picking object according to the intersection degree of the picking geometry and the bounding box of the picking object comprises:
if the pickup object is not in a pickup state, calculating a volume of an intersection of the pickup prism and a bounding box of the pickup object, and calculating a current volume of the pickup prism, and if a pickup condition formula (4) is satisfied, determining the pickup object as a pickup state:
Vintersect1÷Vpick1>β (4)
wherein,Vintersect1Is the volume of the intersection of the picking prism and the bounding box of the picking object, Vpick1The volume of the current picking prism table is beta, and the beta is a picking judgment threshold value;
if the picking object is in a picking state, calculating the volume of the intersection part of the picking prism and the bounding box of the picking object, and calculating the volume of the current picking prism, if the picking condition formula (4) is not satisfied any more and the picking condition formula (4) is not satisfied within a first delay time period from the moment when the picking condition formula (4) is not satisfied any more, releasing the picking object, and clearing the picking state of the picking object;
alternatively, the first and second electrodes may be,
the geometry of the picking geometry is a picking cuboid, and the determining whether to pick the picking object and the determining when to release the picking object according to the intersection degree of the picking geometry and the bounding box of the picking object comprises:
if the picking object is not in the picking state, calculating the volume of the intersection part of the picking rectangular parallelepiped and the bounding box of the picking object, and calculating the current volume of the picking rectangular parallelepiped, if the picking condition formula (5) is satisfied, determining the picking object as the picking state:
Vintersect2÷Vpick2>β (5)
wherein, Vintersect2Is the volume of the intersection of the pickup cuboid and the bounding box of the pickup object, Vpick2The volume of the current picking cuboid is beta, and the beta is a picking judgment threshold value;
if the picking object is in the picking state, calculating the volume of the intersection part of the picking cuboid and the bounding box of the picking object, calculating the volume of the current picking cuboid, and if the picking condition formula (5) is not satisfied any more and within a first delay time period from the moment when the picking condition formula (5) is not satisfied any more, releasing the picking object and clearing the picking state of the picking object.
32. A pick-up device in a virtual reality scene according to claim 31, wherein the device further comprises:
the communication module is connected to the physical computing module and is communicated with the virtual reality equipment;
when the picking object is determined to be picked up, the physical computing module sends control information for picking up the picking object to the virtual reality device through the communication module;
when the picking object is determined to be loosened, the physical computing module sends control information for canceling the picking of the picking object to the virtual reality device through the communication module.
33. The device of claim 31, wherein:
the data acquisition module is connected to a data glove, a flexible sensor or a pressure sensor arranged on a finger joint, so that finger pressure data can be acquired and received through the data glove, the flexible sensor or the pressure sensor arranged on the finger joint.
CN201810021624.XA 2018-01-10 2018-01-10 Picking method and device in virtual reality scene Active CN108227928B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810021624.XA CN108227928B (en) 2018-01-10 2018-01-10 Picking method and device in virtual reality scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810021624.XA CN108227928B (en) 2018-01-10 2018-01-10 Picking method and device in virtual reality scene

Publications (2)

Publication Number Publication Date
CN108227928A CN108227928A (en) 2018-06-29
CN108227928B true CN108227928B (en) 2021-01-29

Family

ID=62640585

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810021624.XA Active CN108227928B (en) 2018-01-10 2018-01-10 Picking method and device in virtual reality scene

Country Status (1)

Country Link
CN (1) CN108227928B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111784850B (en) * 2020-07-03 2024-02-02 深圳市瑞立视多媒体科技有限公司 Object grabbing simulation method based on illusion engine and related equipment
CN114082158B (en) * 2021-11-18 2022-05-20 南京医科大学 Upper limb rehabilitation training system for stroke patient
CN114063780A (en) * 2021-11-18 2022-02-18 兰州乐智教育科技有限责任公司 Method and device for determining user concentration degree, VR equipment and storage medium
CN114546555A (en) * 2022-02-14 2022-05-27 同恩(上海)工程技术有限公司 Method, device and medium for picking up graph based on space geometry

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016186932A1 (en) * 2015-05-20 2016-11-24 Sony Interactive Entertainment Inc. Electromagnet-laden glove for haptic pressure feedback
CN106371604A (en) * 2016-09-18 2017-02-01 Tcl集团股份有限公司 Interactive control gloves, virtual reality system and application method of virtual reality system
CN106648116A (en) * 2017-01-22 2017-05-10 隋文涛 Virtual reality integrated system based on action capture
CN106993181A (en) * 2016-11-02 2017-07-28 大辅科技(北京)有限公司 Many VR/AR equipment collaborations systems and Synergistic method
CN107132917A (en) * 2017-04-25 2017-09-05 腾讯科技(深圳)有限公司 For the hand-type display methods and device in virtual reality scenario
CN107229393A (en) * 2017-06-02 2017-10-03 三星电子(中国)研发中心 Real-time edition method, device, system and the client of virtual reality scenario
CN107272907A (en) * 2017-07-10 2017-10-20 三星电子(中国)研发中心 A kind of scene breaks marching method and device based on virtual reality technology

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016186932A1 (en) * 2015-05-20 2016-11-24 Sony Interactive Entertainment Inc. Electromagnet-laden glove for haptic pressure feedback
CN106371604A (en) * 2016-09-18 2017-02-01 Tcl集团股份有限公司 Interactive control gloves, virtual reality system and application method of virtual reality system
CN106993181A (en) * 2016-11-02 2017-07-28 大辅科技(北京)有限公司 Many VR/AR equipment collaborations systems and Synergistic method
CN106648116A (en) * 2017-01-22 2017-05-10 隋文涛 Virtual reality integrated system based on action capture
CN107132917A (en) * 2017-04-25 2017-09-05 腾讯科技(深圳)有限公司 For the hand-type display methods and device in virtual reality scenario
CN107229393A (en) * 2017-06-02 2017-10-03 三星电子(中国)研发中心 Real-time edition method, device, system and the client of virtual reality scenario
CN107272907A (en) * 2017-07-10 2017-10-20 三星电子(中国)研发中心 A kind of scene breaks marching method and device based on virtual reality technology

Also Published As

Publication number Publication date
CN108227928A (en) 2018-06-29

Similar Documents

Publication Publication Date Title
CN108227928B (en) Picking method and device in virtual reality scene
US20220326829A1 (en) Systems and methods for manipulating a virtual environment
CN108983978B (en) Virtual hand control method and device
WO2023056670A1 (en) Mechanical arm autonomous mobile grabbing method under complex illumination conditions based on visual-tactile fusion
US9367136B2 (en) Holographic object feedback
KR101738569B1 (en) Method and system for gesture recognition
US8998718B2 (en) Image generation system, image generation method, and information storage medium
CN104769522B (en) The remote controllers with gesture identification function are pointed to 3D
US11409357B2 (en) Natural human-computer interaction system based on multi-sensing data fusion
CN1304931C (en) Head carried stereo vision hand gesture identifying device
JP2020522795A (en) Eye tracking calibration technology
US20180225837A1 (en) Scenario extraction method, object locating method and system thereof
JP2019517049A (en) Interaction with 3D virtual objects using pose and multiple DOF controllers
EP1927384A1 (en) Method of determining operation input using game controller including acceleration detector, and method of calculating moving path of the game controller
US20100328319A1 (en) Information processor and information processing method for performing process adapted to user motion
CN105107200B (en) Face Changing system and method based on real-time deep body feeling interaction and augmented reality
CN110211661B (en) Hand function training system based on mixed reality and data processing method
CN111966217A (en) Unmanned aerial vehicle control method and system based on gestures and eye movements
US20220134218A1 (en) System and method for virtual character animation using motion capture
Cho et al. Motion recognition with smart phone embedded 3-axis accelerometer sensor
JP6519075B2 (en) INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD
Sreejith et al. Real-time hands-free immersive image navigation system using Microsoft Kinect 2.0 and Leap Motion Controller
WO2023227072A1 (en) Virtual cursor determination method and apparatus in virtual reality scene, device, and medium
CN109960404A (en) A kind of data processing method and device
EP3700641B1 (en) Methods and systems for path-based locomotion in virtual reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant