CN114035677A - Universal interface implementation method for interaction between both hands and virtual glove peripherals - Google Patents

Universal interface implementation method for interaction between both hands and virtual glove peripherals Download PDF

Info

Publication number
CN114035677A
CN114035677A CN202111242604.3A CN202111242604A CN114035677A CN 114035677 A CN114035677 A CN 114035677A CN 202111242604 A CN202111242604 A CN 202111242604A CN 114035677 A CN114035677 A CN 114035677A
Authority
CN
China
Prior art keywords
virtual
glove
hand
action
virtual hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111242604.3A
Other languages
Chinese (zh)
Inventor
吴超
王宇轩
何晓丹
孔吉宏
袁怀月
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongye Zhicheng Wuhan Engineering Technology Co ltd
Original Assignee
Zhongye Zhicheng Wuhan Engineering Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongye Zhicheng Wuhan Engineering Technology Co ltd filed Critical Zhongye Zhicheng Wuhan Engineering Technology Co ltd
Priority to CN202111242604.3A priority Critical patent/CN114035677A/en
Publication of CN114035677A publication Critical patent/CN114035677A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a method for realizing a universal interface for interaction between both hands and a virtual glove peripheral, which comprises the following steps: s1, obtaining action instructions in a hardware abstraction layer through API provided by each VR glove manufacturer, analyzing and judging the action instructions in a VR glove universal interface, and converting the action instructions into data objects defined in the universal interface according to the corresponding relation; s2, constructing a virtual manual operation predefined library, carrying out predefined encapsulation on various actions of a virtual hand in a universal interface, and storing encapsulated data into the virtual manual operation predefined library; and S3, matching the action corresponding to the action command from the virtual manual action predefined library and executing after receiving the action command of the VR glove. According to the invention, the interactive behaviors in the action instructions of the hardware abstraction layer of each VR glove manufacturer are separated from the SDK of the manufacturer, so that the coupling between hardware and a software system is greatly reduced, and the development efficiency of the virtual reality system based on VR gloves is greatly improved.

Description

Universal interface implementation method for interaction between both hands and virtual glove peripherals
Technical Field
The invention belongs to the technical field of virtual reality, and particularly relates to a universal interface implementation method for interaction between hands and a virtual glove peripheral.
Background
The VR glove is an advanced VR peripheral product, has the main function of providing interaction between the real world and the virtual world, and is a mainstream device in the market at present. VR gloves are as a VR peripheral product that is receiving great attention, and the tactile feedback that it brought is truer, and the tracking on the finger is also very outstanding, can provide high-quality sense of immersion more than general handle.
VR gloves from various large VR glove manufacturers, although featured, are based on general interactive functions, such as: pick, drop, etc. The SDK of VR gloves of each manufacturer only gives an API interface, and does not elaborate the working principle and implementation scheme of the whole hardware abstraction layer, so that developers can go through many 'bends' in the process of standard understanding and concrete development. The various VR gloves do not have a unified industry standard, so that different VR gloves are selected for development at present and need to follow SDKs provided by manufacturers, which means that new SDKs need to be replaced for reintegration and development every time hardware facilities are replaced, and the high development cost and other human factors greatly limit market growth and development convenience.
Chinese patent publication No. CN108958479A discloses a general real-time interaction method for three-dimensional virtual scenes based on data gloves, which includes acquiring the curvature of finger joints and the degree of opening and closing between fingers by using sensors in the data gloves, acquiring the spatial position and azimuth angle parameters of the gloves by using a positioning device matched with the data gloves, combining the acquired spatial position, azimuth angle parameters and sensor parameters of the gloves with parameters such as viewpoints in the three-dimensional virtual scenes according to three-dimensional virtual scene software interaction configuration definition, converting the obtained spatial position, azimuth angle parameters and sensor parameters into simulated mouse/keyboard input, calling an application programming interface of an operating system, converting the simulated mouse/keyboard input state information into real interaction device information, and driving real-time interaction of the three-dimensional virtual scenes. Therefore, the interaction of real-time rotation, roaming and the like of the virtual scene in the three-dimensional visualization software based on the data glove control is realized. Based on the method, the data glove can be integrated with other three-dimensional visualization software, and the three-dimensional interaction capacity of the software is improved. The patent only provides a virtual scene interaction method based on data gloves, and when the data gloves need to be replaced in practical application, new SDKs still need to be replaced for reintegration and development, so that the efficiency is low and the cost is high.
Disclosure of Invention
The invention aims to provide a method for realizing a universal interface for interaction between both hands and a virtual glove peripheral aiming at the problems in the prior art.
In order to achieve the purpose, the invention adopts the technical scheme that:
a universal interface implementation method for interaction between both hands and a virtual glove peripheral comprises the following steps:
s1, obtaining action instructions in a hardware abstraction layer through API provided by each VR glove manufacturer, analyzing the action instructions in a VR glove universal interface, judging glove brands corresponding to the action instructions, and converting the action instructions into data objects defined in the universal interface according to corresponding relations between the action instructions and the glove brands;
s2, constructing a virtual manual operation predefined library, carrying out predefined encapsulation on various actions of a virtual hand in a universal interface, and storing encapsulated data into the virtual manual operation predefined library;
and S3, matching the action corresponding to the action command from the virtual manual action predefined library and executing after receiving the action command of the VR glove.
Specifically, in step S1, the data objects defined in the generic interface include a plurality of instructions: the command comprises a moving command (comprising a forward moving command, a backward moving command, a left moving command, a right moving command, an upward moving command and a downward moving command), a grabbing command, a fist making command, an opening command, a pressing command and a stopping command.
Specifically, in step S2, the method for constructing the virtual hand motion predefined library includes: firstly, establishing a three-dimensional model of a virtual hand in 3DsMAX, numbering each joint of the virtual hand model, mapping the model by using Photoshop, namely adding colors to the model, finally establishing a skeleton and a skin for the virtual hand model, binding the skeleton and the skin, adjusting the weight of the influence of the skeleton on the skin, and making gesture skeleton animation according to defined gesture actions.
Further, the interaction method of the virtual hand and the human hand is as follows: the method comprises the steps of acquiring motion information of joints of each finger section of a human hand in real time through a sensor in a VR glove, setting joint angles of each finger section of a virtual hand according to the motion information of each finger section of the human hand, and finally performing matrix translation and rotation operation on the set joint angles of each finger section of the virtual hand in a virtual environment to realize position and direction transformation of the virtual hand.
Furthermore, in the process of interaction between the virtual hand and the human hand, the virtual hand is correspondingly restricted according to specific application occasions, so that the degree of freedom of the virtual hand is limited.
Specifically, in step S3, when the received motion instruction of the VR glove is a finger motion bending instruction and it is detected that a collision occurs between the virtual hand and the virtual object, it is determined that the virtual hand is performing a grabbing operation on the virtual object at this time; the nodes of the virtual object are arranged below the nodes of the virtual hand through matrix operation, and the gravity effect of the virtual object is cancelled, so that the virtual object becomes a sub-object of the virtual hand; at the moment, the relative coordinate positions of the virtual object and the virtual hand are solidified, and the virtual object translates and rotates along with the virtual hand;
when the action instruction of the VR glove is changed into the opening instruction, the virtual hand is judged to release the virtual object, the interaction task of the virtual hand is completed, the virtual object is separated from the virtual hand, and the gravity effect of the virtual object is recovered.
Compared with the prior art, the invention has the beneficial effects that: according to the invention, through developing a VR glove universal interface, action instructions in hardware abstraction layers of VR glove manufacturers are obtained, and interaction behaviors in the action instructions are stripped from SDK of the manufacturers, so that the coupling between hardware and a software system is greatly reduced; when needs and different VR gloves equipment butt joint, need not to carry out once integrated development respectively to every SDK, greatly promoted the development efficiency of the virtual reality system based on VR gloves.
Drawings
FIG. 1 is a schematic block diagram of a flow chart of a method for implementing a universal interface for interaction between both hands and a virtual glove peripheral according to the present invention.
FIG. 2 is a flowchart illustrating a virtual hand interaction operation according to an embodiment of the present invention.
FIG. 3 is a diagram illustrating several gestures defined in a predefined library of virtual hand movements according to an embodiment of the present invention.
FIG. 4 is a schematic diagram illustrating interaction behavior constraints of a human hand and a virtual hand according to an embodiment of the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it is obvious that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, in this embodiment, a set of unified VR glove interaction standards and specifications is established, which defines unified data structures, input and output formats, and the like, so that all virtual reality systems based on VR gloves can be integrated via the universal interface, and the application of the VR glove universal interface greatly improves the development efficiency of the virtual reality systems based on VR gloves, and simultaneously, realizes unified management and maintenance of the VR glove universal service functions in the market, and forms a set of interaction standard specifications. The method specifically comprises the following steps:
s1, based on the design idea of object-oriented, reasonable boundary division is carried out on the interaction system of the VR glove, interaction logic is stripped from the SDK complicated by manufacturers, and interaction between a virtual hand and a virtual object is realized in a universal data interface;
s2, analyzing various interactive behaviors in different forms, abstracting common interactive behaviors, extracting a program interface, and decoupling the common interactive behaviors from concrete interactive behaviors through the program interface to reduce the influence on an application program when VR glove equipment is replaced as much as possible;
and S3, combing the business logics of interaction, collision, state conversion and the like again, and determining a uniform interactive interface by combining the basic interactive logics of all the existing VR gloves so as to realize that the universal interface of the VR gloves can be simultaneously butted with the VR gloves of various categories.
Specifically, in step S1, the interaction functions of VR gloves of each manufacturer are summarized and analyzed, the action instructions in the hardware abstraction layer are obtained through the APIs provided by each manufacturer, the action instructions are analyzed and judged in the VR glove universal data interface, and the action instructions are converted into data objects defined in the universal interface according to the corresponding relationship, so that the data are unified from the hierarchy of the operation objects.
Further, after obtaining the action instruction in the hardware abstraction layer from the API provided by each manufacturer of the VR glove, the action instruction state set base class is abstracted, and this class member is instantiated in the universal interface to record action instruction data, such as a move instruction (including a move-forward instruction, a move-backward instruction, a move-left instruction, a move-right instruction, a move-up instruction, and a move-down instruction), a grab instruction, a fist instruction, an open instruction, a press instruction, a stop instruction, and the like.
When a data instruction of the VR glove is received, the data processing module (a processing module carried by the virtual reality system) firstly makes a logic judgment to determine the type of hardware equipment, and then calls a SetState function in an action instruction state set to execute a corresponding action.
Specifically, in step S2, the universal interface calls the action instructions issued in step S1 in a unified manner, and then performs service operations such as logic judgment, calculation and assignment; various actions of the virtual hand are predefined and packaged in the universal interface, and each time an action instruction is received, the action instruction is matched from a predefined action library (virtual hand action predefined library), so that the current operation is determined.
Furthermore, the virtual manual predefined library is the basis for realizing virtual interaction, the interaction task cannot be really and effectively completed if the definition is too small, and the system resource redundancy can be caused if the definition is too large. Analyzing the interaction operation commonly used in the virtual interaction in the universal interface defines the following gesture actions (as shown in fig. 2):
1) releasing;
2) making a fist;
3) pressing;
4) a holding action 1;
5) a holding action 2.
Further, the method for constructing the virtual hand action predefined library comprises the following steps: firstly, establishing a three-dimensional model of a virtual hand in 3DsMAX, numbering each joint of the virtual hand model, mapping the model by using Photoshop, namely adding colors to the model, increasing the degree of reality, finally establishing bones and skins for the virtual hand model, binding the bones and the skins, adjusting the weight of the bones influencing the skins, and making gesture skeleton animation according to defined gesture actions.
Further, the interaction method of the virtual hand and the human hand is as follows:
the mapping of the behavior of the human hand in the virtual environment is essentially that a tracker (a position sensor carried by a VR glove) tracks the position of the motion of the human hand to control the motion of the virtual hand in the virtual scene, so that the virtual hand reflects the intention of the human hand on the control of a virtual object in the virtual environment and simulates complex and various wrist movement actions.
In order to truly reflect the position of the virtual hand in the three-dimensional space, the interface can acquire data information in the VR glove sensor in real time, the information of the joint movement of each finger section of the human hand is converted into the measurement value of the sensor, the input value is read and analyzed in the interface to calculate the angle of each joint of the human hand, and the mapping process from the human hand to the virtual hand is as follows:
1) setting the joint angle of each finger segment of a virtual hand
protected void Set Glove Finger Angle(int Finger Index,float Angle);
V/traversing each joint according to the joint number, finding the corresponding joint and endowing the corresponding angle value;
2) and carrying out translation and rotation operation on the matrix of each joint angle defined above in a virtual environment to realize position and orientation transformation of the virtual hand.
protected void Change Finger Position(int Finger Index,Vector3 Finger Position,Vector3 Finger Rotation)
Assigning the calculated position and orientation to each joint.
Specifically, in step S3, a mapping relationship is established between the motion instruction collected in the above step and the virtual hand in the scene, so as to control the virtual hand to operate the virtual object in the virtual environment, and then the contact between the virtual hand and the virtual object is detected by using a collision detection algorithm encapsulated in the universal interface, and the interaction is completed.
The judgment algorithm of the grasping operation is used for judging whether the virtual hand grasps the stable object, the algorithm can not be over simplified, and the over simplified algorithm can cause the false grasping phenomenon; the excessive complexity results in excessive calculation amount and time consumption, thereby causing serious operation hysteresis. The gripping operation determination method adopted by the present embodiment is as follows:
when the received action instruction of the VR glove is a finger action bending instruction and the collision between the virtual hand and the virtual object is detected, judging that the virtual hand is performing grabbing operation on the virtual object at the moment; the nodes of the virtual object are arranged below the nodes of the virtual hand through matrix operation, and the gravity effect of the virtual object is cancelled, so that the virtual object becomes a sub-object of the virtual hand; at the moment, the relative coordinate positions of the virtual object and the virtual hand are solidified, and the virtual object translates and rotates along with the virtual hand;
when the action instruction of the VR glove is changed into the opening instruction, the virtual hand is judged to release the virtual object, the virtual hand interaction task is completed, the virtual object is separated from the virtual hand, and the gravity effect of the virtual object is recovered to enable the virtual object to be restricted by the physical law, so that the interactive operation of the glove and the operation object is realized, and the method is shown in figure 3.
The interactive function of encapsulation in traditional VR gloves is comparatively basic, only picks up, puts down and simple operations such as pressing. The embodiment encapsulates various common grasping rules in different forms, and the VR glove can complete complex simulation interaction tasks only by adding simple operations such as the execution script developed by the invention, parameter modification and the like, and the interaction process is more natural and efficient.
Specifically, in the process of interacting with the human hand, the virtual hand is correspondingly constrained according to specific application occasions, and the degree of freedom of the virtual hand is limited. The freedom of displacement of the operation objects is limited in the constrained state, the interaction objects cannot completely follow the motion of hands in the space, and some operation objects can only horizontally move along the axial direction, such as pulling a drawer and the like.
Next, taking the interaction behavior of the virtual hand moving the drawer as an example, the constraint of the interaction behavior between the human hand and the virtual hand in this embodiment is described:
as shown in fig. 4, first, a virtual hand and a three-dimensional coordinate before the displacement of the human hand are recorded (the coordinate before the displacement of the human hand is an a-point coordinate), when the three-dimensional coordinate of the human hand is changed, the three-dimensional coordinate after the change of the human hand (namely, a B-point coordinate) is recorded, two coordinate points before and after the movement of the human hand are connected, the position relationship of the two coordinate points in a space coordinate system is judged, and a displacement vector AB of the human hand is calculated; obtaining a displacement vector AB of a virtual hand according to the displacement vector of the hand, projecting the displacement vector of the virtual hand onto a horizontal plane (assuming that the drawer moves along the horizontal direction) to obtain a first-order displacement projection vector AC of the virtual hand, and projecting the first-order displacement projection vector onto the movement direction AE of the drawer to obtain a second-order displacement projection vector AD of the virtual hand, wherein the second-order displacement projection vector AD is coincident with the movement direction AE of the drawer; and finally, controlling the virtual hand to shift according to the second-order displacement projection vector of the virtual hand.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (6)

1. A universal interface implementation method for interaction between both hands and a virtual glove peripheral is characterized by comprising the following steps:
s1, obtaining action instructions in a hardware abstraction layer through API provided by each VR glove manufacturer, analyzing the action instructions in a VR glove universal interface, judging glove brands corresponding to the action instructions, and converting the action instructions into data objects defined in the universal interface according to corresponding relations between the action instructions and the glove brands;
s2, constructing a virtual manual operation predefined library, carrying out predefined encapsulation on various actions of a virtual hand in a universal interface, and storing encapsulated data into the virtual manual operation predefined library;
and S3, matching the action corresponding to the action command from the virtual manual action predefined library and executing after receiving the action command of the VR glove.
2. The method of claim 1, wherein in step S1, the data objects defined in the generic interface include a plurality of instructions selected from the group consisting of: a moving instruction, a grabbing instruction, a fist making instruction, an opening instruction, a pressing instruction and a stopping instruction.
3. The method of claim 1, wherein in step S2, the method of constructing the predefined library of virtual hand movements is: firstly, establishing a three-dimensional model of a virtual hand in 3DsMAX, numbering each joint of the virtual hand model, mapping the model by using Photoshop, namely adding colors to the model, finally establishing a skeleton and a skin for the virtual hand model, binding the skeleton and the skin, adjusting the weight of the influence of the skeleton on the skin, and making gesture skeleton animation according to defined gesture actions.
4. The method of claim 3, wherein the method for interacting the virtual hand with the human hand comprises: the method comprises the steps of acquiring motion information of joints of each finger section of a human hand in real time through a sensor in a VR glove, setting joint angles of each finger section of a virtual hand according to the motion information of each finger section of the human hand, and finally performing matrix translation and rotation operation on the set joint angles of each finger section of the virtual hand in a virtual environment to realize position and direction transformation of the virtual hand.
5. The method for implementing the universal interface between the hands and the virtual glove peripherals according to claim 4, wherein the virtual hand is correspondingly constrained according to specific application occasions during the interaction with the human hand, so that the degree of freedom of the virtual hand is limited.
6. The method of claim 1, wherein in step S3, when the received action command of the VR glove is a finger action bending command and a collision between the virtual hand and the virtual object is detected, it is determined that the virtual hand is performing a grabbing operation on the virtual object; the nodes of the virtual object are arranged below the nodes of the virtual hand through matrix operation, and the gravity effect of the virtual object is cancelled, so that the virtual object becomes a sub-object of the virtual hand; at the moment, the relative coordinate positions of the virtual object and the virtual hand are solidified, and the virtual object translates and rotates along with the virtual hand;
when the action instruction of the VR glove is changed into the opening instruction, the virtual hand is judged to release the virtual object, the interaction task of the virtual hand is completed, the virtual object is separated from the virtual hand, and the gravity effect of the virtual object is recovered.
CN202111242604.3A 2021-10-25 2021-10-25 Universal interface implementation method for interaction between both hands and virtual glove peripherals Pending CN114035677A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111242604.3A CN114035677A (en) 2021-10-25 2021-10-25 Universal interface implementation method for interaction between both hands and virtual glove peripherals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111242604.3A CN114035677A (en) 2021-10-25 2021-10-25 Universal interface implementation method for interaction between both hands and virtual glove peripherals

Publications (1)

Publication Number Publication Date
CN114035677A true CN114035677A (en) 2022-02-11

Family

ID=80135291

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111242604.3A Pending CN114035677A (en) 2021-10-25 2021-10-25 Universal interface implementation method for interaction between both hands and virtual glove peripherals

Country Status (1)

Country Link
CN (1) CN114035677A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080013793A1 (en) * 2006-07-13 2008-01-17 Northrop Grumman Corporation Gesture recognition simulation system and method
CN103955295A (en) * 2014-04-17 2014-07-30 北京航空航天大学 Real-time grabbing method of virtual hand based on data glove and physical engine
CN108958471A (en) * 2018-05-17 2018-12-07 中国航天员科研训练中心 The emulation mode and system of virtual hand operation object in Virtual Space
CN108958479A (en) * 2018-06-14 2018-12-07 南京师范大学 Real-time interactive method for universal three-dimensional virtual scene based on data glove
RU187548U1 (en) * 2017-10-27 2019-03-12 Федоров Александр Владимирович VIRTUAL REALITY GLOVE
CN110568929A (en) * 2019-09-06 2019-12-13 诺百爱(杭州)科技有限责任公司 Virtual scene interaction method and device and electronic equipment
CN112835449A (en) * 2021-02-03 2021-05-25 青岛航特教研科技有限公司 Virtual reality and somatosensory device interaction-based safety somatosensory education system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080013793A1 (en) * 2006-07-13 2008-01-17 Northrop Grumman Corporation Gesture recognition simulation system and method
CN103955295A (en) * 2014-04-17 2014-07-30 北京航空航天大学 Real-time grabbing method of virtual hand based on data glove and physical engine
RU187548U1 (en) * 2017-10-27 2019-03-12 Федоров Александр Владимирович VIRTUAL REALITY GLOVE
CN108958471A (en) * 2018-05-17 2018-12-07 中国航天员科研训练中心 The emulation mode and system of virtual hand operation object in Virtual Space
CN108958479A (en) * 2018-06-14 2018-12-07 南京师范大学 Real-time interactive method for universal three-dimensional virtual scene based on data glove
CN110568929A (en) * 2019-09-06 2019-12-13 诺百爱(杭州)科技有限责任公司 Virtual scene interaction method and device and electronic equipment
CN112835449A (en) * 2021-02-03 2021-05-25 青岛航特教研科技有限公司 Virtual reality and somatosensory device interaction-based safety somatosensory education system

Similar Documents

Publication Publication Date Title
Xiang et al. Sapien: A simulated part-based interactive environment
CN108582068A (en) A method of to breaker put together machines people carry out virtual emulation
Popović et al. A strategy for grasping unknown objects based on co-planarity and colour information
CN109816773A (en) A kind of driving method, plug-in unit and the terminal device of the skeleton model of virtual portrait
CN111667560B (en) Interaction structure and interaction method based on VR virtual reality role
CN111191322B (en) Virtual maintainability simulation method based on depth perception gesture recognition
Gu et al. Maniskill2: A unified benchmark for generalizable manipulation skills
JPH11191061A (en) Device and method for generating computer program
CN109697002A (en) A kind of method, relevant device and the system of the object editing in virtual reality
CN102629388A (en) Mechanical equipment simulation system generating method
CN202159302U (en) Augment reality system with user interaction and input functions
CN109732593A (en) A kind of far-end control method of robot, device and terminal device
Nam et al. A software architecture for service robots manipulating objects in human environments
US20220402125A1 (en) System and method for determining a grasping hand model
CN102214365A (en) Skeletal animation theory-based universal virtual person simulation technology
CA2305095C (en) Swept volume model
Ikeuchi et al. Applying learning-from-observation to household service robots: three common-sense formulation
CN114035677A (en) Universal interface implementation method for interaction between both hands and virtual glove peripherals
CN111590560A (en) Method for remotely operating manipulator through camera
Shi et al. Grasping 3d objects with virtual hand in vr environment
Li et al. 3D hand reconstruction from a single image based on biomechanical constraints
CN112181139B (en) Cooperative control interaction method for virtual reality and mixed reality
Yu et al. Real-time multitask multihuman–robot interaction based on context awareness
Varga et al. Survey and investigation of hand motion processing technologies for compliance with shape conceptualization
Hwang et al. Primitive object grasping for finger motion synthesis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination