Hand motion tracking method in virtual driving environment
Technical Field
The invention relates to the field of virtual driving simulation, in particular to a method for tracking hand movement in a virtual driving environment.
Background
In the traditional computer graphics technology, the change of the visual field is realized by a mouse or a keyboard, the visual system and the motion perception system of a user are separated, and along with the development of the virtual reality technology, a plurality of virtual reality technical solutions appear nowadays, the visual angle of an image is changed by utilizing head tracking, so that the visual system and the motion perception system of the user can be connected, and the feeling is more vivid. The user can know the environment through binocular stereo vision and observe the environment through the movement of the head, the scheme emphasizes on solving the problem of virtual environment simulation in the virtual reality technology, the user can display a three-dimensional image generated by a computer in a wide-angle, real-time and three-dimensional mode by wearing corresponding equipment, the traditional handle operation is still used for the interactive sense in the virtual reality, and the user needs to complete the interaction with the virtual world through key operation.
In the prior art, due to the defects of capture equipment and technology, only the gesture of a hand can be acquired, and gesture characteristic data of parts except the hand of a person cannot be acquired; all the posture characteristics of the real hand cannot be accurately restored; when the driver rotates the steering wheel, the shielding of the gesture can occur, and the correct hand position and gesture can not be captured by the equipment, so that the gesture recognition is invalid. In addition, in a series of processes of capturing hand motion data, returning the data to the virtual world, giving a model and driving a model animation, a delay problem is caused, so that the virtual hand position cannot be matched with the real hand position, and the reality sense of the simulation experience is reduced.
In order to realize more vivid virtual reality experience, the interaction with the virtual world is realized in a handle and key mode, which is not the best scheme, and in fact, the virtual reality technology comprises various aspects such as environment simulation, perception, natural skills, sensing equipment and the like. The invention provides a method for tracking hand movement in a virtual driving environment, which realizes interaction with a virtual world by capturing real world character gesture data.
Disclosure of Invention
The invention provides a method for tracking hand movement in a virtual driving environment, which enables a user to see the operation of hands on objects such as a steering wheel and the like in a virtual world, and enhances the reality and immersion of simulation experience.
In order to achieve the purpose, the invention provides the following technical scheme: a method for tracking hand movement in a virtual driving environment, comprising the steps of:
the method comprises the following steps: acquiring gesture data of two hands captured by equipment by using API of Leap Motion;
step two: hand posture data acquired in real time through an API of the Leap Motion is assigned to a virtual character hand model skeleton in real time through an unknown Engine blueprint, so that natural gesture operation tracking operation is realized; which comprises the following steps: IK calculation is carried out according to a TwoBoneIK node in an unknown Engine animation blueprint, and natural control of virtual character arms is realized; and (II) animation mixing is carried out according to Layeredblockverbenone nodes in an unknown Engine animation blueprint, when the hands of characters in the real world approach the steering wheel, the characters in the virtual world can be automatically switched to the state of playing the animation of the steering wheel operated by the driver, and the animation of the steering wheel tightly held by the driver at different positions can be played according to the positions of the real hands.
The Leap Motion is a body-sensing controller issued by Leap of body-sensing controller manufacturing companies facing PCs and macs, and is used for capturing the Motion of human hands.
The model animation uses three-dimensional modeling software to make a virtual character model which has external characteristics like those of a real character and comprises four limbs.
The captured data is transmitted back to a virtual character in an Unreal4 engine, the character drives the animation by using an animation blueprint, and only a Transform modification Bone node is used in the animation blueprint.
When the Leap Motion is not started or activated, the virtual character animation can be automatically switched to play the animation of the steering wheel operated by the driver through a Layeredblockerbone node in an unknown Engine animation blueprint according to the rotating state of the steering wheel in the real world, and the virtual character two-hand model can correspondingly rotate according to the rotating angle of the steering wheel.
When the LeapMotion is started and activated, if the captured data is lost due to shielding or other reasons, the Layeredblockverbenone node in the unknown Engine animation blueprint can be automatically switched to play the animation of the steering wheel operated by the driver. Avoiding the situation of abnormal operation of the driver in the virtual world
The invention has the beneficial effects that:
1. according to the invention, the interaction with the virtual world is realized by capturing the gesture data of the real world character, and the user can see the operation of the user's hands on objects such as a steering wheel in the virtual world, so that the reality sense and the immersion sense of the simulation experience are enhanced.
2. Hand posture data captured by the Leap Motion equipment is accurately calculated through the IK animation, so that Motion simulation of the whole arm is realized.
3. In the virtual driving environment, when the hands of the real character contact the steering wheel, the virtual world character automatically switches to play the animation, and the position of the playing animation is calculated according to the Leap Motion capture data, so that the positions of the hands of the virtual world on the steering wheel can be changed according to the real hand positions while the steering wheel is tightly held by the two hands. The perfect simulation of the virtual world is realized to hold the steering wheel for operation.
4. By using the preset animation, the animation can be automatically switched to play when the Leap Motion is lost or the capture is wrong. The condition that hand data is lost or time delay is caused and virtual world simulation is not real is avoided.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention.
In the drawings:
FIG. 1 is a flow chart of capturing data according to the present invention;
fig. 2 is a flow chart of hand motion tracking according to the present invention.
Detailed Description
The preferred embodiments of the present invention will be described in conjunction with the accompanying drawings, and it will be understood that they are described herein for the purpose of illustration and explanation and not limitation.
A method for tracking hand movement in a virtual driving environment comprises the following steps:
the method comprises the following steps: referring to fig. 1, using the API of Leap Motion, two-hand gesture data captured by the device is obtained, including position and rotation data for each finger: through API of Leap Motion, acquiring two-hand posture data captured by equipment, including position and rotation data of each finger, transmitting the acquired data back to bones of a hand model of a virtual character in an unknown Engine, driving actions of the virtual character in the unknown Engine by using an animation blueprint, controlling skeleton animation of the model in real time by using a Transform Modify Bone node in the animation blueprint, and assigning the two-hand posture data captured by the Leap Motion, including the position and rotation data of each finger to the corresponding bones of the hand model of the virtual character in the animation blueprint.
Step two: referring to fig. 2, hand gesture data obtained in real time through API of Leap Motion is assigned to a virtual character hand model skeleton in real time through an absolute Engine animation blueprint, thereby implementing a natural gesture operation tracking operation;
which comprises the following steps:
(a) Controlling arm movement: acquiring the hand root gesture in real time through API of Leap Motion, and carrying out IK calculation according to a Two Bone IK node in an Unreal Engine animation blueprint to realize reverse driving of the whole arm to move through the hand root gesture, thereby realizing natural control of the virtual character arm;
(b) Interaction with the steering wheel: animation mixing is carried out according to Layeredblockerbone nodes in an Unreal Engine animation blueprint, so that when the hands of characters in the real world approach a steering wheel, the characters in the virtual world can be automatically switched to a state of playing animation of a driver operating the steering wheel, and the animation of the steering wheel tightly held by the driver at different positions can be played according to the positions of the real hands.
(c) And (3) processing in a state of loss, abnormity or no recognition of the recognition state:
when the LeapMotion of the capturing device is not started or activated, the virtual character animation can be automatically switched to play the animation of a driver operating a steering wheel through a Layeredblockerbone node in an unknown Engine animation blueprint according to the rotating state of the steering wheel of the real world, and the virtual character double-hand model can correspondingly rotate according to the rotating angle of the steering wheel.
When the LeapMotion is started and activated, if the captured data is lost due to occlusion or other reasons, the Layeredblockerbone node in the unknown Engine animation blueprint can be automatically switched to play the animation of the steering wheel operated by the driver. The situation that the operation of a driver in the virtual world is abnormal is avoided.
Compared with the prior art, the animation playing method has the advantages that through the preset animation, the animation can be automatically switched to play when the Leap Motion is lost or capture errors occur. The condition that the virtual world simulation is not real due to hand data loss or time delay is avoided; by capturing the gesture data of the real-world character, the interaction with the virtual world is realized, and a user can see the operation of hands of the user on objects such as a steering wheel and the like in the virtual world, so that the reality and immersion of the simulation experience are enhanced.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that changes may be made in the embodiments and/or equivalents thereof without departing from the spirit and scope of the invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.