CN112562071A - Method, device and equipment for calculating motion difference and storage medium - Google Patents

Method, device and equipment for calculating motion difference and storage medium Download PDF

Info

Publication number
CN112562071A
CN112562071A CN202011567709.1A CN202011567709A CN112562071A CN 112562071 A CN112562071 A CN 112562071A CN 202011567709 A CN202011567709 A CN 202011567709A CN 112562071 A CN112562071 A CN 112562071A
Authority
CN
China
Prior art keywords
motion data
joint point
calculating
joint
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011567709.1A
Other languages
Chinese (zh)
Inventor
刘思阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing IQIYI Science and Technology Co Ltd
Original Assignee
Beijing IQIYI Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing IQIYI Science and Technology Co Ltd filed Critical Beijing IQIYI Science and Technology Co Ltd
Priority to CN202011567709.1A priority Critical patent/CN112562071A/en
Publication of CN112562071A publication Critical patent/CN112562071A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention provides a method, a device, equipment and a storage medium for calculating the action difference, wherein the method comprises the following steps: acquiring first motion data, second motion data and a bone vector of a three-dimensional human body model; calculating the offset of the first motion data and the second motion data at the rotation matrix of each joint point; calculating a weight for each joint point from the bone vector, wherein the weight is inversely proportional to the distance of each joint point from the root joint point; and calculating a weighted average value of the offset of each joint point according to the weight to obtain the difference between the first motion data and the second motion data. In this way, the obtained bone vector, the first motion data and the second motion data are calculated to obtain the difference degree between the first motion data and the second motion data, manual observation is not needed, and therefore efficiency can be improved and cost can be reduced.

Description

Method, device and equipment for calculating motion difference and storage medium
Technical Field
The present invention relates to the field of three-dimensional model processing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for calculating an action disparity.
Background
Motion simulation based on three-dimensional human body models has many applications in scenes such as animation production, movie special effects and the like, generally, actors wearing motion capture equipment demonstrate various motions, the motion capture equipment collects motion data, and then, processing equipment combines the motion data with the three-dimensional human body models, so that the motions of the three-dimensional human body models are consistent with the motions of the actors. However, since the size of the actor is different from that of the three-dimensional mannequin, the three-dimensional mannequin may show some actions which do not conform to the physical principle, for example, as shown in fig. 1, the arm of the three-dimensional mannequin passes through the inside of the limb, i.e., the "through mold" is generated.
In this case, in order to make the motion of the three-dimensional human body model conform to the physical principle, the through-mold motion needs to be redirected, and it can be understood that the difference degree between the redirected motion and the original motion cannot be too large on the premise of conforming to the physical principle, otherwise, the accuracy of the motion simulation of the three-dimensional human body model is reduced.
At present, a manual observation mode is usually adopted to judge whether the difference degree between the redirected motion and the original motion meets the requirement, but the cost required by the manual observation mode is continuously increased along with the improvement of the real-time requirement and the increase of the data volume of the motion simulation process of the three-dimensional human body model. Therefore, a method for automatically calculating the motion difference is needed.
Disclosure of Invention
The embodiment of the invention aims to provide a method, a device, equipment and a storage medium for calculating the motion difference, so as to realize the calculation of the difference between the automatic reorientation motion and the original motion. The specific technical scheme is as follows:
in a first aspect of the present invention, there is provided a motion disparity calculation method, including:
acquiring first motion data, second motion data and a bone vector of a three-dimensional human body model, wherein a plurality of joint points of the three-dimensional human body model comprise a root joint point, the first motion data and the second motion data are composed of a rotation matrix of each joint point relative to the root joint point, and the bone vector comprises the distance between each joint point and the root joint point;
calculating the offset of the first motion data and the second motion data at the rotation matrix of each joint point;
calculating a weight for each joint point from the bone vector, wherein the weight is inversely proportional to the distance of each joint point from the root joint point;
and calculating a weighted average value of the offset of each joint point according to the weight to obtain the difference between the first motion data and the second motion data.
Optionally, the acquiring the first motion data, the second motion data and the bone vector of the three-dimensional human body model includes:
acquiring first motion data and a skeleton vector of a three-dimensional human body model acquired by motion capture equipment;
and inputting the first action data and the bone vector into a preset redirection model, and redirecting the first action data to obtain second action data.
Optionally, the calculating, for each joint point, an offset of the rotation matrix of the joint point of the first motion data and the second motion data includes:
calculating the absolute value of the difference between the first motion data and the second motion data in the rotation matrix of the joint point to obtain a difference matrix;
and calculating the sum of each element in the difference matrix to obtain the offset of the first motion data and the second motion data at the joint.
Optionally, the following formula is adopted, and for each joint point, the offset of the rotation matrix of the joint point of the first motion data and the second motion data is calculated:
ΔRi=SUM(|Ri-R′i|)
wherein, R isiIs a rotation matrix of the ith joint point in the first motion data relative to the root joint point, R'iA rotation matrix for the ith joint point in the second motion data relative to the root joint point, the SUM representing a matrix summation operation, the Δ RiAnd an offset between the first motion data and the second motion data at an ith joint point is represented.
Optionally, said calculating a weight for each joint point from said bone vector comprises:
calculating the sum of a preset empirical value and the maximum distance in the skeletal vector;
and calculating the difference between the sum and the distance between the joint point and the root joint point and calculating the ratio of the difference to the sum of the distances in the bone vectors to obtain the weight of the joint point.
Optionally, the weight of each joint point is calculated from the bone vector using the following formula:
Figure BDA0002861239580000031
wherein, the wiRepresents the weight of the ith joint point, dsetFor the preset empirical value, dmaxIs the maximum distance in the bone vector, said diThe distance between the ith joint point in the skeleton vector and the root joint point, the K is the number of joint points in the skeleton vector, and the djIs the distance of the jth joint point in the bone vector from the root joint point.
Optionally, a weighted average of the offset of each joint point is calculated according to the weight by using the following formula, so as to obtain a difference between the first motion data and the second motion data:
Figure BDA0002861239580000032
wherein, the wiRepresents the weight of the ith joint point, said Δ RiAnd K represents the offset of the first motion data and the second motion data at the ith joint point, and represents the number of joint points in the bone vector.
In a second aspect of the present invention, there is also provided an action disparity calculating apparatus including:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring first motion data, second motion data and a bone vector of a three-dimensional human body model, a plurality of joint points of the three-dimensional human body model comprise a root joint point, the first motion data and the second motion data are composed of a rotation matrix of each joint point relative to the root joint point, and the bone vector comprises the distance between each joint point and the root joint point;
the first calculation module is used for calculating the offset of the first motion data and the second motion data in a rotation matrix of each joint point;
the second calculation module is used for calculating the weight corresponding to each joint point according to the skeleton vector, wherein the weight is in inverse proportion to the distance between the joint point and the root joint point;
and the third calculation module is used for calculating a weighted average value of the offset of each joint point according to the weight to obtain the difference degree between the first motion data and the second motion data.
Optionally, the obtaining module is specifically configured to obtain first motion data and a bone vector of the three-dimensional human body model acquired by the motion capture device; and inputting the first action data and the bone vector into a preset redirection model, and redirecting the first action data to obtain second action data.
Optionally, the first calculating module is specifically configured to calculate an absolute value of a difference between the first motion data and the second motion data at the rotation matrix of the joint point, so as to obtain a difference matrix; and calculating the sum of each element in the difference matrix to obtain the offset of the first motion data and the second motion data in the rotation matrix of the joint point.
Optionally, the first calculating module is specifically configured to calculate, for each joint point, an offset of the rotation matrix of the joint point between the first motion data and the second motion data by using the following formula:
ΔRi=SUM(|Ri-R′i|)
wherein, R isiFor the ith joint point in the first motion data relative to the root joint pointR'iA rotation matrix for the ith joint point in the second motion data relative to the root joint point, the SUM representing a matrix summation operation, the Δ RiAnd an offset between the first motion data and the second motion data at an ith joint point is represented.
Optionally, the second calculating module is specifically configured to calculate a sum of a preset empirical value and a maximum distance in the bone vector; and calculating the difference between the sum and the distance between the joint point and the root joint point and calculating the ratio of the difference to the sum of the distances in the bone vectors to obtain the weight of the joint point.
Optionally, the second calculating module is specifically configured to calculate a weight of each joint point according to the bone vector by using the following formula:
Figure BDA0002861239580000041
wherein, the wiRepresents the weight of the ith joint point, dsetFor the preset empirical value, dmaxIs the maximum distance in the bone vector, said diThe distance between the ith joint point in the skeleton vector and the root joint point, the K is the number of joint points in the skeleton vector, and the djIs the distance of the jth joint point in the bone vector from the root joint point.
Optionally, the third calculating module is specifically configured to calculate a weighted average of the offsets of each joint point according to the weights by using the following formula, so as to obtain a difference between the first motion data and the second motion data:
Figure BDA0002861239580000051
wherein, the wiRepresents the weight of the ith joint point, said Δ RiRepresenting the first motion data and theAnd the offset of the second motion data at the ith joint point, wherein K represents the number of joint points in the bone vector.
In another aspect of the present invention, there is also provided an electronic device, including a processor, a communication interface, a memory and a communication bus, where the processor, the communication interface, and the memory complete communication with each other through the communication bus;
a memory for storing a computer program;
and a processor for implementing any one of the above-described operation difference calculation methods when executing the program stored in the memory.
In yet another aspect of the present invention, there is also provided a computer-readable storage medium having stored therein instructions, which when executed on a computer, cause the computer to execute any one of the above-described motion disparity calculation methods.
In yet another aspect of the present invention, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform any of the above-described method for calculating a difference in an action.
The method, the device, the equipment and the storage medium for calculating the action difference degree provided by the embodiment of the invention comprise the following steps of firstly, acquiring first action data, second action data and a bone vector of a three-dimensional human body model of the three-dimensional human body model; then, calculating the offset of the first motion data and the second motion data in the rotation matrix of each joint point; calculating a weight for each joint point based on the bone vector, wherein the weight is inversely proportional to a distance of each joint point from the root joint point; further, according to the weight, calculating a weighted average value of the offset of each joint point to obtain a difference degree between the first motion data and the second motion data. Therefore, the difference degree between the first action data and the second action data can be obtained by calculating the obtained bone vector, the first action data and the second action data, manual observation is not needed, and therefore efficiency can be improved and cost can be reduced.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below.
FIG. 1 is a schematic diagram of a three-dimensional mannequin showing a mold-piercing action;
FIG. 2 is a flow chart illustrating steps of a method for calculating motion variance according to the present application;
FIG. 3 is a schematic diagram of a three-dimensional human body model according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating the parent-child relationship between the nodes of the three-dimensional human body model shown in FIG. 3
FIG. 5 is a schematic diagram of the joint points of a three-dimensional human body model;
FIG. 6 is a schematic diagram showing three actions of the three-dimensional mannequin shown in FIG. 5;
FIG. 7 is a block diagram of an operation difference calculation apparatus according to an embodiment of the present invention;
fig. 8 is a schematic diagram of an electronic device according to an embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described below with reference to the drawings in the embodiments of the present invention.
In the related art, in the motion simulation process based on the three-dimensional human body model, the motion of the actor is mapped onto the three-dimensional human body model by relying on the motion capture equipment, but the actor and the three-dimensional human body model have different figures, so that the three-dimensional human body model can show some motions which do not conform to the physical principle. At present, the mode of manual correction can be adopted to redirect the through-die action, and then an operator can directly adopt the mode of manual observation to judge whether the action after redirection meets the requirements. However, the manual operation method is time-consuming and costly, and with the development of the three-dimensional human body model motion redirection algorithm, the redirected motion cannot be judged in time.
In order to solve the above problem, an embodiment of the present invention provides a method for calculating a motion difference, and the method for calculating a motion difference provided by an embodiment of the present invention is generally described below, and includes the following steps:
acquiring first motion data, second motion data and a bone vector of a three-dimensional human body model, wherein a plurality of joint points of the three-dimensional human body model comprise a root joint point, the first motion data and the second motion data are composed of a rotation matrix of each joint point relative to the root joint point, and the bone vector comprises the distance between each joint point and the root joint point;
calculating the offset of the first motion data and the second motion data in the rotation matrix of each joint point;
calculating a weight for each joint point based on the bone vector, wherein the weight is inversely proportional to a distance of each joint point from the root joint point;
and calculating the weighted average of the offset of each joint point according to the weight to obtain the difference between the first motion data and the second motion data.
As can be seen from the above, the motion difference calculation method provided by the embodiment of the present invention can obtain the difference between the first motion data and the second motion data by calculating the obtained bone vector, the first motion data, and the second motion data, and does not need to perform manual observation, so that the efficiency can be improved, and the cost can be reduced.
The following describes in detail the action redirection method provided in the embodiment of the present invention by using a specific embodiment.
Referring to fig. 2, a flowchart illustrating steps of an action redirection method according to the present application is shown, which may specifically include the following steps:
s201: and acquiring first motion data, second motion data and a bone vector of the three-dimensional human body model.
The three-dimensional human body model comprises a plurality of joint points, a first action data and a second action data, wherein the plurality of joint points comprise a root joint point, the first action data and the second action data are composed of a rotation matrix of each joint point relative to the root joint point, and a skeleton vector comprises the distance between each joint point and the root joint point. The three-dimensional human body model can be designed manually by a designer or can be generated directly according to the skeleton of the human body.
For example, as shown in fig. 3, a schematic diagram of a three-dimensional human body model is provided for an embodiment of the present invention, wherein the left side is a skeleton diagram of the three-dimensional human body model, and the right side is a schematic diagram of a virtual idol generated according to the three-dimensional human body model, the three-dimensional human body model can be understood as a shell, a rotation angle of each joint needs to be driven to move the shell, joints of the three-dimensional human body model can include elbow, shoulder joint, knee joint and the like of a person, and the three-dimensional human body model can be driven by controlling the joints.
The first motion data and the second motion data can be represented by a matrix M, the dimensionality of the matrix M is (J-1) multiplied by 3, J is the number of joint points in the three-dimensional human body model, the motion matrix M does not include rotation information of a preset root joint point in the joint points, and the 3-dimensional vector of each row represents the axial angle information of the joint point represented by the row relative to a parent node of the joint point. In the embodiment of the invention, a single independent action is processed instead of an action sequence, so that the global coordinates of the root joint point do not need to be acquired, namely the global coordinates of the root joint point do not need to be modified.
As shown in fig. 4, it is a schematic diagram of the parent-child relationship between the joint points in the three-dimensional human body model shown in fig. 3, where the three-dimensional human body model includes 17 joint points, and then the dimension of the rotation matrix R is 16 × 3. The connection between each joint point is a bone rod, the length of the bone rod is unchanged, the bone vector is S, the dimensionality is 16 multiplied by 1, and each joint point has a primary and secondary father-son relationship, wherein the preset root joint point is a father node at the highest level, the joint point connected with the preset root joint point is a child node of the preset root joint point, and the like.
In this step, the first motion data and the second motion data are motion data for the three-dimensional human body model, where the first motion data and the second motion data may be respectively collected by a motion capture device, or the second motion data may be obtained by performing redirection processing on the first motion data. The redirection processing is performed on the first action data, which may be manual redirection, or the redirection processing may be performed on the first action data through a preset redirection model.
For example, first motion data and a bone vector of a three-dimensional human body model collected by a motion capture device may be obtained, and then the first motion data and the bone vector are input into a preset redirection model to redirect the first motion data to obtain second motion data. The preset redirection model may be a model of any network, which is not limited in the embodiment of the present invention. In this way, by calculating the difference between the first motion data and the second motion data obtained by different motion redirection algorithms, the optimal solution for performing motion redirection on the first motion data can be determined.
S202: and calculating the offset of the first motion data and the second motion data at the rotation matrix of each joint point.
In the embodiment of the present invention, the following steps may be adopted to calculate the offset of the rotation matrix of the first motion data and the second motion data at each joint point:
firstly, calculating the absolute value of the difference between the first motion data and the second motion data in the rotation matrix of the joint point to obtain a difference matrix, and further calculating the sum of each element in the difference matrix to obtain the offset of the first motion data and the second motion data in the joint point.
For example, the following formula may be adopted, and for each joint point, the offset of the rotation matrix of the joint point of the first motion data and the second motion data is calculated:
ΔRi=SUM(|Ri-R′i|)
wherein R isiIs a rotation matrix R 'of the ith joint point relative to the root joint point in the first motion data'iFor the rotation matrix of the ith joint point relative to the root joint point in the second motion data, SUM represents a matrix summation operation, Δ RiThe offset between the first motion data and the second motion data at the ith joint point is shown.
In one implementation, each element in the rotation matrix of each joint point is euler angle data, in this case, the euler angle data may be converted into a quaternion, and then an absolute value of a difference between the first motion data and the second motion data in the rotation matrix of the joint point may be calculated, so that the calculation process is simpler.
S203: a weight is calculated for each joint point based on the bone vector, wherein the weight is inversely proportional to the distance of each joint point from the root joint point.
The end joint points in the three-dimensional human body model, such as joint points of both hands, feet, head, etc., have a larger variation range after performing the same rotation angle variation than other joint points, for example, after rotating the wrist joint point and the waist joint point by the same angle, the similarity between the three-dimensional human body model with the wrist joint point rotated and the three-dimensional human body model before rotating will be higher than that of the three-dimensional human body model with the waist joint point rotated. Then, when calculating the difference between the first motion data and the second motion data, the joint points farther from the root joint point have less influence on the motion similarity of the entire three-dimensional human body model, and can be tolerated to a higher degree. In other words, nodes farther from the root node may change less weight than nodes closer.
In an embodiment of the present invention, a method for calculating a weight of each joint point according to a bone vector may include the following steps:
firstly, calculating the sum of the preset empirical value and the maximum distance in the skeleton vector, then, calculating the difference between the sum and the distance between the joint point and the root joint point aiming at each joint point, and further calculating the ratio of the difference to the sum of the distances in the skeleton vector to obtain the weight of the joint point.
For example, the following formula can be used to calculate the weight of each joint point according to the bone vector:
Figure BDA0002861239580000101
wherein,wiWeight of the ith joint, dsetTo preset empirical values, dmaxIs the maximum distance in the skeletal vector, diIs the distance between the ith joint point and the root joint point in the skeleton vector, K is the number of joint points in the skeleton vector, djIs the distance between the jth joint point in the bone vector and the root joint point.
S204: and calculating the weighted average of the offset of each joint point according to the weight to obtain the difference between the first motion data and the second motion data.
For example, the following formula can be used to calculate a weighted average of the offset of each joint point according to the weight, and obtain the difference between the first motion data and the second motion data:
Figure BDA0002861239580000102
wherein, wiRepresents the weight of the ith joint, Δ RiThe offset of the first motion data and the second motion data at the ith joint point is shown, and K represents the number of joint points in the skeleton vector.
For example, as shown in fig. 5, a schematic diagram of joint points of a three-dimensional human body model is shown, wherein the joint points included in the three-dimensional human body model and their names are shown in table 1, table 2, and table 3. As shown in fig. 6, the three motions of the three-dimensional human model shown in fig. 5 are C1, C2, and C3 in this order from left to right. By calculating the degree of difference in motion data between C1 and C2, and between C1 and C3, a motion closer to C1 can be determined.
TABLE 1C 1 model Individual Joint Point data
Figure BDA0002861239580000103
Figure BDA0002861239580000111
Figure BDA0002861239580000121
Table 2C 2 model respective joint point data
Figure BDA0002861239580000122
Figure BDA0002861239580000131
Figure BDA0002861239580000141
TABLE 3C 3 model Individual Joint Point data
Figure BDA0002861239580000142
Figure BDA0002861239580000151
Figure BDA0002861239580000161
By adopting the difference calculating method provided by the embodiment of the invention, the difference between C1 and C2 and the difference between C1 and C3 are respectively calculated to obtain SC1-C2=2.9856785743834755,SC1-C33.594991123434131, that is, the degree of difference between C1 and C2 is less than and between C1 and C3, and therefore the degree of difference in motion between C1 and C2 is higher.
As can be seen from the above, the motion difference calculation method provided by the embodiment of the present invention can obtain the similarity difference between the first motion data and the second motion data by calculating the obtained bone vector, the first motion data, and the second motion data, and does not need to perform manual observation, so that the efficiency can be improved, and the cost can be reduced.
Referring to fig. 7, a block diagram of a motion disparity calculation apparatus according to the present application is shown, and the apparatus may specifically include the following modules:
an obtaining module 701, configured to obtain first motion data, second motion data, and a bone vector of a three-dimensional human body model, where a plurality of joint points of the three-dimensional human body model include a root joint point, the first motion data and the second motion data are formed by a rotation matrix of each joint point relative to the root joint point, and the bone vector includes a distance between each joint point and the root joint point;
a first calculating module 702, configured to calculate, for each joint point, an offset of the rotation matrix of the joint point between the first motion data and the second motion data;
a second calculating module 703, configured to calculate, according to the bone vector, a weight corresponding to each joint point, where the weight is inversely proportional to a distance between the joint point and a root joint point;
and a third calculating module 704, configured to calculate a weighted average of the offset of each joint point according to the weight, so as to obtain a difference between the first motion data and the second motion data.
In one implementation, the obtaining module 701 is specifically configured to obtain first motion data and a bone vector of a three-dimensional human body model acquired by a motion capture device; and inputting the first action data and the bone vector into a preset redirection model, and redirecting the first action data to obtain second action data.
In one implementation, the first calculating module 702 is specifically configured to calculate an absolute value of a difference between the first motion data and the second motion data at the rotation matrix of the joint point, so as to obtain a difference matrix; and calculating the sum of each element in the difference matrix to obtain the offset of the first motion data and the second motion data at the joint point.
In one implementation, the first calculating module 702 is specifically configured to calculate, for each joint point, an offset of the first motion data and the second motion data at the joint point by using the following formula:
ΔRi=SUM(|Ri-R′i|)
wherein R isiIs a rotation matrix R 'of the ith joint point relative to the root joint point in the first motion data'iFor the rotation matrix of the ith joint point relative to the root joint point in the second motion data, SUM represents a matrix summation operation, Δ RiThe offset between the first motion data and the second motion data at the ith joint point is shown.
In one implementation, the second calculating module 703 is specifically configured to calculate a sum of a preset empirical value and a maximum distance in a bone vector; and calculating the difference between the sum and the distance between the joint point and the root joint point and calculating the ratio of the difference to the sum of the distances in the skeleton vector to obtain the weight of the joint point.
In one implementation, the second calculating module 703 is specifically configured to calculate the weight of each joint point according to the bone vector by using the following formula:
Figure BDA0002861239580000171
wherein, wiWeight of the ith joint, dsetTo preset empirical values, dmaxIs the maximum distance in the skeletal vector, diIs the distance between the ith joint point and the root joint point in the skeleton vector, K is the number of joint points in the skeleton vector, djIs the distance between the jth joint point in the bone vector and the root joint point.
In one implementation, the third calculating module 704 is specifically configured to calculate a weighted average of the offset of each joint according to the weight by using the following formula, so as to obtain a difference between the first motion data and the second motion data:
Figure BDA0002861239580000181
wherein, wiRepresents the weight of the ith joint, Δ RiThe offset of the first motion data and the second motion data at the ith joint point is shown, and K represents the number of joint points in the skeleton vector.
As can be seen from the above, the motion difference calculation device provided in the embodiment of the present invention can obtain the similarity difference between the first motion data and the second motion data by calculating the obtained bone vector, the first motion data, and the second motion data, and does not need to manually observe, so that the efficiency can be improved, and the cost can be reduced.
An embodiment of the present invention further provides an electronic device, as shown in fig. 8, which includes a processor 801, a communication interface 802, a memory 803, and a communication bus 804, where the processor 801, the communication interface 802, and the memory 803 complete mutual communication through the communication bus 804,
a memory 803 for storing a computer program;
the processor 801 is configured to implement the following steps when executing the program stored in the memory 803:
acquiring first motion data, second motion data and a bone vector of a three-dimensional human body model, wherein a plurality of joint points of the three-dimensional human body model comprise a root joint point, the first motion data and the second motion data are composed of a rotation matrix of each joint point relative to the root joint point, and the bone vector comprises the distance between each joint point and the root joint point;
calculating the offset of the first motion data and the second motion data at each joint point;
calculating a weight for each joint point based on the bone vector, wherein the weight is inversely proportional to a distance of each joint point from the root joint point;
and calculating the weighted average of the offset of each joint point according to the weight to obtain the difference between the first motion data and the second motion data.
The communication bus mentioned in the above terminal may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the terminal and other equipment.
The Memory may include a Random Access Memory (RAM) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component.
In still another embodiment of the present invention, there is also provided a computer-readable storage medium having stored therein instructions, which when run on a computer, cause the computer to execute the method for calculating a difference degree of motion in any one of the above embodiments.
In yet another embodiment, a computer program product containing instructions is provided, which when run on a computer, causes the computer to execute the method for calculating a difference in an action according to any one of the above embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the invention are brought about in whole or in part when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (10)

1. A motion disparity calculation method, characterized by comprising:
acquiring first motion data, second motion data and a bone vector of a three-dimensional human body model, wherein a plurality of joint points of the three-dimensional human body model comprise a root joint point, the first motion data and the second motion data are composed of a rotation matrix of each joint point relative to the root joint point, and the bone vector comprises the distance between each joint point and the root joint point;
calculating the offset of the first motion data and the second motion data at the rotation matrix of each joint point;
calculating a weight for each joint point from the bone vector, wherein the weight is inversely proportional to the distance of each joint point from the root joint point;
and calculating a weighted average value of the offset of each joint point according to the weight to obtain the difference between the first motion data and the second motion data.
2. The method of claim 1, wherein the obtaining the first motion data, the second motion data, and the bone vectors of the three-dimensional human model comprises:
acquiring first motion data and a skeleton vector of a three-dimensional human body model acquired by motion capture equipment;
and inputting the first action data and the bone vector into a preset redirection model, and redirecting the first action data to obtain second action data.
3. The method of claim 1, wherein calculating, for each joint, an offset of the first motion data from the second motion data at a rotation matrix of the joint comprises:
calculating the absolute value of the difference between the first motion data and the second motion data in the rotation matrix of the joint point to obtain a difference matrix;
and calculating the sum of each element in the difference matrix to obtain the offset of the first motion data and the second motion data at the joint.
4. The method of claim 3, wherein the offset of the rotation matrix of the first motion data from the second motion data at each joint is calculated for that joint using the following formula:
ΔRi=SUM(|Ri-R′i|)
wherein, R isiIs a rotation matrix of the ith joint point in the first motion data relative to the root joint point, R'iA rotation matrix for the ith joint point in the second motion data relative to the root joint point, the SUM representing a matrix summation operation, the Δ RiAnd an offset between the first motion data and the second motion data at an ith joint point is represented.
5. The method of claim 1, wherein said calculating a weight for each joint point from said bone vectors comprises:
calculating the sum of a preset empirical value and the maximum distance in the skeletal vector;
and calculating the difference between the sum and the distance between the joint point and the root joint point and calculating the ratio of the difference to the sum of the distances in the bone vectors to obtain the weight of the joint point.
6. The method of claim 5, wherein the weight for each joint point is calculated from the bone vector using the formula:
Figure FDA0002861239570000021
wherein, the wiRepresents the weight of the ith joint point, dsetFor the preset empirical value, dmaxIs the maximum distance in the bone vector, said diThe distance between the ith joint point in the skeleton vector and the root joint point, the K is the number of joint points in the skeleton vector, and the djIs the distance of the jth joint point in the bone vector from the root joint point.
7. The method of claim 1, wherein the weighted average of the offset for each joint is calculated from the weights using the following formula to obtain the difference between the first motion data and the second motion data:
Figure FDA0002861239570000022
wherein, the wiRepresents the weight of the ith joint point, said Δ RiAnd K represents the offset of the first motion data and the second motion data at the ith joint point, and represents the number of joint points in the bone vector.
8. An action disparity calculation apparatus, characterized in that the apparatus comprises:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring first motion data, second motion data and a bone vector of a three-dimensional human body model, a plurality of joint points of the three-dimensional human body model comprise a root joint point, the first motion data and the second motion data are composed of a rotation matrix of each joint point relative to the root joint point, and the bone vector comprises the distance between each joint point and the root joint point;
the first calculation module is used for calculating the offset of the first motion data and the second motion data in a rotation matrix of each joint point;
the second calculation module is used for calculating the weight corresponding to each joint point according to the skeleton vector, wherein the weight is in inverse proportion to the distance between the joint point and the root joint point;
and the third calculation module is used for calculating a weighted average value of the offset of each joint point according to the weight to obtain the difference degree between the first motion data and the second motion data.
9. An electronic device is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of claims 1 to 7 when executing a program stored in the memory.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202011567709.1A 2020-12-25 2020-12-25 Method, device and equipment for calculating motion difference and storage medium Pending CN112562071A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011567709.1A CN112562071A (en) 2020-12-25 2020-12-25 Method, device and equipment for calculating motion difference and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011567709.1A CN112562071A (en) 2020-12-25 2020-12-25 Method, device and equipment for calculating motion difference and storage medium

Publications (1)

Publication Number Publication Date
CN112562071A true CN112562071A (en) 2021-03-26

Family

ID=75033147

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011567709.1A Pending CN112562071A (en) 2020-12-25 2020-12-25 Method, device and equipment for calculating motion difference and storage medium

Country Status (1)

Country Link
CN (1) CN112562071A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113129414A (en) * 2021-04-12 2021-07-16 北京爱奇艺科技有限公司 Hand motion repairing method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107833271A (en) * 2017-09-30 2018-03-23 中国科学院自动化研究所 A kind of bone reorientation method and device based on Kinect
CN109308727A (en) * 2018-09-07 2019-02-05 腾讯科技(深圳)有限公司 Virtual image model generating method, device and storage medium
CN111681303A (en) * 2020-06-10 2020-09-18 北京中科深智科技有限公司 Method and system for extracting key frame from captured data and reconstructing motion

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107833271A (en) * 2017-09-30 2018-03-23 中国科学院自动化研究所 A kind of bone reorientation method and device based on Kinect
CN109308727A (en) * 2018-09-07 2019-02-05 腾讯科技(深圳)有限公司 Virtual image model generating method, device and storage medium
CN111681303A (en) * 2020-06-10 2020-09-18 北京中科深智科技有限公司 Method and system for extracting key frame from captured data and reconstructing motion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李顺意: "基于运动捕获数据的角色动画合成研究", 中国优秀硕士学位论文全文数据库 信息科技辑, no. 9, 15 September 2014 (2014-09-15), pages 2 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113129414A (en) * 2021-04-12 2021-07-16 北京爱奇艺科技有限公司 Hand motion repairing method, device, equipment and storage medium
CN113129414B (en) * 2021-04-12 2024-04-12 北京爱奇艺科技有限公司 Hand motion restoration method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109448090B (en) Image processing method, device, electronic equipment and storage medium
CN110310333B (en) Positioning method, electronic device and readable storage medium
CN110827383B (en) Attitude simulation method and device of three-dimensional model, storage medium and electronic equipment
CN109101901B (en) Human body action recognition method and device, neural network generation method and device and electronic equipment
US11335025B2 (en) Method and device for joint point detection
CN108875492B (en) Face detection and key point positioning method, device, system and storage medium
CN112906494B (en) Face capturing method and device, electronic equipment and storage medium
CN112651345B (en) Human body posture recognition model optimization method and device and terminal equipment
CN110751039A (en) Multi-view 3D human body posture estimation method and related device
JP2020144864A (en) Image processing method, apparatus, and computer-readable storage medium
EP4053736B1 (en) System and method for matching a test frame sequence with a reference frame sequence
CN111868738B (en) Cross-device monitoring computer vision system
CN114022645A (en) Action driving method, device, equipment and storage medium of virtual teacher system
CN111090688A (en) Smoothing processing method and device for time sequence data
CN112562071A (en) Method, device and equipment for calculating motion difference and storage medium
CN112819963B (en) Batch differential modeling method for tree branch model and related equipment
CN111028346A (en) Method and device for reconstructing video object
CN104077768A (en) Method and device for calibrating fish-eye lens radial distortion
CN112258647B (en) Map reconstruction method and device, computer readable medium and electronic equipment
CN109887016B (en) Similarity calculation method and device
CN112967397A (en) Three-dimensional limb modeling method and device, virtual reality equipment and augmented reality equipment
CN111523517B (en) Action scoring method and device, electronic equipment and readable storage medium
CN114187343A (en) 3D data acquisition method and device and electronic equipment
CN109887017B (en) Similarity calculation method and device
CN110942007B (en) Method and device for determining hand skeleton parameters, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination