CN111353543A - Motion capture data similarity measurement method, device and system - Google Patents
Motion capture data similarity measurement method, device and system Download PDFInfo
- Publication number
- CN111353543A CN111353543A CN202010142555.5A CN202010142555A CN111353543A CN 111353543 A CN111353543 A CN 111353543A CN 202010142555 A CN202010142555 A CN 202010142555A CN 111353543 A CN111353543 A CN 111353543A
- Authority
- CN
- China
- Prior art keywords
- motion capture
- capture data
- matched
- calculating
- equal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
Abstract
The invention discloses a method, a device and a system for measuring the similarity of motion capture data, wherein the method comprises the following steps: acquiring template motion capture data Mt and motion capture data Mg to be matched, and respectively performing principal component analysis on the template motion capture data Mt and the motion capture data Mg to be matched to obtain feature subspaces St and Sg; mapping the motion capture data to be matched Mg to the feature subspace St of the template motion capture data Mt, recovering to obtain Mg ', mapping the template motion capture data Mt to the feature subspace Sg of the motion capture data to be matched Mg, and recovering to obtain Mt'; calculating the difference of each frame of Mg, Mg ', Mt and Mt', taking the maximum difference as the matching cost Dc (Mg, Mt) of Mg and Mt, and further obtaining the similarity of the Mg and the MtAiming at the high complexity of motion capture data in time and space, the invention maps the template motion and the motion to be matched to the feature subspace of the other side and then reconstructs the motion, can well utilize the internal features of the data and accurately reflect the internal features of the dataSimilarity between sequences of actions.
Description
Technical Field
The invention belongs to the field of computer graphics and somatosensory man-machine interaction, and particularly relates to a method, a device and a system for measuring motion capture data similarity.
Background
Motion capture is a technology for tracking and recording the motion trail of human joints, and the technology is widely applied to the fields of movie and television, animation, entertainment and interaction. The similarity measurement of the motion capture data is to calculate or evaluate the similarity degree between human body actions, is a fundamental problem in the motion capture data processing, and has important significance for the application of the motion capture data, such as character-driven animation, motion capture data retrieval and somatosensory human-computer interaction. However, due to the high complexity of motion capture data, the task of measuring the similarity of motion capture data is often not easy. The method is mainly embodied in two aspects, namely, large differences exist between similar actions in space and time, such as different duration, different amplitude and the like; two, there may also be a large similarity in spatio-temporal between different actions, such as a certain segment of the action data being similar, or in a different order, but the set of included gestures being consistent.
In view of the above problems, a great deal of research has been conducted in academia and industry. Dynamic Time Warping (DTW) is one of the most widely used methods for calculating the similarity of time series data. The method is based on a dynamic programming algorithm and is capable of handling matches between sequences of actions having different durations. Under the condition that the duration difference of the action sequences is not large, the method can accurately reflect the similarity between the action sequences. However, this method is not suitable for cases where the duration of the sequence of actions differs significantly. Uniform Scaling (US) is another commonly used time series data similarity calculation method. The method adopts an interpolation method to process time sequence data with different durations into data with the same duration, and on the basis, a frame-by-frame comparison method is adopted to calculate the overall similarity. The US method generally has a low temporal complexity, however, like the DTW method, it is not applicable to cases where the duration of the action sequence differs greatly. In addition, a Piecewise Aggregation Approximation (PAA) may also be applied to the motion capture data similarity metric. The method divides time sequence data into equal width, and each subsection is represented by the average value of the subsection. PAA can solve the spatial difference of similar motion capture data to a certain extent, but cannot be well applied to the similarity measurement between motion sequences with different durations, and in addition, the method is difficult to be practically applied because the number of segments is often not well controlled.
Disclosure of Invention
In view of the above problems, the present invention provides a method, an apparatus, and a system for measuring similarity of motion capture data, which can more accurately reflect the similarity between motion sequences for high complexity of motion capture data in time and space.
The technical purpose is achieved, the technical effect is achieved, and the invention is realized through the following technical scheme:
in a first aspect, the present invention provides a method for measuring similarity of motion capture data, comprising:
acquiring template motion capture data Mt [ Mt ] containing n frame data1,Mt2,…,Mtn]Where Mt isiRepresents the ith frame in Mt;
acquiring to-be-matched motion capture data Mg ═ Mg containing m frame data1,Mg2,…,Mgm]In which Mg isjRepresents the jth frame in Mg;
performing principal component analysis on the template motion capture data Mt and the motion capture data Mg to be matched respectively to obtain feature subspaces St and Sg;
mapping the motion capture data Mg to be matched to a feature subspace St of the template motion capture data Mt, and then recovering to obtain Mg';
mapping the template motion capture data Mt to a feature subspace Sg of the motion capture data Mg to be matched, and recovering to obtain Mt';
calculating the difference of each frame of Mg, Mg ', Mt and Mt', taking the maximum difference as the matching cost Dc of Mg and Mt, and further obtaining the similarity of the Mg and the MtA subspace mapping based motion capture data similarity metric is completed.
Optionally, the calculation method of the feature subspaces St and Sg includes:
calculating motion capture data Mt ═ Mt respectively1,Mt2,…,Mtn]And the motion capture data to be matched Mg ═ Mg1,Mg2,…,Mgm]Mean vector ofAnd wherein MtiAnd MgjIs a d-dimensional column vector, i is more than or equal to 1 and less than or equal to n, and j is more than or equal to 1 and less than or equal to m;
respectively carrying out zero equalization on each line of the template motion capture data Mt and the motion capture data Mg to be matched to obtain Xt ═ Xt1,Xt2,…,Xtn]And Xg ═ Xg1,Xg2,…,Xgm]Wherein
Covariance matrices Ct and Cg are calculated, respectively, where Ct ═ 1/n (Xt)T,Cg=(1/m)(Xg)(Xg)T;
Respectively calculating characteristic values and characteristic vectors of Ct and Cg, taking the characteristic vectors corresponding to the maximum k characteristic values, arranging the characteristic vectors into a line to form a matrix, and obtaining a characteristic subspace St ═ St [ [ St [ ]1,St2,…,Stk]And Sg ═ Sg1,Sg2,…,Sgk]Wherein StiAnd SgjIs a d-dimensional column vector, i is more than or equal to 1 and less than or equal to k, j is more than or equal to 1 and less than or equal to k, and k is a user specified parameter.
Optionally, the mapping the to-be-matched motion capture data Mg to the feature subspace St of the template motion capture data Mt, and then recovering to obtain Mg', includes the following steps:
after calculating the projection of Mg in the feature subspace St of Mt, the motion Mg ' ═ Mg ' is reconstructed '1,Mg′2,…,Mg′m]Wherein
The method for mapping the template motion capture data Mt to the feature subspace Sg of the motion capture data Mg to be matched and then recovering to obtain the Mt' comprises the following steps:
calculating a post-projection reconstruction action Mt ' ═ Mt ' in a feature subspace Sg of Mg '1,Mt′2,…,Mt′n]The concrete calculation formula is
Optionally, the difference of each frame of Mg, Mg ', Mt and Mt' is calculated, the largest difference is taken as the matching cost Dc of Mg and Mt, and the similarity of the two is obtainedThe method comprises the following steps:
error D (Mg, Mg ') of Mg and Mg' is calculated to be maxj∈[1,m]||Mg′j-Mgj| |, where max is the operation of solving the maximum value, | | | | | is the operation of vector modulo;
calculating the error D (Mt, Mt') of Mt and Mt ═ maxi∈[1,n]||Mt′t-Mti||;
Calculating the matching cost Dc (Mg, Mt) of Mg and Mt as max (D (Mg, Mg '), D (Mt, Mt'));
In a second aspect, the present invention provides a motion capture data similarity measurement apparatus, comprising:
a first acquisition unit configured to acquire template motion capture data Mt ═ Mt including n frame data1,Mt2,…,Mtn]Where Mt isiRepresents the ith frame in Mt;
a second acquisition unit configured to acquire motion capture data to be matched including m frame data Mg ═ Mg1,Mg2,…,Mgm]In which Mg isjRepresents the jth frame in Mg;
the first computing unit is used for respectively executing principal component analysis on the template motion capture data Mt and the motion capture data Mg to be matched to obtain feature subspaces St and Sg;
the second computing unit is used for mapping the motion capture data to be matched Mg to the feature subspace St of the template motion capture data Mt and then recovering to obtain Mg';
the third computing unit is used for mapping the template motion capture data Mt to a feature subspace Sg of the motion capture data Mg to be matched and then recovering to obtain Mt';
a fourth calculating unit, configured to calculate differences between each frame of Mg and Mg ', Mt and Mt', take the largest difference as the matching cost Dc of Mg and Mt, and further obtain a similarity of Mg and MtA subspace mapping based motion capture data similarity metric is completed.
Optionally, the calculation method of the feature subspaces St and Sg includes:
calculating motion capture data Mt ═ Mt respectively1,Mt2,…,Mtn]And the motion capture data to be matched Mg ═ Mg1,Mg2,…,Mgm]Mean vector ofAnd wherein MtiAnd MgjIs a d-dimensional column vector, i is more than or equal to 1 and less than or equal to n, and j is more than or equal to 1 and less than or equal to m;
respectively carrying out zero mean on each line of the template motion capture data Mt and the motion capture data Mg to be matchedValuating to obtain Xt ═ Xt1,Xt2,…,Xtn]And Xg ═ Xg1,Xg2,…,Xgm]Wherein
Covariance matrices Ct and Cg are calculated, respectively, where Ct ═ 1/n (Xt)T,Cg=(1/m)(Xg)(Xg)T;
Respectively calculating characteristic values and characteristic vectors of Ct and Cg, taking the characteristic vectors corresponding to the maximum k characteristic values, arranging the characteristic vectors into a line to form a matrix, and obtaining a characteristic subspace St ═ St [ [ St [ ]1,St2,…,Stk]And Sg ═ Sg1,Sg2,…,Sgk]Wherein StiAnd SgjIs a d-dimensional column vector, i is more than or equal to 1 and less than or equal to k, j is more than or equal to 1 and less than or equal to k, and k is a user specified parameter.
Optionally, the mapping the to-be-matched motion capture data Mg to the feature subspace St of the template motion capture data Mt, and then recovering to obtain Mg', includes the following steps:
after calculating the projection of Mg in the feature subspace St of Mt, the motion Mg ' ═ Mg ' is reconstructed '1,Mg′2,…,Mg′m]Wherein
The method for mapping the template motion capture data Mt to the feature subspace Sg of the motion capture data Mg to be matched and then recovering to obtain the Mt' comprises the following steps:
calculating a post-projection reconstruction action Mt ' ═ Mt ' in a feature subspace Sg of Mg '1,Mt′2,…,Mt′n]The concrete calculation formula is
Optionally, the difference of each frame of Mg, Mg ', Mt and Mt' is calculated, the largest difference is taken as the matching cost Dc of Mg and Mt, and the similarity of the two is obtainedThe method comprises the following steps:
error D (Mg, Mg ') of Mg and Mg' is calculated to be maxj∈[1,m]||Mg′j-Mgj| |, where max is the operation of solving the maximum value, | | | | | is the operation of vector modulo;
calculating the error D (Mt, Mt') of Mt and Mt ═ maxi∈[1,n]||Mt′i-Mti||;
Calculating the matching cost Dc (Mg, Mt) of Mg and Mt as max (D (Mg, Mg '), D (Mt, Mt'));
In a third aspect, the present invention provides a motion capture data similarity metric system comprising a storage medium and a processor;
the storage medium is used for storing instructions;
the processor is configured to operate in accordance with the instructions to perform the steps of the method according to any one of the first aspects.
The invention has the beneficial effects that:
the invention provides a method, a device and a system for measuring similarity of motion capture data, aiming at high complexity of the motion capture data in time and space, template actions and actions to be matched are mapped to feature subspaces of the other side and then reconstructed, so that the inherent features of the data can be well utilized, and the similarity between action sequences can be accurately reflected.
Drawings
FIG. 1 is a flowchart illustrating a method for similarity measurement of motion capture data according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The following detailed description of the principles of the invention is provided in connection with the accompanying drawings.
Example 1
An embodiment of the present invention provides a method for measuring similarity of motion capture data, as shown in fig. 1, including the following steps:
(1) acquiring template motion capture data Mt [ Mt ] containing n frame data1,Mt2,…,Mtn]And the to-be-matched motion capture data including m frame data Mg ═ Mg1,Mg2,…,Mgm]Where Mt isiAnd MgjIs a d-dimensional column vector (i is more than or equal to 1 and less than or equal to n, j is more than or equal to 1 and less than or equal to m); wherein MgjRepresents the jth frame in Mg; mg (magnesium)jRepresents the jth frame in Mg;
(2) respectively executing principal component analysis on the template motion capture data Mt and the motion capture data Mg to be matched to obtain a characteristic subspace St [ [ St [ ]1,St2,…,Stk]And Sg ═ Sg1,Sg2,…,Sgk]Wherein StiAnd SgjIs a d-dimensional column vector (i is more than or equal to 1 and less than or equal to k, and j is more than or equal to 1 and less than or equal to k);
(3) mapping the motion capture data to be matched Mg to the feature subspace St of the template motion capture data Mt, and recovering to obtain Mg ═ Mg'1,Mg′2,…,Mg′m];
(4) Mapping the template motion capture data Mt to a feature subspace Sg of the motion capture data Mg to be matched, and recovering to obtain Mt ═ Mt'1,Mt′2,…,Mt′n];
(5) Calculating the difference of each frame of Mg, Mg ', Mt and Mt', taking the maximum difference as the matching cost Dc of Mg and Mt, and further obtaining the similarity of the Mg and the MtA subspace mapping based motion capture data similarity metric is completed.
In a specific implementation manner of the embodiment of the present invention, the method for calculating the feature subspaces St and Sg includes:
calculating motion capture data Mt ═ Mt respectively1,Mt2,…,Mtn]And the motion capture data to be matched Mg ═ Mg1,Mg2,…,Mgm]Mean vector ofAnd wherein MtiAnd MgjIs a d-dimensional column vector, i is more than or equal to 1 and less than or equal to n, and j is more than or equal to 1 and less than or equal to m; (ii) a
Respectively carrying out zero equalization on each line of the template motion capture data Mt and the motion capture data Mg to be matched to obtain Xt ═ Xt1,Xt2,…,Xtn]And Xg ═ Xg1,Xg2,…,Xgm]Wherein
Covariance matrices Ct and Cg are calculated, respectively, where Ct ═ 1/n (Xt)T,Cg=(1/m)(Xg)(Xg)T;
Respectively calculating characteristic values and characteristic vectors of Ct and Cg, taking the characteristic vectors corresponding to the maximum k characteristic values, arranging the characteristic vectors into a line to form a matrix, and obtaining a characteristic subspace St ═ St [ [ St [ ]1,St2,…,Stk]And Sg ═ Sg1,Sg2,…,Sgk]Wherein StiAnd SgjIs a d-dimensional column vector, i is more than or equal to 1 and less than or equal to k, j is more than or equal to 1 and less than or equal to k, and k is a user specified parameter.
In a specific implementation manner of the embodiment of the present invention, the to-be-matched motion capture data Mg is mapped to the feature subspace St of the template motion capture data Mt, and then Mg ' ═ Mg ' is obtained by recovering '1,Mg′2,…,Mg′m]The method comprises the following steps:
calculating a post-projection reconstruction action Mg 'in the feature subspace St of Mt'1,Mg′2,…,Mg′m]Wherein
The method for mapping the template motion capture data Mt to the feature subspace Sg of the motion capture data Mg to be matched and then recovering to obtain the Mt' comprises the following steps:
calculating a post-projection reconstruction action Mt ' ═ Mt ' in a feature subspace Sg of Mg '1,Mt′2,…,Mt′n]The concrete calculation formula is
In a specific implementation manner of the embodiment of the present invention, the difference between each frame of Mg and Mg ', Mt and Mt' is calculated, the largest difference is taken as the matching cost Dc of Mg and Mt, and the similarity between the two is obtainedThe method comprises the following steps:
error D (Mg, Mg ') of Mg and Mg' is calculated to be maxj∈[1,m]||Mg′j-Mgj| |, where max is the operation of solving the maximum value, | | | | | is the operation of vector modulo;
calculating the error D (Mt, Mt') of Mt and Mt ═ maxi∈[1,n]||Mt′i-Mti||;
Calculating the matching cost Dc (Mg, Mt) of Mg and Mt as max (D (Mg, Mg '), D (Mt, Mt'));
Example 2
Based on the same inventive concept as embodiment 1, an embodiment of the present invention provides a motion capture data similarity measurement apparatus, including:
a first acquisition unit configured to acquire template motion capture data Mt ═ Mt including n frame data1,Mt2,…,Mtn]Where Mt isiRepresents the ith frame in Mt;
a second acquisition unit configured to acquire motion capture data to be matched including m frame data Mg ═ Mg1,Mg2,…,Mgm]In which Mg isjRepresents the jth frame in Mg;
the first computing unit is used for respectively executing principal component analysis on the template motion capture data Mt and the motion capture data Mg to be matched to obtain feature subspaces St and Sg;
the second computing unit is used for mapping the motion capture data to be matched Mg to the feature subspace St of the template motion capture data Mt and then recovering to obtain Mg';
the third computing unit is used for mapping the template motion capture data Mt to a feature subspace Sg of the motion capture data Mg to be matched and then recovering to obtain Mt';
a fourth calculating unit, configured to calculate differences between each frame of Mg and Mg ', Mt and Mt', take the largest difference as the matching cost Dc of Mg and Mt, and further obtain a similarity of Mg and MtA subspace mapping based motion capture data similarity metric is completed.
In a specific implementation manner of the embodiment of the present invention, the method for calculating the feature subspaces St and Sg includes:
calculating motion capture data Mt ═ Mt respectively1,Mt2,…,Mtn]And the motion capture data to be matched Mg ═ Mg1,Mg2,…,Mgm]Mean vector ofAnd wherein Mti and Mgj are d-dimensional column vectors, i is greater than or equal to 1 and less than or equal to n, and j is greater than or equal to 1 and less than or equal to m;
respectively carrying out zero equalization on each line of the template motion capture data Mt and the motion capture data Mg to be matched to obtain Xt ═ Xt1,Xt2,…,Xtn]And Xg ═ Xg1,Xg2,…,Xgm]Wherein
Covariance matrices Ct and Cg are calculated, respectively, where Ct ═ 1/n (Xt)T,Cg=(1/m)(Xg)(Xg)T;
Respectively calculating characteristic values and characteristic vectors of Ct and Cg, taking the characteristic vectors corresponding to the maximum k characteristic values, arranging the characteristic vectors into a line to form a matrix, and obtaining a characteristic subspace St ═ St [ [ St [ ]1,St2,…,Stk]And Sg ═ Sg1,Sg2,…,Sgk]Wherein StiAnd SgjIs a d-dimensional column vector, i is more than or equal to 1 and less than or equal to k, j is more than or equal to 1 and less than or equal to k, and k is a user specified parameter.
In a specific implementation manner of the embodiment of the present invention, the mapping the to-be-matched motion capture data Mg to the feature subspace St of the template motion capture data Mt, and then recovering to obtain Mg', includes the following steps:
after calculating the projection of Mg in the feature subspace St of Mt, the motion Mg ' ═ Mg ' is reconstructed '1,Mg′2,…,Mg′m]Wherein
The method for mapping the template motion capture data Mt to the feature subspace Sg of the motion capture data Mg to be matched and then recovering to obtain the Mt' comprises the following steps:
calculating a post-projection reconstruction action Mt ' ═ Mt ' in a feature subspace Sg of Mg '1,Mt′2,…,Mt′n]The concrete calculation formula is
In a specific implementation manner of the embodiment of the present invention, the difference between each frame of Mg and Mg ', Mt and Mt' is calculated, the largest difference is taken as the matching cost Dc of Mg and Mt, and the similarity between the two is obtainedThe method comprises the following steps:
error D (Mg, Mg ') of Mg and Mg' is calculated to be maxj∈[1,m]||Mg′j-Mgj| |, where max is the operation of solving the maximum value, | | | | | is the operation of vector modulo;
calculating the error D (Mt, Mt') of Mt and Mt ═ maxi∈[1,n]||Mt′i-Mti||;
Calculating the matching cost Dc (Mg, Mt) of Mg and Mt as max (D (Mg, Mg '), D (Mt, Mt'));
Example 3
Based on the same inventive concept as embodiment 1, an embodiment of the present invention provides a motion capture data similarity measurement system, including a storage medium and a processor;
the storage medium is used for storing instructions;
the processor is configured to operate in accordance with the instructions to perform the steps of the method according to embodiment 1.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The present invention is not limited to the above embodiments, and any modifications, equivalent replacements, improvements, etc. made within the spirit and principle of the present invention are included in the scope of the claims of the present invention which are filed as the application.
Claims (9)
1. A method for motion capture data similarity measurement, comprising:
acquiring template motion capture data Mt [ Mt ] containing n frame data1,Mt2,…,Mtn]Where Mt isiRepresents the ith frame in Mt;
acquiring to-be-matched motion capture data Mg ═ Mg containing m frame data1,Mg2,…,Mgm]In which Mg isiRepresents the jth frame in Mg;
performing principal component analysis on the template motion capture data Mt and the motion capture data Mg to be matched respectively to obtain feature subspaces St and Sg;
mapping the motion capture data Mg to be matched to a feature subspace St of the template motion capture data Mt, and then recovering to obtain Mg';
mapping the template motion capture data Mt to a feature subspace Sg of the motion capture data Mg to be matched, and recovering to obtain Mt';
2. A method for motion capture data similarity measurement according to claim 1, wherein: the calculation method of the feature subspaces St and Sg includes:
calculating motion capture data Mt ═ Mt respectively1,Mt2,…,Mtn]And the motion capture data to be matched Mg ═ Mg1,Mg2,…,Mgm]Mean vector ofAnd wherein MtiAnd MgjIs a d-dimensional column vector, i is more than or equal to 1 and less than or equal to n, and j is more than or equal to 1 and less than or equal to m;
respectively carrying out zero equalization on each line of the template motion capture data Mt and the motion capture data Mg to be matched to obtain Xt ═ Xt1,Xt2,…,Xtn]And Xg ═ Xg1,Xg2,…,Xgm]Wherein
Covariance matrices Ct and Cg are calculated, respectively, where Ct ═ 1/n (Xt)T,Cg=(1/m)(Xg)(Xg)T;
Respectively calculating characteristic values and characteristic vectors of Ct and Cg, taking the characteristic vectors corresponding to the maximum k characteristic values, arranging the characteristic vectors into a line to form a matrix, and obtaining a characteristic subspace St ═ St [ [ St [ ]1St2,…,Stk]And Sg ═ Sg1,Sg2,…,Sgk]Wherein StiAnd SgjIs a d-dimensional column vector, i is more than or equal to 1 and less than or equal to k, j is more than or equal to 1 and less than or equal to k, and k is a user specified parameter.
3. A method of motion capture data similarity metric according to claim 2, characterized by: mapping the to-be-matched motion capture data Mg to a feature subspace St of the template motion capture data Mt, and then recovering to obtain Mg', the method comprises the following steps:
after calculating the projection of Mg in the feature subspace St of Mt, the motion Mg ' ═ Mg ' is reconstructed '1,Mg′2,…,Mg′m]Wherein
The method for mapping the template motion capture data Mt to the feature subspace Sg of the motion capture data Mg to be matched and then recovering to obtain the Mt' comprises the following steps:
4. A method of motion capture data similarity metric according to claim 3, characterized by: calculating the difference of each frame of Mg, Mg ', Mt and Mt', taking the maximum difference as the matching cost Dc of Mg and Mt, and further obtaining the similarity of the Mg and the MtThe method comprises the following steps:
error D (Mg, Mg ') of Mg and Mg' is calculated to be maxj∈[1,m]||Mg′j-MgjIi, where max is a maximum value operation and ii is a vector modulo operation;
calculating Mt and Mt'Error D (Mt, Mt') of (max)i∈[1,n]‖Mt′i-Mti‖;
Calculating the matching cost Dc (Mg, Mt) of Mg and Mt as max (D (Mg, Mg '), D (Mt, Mt'));
5. A motion capture data similarity metric apparatus, comprising:
a first acquisition unit configured to acquire template motion capture data Mt ═ Mt including n frame data1,Mt2,…,Mtn]Where Mt isiRepresents the ith frame in Mt;
a second acquisition unit configured to acquire motion capture data to be matched including m frame data Mg ═ Mg1,Mg2,…,Mgm]In which Mg isjRepresents the jth frame in Mg;
the first computing unit is used for respectively executing principal component analysis on the template motion capture data Mt and the motion capture data Mg to be matched to obtain feature subspaces St and Sg;
the second computing unit is used for mapping the motion capture data to be matched Mg to the feature subspace St of the template motion capture data Mt and then recovering to obtain Mg';
the third computing unit is used for mapping the template motion capture data Mt to a feature subspace Sg of the motion capture data Mg to be matched and then recovering to obtain Mt';
6. The motion capture data similarity metric apparatus of claim 5, wherein: the calculation method of the feature subspaces St and Sg includes:
calculating motion capture data Mt ═ Mt respectively1,Mt2,…,Mtn]And the motion capture data to be matched Mg ═ Mg1,Mg2,…,Mgm]Mean vector ofAnd wherein MtiAnd MgjIs a d-dimensional column vector, i is more than or equal to 1 and less than or equal to n, and j is more than or equal to 1 and less than or equal to m;
respectively carrying out zero equalization on each line of the template motion capture data Mt and the motion capture data Mg to be matched to obtain Xt ═ Xt1,Xt2,…,Xtn]And Xg ═ Xg1,Xg2,…,Xgm]Wherein
Covariance matrices Ct and Cg are calculated, respectively, where Ct ═ 1/n (Xt)T,Cg=(1/m)(Xg)(Xg)T;
Respectively calculating characteristic values and characteristic vectors of Ct and Cg, taking the characteristic vectors corresponding to the maximum k characteristic values, arranging the characteristic vectors into a line to form a matrix, and obtaining a characteristic subspace St ═ St [ [ St [ ]1,St2,…,Stk]And Sg ═ Sg1,Sg2,…,Sgk]Wherein StiAnd SgjIs a d-dimensional column vector, i is more than or equal to 1 and less than or equal to k, j is more than or equal to 1 and less than or equal to k, and k is a user specified parameter.
7. The motion capture data similarity metric apparatus of claim 6, wherein: mapping the to-be-matched motion capture data Mg to a feature subspace St of the template motion capture data Mt, and then recovering to obtain Mg', the method comprises the following steps:
after calculating the projection of Mg in the feature subspace St of Mt, the motion Mg ' ═ Mg ' is reconstructed '1,Mg′2,…,Mg′m]Wherein
The method for mapping the template motion capture data Mt to the feature subspace Sg of the motion capture data Mg to be matched and then recovering to obtain the Mt' comprises the following steps:
8. The motion capture data similarity metric apparatus of claim 7, wherein: calculating the difference of each frame of Mg, Mg ', Mt and Mt', taking the maximum difference as the matching cost Dc of Mg and Mt, and further obtaining the similarity of the Mg and the MtThe method comprises the following steps:
error D (Mg, Mg ') of Mg and Mg' is calculated to be maxj∈[1,m]||Mg′j-MgjIi, where max is a maximum value operation and ii is a vector modulo operation;
calculating the error D (Mt, Mt') of Mt and Mt ═ maxi∈[1,n]‖Mt′i-Mti‖;
Calculating the matching cost Dc (Mg, Mt) of Mg and Mt as max (D (Mg, Mg '), D (Mt, Mt'));
9. A motion capture data similarity metric system comprising a storage medium and a processor;
the storage medium is used for storing instructions;
the processor is configured to operate in accordance with the instructions to perform the steps of the method according to any one of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010142555.5A CN111353543B (en) | 2020-03-04 | 2020-03-04 | Motion capture data similarity measurement method, device and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010142555.5A CN111353543B (en) | 2020-03-04 | 2020-03-04 | Motion capture data similarity measurement method, device and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111353543A true CN111353543A (en) | 2020-06-30 |
CN111353543B CN111353543B (en) | 2020-09-11 |
Family
ID=71194331
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010142555.5A Active CN111353543B (en) | 2020-03-04 | 2020-03-04 | Motion capture data similarity measurement method, device and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111353543B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4016536A1 (en) * | 2020-12-16 | 2022-06-22 | Polar Electro Oy | Biomechanical modelling of motion measurements |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101101666A (en) * | 2007-08-09 | 2008-01-09 | 中国科学院计算技术研究所 | Dummy role movement synthesis method based on movement capturing data |
US20100272356A1 (en) * | 2008-05-27 | 2010-10-28 | Li Hong | Device and method for estimating whether an image is blurred |
CN101989326A (en) * | 2009-07-31 | 2011-03-23 | 三星电子株式会社 | Human posture recognition method and device |
US20120033856A1 (en) * | 2006-12-19 | 2012-02-09 | Matthew Flagg | System and method for enabling meaningful interaction with video based characters and objects |
CN102508867A (en) * | 2011-10-09 | 2012-06-20 | 南京大学 | Human-motion diagram searching method |
CN103310463A (en) * | 2013-06-18 | 2013-09-18 | 西北工业大学 | On-line target tracking method based on probabilistic principal component analysis and compressed sensing |
CN106127803A (en) * | 2016-06-17 | 2016-11-16 | 北京交通大学 | Human body motion capture data behavior dividing method and system |
CN109284006A (en) * | 2018-11-09 | 2019-01-29 | 中科数字健康科学研究院(南京)有限公司 | A kind of human motion capture device and method |
CN109993818A (en) * | 2017-12-31 | 2019-07-09 | 中国移动通信集团辽宁有限公司 | Three-dimensional (3 D) manikin moves synthetic method, device, equipment and medium |
-
2020
- 2020-03-04 CN CN202010142555.5A patent/CN111353543B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120033856A1 (en) * | 2006-12-19 | 2012-02-09 | Matthew Flagg | System and method for enabling meaningful interaction with video based characters and objects |
CN101101666A (en) * | 2007-08-09 | 2008-01-09 | 中国科学院计算技术研究所 | Dummy role movement synthesis method based on movement capturing data |
US20100272356A1 (en) * | 2008-05-27 | 2010-10-28 | Li Hong | Device and method for estimating whether an image is blurred |
CN101989326A (en) * | 2009-07-31 | 2011-03-23 | 三星电子株式会社 | Human posture recognition method and device |
CN102508867A (en) * | 2011-10-09 | 2012-06-20 | 南京大学 | Human-motion diagram searching method |
CN103310463A (en) * | 2013-06-18 | 2013-09-18 | 西北工业大学 | On-line target tracking method based on probabilistic principal component analysis and compressed sensing |
CN106127803A (en) * | 2016-06-17 | 2016-11-16 | 北京交通大学 | Human body motion capture data behavior dividing method and system |
CN109993818A (en) * | 2017-12-31 | 2019-07-09 | 中国移动通信集团辽宁有限公司 | Three-dimensional (3 D) manikin moves synthetic method, device, equipment and medium |
CN109284006A (en) * | 2018-11-09 | 2019-01-29 | 中科数字健康科学研究院(南京)有限公司 | A kind of human motion capture device and method |
Non-Patent Citations (2)
Title |
---|
NAOKI NUMAGUCHI等: "A Puppet Interface for Retrieval of Motion Capture Data", 《PROCEEDINGS OF THE 2011 ACM SIGGRAPH/EUROGRAPHICS SYMPOSIUM ON COMPUTER ANIMATION》 * |
胡晓雁等: "基于PCA的运动数据相似性计算", 《中国计算机图形学大会》 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4016536A1 (en) * | 2020-12-16 | 2022-06-22 | Polar Electro Oy | Biomechanical modelling of motion measurements |
Also Published As
Publication number | Publication date |
---|---|
CN111353543B (en) | 2020-09-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Melekhov et al. | Image-based localization using hourglass networks | |
Kim et al. | Recurrent temporal aggregation framework for deep video inpainting | |
Vemulapalli et al. | R3DG features: Relative 3D geometry-based skeletal representations for human action recognition | |
US10659670B2 (en) | Monitoring system and control method thereof | |
Ding et al. | A rank minimization approach to video inpainting | |
EP3391289A1 (en) | Systems and methods for providing an image classifier | |
CN110060348B (en) | Face image shaping method and device | |
CN108124489B (en) | Information processing method, apparatus, cloud processing device and computer program product | |
CN108876826A (en) | A kind of image matching method and system | |
Zhang et al. | Joint voxel and coordinate regression for accurate 3d facial landmark localization | |
CN111353543B (en) | Motion capture data similarity measurement method, device and system | |
Jin et al. | Towards stabilizing facial landmark detection and tracking via hierarchical filtering: A new method | |
CN110009662A (en) | Method, apparatus, electronic equipment and the computer readable storage medium of face tracking | |
Li et al. | Localization with sampling-argmax | |
Chang et al. | Visual tracking in high-dimensional state space by appearance-guided particle filtering | |
Zhao et al. | Fm-3dfr: Facial manipulation-based 3-d face reconstruction | |
EP2237227A1 (en) | Video sequence processing method and system | |
Li et al. | Adaptive weighted CNN features integration for correlation filter tracking | |
Jing et al. | A novel 3D reconstruction algorithm of motion-blurred CT image | |
Ihm et al. | Low-cost depth camera pose tracking for mobile platforms | |
Zhu et al. | Motion equivariant networks for event cameras with the temporal normalization transform | |
Dawoud et al. | Fast template matching method based on optimized metrics for face localization | |
CN111523345B (en) | Real-time human face tracking system and method | |
Ionita et al. | Real time feature point tracking with automatic model selection | |
WO2023088074A1 (en) | Face tracking method and apparatus, and storage medium and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |