CN112288838A - Data processing method and device - Google Patents

Data processing method and device Download PDF

Info

Publication number
CN112288838A
CN112288838A CN202011166428.5A CN202011166428A CN112288838A CN 112288838 A CN112288838 A CN 112288838A CN 202011166428 A CN202011166428 A CN 202011166428A CN 112288838 A CN112288838 A CN 112288838A
Authority
CN
China
Prior art keywords
motion data
frame
original motion
original
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011166428.5A
Other languages
Chinese (zh)
Inventor
刘思阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing IQIYI Science and Technology Co Ltd
Original Assignee
Beijing IQIYI Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing IQIYI Science and Technology Co Ltd filed Critical Beijing IQIYI Science and Technology Co Ltd
Priority to CN202011166428.5A priority Critical patent/CN112288838A/en
Publication of CN112288838A publication Critical patent/CN112288838A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention provides a data processing method and device, and relates to the technical field of data processing. The method comprises the following steps: calculating an acceleration parameter of each frame of original motion data according to motion data of a plurality of joint points respectively contained in each frame of original motion data in an original motion sequence; wherein the original motion sequence comprises a plurality of frames of original motion data; respectively judging whether the acceleration parameter of each frame of original motion data is larger than a preset acceleration threshold value, and selecting the original motion data of which the acceleration parameter is larger than the preset acceleration threshold value from the original motion sequence as the motion data of the key frame. According to the invention, the key data with large influence in the whole action process is automatically selected as the key frame through the acceleration parameters, so that the processing efficiency is improved, and the selection accuracy is improved.

Description

Data processing method and device
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a data processing method and apparatus.
Background
The 3D virtual character has many applications in animation and movie special effects, and the general flow is to capture various motions of actors through motion capture technology and then transfer the motions to a specified 3D model. However, there are many problems in the migration process, such as that the actor and the 3D model to be driven have different sizes, so that the 3D model makes some actions that do not conform to the physical principle, such as extending the hand into the limb, which is called as "mold penetration" in the industry.
The problem of die penetration is solved, motion data of a driving model are mainly corrected, namely the rotation angle of a joint is corrected, and when the traditional manual correction is carried out, the effect can be directly seen manually after the correction is finished, and the effect is almost the same as that of the original action. During correction, correction is not required to be carried out on each frame action, and interpolation can be carried out on data of non-key frames through key frames.
The key frames need to be set manually during manual repair, but the accuracy of the key frames cannot be guaranteed because manual operation possibly leads to misjudgment due to personal level, visual fatigue and the like, and each frame of data needs to be checked manually during manual setting, so that the method is long in time consumption and high in labor cost.
Disclosure of Invention
The invention provides a data processing method and a data processing device, and solves the problems that time consumption is long when key frames are manually set and accuracy cannot be guaranteed in the prior art.
In a first aspect of the present invention, there is provided a data processing method, including:
calculating an acceleration parameter of each frame of original motion data according to motion data of a plurality of joint points respectively contained in each frame of original motion data in an original motion sequence; wherein the original motion sequence comprises a plurality of frames of original motion data;
respectively judging whether the acceleration parameter of each frame of original motion data is larger than a preset acceleration threshold value, and selecting the original motion data of which the acceleration parameter is larger than the preset acceleration threshold value from the original motion sequence as the motion data of the key frame.
Preferably, the step of calculating the acceleration parameter of each frame of original motion data according to the motion data of a plurality of joint points respectively contained in each frame of original motion data in the original motion sequence includes:
calculating the acceleration of each joint point corresponding to each frame of original motion data according to the motion data of a plurality of joint points contained in each frame of original motion data in the original motion sequence;
calculating the weighted average value of the accelerations of all the joint points corresponding to each frame of original motion data according to the acceleration of each joint point corresponding to each frame of original motion data and the weight coefficient corresponding to each joint point, and determining the acceleration parameter of each frame of original motion data according to the average value.
Preferably, the step of calculating the acceleration of each joint point corresponding to each frame of original motion data according to the motion data of the plurality of joint points included in each frame of original motion data in the original motion sequence includes:
subtracting the motion data of each joint point of each frame of original motion data from the motion data of the joint point corresponding to the previous frame of original motion data of each frame of original motion data to obtain the motion speed of each joint point of each frame of original motion data;
and subtracting the movement speed of each joint point of the next frame of original movement data of each frame of original movement data from the movement speed of each joint point of the next frame of original movement data of each frame of original movement data to obtain the acceleration of each joint point of each frame of original movement data.
Preferably, before the motion data of each joint point of each frame of original motion data is subtracted by the motion data of the joint point corresponding to the original motion data of the previous frame of original motion data of each frame of original motion data to obtain the motion speed of each joint point of each frame of original motion data, the method further includes:
and copying the first frame of original motion data in the original motion sequence into the previous frame of original motion data of the first frame of original motion data.
Preferably, the motion data of each joint point of each frame of original motion data respectively comprises motion data of a plurality of dimensional directions;
the step of calculating the acceleration of each joint point corresponding to each frame of original motion data according to the motion data of a plurality of joint points contained in each frame of original motion data in the original motion sequence comprises the following steps:
calculating the acceleration of each joint point corresponding to each frame of original motion data in a plurality of dimension directions according to the motion data of the plurality of joint points contained in each frame of original motion data;
the step of calculating the weighted average value of the accelerations of all the joint points corresponding to each frame of original motion data according to the acceleration of each joint point corresponding to each frame of original motion data and the weight coefficient corresponding to each joint point, and determining the acceleration parameter of each frame of original motion data according to the average value comprises the following steps:
calculating the weighted average value of the accelerations of all the joint points in each dimension direction corresponding to each frame of original motion data according to the accelerations of each joint point in the plurality of dimension directions corresponding to each frame of original motion data and the weight coefficient corresponding to each joint point;
and determining the acceleration parameter of each frame of original motion data according to the weighted average value of the accelerations of all the joint points in each dimension direction corresponding to each frame of original motion data.
Preferably, the step of determining the acceleration parameter of each frame of original motion data according to the weighted average value of the accelerations of all the joints in each dimension direction corresponding to each frame of original motion data includes:
and calculating the square sum of the average values in a plurality of dimensional directions corresponding to each frame of original motion data, and taking the square sum as the acceleration parameter of each frame of original motion data.
Preferably, the step of calculating, according to the motion data of the plurality of joint points included in each frame of original motion data, the acceleration of each joint point corresponding to each frame of original motion data in the directions of the plurality of dimensions includes:
respectively converting the motion data of a plurality of joint points contained in each frame of original motion data into a multi-dimensional motion data matrix;
and acquiring a multidimensional acceleration matrix of each frame of original motion data according to the multidimensional motion data matrix of each frame of original motion data, wherein the multidimensional acceleration matrix comprises the acceleration of each joint point in a plurality of dimensional directions.
Preferably, the total product of the weight coefficients corresponding to all the joint points is 1.
Preferably, after the selecting the original motion data with the acceleration parameter greater than the preset acceleration threshold as the motion data of the key frame, the method further includes:
judging whether the distance between any two adjacent key frames in all the key frame motion data is greater than a preset distance threshold value or not;
and when the distance between two adjacent key frames is greater than the preset distance threshold, selecting original motion data, the distance between the original motion data and the key frame of the previous frame of the two adjacent key frames is the preset distance threshold, and supplementing the original motion data into the motion data of the key frame.
In a second aspect of the present invention, there is also provided a data processing apparatus comprising:
the first calculation module is used for calculating the acceleration parameter of each frame of original motion data according to the motion data of a plurality of joint points contained in each frame of original motion data in the original motion sequence; wherein the original motion sequence comprises a plurality of frames of original motion data;
and the key frame selection module is used for respectively judging whether the acceleration parameter of each frame of original motion data is greater than a preset acceleration threshold value, and selecting the original motion data of which the acceleration parameter is greater than the preset acceleration threshold value from the original motion sequence as the motion data of the key frame.
In a third aspect of the present invention, there is also provided an electronic device, including: a processor, a communication interface, a memory, and a communication bus; the processor, the communication interface and the memory complete mutual communication through a communication bus;
a memory for storing a computer program;
a processor for implementing the steps of the data processing method as described in any one of the above when executing the program stored in the memory.
In a fourth aspect implemented by the present invention, there is also provided a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the data processing method as described in any one of the above.
Aiming at the prior art, the invention has the following advantages:
in the embodiment of the invention, firstly, an acceleration parameter of each frame of original motion data is calculated according to motion data of a plurality of joint points respectively contained in each frame of original motion data in an original motion sequence; wherein the original motion sequence comprises a plurality of frames of original motion data; and then respectively judging whether the acceleration parameter of each frame of original motion data is greater than a preset acceleration threshold value, and selecting the original motion data of which the acceleration parameter is greater than the preset acceleration threshold value from the original motion sequence as the motion data of the key frame. Therefore, the key data with large influence in the whole action process is automatically selected as the key frames through the acceleration parameters, automatic batch processing of the selected key frames is realized, each frame of data does not need to be checked manually, the processing efficiency of the selected key frames is improved, the key data with large influence in the action process is automatically selected as the key frames according to the acceleration parameters, misjudgment caused by the problems of personal level, visual fatigue and the like is avoided, and the accuracy of the selected key frames is improved.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments will be briefly described below.
Fig. 1 is a schematic flow chart of a data processing method according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a model joint according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a repair motion data provided by an embodiment of the present invention;
FIG. 4 is a flowchart illustrating a supplemental key frame according to an embodiment of the present invention;
FIG. 5 is a schematic block diagram of a data processing apparatus provided by an embodiment of the present invention;
fig. 6 is a schematic block diagram of an electronic device provided in an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
Fig. 1 is a schematic flow chart of a data processing method according to an embodiment of the present invention, and referring to fig. 1, the data processing method includes:
step 11: calculating an acceleration parameter of each frame of original motion data according to motion data of a plurality of joint points respectively contained in each frame of original motion data in an original motion sequence; wherein the original motion sequence comprises a plurality of frames of original motion data.
Here, the acceleration parameter of each frame of original motion data is calculated, so that the key frame is selected based on the acceleration parameter, the key frame calculation is converted into the acceleration parameter calculation, and the calculation process is simplified.
The original motion sequence is a group of original motion data which are sequentially arranged according to a time sequence and obtained by capturing motion information of prototypes such as human bodies or animals in a period of time T. The frequency of capturing motion information may be set according to a requirement, for example, the motion information is captured at a frequency of 50 frames per second, and if T is 10s, 500 frames of raw motion data may be captured within the time period T, where the raw motion sequence includes the 500 frames of raw motion data arranged in order according to a time sequence.
Wherein, the original motion sequence is assumed to include t frames of original motion data in total, and t is an integer greater than or equal to 1. The original motion sequence may be denoted as M ═ M1,m2,m3,…mi…,mtIn which m isiRepresenting the original motion data of the ith frame, i is more than or equal to 1 and less than or equal to t.
It is assumed that each frame of raw motion data includes motion data of k joint points, respectively, k being an integer greater than or equal to 1. The ith frame raw motion data can be recorded as
Figure BDA0002745941140000061
Wherein the content of the first and second substances,
Figure BDA0002745941140000062
j is more than or equal to 1 and less than or equal to k and represents the motion data of the j-th joint point of the original motion data of the ith frame.
For example, as shown in fig. 2, assuming a total of 34 joint points, each frame of raw motion data includes motion data of 34 joint points, respectively.
Step 12: respectively judging whether the acceleration parameter of each frame of original motion data is larger than a preset acceleration threshold value, and selecting the original motion data of which the acceleration parameter is larger than the preset acceleration threshold value from the original motion sequence as the motion data of the key frame.
The acceleration parameters are used for realizing automatic selection of the key frames, and the acceleration parameters are larger than the preset acceleration threshold value, namely, the original motion data with larger action is used as the key frames, so that the key data with larger influence in the whole action process is accurately selected, the processing efficiency is improved, and the accuracy is improved.
For example, as shown in fig. 3, assuming that, in 6 frames of original motion data selected in an original motion sequence, the acceleration parameter of each frame of original motion data is acquired through step 11, and it is determined through step 12 that only the acceleration parameters of the 1 st frame of original motion data and the 6 th frame of original motion data satisfy a condition that is greater than a preset acceleration threshold, the 1 st frame of original motion data and the 6 th frame of original motion data are used as key frames, and other frames of original motion data are used as non-key frames. During the restoration, the 1 st frame original motion data and the 6 th frame original motion data can be restored, the restoration data of the other 2 nd to 5 th frame original motion data is obtained by interpolation of the restoration data of the 1 st frame and the 6 th frame original motion data, and finally the restoration motion sequence shown in fig. 3 is obtained.
The preset acceleration threshold may be set to any magnitude according to a requirement, and is not limited herein.
According to the data processing method, the key data with large influence in the whole action process is automatically selected as the key frames through the acceleration parameters, automatic batch processing of the selected key frames is achieved, manual checking of each frame data is not needed, the processing efficiency of the selected key frames is improved, the key data with large influence in the action process is automatically selected as the key frames according to the acceleration parameters, misjudgment caused by problems of personal level, visual fatigue and the like is avoided, and the accuracy of the selected key frames is improved.
Preferably, the step 11 includes:
step 111: and calculating the acceleration of each joint point corresponding to each frame of original motion data according to the motion data of a plurality of joint points contained in each frame of original motion data in the original motion sequence.
Here, the acceleration of each joint point corresponding to each frame of original motion data can be accurately calculated using the motion data of the plurality of joint points included in each frame of original motion data to determine a final acceleration parameter using the acceleration of each joint point.
Step 112: calculating the weighted average value of the accelerations of all the joint points corresponding to each frame of original motion data according to the acceleration of each joint point corresponding to each frame of original motion data and the weight coefficient corresponding to each joint point, and determining the acceleration parameter of each frame of original motion data according to the average value.
The acceleration parameters are determined according to the weighted average value of the accelerations of all the joint points corresponding to each frame of original motion data, so that the acceleration parameters integrate the motion conditions of all the joint points, the difference between different joint points is considered, the key frames are selected according to the difference as a standard, and the key data which have large influence on the motion can be selected more accurately.
The weight coefficient corresponding to each joint point can be set according to requirements, and the weight of each joint point is not limited in the embodiment of the invention.
However, in order to ensure that the weight distribution of all the joints is more reasonable, it is preferable that the total product of the weight coefficients corresponding to all the joints is 1.
At this time, assume wlIs the weight coefficient of the ith joint point, then
Figure BDA0002745941140000071
1≤l≤k。
Of course, the above-mentioned determination method of the acceleration parameter is only a preferred implementation method, and the acceleration parameter may also be determined in other manners, for example, the average value of the accelerations of all the joints corresponding to each frame of the original motion data is directly calculated without considering the difference between different joints, and the acceleration parameter is determined according to the average value, which is not described herein.
Preferably, the step 111 includes:
step 1111: and subtracting the motion data of each joint point of each frame of original motion data from the motion data of the joint point corresponding to the original motion data of the previous frame of original motion data of each frame of original motion data to obtain the motion speed of each joint point of each frame of original motion data.
Here, to determine the acceleration, the velocity is first determined by subtracting the motion data of the joint point corresponding to the original motion data of the previous frame of each frame of original motion data from the motion data of each joint point of each frame of original motion data to obtain the motion velocity of each joint point of each frame of original motion data.
E.g. raw motion data in ith frame
Figure BDA0002745941140000081
For example, the motion data of each joint point is utilized by the original motion data of the ith frame
Figure BDA0002745941140000082
Subtracting the motion data of the joint point corresponding to the original motion data of the i-1 th frame to obtain the motion speed of each joint point of the original motion data of the i-th frame, and using a formula to express as follows:
Figure BDA0002745941140000083
step 1112: and subtracting the movement speed of each joint point of the next frame of original movement data of each frame of original movement data from the movement speed of each joint point of the next frame of original movement data of each frame of original movement data to obtain the acceleration of each joint point of each frame of original movement data.
After the velocity is determined, the acceleration is continuously determined, that is, the motion velocity of each joint point of the next frame of original motion data of each frame of original motion data is subtracted from the motion velocity of each joint point of the next frame of original motion data of each frame of original motion data, so as to obtain the acceleration of each joint point of each frame of original motion data.
For example, the motion velocity of each joint point of the original motion data in the ith frame
Figure BDA0002745941140000084
Figure BDA0002745941140000085
For example, the motion velocity of each joint point of the i +1 th frame of original motion data is subtracted from the motion velocity of the corresponding joint point of the i th frame of original motion data to obtain the acceleration of each joint point of the i th frame of original motion data, which is expressed by the following formula:
Figure BDA0002745941140000086
after obtaining the acceleration of each joint point of the i-th frame of original motion data, the weighted average of the accelerations of all joint points corresponding to the i-th frame of original motion data may be calculated in step 112, and is expressed as follows by using the formula:
Figure BDA0002745941140000087
wherein, wlIs the weight coefficient of the first joint point, l is more than or equal to 1 and less than or equal to k.
Preferably, before step 1111, the method further comprises:
step 1110: and copying the first frame of original motion data in the original motion sequence into the previous frame of original motion data of the first frame of original motion data.
Here, since the movement velocity of each joint point of each frame of original motion data is determined based on the previous frame of original motion data, by copying the first frame of original motion data in the original motion sequence to the previous frame of original motion data of the first frame of original motion data, an error is prevented from occurring in calculating the first frame of original motion data.
In order to avoid error reporting in the calculation of the first frame original motion data, other methods may also be used, for example, skipping directly the first frame original motion data, starting the calculation from the second frame original motion data, and setting the motion speed of each joint point of the first frame original motion data to zero.
Preferably, the motion data of each joint point of each frame of original motion data respectively comprises motion data of a plurality of dimensional directions; the step 111 includes:
step 1113: and calculating the acceleration of each joint point corresponding to each frame of original motion data in a plurality of dimension directions according to the motion data of the plurality of joint points contained in each frame of original motion data.
Here, when the motion data of each joint point includes motion data in a plurality of dimensional directions, it is necessary to calculate the acceleration of each joint point in the plurality of dimensional directions to analyze the motion data of all joint points in all directions, ensuring accuracy.
Generally, when calculating the model motion data, the motion sequence is analyzed by a three-dimensional coordinate system, and each corresponding joint point respectively comprises motion data in three dimensional directions. If the three-dimensional coordinate system includes X, Y, Z coordinate axes, each joint point includes X, Y, Z directions of motion data. For example, the motion data of j-th joint point of the i-th frame original motion data
Figure BDA0002745941140000091
The acceleration of the jth joint point of the ith frame of raw motion data in three directions of X, Y, Z needs to be calculated:
Figure BDA0002745941140000092
for convenience of calculation, it is preferable that step 1113 includes:
step 11131: and respectively converting the motion data of a plurality of joint points contained in each frame of original motion data into a multi-dimensional motion data matrix.
Here, the calculation is performed by the conversion matrix, so that the calculation method can be simplified and the efficiency can be improved.
E.g. raw motion data in ith frame
Figure BDA0002745941140000093
For example, when each joint point includes X, Y, Z three-directional motion data, m is calculated by this stepiConversion into a three-dimensional motion data matrix:
Figure BDA0002745941140000101
step 11132: and acquiring a multidimensional acceleration matrix of each frame of original motion data according to the multidimensional motion data matrix of each frame of original motion data, wherein the multidimensional acceleration matrix comprises the acceleration of each joint point in a plurality of dimensional directions.
Here, a multi-dimensional acceleration matrix of each frame of the original motion data is further acquired based on the multi-dimensional motion data matrix of each frame of the original motion data to determine the acceleration parameter.
Wherein, in this step, based on the manner of steps 1111-1112, the motion data of each joint point of each frame of original motion data is first utilized to subtract the motion data of the joint point corresponding to the previous frame of original motion data of each frame of original motion data, so as to obtain the multidimensional motion velocity matrix of each frame of original motion data; and then, a multi-dimensional acceleration matrix of each frame of original motion data is obtained by subtracting the motion speed of the joint point corresponding to each frame of original motion data from the motion speed of each joint point of the next frame of original motion data of each frame of original motion data.
E.g. still with the original motion data of the ith frame
Figure BDA0002745941140000102
Each joint pointThe motion data respectively including X, Y, Z three directions are taken as an example, and the step is carried out according to miThree-dimensional motion data matrix of
Figure BDA0002745941140000103
And mi-1Three-dimensional motion data matrix of
Figure BDA0002745941140000104
First obtaining miThe three-dimensional motion velocity matrix of (a):
Figure BDA0002745941140000105
then according to miThree-dimensional motion velocity matrix of
Figure BDA0002745941140000106
And mi+1Three-dimensional motion velocity matrix of
Figure BDA0002745941140000107
Obtaining miThe three-dimensional acceleration matrix of (a):
Figure BDA0002745941140000108
on the basis of obtaining the multidimensional acceleration matrix of each frame of original motion data, the step 112 includes:
step 1121: and calculating the weighted average value of the accelerations of all the joint points in each dimension direction corresponding to each frame of original motion data according to the accelerations of each joint point in the plurality of dimension directions corresponding to each frame of original motion data and the weight coefficient corresponding to each joint point.
Here, the weighted average value of the accelerations of all the joint points in each dimension direction is calculated according to the accelerations of each joint point in the plurality of dimension directions and the corresponding weight coefficients, so that the motion conditions of all the joint points in each dimension direction are integrated, the difference between different joint points is considered for calculation, and the accuracy of the acceleration parameters is improved.
As mentioned above, other methods may also be adopted, for example, the average value of the accelerations of all the joints in each dimension direction corresponding to each frame of original motion data is directly calculated without considering the difference between different joints.
E.g. still with the original motion data of the ith frame
Figure BDA0002745941140000111
Taking the example that each joint point respectively comprises X, Y, Z movement data in three directions, the step calculates miX, Y, Z weighted average of acceleration of all joints in each direction
Figure BDA0002745941140000112
And
Figure BDA0002745941140000113
Figure BDA0002745941140000114
wherein wlIs the weighting coefficient of the ith joint point.
Step 1122: and determining the acceleration parameter of each frame of original motion data according to the weighted average value of the accelerations of all the joint points in each dimension direction corresponding to each frame of original motion data.
The determined acceleration parameters can reflect the motion condition of each frame of original motion data more comprehensively based on the weighted average value of the accelerations of all the joint points in each dimension direction corresponding to each frame of original motion data, so that the accuracy of selecting the key frame is improved.
Preferably, step 1122 includes:
step 1122: and calculating the square sum of the average values in a plurality of dimensional directions corresponding to each frame of original motion data, and taking the square sum as the acceleration parameter of each frame of original motion data.
E.g. still with the original motion data of the ith frame
Figure BDA0002745941140000115
Taking the example that each joint point respectively comprises X, Y, Z movement data in three directions, the step calculates miAverage value in each direction of X, Y, Z
Figure BDA0002745941140000116
And
Figure BDA0002745941140000117
the sum of the squares of (a) and (b), as the final acceleration parameter, is formulated as follows:
Figure BDA0002745941140000121
at this time, the integrated acceleration parameter is utilized
Figure BDA0002745941140000122
Judging whether m is to be foundiAs the key frame, specifically, it is judged by step 12
Figure BDA0002745941140000123
Whether the acceleration is greater than a preset acceleration threshold value asetIf, if
Figure BDA0002745941140000124
Then the ith frame m is determinediFor a key frame, it may be added to the initial key frame list K.
Of course, the above-mentioned manner of determining the square sum to determine the acceleration parameter is only a preferred implementation manner, and the embodiment of the present invention may also determine the acceleration parameter in other manners, for example, after calculating the square sum of the weighted averages of all the joint accelerations in the multiple dimension directions, calculating the square value of the square sum, and using the square value as the acceleration parameter, which is not described herein.
After the key frame is selected from the original motion sequence through the foregoing process, in order to ensure the consistency of the motion, preferably, after the original motion data with the acceleration parameter greater than the preset acceleration threshold is selected as the motion data of the key frame in step 12, the method further includes:
step 13: and judging whether the distance between any two adjacent key frames in the motion data of all the key frames is greater than a preset distance threshold value.
Step 14: and when the distance between two adjacent key frames is greater than the preset distance threshold, selecting original motion data, the distance between the original motion data and the key frame of the previous frame of the two adjacent key frames is the preset distance threshold, and supplementing the original motion data into the motion data of the key frame.
At the moment, when the distance between any two adjacent key frames is greater than the preset distance threshold, one frame of original motion data which is in the original motion sequence and is away from the previous frame of key frame by the preset distance threshold is supplemented into key frame data, so that the phenomenon that the distance between two adjacent key frames is too large is avoided, the accuracy of key frame selection is further improved, and the continuous and natural action is ensured.
Here, after step 14, it is continuously determined whether the distance between the supplemented target key frame and the key frame of the next frame adjacent to the target key frame is greater than the preset distance threshold through step 13, and if so, the original motion data whose distance from the target key frame is the preset distance threshold is continuously selected from the original motion sequence and supplemented as the motion data of the key frame.
At this time, a plurality of frames of key frames may be supplemented between two adjacent frames of key frames, so as to ensure that the distance between any two adjacent frames of key frames is not too large, and ensure the continuity and naturalness of the action.
The preset distance threshold can be set to any value according to requirements, for example, the preset distance threshold is 6 frames, the distance between any two adjacent key frames cannot exceed 6 frames, and if the distance exceeds 6 frames, motion data which is 6 frames away from the previous key frame needs to be supplemented into the key frame.
A specific application flow of supplementing the key frame according to the embodiment of the present invention is illustrated as follows.
Assuming that after the key frame is selected in the original motion sequence by steps 11-12, an initial key frame list K is obtained:
Figure BDA0002745941140000131
k is an ordered list, K1、k2、k3The rank of the key frame in K. Wherein k is1、k2、k3Etc. may not be continuous, e.g., assume that the 1 st frame key frame is the 3 rd frame original motion data m in the original motion sequence3The 2 nd frame key frame is the 5 th frame original motion data m in the original motion sequence5The 3 rd frame key frame is 8 th frame original motion data m in the original motion sequence8Then k is1、k2、k3The values of (A) are 3, 5 and 8 in sequence.
As shown in fig. 4, the supplementary key frame specifically includes:
s41: and starting.
S42: set o to 0 and q to q _ set. Where q _ set is a preset distance threshold and o represents the key frame of the frame in the initial key frame list K.
S43: set o ═ o + 1.
S44: setting S to ko
S45: and judging whether o +1 is larger than len (K), if so, jumping to S410, and otherwise, jumping to S46.
Here, it is determined whether o +1 is greater than the total frame number len (K) in the sequence table K, if it is greater than the last frame key frame currently determined to be in the sequence table K, the process skips to S410 to end for the completion of the supplement of all the neighboring key frames in the sequence table K with a larger distance, otherwise, the process skips to S46 to continue the supplement.
S46: setting E to ko+1
S47: and judging whether the E-S is larger than q, if so, jumping to S48, and otherwise, returning to S43.
Here, whether the distance between the key frame of the next frame and the key frame of the previous frame is greater than q is judged, q is equal to a preset distance threshold, if so, the step goes to S48 to supplement the key frame, otherwise, the step returns to S43 to continue judging the key frame of the next frame in K.
S48: and supplementing the S + q frame original motion data in the original motion sequence into the motion data of the key frame.
Here, it is assumed that o is 3 and S is k3=8,q=q set6, the S + q in the original motion sequence is equal to 14 frames of original motion data m14The complement is a key frame.
S49: s + q is set, and then the process returns to S45.
And setting S as the supplemented key frame, returning to S45, judging whether the supplement is completed or not, if the supplement is completed, continuing to judge whether the distance between the supplemented key frame and the key frame of the next frame adjacent to the supplemented key frame is greater than q through S46-S47, if the supplement is not completed, supplementing the key frame, otherwise, continuing to judge the key frame of the next frame in the sequence list K, and repeating the steps until the supplement of all the adjacent key frames in the sequence list K with larger distance is completed.
S410: and (6) ending.
At the moment, the distance between any two adjacent key frames can not be overlarge by supplementing the key frames, so that the accuracy of selecting the key frames is further improved, and the continuity and naturalness of actions are ensured.
According to the data processing method, the key data with large influence in the whole action process is automatically selected as the key frames through the acceleration parameters, automatic batch processing of the selected key frames is achieved, manual checking of each frame data is not needed, the processing efficiency of the selected key frames is improved, the key data with large influence in the action process is automatically selected as the key frames according to the acceleration parameters, misjudgment caused by problems of personal level, visual fatigue and the like is avoided, and the accuracy of the selected key frames is improved.
Referring to fig. 5, an embodiment of the present invention further provides a data processing apparatus 500, including:
the first calculating module 501 is configured to calculate an acceleration parameter of each frame of original motion data according to motion data of a plurality of joint points included in each frame of original motion data in an original motion sequence; wherein the original motion sequence comprises a plurality of frames of original motion data;
a key frame selecting module 502, configured to respectively determine whether an acceleration parameter of each frame of original motion data is greater than a preset acceleration threshold, and select, from the original motion sequence, an original motion data whose acceleration parameter is greater than the preset acceleration threshold as a motion data of a key frame.
According to the data processing device 500, the key data with large influence in the whole action process is automatically selected as the key frames through the acceleration parameters, automatic batch processing of the selected key frames is achieved, manual checking of each frame data is not needed, the processing efficiency of the selected key frames is improved, the key data with large influence in the action process is automatically selected as the key frames according to the acceleration parameters, misjudgment caused by problems of personal level, visual fatigue and the like is avoided, and the accuracy of the selected key frames is improved.
Preferably, the first calculating module 501 includes:
the first calculation submodule is used for calculating the acceleration of each joint point corresponding to each frame of original motion data according to the motion data of a plurality of joint points contained in each frame of original motion data in the original motion sequence;
and the second calculation submodule is used for calculating the weighted average value of the accelerations of all the joint points corresponding to each frame of original motion data according to the acceleration of each joint point corresponding to each frame of original motion data and the weight coefficient corresponding to each joint point, and determining the acceleration parameter of each frame of original motion data according to the average value.
Preferably, the first calculation submodule includes:
the first processing unit is used for subtracting the motion data of each joint point of each frame of original motion data from the motion data of the joint point corresponding to the previous frame of original motion data of each frame of original motion data to obtain the motion speed of each joint point of each frame of original motion data;
and the second processing unit is used for subtracting the movement speed of each joint point of the next frame of original movement data of each frame of original movement data from the movement speed of the joint point corresponding to each frame of original movement data to obtain the acceleration of each joint point of each frame of original movement data.
Preferably, the apparatus further comprises:
and the copying module is used for copying the first frame of original motion data in the original motion sequence into the previous frame of original motion data of the first frame of original motion data.
Preferably, the motion data of each joint point of each frame of original motion data respectively comprises motion data of a plurality of dimensional directions;
the first computation submodule includes:
the first calculation unit is used for calculating the acceleration of each joint point corresponding to each frame of original motion data in a plurality of dimension directions according to the motion data of the plurality of joint points contained in each frame of original motion data;
the second calculation submodule includes:
the second calculation unit is used for calculating the weighted average value of the accelerations of all the joint points in each dimension direction corresponding to each frame of original motion data according to the accelerations of all the joint points in the dimension directions corresponding to each frame of original motion data and the weight coefficient corresponding to each joint point;
and the first determining unit is used for determining the acceleration parameter of each frame of original motion data according to the weighted average value of the accelerations of all the joint points in each dimension direction corresponding to each frame of original motion data.
Preferably, the first determining unit is specifically configured to:
and calculating the square sum of the average values in a plurality of dimensional directions corresponding to each frame of original motion data, and taking the square sum as the acceleration parameter of each frame of original motion data.
Preferably, the first calculating unit is specifically configured to:
respectively converting the motion data of a plurality of joint points contained in each frame of original motion data into a multi-dimensional motion data matrix; and acquiring a multidimensional acceleration matrix of each frame of original motion data according to the multidimensional motion data matrix of each frame of original motion data, wherein the multidimensional acceleration matrix comprises the acceleration of each joint point in a plurality of dimensional directions.
Preferably, the total product of the weight coefficients corresponding to all the joint points is 1.
Preferably, the apparatus further comprises:
the judging module is used for judging whether the distance between any two adjacent key frames in all the key frame motion data is greater than a preset distance threshold value or not;
and the supplementing module is used for selecting original motion data, the distance between the original motion data and a key frame of the previous frame of the two adjacent key frames is the preset distance threshold value, and supplementing the original motion data into the motion data of the key frame in the original motion sequence when the distance between the two adjacent key frames is larger than the preset distance threshold value.
According to the data processing device 500, the key data with large influence in the whole action process is automatically selected as the key frames through the acceleration parameters, automatic batch processing of the selected key frames is achieved, manual checking of each frame data is not needed, the processing efficiency of the selected key frames is improved, the key data with large influence in the action process is automatically selected as the key frames according to the acceleration parameters, misjudgment caused by problems of personal level, visual fatigue and the like is avoided, and the accuracy of the selected key frames is improved.
For the above device embodiments, since they are basically similar to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points.
The embodiment of the invention also provides the electronic equipment which can be a mobile terminal. As shown in fig. 6, the system comprises a processor 601, a communication interface 602, a memory 603 and a communication bus 604, wherein the processor 601, the communication interface 602 and the memory 603 complete communication with each other through the communication bus 604.
A memory 603 for storing a computer program.
When the processor 601 is used to execute the program stored in the memory 603, the following steps are implemented:
calculating an acceleration parameter of each frame of original motion data according to motion data of a plurality of joint points respectively contained in each frame of original motion data in an original motion sequence; wherein the original motion sequence comprises a plurality of frames of original motion data;
respectively judging whether the acceleration parameter of each frame of original motion data is larger than a preset acceleration threshold value, and selecting the original motion data of which the acceleration parameter is larger than the preset acceleration threshold value from the original motion sequence as the motion data of the key frame.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component.
In yet another embodiment provided by the present invention, a computer-readable storage medium is further provided, which stores instructions that, when executed on a computer, cause the computer to execute the data processing method described in the above embodiment.
In a further embodiment of the present invention, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform the data processing method described in the above embodiment.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present invention are included in the protection scope of the present invention.

Claims (10)

1. A data processing method, comprising:
calculating an acceleration parameter of each frame of original motion data according to motion data of a plurality of joint points respectively contained in each frame of original motion data in an original motion sequence; wherein the original motion sequence comprises a plurality of frames of original motion data;
respectively judging whether the acceleration parameter of each frame of original motion data is larger than a preset acceleration threshold value, and selecting the original motion data of which the acceleration parameter is larger than the preset acceleration threshold value from the original motion sequence as the motion data of the key frame.
2. The data processing method of claim 1, wherein the step of calculating the acceleration parameter of each frame of original motion data according to the motion data of a plurality of joint points respectively contained in each frame of original motion data in the original motion sequence comprises:
calculating the acceleration of each joint point corresponding to each frame of original motion data according to the motion data of a plurality of joint points contained in each frame of original motion data in the original motion sequence;
calculating the weighted average value of the accelerations of all the joint points corresponding to each frame of original motion data according to the acceleration of each joint point corresponding to each frame of original motion data and the weight coefficient corresponding to each joint point, and determining the acceleration parameter of each frame of original motion data according to the average value.
3. The data processing method according to claim 2, wherein the step of calculating the acceleration of each joint point corresponding to each frame of original motion data according to the motion data of a plurality of joint points included in each frame of original motion data in the original motion sequence comprises:
subtracting the motion data of each joint point of each frame of original motion data from the motion data of the joint point corresponding to the previous frame of original motion data of each frame of original motion data to obtain the motion speed of each joint point of each frame of original motion data;
and subtracting the movement speed of each joint point of the next frame of original movement data of each frame of original movement data from the movement speed of each joint point of the next frame of original movement data of each frame of original movement data to obtain the acceleration of each joint point of each frame of original movement data.
4. The data processing method of claim 3, wherein before the motion data of each joint point of each frame of raw motion data is subtracted from the motion data of the joint point corresponding to the previous frame of raw motion data of each frame of raw motion data to obtain the motion speed of each joint point of each frame of raw motion data, the method further comprises:
and copying the first frame of original motion data in the original motion sequence into the previous frame of original motion data of the first frame of original motion data.
5. The data processing method of claim 2, wherein the motion data of each joint point of each frame of original motion data respectively comprises motion data of a plurality of dimensional directions;
the step of calculating the acceleration of each joint point corresponding to each frame of original motion data according to the motion data of a plurality of joint points contained in each frame of original motion data in the original motion sequence comprises the following steps:
calculating the acceleration of each joint point corresponding to each frame of original motion data in a plurality of dimension directions according to the motion data of the plurality of joint points contained in each frame of original motion data;
the step of calculating the weighted average value of the accelerations of all the joint points corresponding to each frame of original motion data according to the acceleration of each joint point corresponding to each frame of original motion data and the weight coefficient corresponding to each joint point, and determining the acceleration parameter of each frame of original motion data according to the average value comprises the following steps:
calculating the weighted average value of the accelerations of all the joint points in each dimension direction corresponding to each frame of original motion data according to the accelerations of each joint point in the plurality of dimension directions corresponding to each frame of original motion data and the weight coefficient corresponding to each joint point;
and determining the acceleration parameter of each frame of original motion data according to the weighted average value of the accelerations of all the joint points in each dimension direction corresponding to each frame of original motion data.
6. The data processing method of claim 5, wherein the step of determining the acceleration parameter of each frame of raw motion data according to the weighted average of the accelerations of all the joints in each dimension direction corresponding to each frame of raw motion data comprises:
and calculating the square sum of the average values in a plurality of dimensional directions corresponding to each frame of original motion data, and taking the square sum as the acceleration parameter of each frame of original motion data.
7. The data processing method according to claim 5, wherein the step of calculating, based on the motion data of a plurality of joint points included in each frame of original motion data, the acceleration of each joint point corresponding to each frame of original motion data in a plurality of dimensional directions comprises:
respectively converting the motion data of a plurality of joint points contained in each frame of original motion data into a multi-dimensional motion data matrix;
and acquiring a multidimensional acceleration matrix of each frame of original motion data according to the multidimensional motion data matrix of each frame of original motion data, wherein the multidimensional acceleration matrix comprises the acceleration of each joint point in a plurality of dimensional directions.
8. The data processing method according to claim 2, wherein the total product of the weight coefficients corresponding to all the joint points is 1.
9. The data processing method according to claim 1, wherein after selecting the original motion data with the acceleration parameter greater than the preset acceleration threshold as the motion data of the key frame, the method further comprises:
judging whether the distance between any two adjacent key frames in all the key frame motion data is greater than a preset distance threshold value or not;
and when the distance between two adjacent key frames is greater than the preset distance threshold, selecting original motion data, the distance between the original motion data and the key frame of the previous frame of the two adjacent key frames is the preset distance threshold, and supplementing the original motion data into the motion data of the key frame.
10. A data processing apparatus, comprising:
the first calculation module is used for calculating the acceleration parameter of each frame of original motion data according to the motion data of a plurality of joint points contained in each frame of original motion data in the original motion sequence; wherein the original motion sequence comprises a plurality of frames of original motion data;
and the key frame selection module is used for respectively judging whether the acceleration parameter of each frame of original motion data is greater than a preset acceleration threshold value, and selecting the original motion data of which the acceleration parameter is greater than the preset acceleration threshold value from the original motion sequence as the motion data of the key frame.
CN202011166428.5A 2020-10-27 2020-10-27 Data processing method and device Pending CN112288838A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011166428.5A CN112288838A (en) 2020-10-27 2020-10-27 Data processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011166428.5A CN112288838A (en) 2020-10-27 2020-10-27 Data processing method and device

Publications (1)

Publication Number Publication Date
CN112288838A true CN112288838A (en) 2021-01-29

Family

ID=74372263

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011166428.5A Pending CN112288838A (en) 2020-10-27 2020-10-27 Data processing method and device

Country Status (1)

Country Link
CN (1) CN112288838A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114248266A (en) * 2021-09-17 2022-03-29 之江实验室 Anthropomorphic action track generation method and device for double-arm robot and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103533237A (en) * 2013-09-29 2014-01-22 清华大学 Method for extracting video key frame from video
CN105912985A (en) * 2016-04-01 2016-08-31 上海理工大学 Human skeleton joint point behavior motion expression method based on energy function
CN106780681A (en) * 2016-12-01 2017-05-31 北京像素软件科技股份有限公司 A kind of role action generation method and device
CN109190474A (en) * 2018-08-01 2019-01-11 南昌大学 Human body animation extraction method of key frame based on posture conspicuousness
JP2019102025A (en) * 2017-12-08 2019-06-24 株式会社スクウェア・エニックス Animation data compression program, animation data restoration program, animation data compression device, and animation data compression method
CN111681303A (en) * 2020-06-10 2020-09-18 北京中科深智科技有限公司 Method and system for extracting key frame from captured data and reconstructing motion

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103533237A (en) * 2013-09-29 2014-01-22 清华大学 Method for extracting video key frame from video
CN105912985A (en) * 2016-04-01 2016-08-31 上海理工大学 Human skeleton joint point behavior motion expression method based on energy function
CN106780681A (en) * 2016-12-01 2017-05-31 北京像素软件科技股份有限公司 A kind of role action generation method and device
JP2019102025A (en) * 2017-12-08 2019-06-24 株式会社スクウェア・エニックス Animation data compression program, animation data restoration program, animation data compression device, and animation data compression method
CN109190474A (en) * 2018-08-01 2019-01-11 南昌大学 Human body animation extraction method of key frame based on posture conspicuousness
CN111681303A (en) * 2020-06-10 2020-09-18 北京中科深智科技有限公司 Method and system for extracting key frame from captured data and reconstructing motion

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114248266A (en) * 2021-09-17 2022-03-29 之江实验室 Anthropomorphic action track generation method and device for double-arm robot and electronic equipment
CN114248266B (en) * 2021-09-17 2024-03-26 之江实验室 Anthropomorphic action track generation method and device of double-arm robot and electronic equipment

Similar Documents

Publication Publication Date Title
CN111163338A (en) Video definition evaluation model training method, video recommendation method and related device
CN110910322B (en) Picture processing method and device, electronic equipment and computer readable storage medium
CN111985414B (en) Joint position determining method and device
CN113421242B (en) Welding spot appearance quality detection method and device based on deep learning and terminal
CN111915567A (en) Image quality evaluation method, device, equipment and medium
CN112562072A (en) Action redirection method, device, equipment and storage medium
CN111932451A (en) Method and device for evaluating repositioning effect, electronic equipment and storage medium
CN110969100A (en) Human body key point identification method and device and electronic equipment
CN111031359B (en) Video playing method and device, electronic equipment and computer readable storage medium
CN112288838A (en) Data processing method and device
CN107844593B (en) Video data distribution method and device in distributed computing platform
CN114298923A (en) Lens evaluation and image restoration method for machine vision measurement system
CN108520532B (en) Method and device for identifying motion direction of object in video
CN110472143A (en) A kind of information-pushing method, device, readable storage medium storing program for executing and terminal device
CN110366029B (en) Method and system for inserting image frame between videos and electronic equipment
CN110956131B (en) Single-target tracking method, device and system
CN111325832A (en) Modeling method, modeling device and electronic equipment
CN112258609A (en) Data matching method and device, electronic equipment and storage medium
CN114820755B (en) Depth map estimation method and system
CN111353597B (en) Target detection neural network training method and device
CN113014928B (en) Compensation frame generation method and device
CN115063713A (en) Training method of video generation model, video generation method and device, electronic equipment and readable storage medium
CN113420604A (en) Multi-person posture estimation method and device and electronic equipment
CN112686977B (en) Human model action redirection method, device, electronic equipment and storage medium
CN114254757A (en) Distributed deep learning method and device, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination