CN110188688B - Posture evaluation method and device - Google Patents

Posture evaluation method and device Download PDF

Info

Publication number
CN110188688B
CN110188688B CN201910463521.3A CN201910463521A CN110188688B CN 110188688 B CN110188688 B CN 110188688B CN 201910463521 A CN201910463521 A CN 201910463521A CN 110188688 B CN110188688 B CN 110188688B
Authority
CN
China
Prior art keywords
matrix
coordinates
posture information
posture
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910463521.3A
Other languages
Chinese (zh)
Other versions
CN110188688A (en
Inventor
袁燚
徐榆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201910463521.3A priority Critical patent/CN110188688B/en
Publication of CN110188688A publication Critical patent/CN110188688A/en
Application granted granted Critical
Publication of CN110188688B publication Critical patent/CN110188688B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/60Rotation of whole images or parts thereof
    • G06T3/604Rotation of whole images or parts thereof using coordinate rotation digital computer [CORDIC] devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention provides a posture assessment method and a posture assessment device, wherein the method comprises the following steps: acquiring first posture information of a first object in a first image and second posture information of a second object in a second image, wherein the first posture information comprises coordinates of at least two parts of the first object, and the second posture information comprises coordinates of at least two parts of the second object; performing scaling processing and/or rotation processing on the coordinates in the first posture information to obtain third posture information, wherein the size of the object indicated by the third posture information is the same as that of the second object, and at least two parts of the object indicated by the third posture information correspond to at least two parts of the second object; and determining the posture matching degree of the first object and the second object according to the third posture information and the second posture information. For improving the accuracy of determining the degree of pose match.

Description

Posture evaluation method and device
Technical Field
The embodiment of the invention relates to the field of image processing, in particular to a posture assessment method and device.
Background
The motion pose estimation method may be used to estimate the similarity of the pose of the person in the two pictures.
Currently, an action posture estimation method includes: acquiring coordinate matrixes of human key points of people in the two images, determining Euclidean distances between coordinates of the human key points at corresponding positions according to the coordinate matrixes of the human key points in the two images, and determining the similarity of the postures of the people in the two images according to the Euclidean distances.
In the above process, when the human body type difference is large in the two images but the postures are similar, and the euclidean distance is determined as the similarity of the human postures in the two images, the obtained posture similarity is low, and the accuracy of the posture similarity is poor.
Disclosure of Invention
The embodiment of the invention provides a posture assessment method and a posture assessment device, which are used for improving the accuracy of determining the posture matching degree.
In a first aspect, an embodiment of the present invention provides a posture assessment method, including:
acquiring first posture information of a first object in a first image and second posture information of a second object in a second image, wherein the first posture information comprises coordinates of at least two parts of the first object, and the second posture information comprises coordinates of at least two parts of the second object;
performing scaling processing and/or rotation processing on the coordinates in the first posture information to obtain third posture information, wherein the size of the object indicated by the third posture information is the same as that of the second object, and at least two parts of the object indicated by the third posture information correspond to at least two parts of the second object;
and determining the posture matching degree of the first object and the second object according to the third posture information and the second posture information.
In one possible embodiment, scaling and/or rotating the coordinates in the first pose information to obtain third pose information includes:
determining an adjustment parameter according to the first posture information and the second posture information, wherein the adjustment parameter comprises a scaling parameter and/or a rotation angle;
and determining the third posture information according to the first posture information and the adjusting parameter.
In another possible embodiment, determining an adjustment parameter based on the first posture information and the second posture information includes:
acquiring a first matrix corresponding to the first posture information, wherein the first matrix comprises coordinates of at least two parts of the first object, and the coordinates of the at least two parts of the first object are stored in the first matrix according to a preset rule;
acquiring a second matrix corresponding to the second posture information, wherein the second matrix comprises coordinates of at least two parts of the second object, and the coordinates of the at least two parts of the second object are stored in the second matrix according to the preset rule;
and determining the adjusting parameters according to the Powerian distance between the first matrix and the second matrix.
In another possible embodiment, determining the adjustment parameter according to a euclidean distance between the first matrix and the second matrix includes:
performing a matrix processing operation, the matrix processing operation comprising: obtaining a third matrix according to the initial parameters and the first matrix;
performing a parameter update operation, the parameter update operation including updating initial parameters according to the Poisson's distance of the third matrix and the second matrix;
and repeatedly executing the matrix processing operation and the parameter updating operation until the Peter distance between the third matrix and the second matrix is smaller than or equal to a distance threshold value, or the number of times of executing the matrix processing operation is larger than or equal to a preset number of times, and determining the initial parameter as the adjustment parameter.
In another possible implementation, determining the third posture information according to the first posture information and the adjustment parameter includes:
and multiplying the adjustment parameter by the coordinate in the first posture information to obtain the third posture information.
In another possible implementation, the determining the degree of gesture matching between the first object and the second object according to the third gesture information and the second gesture information includes:
acquiring a second matrix corresponding to the second posture information;
acquiring a fourth matrix corresponding to the third posture information;
and determining the gesture matching degree of the first object and the second object according to the second matrix and the fourth matrix.
In another possible embodiment, the determining the gesture matching degree of the first object and the second object according to the second matrix and the fourth matrix includes:
acquiring Euclidean distances of coordinates of corresponding positions in the second matrix and the fourth matrix;
and determining the gesture matching degree of the first object and the second object according to the Euclidean distance of the coordinates of the corresponding positions.
In another possible embodiment, determining the gesture matching degree of the first object and the second object according to the euclidean distance of the coordinates of the corresponding position includes:
determining an average value of Euclidean distances of corresponding coordinates in the second matrix and the fourth matrix according to the Euclidean distances of the corresponding coordinates;
and determining the absolute value of the difference between the preset threshold and the average value as the posture matching degree of the first object and the second object.
In a second aspect, an embodiment of the present invention provides a posture assessment apparatus, including: an acquisition module, a processing module and a determination module, wherein,
the acquiring module is configured to acquire first pose information of a first object in a first image and second pose information of a second object in a second image, where the first pose information includes coordinates of at least two parts of the first object, and the second pose information includes coordinates of at least two parts of the second object;
the processing module is configured to perform scaling processing and/or rotation processing on the coordinates in the first pose information to obtain third pose information, where a size of an object indicated by the third pose information is the same as a size of the second object, and at least two parts of the object indicated by the third pose information correspond to at least two parts of the second object;
the determining module is configured to determine a gesture matching degree of the first object and the second object according to the third gesture information and the second gesture information.
In a possible implementation, the processing module is specifically configured to:
determining an adjustment parameter according to the first posture information and the second posture information, wherein the adjustment parameter comprises a scaling parameter and/or a rotation angle;
and determining the third posture information according to the first posture information and the adjusting parameter.
In another possible implementation manner, the processing module is specifically configured to:
acquiring a first matrix corresponding to the first posture information, wherein the first matrix comprises coordinates of at least two parts of the first object, and the coordinates of the at least two parts of the first object are stored in the first matrix according to a preset rule;
acquiring a second matrix corresponding to the second posture information, wherein the second matrix comprises coordinates of at least two parts of the second object, and the coordinates of the at least two parts of the second object are stored in the second matrix according to the preset rule;
and determining the adjusting parameters according to the Powerian distance between the first matrix and the second matrix.
In another possible implementation manner, the processing module is specifically configured to:
performing a matrix processing operation, the matrix processing operation comprising: obtaining a third matrix according to the initial parameters and the first matrix;
performing a parameter update operation, the parameter update operation including updating initial parameters according to the Poisson's distance of the third matrix and the second matrix;
and repeatedly executing the matrix processing operation and the parameter updating operation until the Peter distance between the third matrix and the second matrix is smaller than or equal to a distance threshold value, or the number of times of executing the matrix processing operation is larger than or equal to a preset number of times, and determining the initial parameter as the adjustment parameter.
In another possible implementation manner, the determining module is specifically configured to:
and multiplying the adjustment parameter by the coordinate in the first posture information to obtain the third posture information.
In another possible implementation manner, the determining module is specifically configured to:
acquiring a second matrix corresponding to the second posture information;
acquiring a fourth matrix corresponding to the third posture information;
and determining the gesture matching degree of the first object and the second object according to the second matrix and the fourth matrix.
In another possible implementation manner, the determining module is specifically configured to:
acquiring Euclidean distances of coordinates of corresponding positions in the second matrix and the fourth matrix;
and determining the posture matching degree of the first object and the second object according to the Euclidean distance of the coordinates of the corresponding positions.
In another possible implementation manner, the determining module is specifically configured to:
determining an Euclidean distance average value according to the Euclidean distance of the coordinates of the corresponding position;
and determining the absolute value of the difference between the preset threshold and the Euclidean distance average value as the posture matching degree of the first object and the second object.
In a third aspect, an embodiment of the present invention provides a posture assessment apparatus, including: a processor coupled with a memory;
the memory is used for storing a computer program;
the processor is configured to execute the computer program stored in the memory to cause the posture estimation apparatus to perform the posture estimation method according to any one of the first aspect.
In a fourth aspect, an embodiment of the present invention provides a readable storage medium, which includes a program or instructions, and when the program or instructions are run on a computer, the posture estimation method according to any one of the above first aspects is performed.
The method obtains first posture information of a first object in a first image and second posture information of a second object in a second image, the first posture information comprises coordinates of at least two parts of the first object, the second posture information comprises coordinates of at least two parts of the second object, scaling processing and/or rotation processing are carried out on the coordinates in the first posture information to obtain third posture information, the size of the object indicated by the third posture information is the same as that of the second object, the at least two parts of the object indicated by the third posture information correspond to the at least two parts of the second object, and the posture matching degree of the first object and the second object is determined according to the third posture information and the second posture information. In the above process, the coordinates in the first posture information are scaled and/or rotated to obtain third posture information, and the posture matching degrees of the first object and the second object are determined according to the third posture information and the second posture information, so that when the postures of the objects in the two images are similar to each other, a high posture matching degree is obtained, and the accuracy of the determined posture matching degree is high.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is an application scenario diagram of a posture estimation method provided in an embodiment of the present invention;
FIG. 2 is a first flowchart illustrating a posture estimation method according to an embodiment of the present invention;
FIG. 3 is a second flowchart illustrating a posture estimation method according to an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of a posture estimation device provided in an embodiment of the present invention;
fig. 5 is a schematic diagram of a hardware structure of a posture estimation apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is an application scenario diagram of a posture estimation method according to an embodiment of the present invention. Referring to fig. 1, a first object 110 is included in the first image 11, and a second object 120 is included in the second image 12, wherein the first object 110 and the second object 120 may have the same or different gestures. In practical applications, when the gesture matching degrees of the first object 110 and the second object 120 need to be evaluated, the first object 110 may be scaled and rotated to obtain the first object 130, and the gesture matching degrees of the first object 130 and the second object 120 may be determined as the gesture matching degrees of the first object 110 and the second object 120.
In the above process, the first object 110 is scaled and rotated, so that the obtained first object 130 and the second object 120 have the same size, and the limb part of the first object 130 corresponds to the limb part of the second object 120 (for example, the head of the first object 130 is at the highest position of the first image 11, the head of the second object 120 is at the highest position of the second image 12, the left hand of the first object 130 is at the leftmost position of the first image 11, and the left hand of the second object 120 is at the leftmost position of the second image 12), so that the gesture matching degree of the first object 130 and the second object 120 is determined as the gesture matching degree of the first object 110 and the second object 120, and the accuracy of the obtained gesture matching degree can be improved.
It should be noted that the posture estimation method disclosed in the present application can be applied to an electronic device, such as a dancing machine, and the dancing machine can score the dancing posture of the user during the dancing process of the user.
In practical application, in the process of dancing of a user, a first image of the user is acquired, the first image comprises a dancing posture image of the user, the dancing posture image displays the dancing posture of the user, a second image can be a preset standard image in a dancing machine, the standard image displays a standard dancing posture, the dancing machine can determine the matching degree of the dancing posture of the user and the standard dancing posture through the posture evaluation method provided by the application, and therefore the dancing posture of the user can be scored according to the matching degree, and the scoring accuracy can be improved.
The technical means shown in the present application will be described in detail below with reference to specific examples. It should be noted that the following embodiments may be combined with each other, and the description of the same or similar contents in different embodiments is not repeated.
Fig. 2 is a first flowchart illustrating a posture estimation method according to an embodiment of the present invention. Referring to fig. 2, the posture estimation method includes:
s201: first pose information of a first object in a first image and second pose information of a second object in a second image are acquired, the first pose information including coordinates of at least two parts of the first object, and the second pose information including coordinates of at least two parts of the second object.
Optionally, the execution subject of the embodiment of the present invention may be a posture evaluation device. Alternatively, the posture estimation device may be implemented by a combination of software and/or hardware.
Alternatively, the first object and the second object may be images of the same person, or different persons, and the persons in the images may have the same or different gestures.
It is noted that the at least two parts of the first subject and the at least two parts of the second subject may be at least two parts of a head, shoulders, left arms, right arms, torso, left legs and right legs.
Optionally, at least one keypoint may be included on each site. For example, the head may include key points for the left eye, right eye, nose, etc., and the left arm may include key points for the left hand, left elbow, etc.
In practical applications, the coordinates of the at least two parts in the first pose information are the coordinates of the keypoints on each part of the first object in the first image, and the coordinates of the at least two parts in the second pose information are the coordinates of the keypoints on each part of the second object in the second image.
Optionally, the coordinates may be two-dimensional coordinates or three-dimensional coordinates.
Alternatively, the first image and the second image may be the same size or different sizes.
When the sizes of the first image and the second graphic are different, the first image and/or the second graphic need to be subjected to cropping processing so that the sizes of the first image and the second graphic are the same.
It should be noted that, the key point detection network may be used to perform key point detection processing on the first object in the first image to obtain coordinates of key points on each part of the first object in the first image, and perform key point detection processing on the second object in the second image to obtain coordinates of key points on each part of the second object in the second image.
Optionally, the first posture information and the second posture information may further include identification information of each part.
For example, the first gesture information may be represented as
Figure GDA0003047437840000081
In the first pose information, the elements in the first column are used to identify key points on different parts of the first object, and the elements in the second column may represent coordinates corresponding to the key points of the first object.
For example, 1He1 may identify the keypoint of the head of the first object-the left eye, and 1He2 may be used to identify the keypoint of the head of the first object-the right eye, (a)1,b1) Is the coordinate of the left eye of the first object in the first image, (a'1,b′1) As coordinates of the right eye of the first object in the first image
For example, the second gesture information may be represented as
Figure GDA0003047437840000082
It should be noted that, in the second pose information, the elements in the first column are used to identify key points on different parts of the second object, and the elements in the second column may represent coordinates corresponding to the key points of the second object.
For example, 2He1 may identify the keypoint of the head of the second object-the left eye, 2He2 may be used to identify the keypoint of the head of the second object-the right eye, (c)1,d1) Is the coordinate of the left eye of the second object in the second image, (c'1,d′1) Is the coordinates of the right eye of the second object in the second image.
S202: and performing scaling processing and/or rotation processing on the coordinates in the first posture information to obtain third posture information, wherein the size of the object indicated by the third posture information is the same as that of the second object, and at least two parts of the object indicated by the third posture information correspond to at least two parts of the second object.
Optionally, a first matrix may be obtained according to the first posture information, a second matrix may be obtained according to the second posture information, an adjustment parameter is determined according to the first matrix and the second matrix, and the adjustment parameter is multiplied by the first matrix, so as to implement scaling and/or rotation processing on the coordinates of the first posture information.
Optionally, the third posture information is obtained according to the coordinates after the scaling processing and/or the rotation processing and the identification of the key point corresponding to the coordinates.
For example, the third gesture information may be represented as
Figure GDA0003047437840000083
It should be noted that, in the third posture information, the elements in the first column are used to identify key points on different parts of an object (referred to as a third object, for example, the first object 130 in fig. 1) obtained by performing scaling processing and/or rotation processing on the first object, the elements in the second column may represent coordinates corresponding to the key points of the third object,
for example, 3He1 may identify a keypoint of the head of the third object-the left eye, 3Ah1 may be used to identify a keypoint on the arm of the third object-the left hand, (a)1,B1) Is the coordinate of the left eye of the third object in the first image, (A)2,B2) Is the coordinates of the left hand of the third object in the first image.
S203: and determining the posture matching degree of the first object and the second object according to the third posture information and the second posture information.
Optionally, a fourth matrix may be obtained according to the third posture information, a second matrix may be obtained according to the second posture information, and the posture matching degree between the first object and the second object may be determined according to the second matrix and the fourth matrix.
The fourth matrix is determined based on the coordinates in the third posture information, and the second matrix is determined based on the coordinates in the second posture information.
Optionally, after obtaining the euclidean distances of the coordinates of the corresponding positions of the second matrix and the fourth matrix, determining the pose matching degrees of the first object and the second object according to a difference between a preset threshold and a sum of the euclidean distances.
In the posture evaluation method provided by the embodiment of the invention, first posture information of a first object in a first image and second posture information of a second object in a second image are acquired, the first posture information comprises coordinates of at least two parts of the first object, the second posture information comprises coordinates of at least two parts of the second object, scaling processing and/or rotation processing are carried out on the coordinates in the first posture information to obtain third posture information, the size of the object indicated by the third posture information is the same as that of the second object, the at least two parts of the object indicated by the third posture information correspond to the at least two parts of the second object, and the posture matching degree of the first object and the second object is determined according to the third posture information and the second posture information. In the above process, the coordinates in the first posture information are scaled and/or rotated to obtain third posture information, and the posture matching degrees of the first object and the second object are determined according to the third posture information and the second posture information, so that when the postures of the objects in the two images are similar to each other, a high posture matching degree is obtained, and the accuracy of the determined posture matching degree is high.
On the basis of any one of the above embodiments, a posture estimation method provided by the embodiment of the present invention is further described in detail below with reference to fig. 3, specifically, refer to fig. 3.
Fig. 3 is a flowchart illustrating a second method for posture assessment according to an embodiment of the present invention. Referring to fig. 3, the posture estimation method includes:
s301: first pose information of a first object in a first image and second pose information of a second object in a second image are acquired, the first pose information including coordinates of at least two parts of the first object, and the second pose information including coordinates of at least two parts of the second object.
It should be noted that the execution process of S301 is the same as the execution process of S201, and here, the execution method of S301 is not described again.
S302: and acquiring a first matrix corresponding to the first posture information, wherein the first matrix comprises coordinates of at least two parts of the first object, and the coordinates of the at least two parts of the first object are stored in the first matrix according to a preset rule.
Optionally, the coordinates of the key points on the at least two portions of the first object may be acquired from the first posture information, and the coordinates of the key points on the at least two portions of the first object may be stored in the first matrix R according to a preset rule.
Optionally, the preset rule may be that coordinates of key points of corresponding parts are stored in the first matrix R according to an up-down order, a left-right order of the parts.
For example, of the first objectThe two parts are head and shoulder, the head comprises key points of left eye, right eye and nose, the shoulder comprises key points of left shoulder and right shoulder, the first matrix R can be
Figure GDA0003047437840000101
Wherein the first element
Figure GDA0003047437840000102
Coordinates of the key point of the left eye of the first object, second element
Figure GDA0003047437840000103
As the right eye keypoint coordinates of the first object, the third element
Figure GDA0003047437840000104
As the nose key point coordinates of the first object, the fourth element
Figure GDA0003047437840000105
As the left shoulder keypoint coordinate of the first object, the fifth element
Figure GDA0003047437840000106
Is the right shoulder keypoint coordinates of the first object.
S303: and acquiring a second matrix corresponding to the second posture information, wherein the second matrix comprises coordinates of at least two parts of the second object, and the coordinates of the at least two parts of the second object are stored in the second matrix according to a preset rule.
Optionally, the second matrix S corresponding to the second posture information may be obtained according to the method of obtaining the first matrix corresponding to the first posture information in S302, and here, the implementation process in S303 is not described again.
It should be noted that the elements at corresponding positions in the second matrix S and the first matrix R indicate different coordinates of the same key point (e.g., left eye and right eye).
For example, in the first matrix R is
Figure GDA0003047437840000111
The second matrix S may be
Figure GDA0003047437840000112
Wherein the first element
Figure GDA0003047437840000113
Coordinates of a key point of a left eye of a second object, a second element
Figure GDA0003047437840000114
As the right eye keypoint coordinates of a second object, a third element
Figure GDA0003047437840000115
As the nose key point coordinates of the second object, fourth element
Figure GDA0003047437840000116
As left shoulder keypoint coordinates of the second object, fifth element
Figure GDA0003047437840000117
Is the right shoulder keypoint coordinate of the second object.
S304: performing a matrix processing operation, the matrix processing operation comprising: and obtaining a third matrix according to the initial parameters and the first matrix.
Optionally, an initial vector may be determined according to the first matrix R and the second matrix S, and an initial parameter may be determined according to the initial vector.
Alternatively, the initial vector may be determined from the first matrix R and the second matrix S by the following feasible formula one:
Figure GDA0003047437840000118
formula one
Wherein the content of the first and second substances,
Figure GDA0003047437840000119
as an initial vector, S is a first matrix R (a second matrix S)Number of elements in (1), xiAnd yiFor the coordinate value of the ith element in the first matrix R, the ith coordinate (x) can be formedi,yi),miAnd niFor the coordinate value of the ith element in the second matrix S, the ith coordinate (m) can be formedi,ni)。
Optionally, according to the initial vector, the determined initial parameter K may be
Figure GDA00030474378400001110
In addition, p is kcos (θ) and q is ksin (θ), where k is a scaling parameter and θ is a rotation angle.
Optionally, the initial parameter K is multiplied by the first matrix R to obtain a third matrix T.
S306: and executing a parameter updating operation, wherein the parameter updating operation comprises updating the initial parameters according to the Poisson's distance of the third matrix T and the second matrix R.
Optionally, the euclidean distance may be obtained according to the third matrix T and the second matrix R by using the following feasible formula two:
Figure GDA00030474378400001111
wherein, PdIs the distance of Peter, eiAnd fiFor the coordinate value of the ith element in the third matrix T, the ith coordinate (e) can be formedi,fi),miAnd niFor the coordinate value of the ith element in the second matrix S, the ith coordinate (m) can be formedi,ni)。
Optionally, when the euclidean distance is greater than the distance threshold or the number of times of performing the matrix processing operation is less than the preset number of times, the initial parameter K is updated.
Alternatively, the distance threshold may be 0.05, 0.06, etc.
Optionally, the preset number of times may be 10 times, 20 times, and the like.
S306: and judging whether the Peterson distance between the third matrix and the second matrix is smaller than or equal to a distance threshold value or whether the times of executing the matrix processing operation are larger than or equal to preset times.
If yes, go to step S307.
If not, go to step S304.
In practical applications, S303 to S305 need to be repeatedly executed.
It should be noted that, after each execution of S304, it is determined whether the euclidean distance is less than or equal to the distance threshold, or the number of times of executing the matrix processing operation (S303) is greater than or equal to the preset number of times, if so, S307 is executed, please refer to S307, and if not, S304 is continuously executed.
S307: the initial parameter is determined as the adjustment parameter.
It should be noted that the initial parameter that completes the last update is determined as the adjustment parameter K'.
S308: and multiplying the adjustment parameter by the coordinate in the first posture information to obtain third posture information.
Optionally, the adjustment parameter K' may be multiplied by a first matrix R obtained according to the first posture information to obtain a coordinate in the third posture information, and the third posture information may be obtained according to the coordinate in the third posture information and the key point identifier on the different part.
S309: and acquiring a fourth matrix corresponding to the third posture information.
Optionally, coordinate extraction processing may be performed on the third posture information to obtain a fourth matrix U.
For example, the fourth matrix U may be
Figure GDA0003047437840000121
Wherein the elements in the fourth matrix U correspond to the elements in S202. For example,
Figure GDA0003047437840000122
corresponds to (A)1,B1) Wherein u is1Is equal to A1,v1Is equal to B1
Figure GDA0003047437840000123
Corresponding to (A'1,B′1) Wherein u is2Is equal to A'1,v2Is equal to B'1
S310: and acquiring Euclidean distances of coordinates of corresponding positions in the second matrix and the fourth matrix.
Optionally, the euclidean distance D of the coordinates of the corresponding positions in the second matrix and the fourth matrix may be determined by the following formula threei
Figure GDA0003047437840000131
S311: and determining the average value of Euclidean distances of the corresponding coordinates according to the Euclidean distances of the coordinates of the corresponding positions.
Alternatively, the euclidean distance average D may be determined according to equation four, which may be followed.
Figure GDA0003047437840000132
S312: and determining the absolute value of the difference between the preset threshold and the Euclidean distance average value as the posture matching degree of the first object and the second object.
Optionally, the preset threshold may be M, where a value of M may be 100, or may be another value.
Alternatively, the gesture matching degree X of the first object and the second object may be determined according to the following feasible formula five.
X ═ M-D | formula five
Fig. 4 is a schematic structural diagram of a posture estimation device according to an embodiment of the present invention. Referring to fig. 4, the posture evaluating apparatus 10 includes: an acquisition module 11, a processing module 12 and a determination module 13, wherein,
the acquiring module 11 is configured to acquire first pose information of a first object in a first image and second pose information of a second object in a second image, where the first pose information includes coordinates of at least two parts of the first object, and the second pose information includes coordinates of at least two parts of the second object;
the processing module 12 is configured to perform scaling processing and/or rotation processing on the coordinates in the first pose information to obtain third pose information, where a size of an object indicated by the third pose information is the same as a size of the second object, and at least two parts of the object indicated by the third pose information correspond to the at least two parts of the second object;
the determining module 13 is configured to determine a gesture matching degree of the first object and the second object according to the third gesture information and the second gesture information.
The posture estimation device provided by the embodiment of the invention can execute the technical scheme shown in the method embodiment, the implementation principle and the beneficial effect are similar, and the details are not repeated here.
In a possible implementation, the processing module 12 is specifically configured to:
determining an adjustment parameter according to the first posture information and the second posture information, wherein the adjustment parameter comprises a scaling parameter and/or a rotation angle;
and determining the third posture information according to the first posture information and the adjusting parameter.
In another possible implementation, the processing module 12 is specifically configured to:
acquiring a first matrix corresponding to the first posture information, wherein the first matrix comprises coordinates of at least two parts of the first object, and the coordinates of the at least two parts of the first object are stored in the first matrix according to a preset rule;
acquiring a second matrix corresponding to the second posture information, wherein the second matrix comprises coordinates of at least two parts of the second object, and the coordinates of the at least two parts of the second object are stored in the second matrix according to the preset rule;
and determining the adjusting parameters according to the Powerian distance between the first matrix and the second matrix.
In another possible implementation, the processing module 12 is specifically configured to:
performing a matrix processing operation, the matrix processing operation comprising: obtaining a third matrix according to the initial parameters and the first matrix;
performing a parameter update operation, the parameter update operation including updating initial parameters according to the Poisson's distance of the third matrix and the second matrix;
and repeatedly executing the matrix processing operation and the parameter updating operation until the Peter distance between the third matrix and the second matrix is smaller than or equal to a distance threshold value, or the number of times of executing the matrix processing operation is larger than or equal to a preset number of times, and determining the initial parameter as the adjustment parameter.
In another possible implementation, the determining module 13 is specifically configured to:
and multiplying the adjustment parameter by the coordinate in the first posture information to obtain the third posture information.
In another possible implementation, the determining module 13 is specifically configured to:
acquiring a second matrix corresponding to the second posture information;
acquiring a fourth matrix corresponding to the third posture information;
and determining the gesture matching degree of the first object and the second object according to the second matrix and the fourth matrix.
In another possible implementation, the determining module 13 is specifically configured to:
acquiring Euclidean distances of coordinates of corresponding positions in the second matrix and the fourth matrix;
and determining the posture matching degree of the first object and the second object according to the Euclidean distance of the coordinates of the corresponding positions.
In another possible implementation, the determining module 13 is specifically configured to:
determining an Euclidean distance average value according to the Euclidean distance of the coordinates of the corresponding position;
and determining the absolute value of the difference between the preset threshold and the Euclidean distance average value as the posture matching degree of the first object and the second object.
The posture estimation device provided by the embodiment of the invention can execute the technical scheme shown in the method embodiment, the implementation principle and the beneficial effect are similar, and the details are not repeated here.
Fig. 5 is a schematic diagram of a hardware structure of a posture estimation apparatus according to an embodiment of the present invention. Referring to fig. 5, the posture evaluating apparatus 20 includes: at least one processor 21 and a memory 22. Optionally, the posture estimation device 20 further includes a communication section 23. The processor 21, the memory 22, and the communication unit 23 are connected by a bus 24.
In particular implementations, at least one processor 21 executes computer-executable instructions stored by the memory 22 to cause the at least one processor 21 to perform the gesture assessment method as described above.
For a specific implementation process of the processor 21, reference may be made to the above method embodiments, which implement similar principles and technical effects, and this embodiment is not described herein again.
In the embodiment shown in fig. 5, it should be understood that the Processor may be a Central Processing Unit (CPU), other general-purpose processors, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor, or in a combination of the hardware and software modules within the processor.
The memory may comprise high speed RAM memory and may also include non-volatile storage NVM, such as at least one disk memory.
The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, the buses in the figures of the present application are not limited to only one bus or one type of bus.
The present application also provides a computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, implement the pose estimation method as described above.
The computer-readable storage medium may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. Readable storage media can be any available media that can be accessed by a general purpose or special purpose computer.
An exemplary readable storage medium is coupled to the processor such the processor can read information from, and write information to, the readable storage medium. Of course, the readable storage medium may also be an integral part of the processor. The processor and the readable storage medium may reside in an Application Specific Integrated Circuits (ASIC). Of course, the processor and the readable storage medium may also reside as discrete components in the apparatus.
The division of the units is only a logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention. Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.

Claims (9)

1. A method of pose estimation, comprising:
acquiring first posture information of a first object in a first image and second posture information of a second object in a second image, wherein the first posture information comprises coordinates of at least two parts of the first object, and the second posture information comprises coordinates of at least two parts of the second object;
performing scaling processing and/or rotation processing on the coordinates in the first posture information to obtain third posture information, wherein the size of the object indicated by the third posture information is the same as that of the second object, and at least two parts of the object indicated by the third posture information correspond to at least two parts of the second object;
determining a posture matching degree of the first object and the second object according to the third posture information and the second posture information;
performing scaling processing and/or rotation processing on the coordinates in the first posture information to obtain third posture information, including:
acquiring a first matrix corresponding to the first posture information, wherein the first matrix comprises coordinates of at least two parts of the first object, and the coordinates of the at least two parts of the first object are stored in the first matrix according to a preset rule;
acquiring a second matrix corresponding to the second posture information, wherein the second matrix comprises coordinates of at least two parts of the second object, and the coordinates of the at least two parts of the second object are stored in the second matrix according to the preset rule;
determining an adjustment parameter according to a Poulean distance between the first matrix and the second matrix, wherein the adjustment parameter comprises a scaling parameter and/or a rotation angle;
and determining the third posture information according to the first posture information and the adjusting parameter.
2. The method of claim 1, wherein determining the adjustment parameter based on a Powerian distance between the first matrix and the second matrix comprises:
performing a matrix processing operation, the matrix processing operation comprising: obtaining a third matrix according to the initial parameters and the first matrix;
performing a parameter update operation, the parameter update operation including updating initial parameters according to the Poisson's distance of the third matrix and the second matrix;
and repeatedly executing the matrix processing operation and the parameter updating operation until the Peter distance between the third matrix and the second matrix is smaller than or equal to a distance threshold value, or the number of times of executing the matrix processing operation is larger than or equal to a preset number of times, and determining the initial parameter as the adjustment parameter.
3. The method of claim 2, wherein determining the third pose information based on the first pose information and the adjustment parameters comprises:
and multiplying the adjustment parameter by the coordinate in the first posture information to obtain the third posture information.
4. The method according to any one of claims 1-3, wherein determining the degree of gesture match of the first object and the second object based on the third gesture information and the second gesture information comprises:
acquiring a second matrix corresponding to the second posture information;
acquiring a fourth matrix corresponding to the third posture information;
and determining the gesture matching degree of the first object and the second object according to the second matrix and the fourth matrix.
5. The method of claim 4, wherein determining the degree of gesture match for the first object and the second object based on the second matrix and the fourth matrix comprises:
acquiring Euclidean distances of coordinates of corresponding positions in the second matrix and the fourth matrix;
and determining the posture matching degree of the first object and the second object according to the Euclidean distance of the coordinates of the corresponding positions.
6. The method of claim 5, wherein determining the degree of pose matching of the first object and the second object based on the Euclidean distance of the coordinates of the corresponding position comprises:
determining an Euclidean distance average value according to the Euclidean distance of the coordinates of the corresponding position;
and determining the absolute value of the difference between a preset threshold and the Euclidean distance average value as the posture matching degree of the first object and the second object.
7. A posture assessment apparatus, comprising: an acquisition module, a processing module and a determination module, wherein,
the acquiring module is configured to acquire first pose information of a first object in a first image and second pose information of a second object in a second image, where the first pose information includes coordinates of at least two parts of the first object, and the second pose information includes coordinates of at least two parts of the second object;
the processing module is configured to perform scaling processing and/or rotation processing on the coordinates in the first pose information to obtain third pose information, where a size of an object indicated by the third pose information is the same as a size of the second object, and at least two parts of the object indicated by the third pose information correspond to at least two parts of the second object;
the determining module is configured to determine a pose matching degree of the first object and the second object according to the third pose information and the second pose information;
the processing module is specifically configured to:
acquiring a first matrix corresponding to the first posture information, wherein the first matrix comprises coordinates of at least two parts of the first object, and the coordinates of the at least two parts of the first object are stored in the first matrix according to a preset rule;
acquiring a second matrix corresponding to the second posture information, wherein the second matrix comprises coordinates of at least two parts of the second object, and the coordinates of the at least two parts of the second object are stored in the second matrix according to the preset rule
Determining an adjustment parameter according to a Poulean distance between the first matrix and the second matrix, wherein the adjustment parameter comprises a scaling parameter and/or a rotation angle;
and determining the third posture information according to the first posture information and the adjusting parameter.
8. A posture assessment apparatus, comprising: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executing the memory-stored computer-executable instructions cause the at least one processor to perform the pose assessment method of any of claims 1 to 6.
9. A computer-readable storage medium having computer-executable instructions stored thereon which, when executed by a processor, implement the pose assessment method of any one of claims 1 to 6.
CN201910463521.3A 2019-05-30 2019-05-30 Posture evaluation method and device Active CN110188688B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910463521.3A CN110188688B (en) 2019-05-30 2019-05-30 Posture evaluation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910463521.3A CN110188688B (en) 2019-05-30 2019-05-30 Posture evaluation method and device

Publications (2)

Publication Number Publication Date
CN110188688A CN110188688A (en) 2019-08-30
CN110188688B true CN110188688B (en) 2021-12-14

Family

ID=67718924

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910463521.3A Active CN110188688B (en) 2019-05-30 2019-05-30 Posture evaluation method and device

Country Status (1)

Country Link
CN (1) CN110188688B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110782482A (en) * 2019-10-21 2020-02-11 深圳市网心科技有限公司 Motion evaluation method and device, computer equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104657713A (en) * 2015-02-09 2015-05-27 浙江大学 Three-dimensional face calibrating method capable of resisting posture and facial expression changes
CN109508656A (en) * 2018-10-29 2019-03-22 重庆中科云丛科技有限公司 A kind of dancing grading automatic distinguishing method, system and computer readable storage medium
CN109614899A (en) * 2018-11-29 2019-04-12 重庆邮电大学 A kind of human motion recognition method based on Lie group feature and convolutional neural networks

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IN2014MN00128A (en) * 2011-07-22 2015-06-12 Fujifilm Corp
US8945328B2 (en) * 2012-09-11 2015-02-03 L.I.F.E. Corporation S.A. Methods of making garments having stretchable and conductive ink
CN109684911B (en) * 2018-10-30 2021-05-11 百度在线网络技术(北京)有限公司 Expression recognition method and device, electronic equipment and storage medium
CN109190607A (en) * 2018-10-30 2019-01-11 维沃移动通信有限公司 A kind of motion images processing method, device and terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104657713A (en) * 2015-02-09 2015-05-27 浙江大学 Three-dimensional face calibrating method capable of resisting posture and facial expression changes
CN109508656A (en) * 2018-10-29 2019-03-22 重庆中科云丛科技有限公司 A kind of dancing grading automatic distinguishing method, system and computer readable storage medium
CN109614899A (en) * 2018-11-29 2019-04-12 重庆邮电大学 A kind of human motion recognition method based on Lie group feature and convolutional neural networks

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《Master Opencv...读书笔记》非刚性⼈脸跟踪II;如梦如幻2015;《https://blog.csdn.net/qqh19910525/article/details/52287739》;20160803;第1-2节 *
基于Kinect的运动人体检测技术研究;王富强;《中国优秀硕士学位论文全文数据库-信息科技辑》;20150315(第2015年03期);I138-1975 *

Also Published As

Publication number Publication date
CN110188688A (en) 2019-08-30

Similar Documents

Publication Publication Date Title
CN108875524B (en) Sight estimation method, device, system and storage medium
US11380017B2 (en) Dual-view angle image calibration method and apparatus, storage medium and electronic device
EP2819098B1 (en) Methods and systems for generating a three dimentional representation of a subject
JP6528764B2 (en) Face matching device, method, and recording medium
WO2020029554A1 (en) Augmented reality multi-plane model animation interaction method and device, apparatus, and storage medium
KR20180107085A (en) How to influence virtual objects in augmented reality
CN108229301B (en) Eyelid line detection method and device and electronic equipment
CN110675487A (en) Three-dimensional face modeling and recognizing method and device based on multi-angle two-dimensional face
CN111459269B (en) Augmented reality display method, system and computer readable storage medium
CN109919971B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN111860489A (en) Certificate image correction method, device, equipment and storage medium
CN111582220A (en) Skeleton point behavior identification system based on shift diagram convolution neural network and identification method thereof
CN110188688B (en) Posture evaluation method and device
CN110032941B (en) Face image detection method, face image detection device and terminal equipment
CN111681302A (en) Method and device for generating 3D virtual image, electronic equipment and storage medium
CN109740511B (en) Facial expression matching method, device, equipment and storage medium
CN111460937B (en) Facial feature point positioning method and device, terminal equipment and storage medium
CN111353325A (en) Key point detection model training method and device
CN113570725A (en) Three-dimensional surface reconstruction method and device based on clustering, server and storage medium
CN115393487B (en) Virtual character model processing method and device, electronic equipment and storage medium
CN113870190A (en) Vertical line detection method, device, equipment and storage medium
CN111259703B (en) Face inclination angle detection method and device
JP7177280B2 (en) Image recognition device, image recognition method, and image recognition program
CN111462337B (en) Image processing method, device and computer readable storage medium
CN113223103A (en) Method, device, electronic device and medium for generating sketch

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant