CN111291656B - Human body trunk posture matching method in measurement 2d image - Google Patents

Human body trunk posture matching method in measurement 2d image Download PDF

Info

Publication number
CN111291656B
CN111291656B CN202010071078.8A CN202010071078A CN111291656B CN 111291656 B CN111291656 B CN 111291656B CN 202010071078 A CN202010071078 A CN 202010071078A CN 111291656 B CN111291656 B CN 111291656B
Authority
CN
China
Prior art keywords
trunk
image
point
key points
human body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010071078.8A
Other languages
Chinese (zh)
Other versions
CN111291656A (en
Inventor
石克阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Weier Network Technology Co ltd
Original Assignee
Hangzhou Weier Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Weier Network Technology Co ltd filed Critical Hangzhou Weier Network Technology Co ltd
Priority to CN202010071078.8A priority Critical patent/CN111291656B/en
Publication of CN111291656A publication Critical patent/CN111291656A/en
Application granted granted Critical
Publication of CN111291656B publication Critical patent/CN111291656B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a human body trunk gesture matching method in a measurement 2d image in an electronic commerce image, and relates to the technical field of image matching. The method comprises the following steps: constructing a gesture library; for each human body posture image in the posture library, extracting key points of the human body trunk, and calculating the rotation degree; counting the probability of rotation of the upper torso and the lower torso in three dimensions; extracting key points of human body trunk of the image to be matched, and calculating rotation degree; calculating the absolute difference of rotation degrees of the sample and the image to be matched in three dimensions, and weighting and summing a first metric index; calculating the distance between the sample and the corresponding key point in the image to be matched and taking the distance as a second metric index; summing the first metric index and the second metric index to obtain a similarity index of the upper trunk/the lower trunk; and summing the similarity indexes of the upper trunk and the lower trunk, wherein the minimum value is the matching object. The invention effectively ensures the accuracy of the approximate matching of the human body trunk gesture.

Description

Human body trunk posture matching method in measurement 2d image
Technical Field
The invention relates to the technical field of image matching, in particular to a human body trunk gesture matching method in a measurement 2d image in an electronic commerce image.
Background
In an e-commerce scene, the pictures uploaded by merchants are typically 2D still pictures. The existing measurement methods aiming at the human body trunk gesture approximation in the 2D image are divided into an appearance template method, a detector array method, a nonlinear regression method and the like, the images are compared with the templates, and whether the human body trunk gesture in the two images is approximate is determined through the matching degree of key points. However, these methods only consider the similarity of the relative rotation degrees of the human trunk in three dimensions in the image, and do not consider the absolute degree of the rotation degrees in three dimensions, so the accuracy is not high. Under the electronic market scene, the trunk gesture of the model is more various, and the difference between different gestures is not large, so that a human trunk gesture approximation measurement method with higher accuracy is required.
Disclosure of Invention
The invention aims to provide a human body trunk gesture matching method in a measurement 2d image in an electronic commerce image, which effectively ensures the accuracy of approximate matching of human body trunk gestures.
In order to achieve the above purpose, the present invention provides the following technical solutions:
the human body trunk gesture matching method in the measurement 2d image in the electronic commerce image is characterized by comprising the following steps of:
s1, selecting different human body trunk gesture images as samples to construct a gesture library;
s2, extracting key points of human body trunk aiming at each human body trunk gesture image in a gesture library, and respectively calculating rotation degrees of an upper trunk and a lower trunk in three dimensions in each human body trunk gesture image;
s3, counting human body posture images in a posture library, and clustering to obtain probabilities of rotation of the upper body trunk and the lower body trunk in three dimensions respectively;
s4, extracting key points of human trunk of the image to be matched, and respectively calculating rotation degrees of the upper trunk and the lower trunk in three dimensions;
s5, calculating the absolute difference of rotation degrees of the upper trunk in three dimensions in the human trunk posture image and the image to be matched, and taking the probability of rotation generated in the three dimensions as a weight to carry out weighted summation on a first metric index;
s6, calculating the distance between the human body posture image and the key point of the corresponding upper body in the image to be matched, and calculating the sum of the distances between each pair of key points as a second metric index;
s7, summing the first measurement index and the second measurement index to obtain a similarity index of the trunk of the upper body;
s8, repeating the steps from S5 to S7 aiming at the lower trunk, and obtaining a similarity index of the lower trunk;
and S9, summing the similarity index of the upper trunk and the similarity index of the lower trunk, wherein the human trunk gesture image corresponding to the minimum value is the matching object of the image to be matched.
Further, in the step S2, the alphapore algorithm is adopted for extracting the key points of the human trunk.
Further, in the step S2,
the rotation degree of the upper torso in three dimensions is calculated as follows: calculating the ratio of the relative distances from the left shoulder point to the right shoulder point to the neck point to be the rotation degree of the trunk in the Yaw direction; calculating the ratio of the horizontal distance difference from the left shoulder point to the right shoulder point and the vertical distance difference from the left shoulder point to the right shoulder point as the rotation degree of the upper body in the Roll direction; calculating the ratio Pa of the distance from the left shoulder point to the left hand point to the distance from the neck point to the left waist point, and then calculating the ratio Pb of the arm length to the height of the upper body, wherein the ratio Pa/Pb is the rotation degree of the upper body in the Pitch direction;
the rotation degree of the lower torso in three dimensions is calculated as follows: the ratio of the relative distances from the left waist point to the right waist point to the neck point is the rotation degree in the Yaw direction; calculating the ratio of the horizontal distance difference between the left waist point and the left knee point to the vertical distance difference between the left waist point and the left knee point as the rotation degree of the lower body in the Roll direction; the ratio of thigh length and calf length of the lower torso is calculated as the degree of rotation of the lower torso in the Pitch direction.
Further, in the step S3, K-means clustering is adopted for clustering.
Further, in S6, the distance is a euclidean distance.
Further, the method for determining the key points of the corresponding upper body trunk is as follows: and for the key points of any upper trunk of the image to be matched, calculating the Euclidean distance between the key points and the key points of each human trunk in the human trunk gesture image, and taking the two head key points with the minimum Euclidean distance as the corresponding head key points.
Further, in S6, the distance is a hamming distance.
Further, the method for determining the key points of the corresponding upper body trunk is as follows: for the key point of any upper body trunk of the image to be matched, the Hamming distance between the key point and the key point of each human body trunk in the human body trunk gesture image is calculated, and the two head key points with the minimum Hamming distance are taken as the corresponding head key points.
Further, in S6, the distance is a sum of a euclidean distance and a hamming distance.
Further, the method for determining the corresponding head key points is as follows: the method for determining the key points of the corresponding upper trunk is as follows: and for the key points of any upper trunk of the image to be matched, calculating the sum of Euclidean distance and Hamming distance of the key points of the upper trunk and each human trunk in the human trunk gesture image, and taking the two head key points with the minimum sum of Euclidean distance and Hamming distance as the corresponding head key points.
Compared with the prior art, the invention has the beneficial effects that:
according to the invention, the human body trunk is divided into the upper body and the lower body, and the rotation degree weighted values of three dimensions and the distances of corresponding key points are used as measurement indexes for similarity judgment, so that the accuracy of human body trunk posture matching is greatly improved.
Drawings
Fig. 1 is an overall flow chart of the present invention.
Fig. 2 is a schematic diagram of extracting key points of a human torso according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely, and it is apparent that the described embodiments are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, the invention provides a human body posture matching method for a measurement 2d image in an electronic commerce image, which is characterized by comprising the following steps:
s1, selecting different human body trunk gesture images as samples to construct a gesture library; in this embodiment, the human body trunk comprises an upper body trunk and a lower body trunk, the upper body trunk has a left-right side body which rotates, a left-right bowing body, a front-back pitching, the lower body trunk comprises a squatting curve, a left-right side leg and a left-right side kick; the combination of the upper body trunk and the lower body trunk forms a complete human body trunk gesture, and the image sample of each gesture is about 1000.
S2, extracting key PointS PointS and positions of human body trunk aiming at each human body trunk gesture image S in a gesture library, and calculating rotation degrees Ys, rs and Ps of the upper body trunk and the lower body trunk in three dimensions of Yaw, roll and Pitch in each human body trunk gesture image S;
the extraction of key points of human body trunk adopts an alphapore model, which is a deep learning model, and the position and main key points of the image human body are identified through three modules of STN+SPPE+SDTN. The key point positions of the human body are shown in the figure. As shown in fig. 2, 18 key points of the human torso can be obtained through the model, including a chin point 0, a neck point 1, a left shoulder point 2, a right shoulder point 5, a left elbow point 3, a right elbow point 6, a left hand point 4, a right hand point 7, a left waist point 8, a right waist point 11, a left knee point 9, a right knee point 12, a left foot point 10, a right foot point 13, a left eye point 14, a right eye point 15, a left ear point 16, and a right ear point 17. Wherein chin point 0, neck point 1, left shoulder point 2, right shoulder point 5, left elbow point 3, right elbow point 6, left hand point 4, right hand point 7 are key points of the upper torso, left waist point 8, right waist point 11, left knee point 9, right knee point 12, left foot point 10, right foot point 13, left eye point 14, right eye point 15, left ear point 16, and right ear point 17 are key points of the lower torso.
The rotation degree Ys, rs, ps of the upper body trunk is calculated as follows:
calculating the ratio of the relative distances from the left shoulder point 2 and the right shoulder point 5 to the neck point 1 to be the rotation degree Ys1 of the upper body in the Yaw direction, wherein the distance from the left shoulder point 2 to the neck point 1 is D12, the distance from the right shoulder point 5 to the neck point 1 is D15, and the ratio of the two is rio=D12/D15 to be taken as the rotation degree Ys1 in the Yaw direction; when Ys1 is smaller than 1, the upper trunk is stated to be biased leftwards, and when Ys1 is larger than 1, the upper trunk is stated to be biased rightwards, and the specific proportion has a corresponding rotation angle. Similarly, the ratio of the horizontal distance difference W25 from the left shoulder point 2 to the right shoulder point 5 and the vertical distance difference H25 from the left shoulder point 2 to the right shoulder point 5 is calculated as the rotation degree Rs1 of the upper trunk in the Roll direction, and Rs1 is the tan value of the rotation angle. The ratio Pa of the distance D24 from the left shoulder point 2 to the left hand point 4 to the distance D18 from the neck point 1 to the left waist point 8 is calculated, and the sum dl= |d23|+|d34| of the distance D23 from the left shoulder point 2 to the left elbow point 3 to the distance D34 from the left elbow point 3 to the left hand point 4 is calculated, and the sum d2= |d56|+|d67| of the distance D56 from the right shoulder point 5 to the right elbow point 6 to the distance D67 from the right elbow point 6 to the right hand point 7 is calculated. The average value of the two is the arm length L1= (D1+D2)/2 of the human body, then the difference between the average horizontal distance Yh from the right shoulder point 5 to the right elbow point 6 and the average horizontal distance Yw from the left waist point 8 to the right waist point 11 is calculated, namely the height L2 of the upper body, the ratio Pb of the L1/L2 is calculated, and the ratio Pb to Pa is the rotation degree Ps1 of the upper body in the Pitch direction. When pb=pa, that is, the upper body of the human body does not rotate in the Pitch direction, the upper body of the human body has a pitching posture when the upper body of the human body is unequal, and the corresponding rotation angle meters are arranged at different proportions.
The rotation degree Ys, rs, ps of the lower torso is calculated as follows: the horizontal distance ratio between the left waist point 8 and the right waist point 11 and the neck point 1 is calculated as the rotation degree Ys2 of the lower torso in the Yaw direction. The ratio of the horizontal distance difference W89 between the left waist point 8 and the left knee point 9 and the vertical distance difference H89 between the left waist point 8 and the left knee point 9 is calculated as the rotation degree Rs2 of the lower torso in the Roll direction. The average distance d1= (d89+d1112)/2 of the distance D89 between the left waist point 8 and the left knee point 9 and the distance D1112 between the right waist point 11 and the right knee point 12 is calculated as the thigh length, and the average distance d2= (d910+d1213)/2 of the distance D910 between the left knee point 9 and the left foot point 10 and the distance D1213 between the right knee point 12 and the right foot point 13 is calculated as the calf length, and the ratio D1/D2 is the rotation degree Ps2 of the lower torso in the Pitch direction.
S3, counting all human body posture images in a posture library, and clustering by adopting K-means to obtain the probability W= [ W1, W2, W3] of rotation of the upper body trunk and the lower body trunk in three dimensions of Yaw, roll and Pitch, wherein the probability W= [ W1, W2, W3] is used as the weight for detecting the similarity of the rotation degrees of the three dimensions;
s4, extracting key points PointP and positions of human trunk of the image P to be matched according to the method of S2, and calculating rotation degrees Yp, rp and Pp of the upper trunk/lower trunk in three dimensions of Yaw, roll and Pitch;
s5, calculating the absolute difference of the rotation degree of the upper torso in three dimensions of the human torso posture image and the image to be matched, wherein Dy= |Ys1-yp|, dr= |Rs1-Rp|, dp= |Ps1-Pp|; and a first metric for weighted summation weighted by the probabilities of rotation generated in three dimensions: s1=dy_w1+dr_w2+dp_w3.
S6, calculating the distance between the human body trunk gesture image S and the key point of the corresponding upper trunk in the image P to be matched, and calculating the sum of the distances between the key points of each pair of upper trunk as a second metric index;
in a first embodiment, the correspondence between the key points of each upper torso in the image P to be matched and the key points of the upper torso in the human torso pose image S in the pose library is determined by the following method: for the key points (xp, yp) of any upper trunk of the image P to be matched, calculating the Euclidean distance O= (|xp-xs|2+|yp-ys|2)/(0.5) between the key points (xs, ys) of the upper trunk and each upper trunk in the human trunk gesture image S, and taking the key points (xs, ys) of the upper trunk with the Euclidean distance minimum value min (O) as the key points corresponding to the key points (xp, yp) of the upper trunk. The euclidean distance and s2=Σmin (O) of each pair of key points are counted as a second metric index.
In the second embodiment, the correspondence between the key points of each upper torso in the image P to be matched and the key points of the upper torso in the human torso pose image S in the pose library is determined by the following method: for the key points (xp, yp) of any upper trunk of the image P to be matched, the Hamming distance H= |xp-xs|+|yp-ys| between the key points (xs, ys) and each upper trunk in the human trunk gesture image S is calculated, and the key points (xs, ys) of the upper trunk of the minimum Hamming distance min (O) are taken as the key points corresponding to the key points (xp, yp) of the upper trunk. Each pair of keypoint hamming distances and s2=Σmin (O) is counted as a second metric index.
In the third embodiment, the correspondence between the key points of each upper torso in the image P to be matched and the key points of the upper torso in the human torso pose image S in the pose library is determined by the following method: for the key points (xp, yp) of any upper torso of the image P to be matched, the Euclidean distance O= (|xp-xs|2+|yp-ys|2)/(0.5) and the Hamming distance H= |xp-xs|+|yp-ys|are calculated from the key points (xs, ys) of each upper torso in the human torso posture image S, and the key points (xs, ys) of the upper torso of the distance summation minimum min (Dpoint) are obtained as the key points corresponding to the key points (xp, yp) of the upper torso. Each pair of keypoint hamming distances and s2=Σmin (O) is counted as a second metric index. Compared with the first and second embodiments, the second metric of the present embodiment combines the euclidean distance and the hamming distance, and is more accurate in both matching the corresponding key points and metric.
And S7, summing the first metric index and the second metric index, wherein Sspm1=S1+S2, and the sum Sspm1 is the similarity index of the human trunk posture image S and the upper trunk of the image P to be matched.
S8, repeating the steps from S5 to S7 aiming at the lower trunk to obtain a similarity index Ssum2 of the lower trunk;
and S9, finally, summing the similarity index Ssum1 of the trunk of the upper body and the similarity index Ssum2 of the trunk of the lower body, wherein Ssum 1=S1+S2, and the human body posture image corresponding to the minimum value of Ssum1 is the matching object of the image to be matched, so that the human body posture of the image to be matched is obtained.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims (10)

1. The human body trunk gesture matching method in the measurement 2d image in the electronic commerce image is characterized by comprising the following steps of:
s1, selecting different human body trunk gesture images as samples to construct a gesture library;
s2, extracting key points of human body trunk aiming at each human body trunk gesture image in a gesture library, and respectively calculating rotation degrees of an upper trunk and a lower trunk in three dimensions in each human body trunk gesture image;
s3, counting human body posture images in a posture library, and clustering to obtain probabilities of rotation of the upper body trunk and the lower body trunk in three dimensions respectively;
s4, extracting key points of human trunk of the image to be matched, and respectively calculating rotation degrees of the upper trunk and the lower trunk in three dimensions;
s5, calculating the absolute difference of rotation degrees of the upper trunk in three dimensions in the human trunk posture image and the image to be matched, and taking the probability of rotation generated in the three dimensions as a weight to carry out weighted summation on a first metric index;
s6, calculating the distance between the human body posture image and the key point of the corresponding upper body in the image to be matched, and calculating the sum of the distances between each pair of key points as a second metric index;
s7, summing the first measurement index and the second measurement index to obtain a similarity index of the trunk of the upper body;
s8, repeating the steps from S5 to S7 aiming at the lower trunk, and obtaining a similarity index of the lower trunk;
and S9, summing the similarity index of the upper trunk and the similarity index of the lower trunk, wherein the human trunk gesture image corresponding to the minimum value is the matching object of the image to be matched.
2. The method for matching human body trunk postures in the measurement 2d image in the e-commerce image according to claim 1, wherein in the step S2, an alphaphase algorithm is adopted for extracting key points of human body trunk.
3. The method for matching human torso pose in a metric 2d image in an e-commerce image of claim 1, wherein in S2,
the rotation degree of the upper torso in three dimensions is calculated as follows: calculating the ratio of the relative distances from the left shoulder point to the right shoulder point to the neck point to be the rotation degree of the trunk in the Yaw direction; calculating the ratio of the horizontal distance difference from the left shoulder point to the right shoulder point and the vertical distance difference from the left shoulder point to the right shoulder point as the rotation degree of the upper body in the Roll direction; calculating the ratio Pa of the distance from the left shoulder point to the left hand point to the distance from the neck point to the left waist point, and then calculating the ratio Pb of the arm length to the height of the upper body, wherein the ratio Pa/Pb is the rotation degree of the upper body in the Pitch direction;
the rotation degree of the lower torso in three dimensions is calculated as follows: the ratio of the relative distances from the left waist point to the right waist point to the neck point is the rotation degree in the Yaw direction; calculating the ratio of the horizontal distance difference between the left waist point and the left knee point to the vertical distance difference between the left waist point and the left knee point as the rotation degree of the lower body in the Roll direction; the ratio of thigh length and calf length of the lower torso is calculated as the degree of rotation of the lower torso in the Pitch direction.
4. The method for matching human body trunk postures in 2d images in electronic commerce images according to claim 1, wherein in the step S3, K-means clustering is adopted for clustering.
5. The method for matching human body trunk postures in a 2d image measured in an electronic commerce image according to claim 1, wherein in the step S6, the distance is a euclidean distance.
6. The method for matching human body trunk postures in a measured 2d image in an electronic commerce image according to claim 5, wherein the method for determining key points of the corresponding upper body trunk is as follows: and for the key points of any upper trunk of the image to be matched, calculating the Euclidean distance between the key points and the key points of each human trunk in the human trunk gesture image, and taking the two head key points with the minimum Euclidean distance as the corresponding head key points.
7. The method for matching human torso pose in a 2d image for a measure in an e-commerce image according to claim 1, wherein in S6, the distance is a hamming distance.
8. The method for matching human body trunk postures in a measured 2d image in an electronic commerce image according to claim 7, wherein the method for determining key points of the corresponding upper body trunk is as follows: for the key point of any upper body trunk of the image to be matched, the Hamming distance between the key point and the key point of each human body trunk in the human body trunk gesture image is calculated, and the two head key points with the minimum Hamming distance are taken as the corresponding head key points.
9. The method for matching human body trunk gesture in 2d image measured in e-commerce image according to claim 1, wherein in S6, the distance is a sum of euclidean distance and hamming distance.
10. The method for matching human body trunk postures in a measured 2d image in an electronic commerce image according to claim 9, wherein the method for determining the corresponding head key points is as follows: the method for determining the key points of the corresponding upper trunk is as follows: and for the key points of any upper trunk of the image to be matched, calculating the sum of Euclidean distance and Hamming distance of the key points of the upper trunk and each human trunk in the human trunk gesture image, and taking the two head key points with the minimum sum of Euclidean distance and Hamming distance as the corresponding head key points.
CN202010071078.8A 2020-01-21 2020-01-21 Human body trunk posture matching method in measurement 2d image Active CN111291656B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010071078.8A CN111291656B (en) 2020-01-21 2020-01-21 Human body trunk posture matching method in measurement 2d image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010071078.8A CN111291656B (en) 2020-01-21 2020-01-21 Human body trunk posture matching method in measurement 2d image

Publications (2)

Publication Number Publication Date
CN111291656A CN111291656A (en) 2020-06-16
CN111291656B true CN111291656B (en) 2023-06-02

Family

ID=71021328

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010071078.8A Active CN111291656B (en) 2020-01-21 2020-01-21 Human body trunk posture matching method in measurement 2d image

Country Status (1)

Country Link
CN (1) CN111291656B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101350064A (en) * 2008-08-29 2009-01-21 北京中星微电子有限公司 Method and apparatus for estimating two-dimension human body guise
CN102609684A (en) * 2012-01-16 2012-07-25 宁波江丰生物信息技术有限公司 Human body posture detection method and device
CN102831383A (en) * 2011-06-13 2012-12-19 索尼公司 Active behavior detecting method and device
CN105224921A (en) * 2015-09-17 2016-01-06 桂林远望智能通信科技有限公司 A kind of facial image preferentially system and disposal route
JP2018073385A (en) * 2016-10-22 2018-05-10 俊之 坂本 Image processing device and program
CN109345513A (en) * 2018-09-13 2019-02-15 红云红河烟草(集团)有限责任公司 Cigarette package defect detection method with cigarette package posture calculation function
CN109657631A (en) * 2018-12-25 2019-04-19 上海智臻智能网络科技股份有限公司 Human posture recognition method and device
CN109948505A (en) * 2019-03-14 2019-06-28 郑州大学 A kind of optimization method of human body three-dimensional attitude matching algorithm
CN110110593A (en) * 2019-03-27 2019-08-09 广州杰赛科技股份有限公司 Face Work attendance method, device, equipment and storage medium based on self study
CN110349206A (en) * 2019-07-18 2019-10-18 科大讯飞(苏州)科技有限公司 A kind of method and relevant apparatus of human body symmetrical detection

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9165199B2 (en) * 2007-12-21 2015-10-20 Honda Motor Co., Ltd. Controlled human pose estimation from depth image streams

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101350064A (en) * 2008-08-29 2009-01-21 北京中星微电子有限公司 Method and apparatus for estimating two-dimension human body guise
CN102831383A (en) * 2011-06-13 2012-12-19 索尼公司 Active behavior detecting method and device
CN102609684A (en) * 2012-01-16 2012-07-25 宁波江丰生物信息技术有限公司 Human body posture detection method and device
CN105224921A (en) * 2015-09-17 2016-01-06 桂林远望智能通信科技有限公司 A kind of facial image preferentially system and disposal route
JP2018073385A (en) * 2016-10-22 2018-05-10 俊之 坂本 Image processing device and program
CN109345513A (en) * 2018-09-13 2019-02-15 红云红河烟草(集团)有限责任公司 Cigarette package defect detection method with cigarette package posture calculation function
CN109657631A (en) * 2018-12-25 2019-04-19 上海智臻智能网络科技股份有限公司 Human posture recognition method and device
CN109948505A (en) * 2019-03-14 2019-06-28 郑州大学 A kind of optimization method of human body three-dimensional attitude matching algorithm
CN110110593A (en) * 2019-03-27 2019-08-09 广州杰赛科技股份有限公司 Face Work attendance method, device, equipment and storage medium based on self study
CN110349206A (en) * 2019-07-18 2019-10-18 科大讯飞(苏州)科技有限公司 A kind of method and relevant apparatus of human body symmetrical detection

Also Published As

Publication number Publication date
CN111291656A (en) 2020-06-16

Similar Documents

Publication Publication Date Title
CN108597578B (en) Human motion assessment method based on two-dimensional skeleton sequence
Pons-Moll et al. Multisensor-fusion for 3d full-body human motion capture
CN109815907B (en) Sit-up posture detection and guidance method based on computer vision technology
CN111881887A (en) Multi-camera-based motion attitude monitoring and guiding method and device
CN110991266B (en) Binocular face living body detection method and device
WO2017133009A1 (en) Method for positioning human joint using depth image of convolutional neural network
CN1316416C (en) Head motion estimation from four feature points
CN107767442A (en) A kind of foot type three-dimensional reconstruction and measuring method based on Kinect and binocular vision
Wu et al. Human 3D pose estimation in a lying position by RGB-D images for medical diagnosis and rehabilitation
CN103438834B (en) The hierarchical quick three-dimensional measurement mechanism of structure based light projection and measuring method
CN105551086A (en) Customized foot modeling and shoe pad customization method on the basis of computer vision
CN113139962B (en) System and method for scoliosis probability assessment
WO2015165227A1 (en) Human face recognition method
CN113065532B (en) Sitting posture geometric parameter detection method and system based on RGBD image
CN112990089B (en) Method for judging human motion gesture
CN112184898A (en) Digital human body modeling method based on motion recognition
Nguyen et al. Estimating skeleton-based gait abnormality index by sparse deep auto-encoder
CN111291656B (en) Human body trunk posture matching method in measurement 2d image
CN109064511A (en) A kind of gravity center of human body's height measurement method, device and relevant device
CN116958958A (en) Self-adaptive class-level object attitude estimation method based on graph convolution double-flow shape prior
CN111291655B (en) Head posture matching method for measuring 2d image in electronic commerce image
Pan et al. Improved Census Transform Method for Semi-Global Matching Algorithm
CN112949587B (en) Hand holding gesture correction method, system and computer readable medium based on key points
CN116403244A (en) Openphase skeleton point marker-based body state rapid detection method
CN114299250A (en) Three-dimensional reconstruction method for working environment of stomach part of magnetorheological medical capsule robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant