CN112990089A - Method for judging human motion posture - Google Patents

Method for judging human motion posture Download PDF

Info

Publication number
CN112990089A
CN112990089A CN202110379043.5A CN202110379043A CN112990089A CN 112990089 A CN112990089 A CN 112990089A CN 202110379043 A CN202110379043 A CN 202110379043A CN 112990089 A CN112990089 A CN 112990089A
Authority
CN
China
Prior art keywords
parameters
differ
joint
distance
posture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110379043.5A
Other languages
Chinese (zh)
Other versions
CN112990089B (en
Inventor
王桢桢
杨卓锐
赵志威
王曜怡
邓仁为
钱林俊
刘煜程
岑磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University
Original Assignee
Chongqing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University filed Critical Chongqing University
Priority to CN202110379043.5A priority Critical patent/CN112990089B/en
Publication of CN112990089A publication Critical patent/CN112990089A/en
Application granted granted Critical
Publication of CN112990089B publication Critical patent/CN112990089B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training

Abstract

The invention relates to a method for judging human motion postures, and belongs to the field of computer vision. The method comprises the following steps: s1: acquiring a motion video, and processing each frame of image in the video; s2: extracting joint point data according to the skeleton, and calculating key parameters capable of describing human body postures, including angle parameters, height parameters and distance parameters; s31: obtaining a posture key parameter group of an exerciser and standard actions by a calculation method of the posture key parameters of the human motion; s32: calculating the difference values of the parameters which are in one-to-one correspondence in the parameter group according to the sequence; s33: setting the degree of difference between the action of the exerciser and the standard action, and giving a posture adjustment scheme according to the degree of difference between the action and the standard action. The invention can accurately calculate the key parameters of the human motion posture, judge whether the human motion action is standard according to the key parameters and then make an effective action adjustment scheme.

Description

Method for judging human motion posture
Technical Field
The invention belongs to the field of computer vision, relates to the technical field of human body posture recognition, and particularly relates to a method for judging a human body motion posture.
Background
With the development of artificial intelligence technology, the technologies of character recognition, voice recognition and the like have become mature gradually, but great difficulty still exists in understanding human gestures.
The existing human body posture recognition algorithm usually recognizes the state of a human body at a certain moment, and the method can accurately recognize simple static postures such as 'lifting hands' and 'lifting legs'. In natural interpersonal communication scenarios, however, humans tend to represent more complex meanings using a series of combinations of actions. The identification of the dynamic gesture formed by the series of actions is a research focus and difficulty in the current human body gesture identification research field.
The existing gesture recognition method mainly comprises 2 types, one is human gesture recognition based on a motion sensor, and the other is human gesture recognition based on image analysis; the recognition technology based on the sensor mainly collects relevant motion data by leading researchers to carry the sensor, the commonly used sensor mainly comprises an accelerometer, a magnetoresistive sensor, a gyroscope and the like, the posture of a person is recognized by combining a machine learning relevant method after the motion information of the researchers is obtained by the sensor, and the recognition result of the posture is mainly influenced by a feature extraction method, namely the use of the sensor and the selection of a classifier, so that the posture recognition is not accurate enough; the image-based analysis method is characterized in that a system extracts images of researchers to serve as characteristics of research and analysis, the existing image-based method mostly adopts methods such as pile image aspect ratio, shape complexity change, eccentricity and the like to analyze contour characteristics of the images and combines K-means or SVM to judge the posture category of people, however, the traditional method can only carry out single individual posture recognition and cannot realize comparative analysis with standard actions.
In the prior art, whether the standard evaluation is performed on the actions of the tested person during the sports test or the sports exercise is basically obtained by adopting a manual judgment method, and a computer model cannot make objective and accurate evaluation.
Therefore, a method for effectively recognizing the motion posture of the human body and accurately evaluating whether the motion posture is standard is needed.
Disclosure of Invention
In view of the above, the present invention provides a method for determining a human body motion gesture, which is used to accurately calculate key parameters of the human body motion gesture, determine whether a human body motion action is standard according to the key parameters, and then make an action adjustment scheme.
In order to achieve the purpose, the invention provides the following technical scheme:
scheme 1: a method for calculating key parameters of human motion postures comprises the following steps:
s1: acquiring a motion video, and processing each frame of image in the video;
s2: extracting joint point data according to the skeleton, and calculating key parameters capable of describing human body postures, including angle parameters, height parameters and distance parameters;
the angle parameters include: the bending angle of the elbow joint, the bending angle of the knee joint and the bending angle of the waist;
the height parameters include: the ground clearance of the left elbow joint and the right elbow joint, the ground clearance of the left knee joint and the right knee joint and the ground clearance of the waist at the left side and the right side;
the distance parameters include: the lateral and longitudinal distances between the feet, the lateral and longitudinal distances between the arms, and the lateral and longitudinal distances between the knees.
Further, in step S2, the angle parameter is calculated mainly from data of three relevant joint points, and the main idea is to construct a triangle from the three joint points, and calculate the distance between two joint points of the three joint points, i.e. the side length of the triangle.
Further, in step S2, the distance parameter is a triangle constructed by the projections of the joint points in the horizontal and vertical directions, and the lateral distance and the longitudinal distance between the joint points are obtained by calculating the side lengths of the two right-angle sides of the triangle.
Further, in step S2, the height parameter is an absolute value of the difference between the vertical coordinate distances between the relevant joints.
Scheme 2: a method for judging human motion gestures specifically comprises the following steps:
s31: obtaining the posture key parameter group P of the exerciser by the calculation method of the human motion posture key parameteruserAnd pose key parameter set P of standard actionstandard
S32: calculate PuserAnd PstandardThe difference values of the parameters which correspond to each other in sequence in the parameter group are difference;
s33: setting the degree of difference between the action of the exerciser and the standard action, and giving a posture adjustment scheme according to the degree of difference between the action and the standard action.
Further, in step S33, the posture adjustment scheme is:
(1) adjusting angle parameters:
when the elbow joint bending angle is differ degrees, the arm is moved outwards/inwards by approximately differ degrees, and the correct postures of other parts of the body are kept unchanged;
when the bending angle of the knee joint is differ degrees, the body part above the knee joint is lifted upwards/pressed downwards by about differ degrees, and the correct postures of other parts of the body are kept unchanged;
when the bending angle of the waist is differ degrees, the body part above the waist is moved by about differ degrees towards the back/chest direction, and the correct postures of other parts of the body are kept unchanged;
(2) adjusting distance parameters:
when the transverse distance and the longitudinal distance between the two feet are differ cm, the left foot and the right foot are translated outwards/inwards by about | differ |/2cm, and the correct postures of other parts of the body are kept unchanged;
when the distance between the transverse direction and the longitudinal direction of the arms is differ cm, the left arm and the right arm are translated outwards/inwards by about differ/2 cm, and the correct postures of other parts of the body are kept unchanged;
when the transverse distance and the longitudinal distance between the two knees are differ cm, the left knee joint and the right knee joint are translated outwards/inwards by about | differ |/2cm, and the correct postures of other parts of the body are kept unchanged;
(3) adjusting height parameters:
when the height of the elbow joint from the ground is differ cm, the left elbow joint and the right elbow joint are translated upwards/downwards by about differ cm, and the correct postures of other parts of the body are kept unchanged;
when the height of the knee joint from the ground is differ cm, the left knee joint and the right knee joint are translated upwards/downwards by | differ | cm, and the correct postures of other parts of the body are kept unchanged;
and when the height of the waist from the ground is differ cm, the waist is translated upwards/downwards by about | differ | cm, and the correct postures of other parts of the body are kept unchanged.
The invention has the beneficial effects that: the invention can accurately calculate the key parameters of the human motion posture, judge whether the human motion action is standard according to the key parameters and then make an effective action adjustment scheme; the device provides an efficient movement reference for the exerciser, so that the exerciser can adjust the posture in time, and the safety and the health of exercise are improved.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention. The objectives and other advantages of the invention may be realized and attained by the means of the instrumentalities and combinations particularly pointed out hereinafter.
Drawings
For the purposes of promoting a better understanding of the objects, aspects and advantages of the invention, reference will now be made to the following detailed description taken in conjunction with the accompanying drawings in which:
FIG. 1 is a schematic diagram of angle parameters of human body posture;
FIG. 2 is a schematic diagram of distance parameters for human body gestures;
FIG. 3 is a schematic diagram of height parameters of a human body posture;
FIG. 4 is a schematic view of a triangle of a joint point when calculating angle parameters;
FIG. 5 is a schematic view of a triangle of a joint point when calculating a distance parameter;
FIG. 6 is a schematic view of the joint point for calculating the height parameter;
FIG. 7 is a diagram of a joint point for a human body posture.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention in a schematic way, and the features in the following embodiments and examples may be combined with each other without conflict.
Wherein the showings are for the purpose of illustrating the invention only and not for the purpose of limiting the same, and in which there is shown by way of illustration only and not in the drawings in which there is no intention to limit the invention thereto; to better illustrate the embodiments of the present invention, some parts of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product; it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The same or similar reference numerals in the drawings of the embodiments of the present invention correspond to the same or similar components; in the description of the present invention, it should be understood that if there is an orientation or positional relationship indicated by terms such as "upper", "lower", "left", "right", "front", "rear", etc., based on the orientation or positional relationship shown in the drawings, it is only for convenience of description and simplification of description, but it is not an indication or suggestion that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and therefore, the terms describing the positional relationship in the drawings are only used for illustrative purposes, and are not to be construed as limiting the present invention, and the specific meaning of the terms may be understood by those skilled in the art according to specific situations.
Referring to fig. 1 to 7, the present invention provides a method for determining a human body movement posture, which includes a posture key parameter calculation stage and a stage of determining whether the human body movement posture is standard.
1. Attitude key parameter calculation stage
In order to make the exerciser more intuitively know the difference between the self action and the standard action, the present embodiment processes each frame of image extracted from the standard action video and the test video. Through the joint point data obtained in the skeleton extraction stage, key parameters capable of describing the human body posture are calculated, wherein the key parameters comprise an angle parameter, a height parameter and a distance parameter, and the specific parameters are shown in table 1.
TABLE 1 key parameters table of human body posture
Angle parameter Distance parameter Height parameter
Bending angle of elbow joint Lateral and longitudinal distance between feet Height of left and right elbow joints above ground
Bending angle of knee joint Transverse and longitudinal distance between arms Height of left and right knee joints from ground
Bending angle of waist Lateral and longitudinal distances between knees Height of left and right waist above ground
Specific meanings of the above parameters are shown in FIGS. 1 to 3.
According to the joint data points acquired in the skeleton extraction stage, the calculation formula of the above 15 human body posture key parameters can be obtained. The angle parameters include the bending angle of the elbow joint, the bending angle of the knee joint and the bending angle of the waist, the angle parameters are mainly calculated by data of three related joint points, and the main idea is that a triangle is constructed by the three joint points, as shown in fig. 4.
According to the triangle shown in fig. 4, the distance between two of the three joint points, i.e. the side length a, B, C of the triangle, is calculated as follows:
Figure BDA0003012124060000051
Figure BDA0003012124060000052
Figure BDA0003012124060000053
wherein (x)a,ya)、(xb,yb)、(xc,yc) Respectively representing the coordinates of three joint points.
Then, the cosine theorem formula is used to calculate the angle α, as shown in the following formula:
Figure BDA0003012124060000054
α=arccosα
the distance parameters in the key parameters of the human body posture mainly comprise the transverse and longitudinal distances between the feet, the transverse and longitudinal distances between the arms and the transverse and longitudinal distances between the knees. In most practical situations, the joints of the feet and the elbows and the knees are not on the same horizontal straight line, so that the distance cannot be calculated simply by directly using a formula of the distance between the two points, a triangle is constructed by referring to the idea of angle calculation, and the distance between the joint points is obtained by calculating the side length of the triangle.
Fig. 5 shows a triangle constructed by the projections of the joint points in the horizontal and vertical directions, and it can be known from fig. 5 that the transverse distance HD and the longitudinal distance VD between the joint points can be obtained by calculating the side lengths of the two right-angle sides of the triangle.
As can be seen from fig. 5, the calculation formulas of the transverse distance HD and the longitudinal distance VD between the joint points are respectively:
HD=|x2-x1|
VD=|y2-y1|
height parameters in the key parameters of the human posture mainly comprise the elbow joint ground clearance, the knee joint ground clearance and the waist ground clearance, wherein all height parameters are composed of two parts of the right side and the left side of the human body, as shown in fig. 6, namely the elbow joint ground clearance comprises the left elbow joint ground clearance and the right elbow joint ground clearance, the knee joint ground clearance comprises the left knee joint ground clearance and the right knee joint ground clearance, and the waist ground clearance comprises the left waist ground clearance and the right waist ground clearance.
The calculation of the relative angle of the height parameter and the calculation of the distance parameter are simpler, and the absolute value of the distance difference of the vertical coordinate between the related joint points can be directly calculated.
FIG. 7 is a diagram of a joint point of a human body posture, wherein the coordinate angle r indicates that the joint point belongs to the right side of the body, and l indicates that the joint point belongs to the left side of the body.
The height H of the left elbow joint above the ground can be easily determined from fig. 7zlHeight H from the ground of the right elbow jointzrHeight H of left knee joint from groundxlTo the rightKnee joint ground clearance HxrLeft waist height to ground HylHeight H from the right waistyrThe formula of (c) is shown as follows:
Hzl=|yl1-yl4|
Hzr=|yr1-yr4|
Hxl=|yl3-yl4|
Hxr=|yr3-yr4|
Hyl=|yl2-yl4|
Hyr=|yr2-yr4|
by the above calculation formula, the key parameter values of the human body posture can be obtained, the angle parameter values, the distance parameter values and the height parameter values of the human body posture in the key image frame of the test video and the key image frame of the standard video are compared, the body part of the exerciser with the action difference from the action in the standard library can be accurately obtained, and the body part with the action difference can be quantized. If the elbow joint bending angle of a certain exerciser is larger than that of a human body in a standard library, characters can be output to prompt a user, for example: the elbow joint bending angle is too large, and it is recommended to reduce the elbow joint bending angle.
2. Stage for judging whether human body motion posture is standard or not
In order to enable the exerciser to more intuitively know the specific problems of the self-movement posture, the embodiment processes the result obtained in the posture key parameter calculation stage, sets a threshold value, and generates a detailed text description, and the specific implementation scheme is as follows:
firstly, assuming that the key parameters of the human body posture in the key image frames of the exerciser and the standard library video are obtained through a posture key parameter calculation link:
Puser={αzuxuyu,hdfu,vdfu,hdau,vdau,hdku,vdku,hzlu,hzru,hxlu,hxru,hylu,hyru}
Pstandard={αzsxsys,hdfs,vdfs,hdas,vdas,hdks,vdks,hzls,hzrs,hxls,hxrs,hyls,hyrs}
wherein, PuserSet of posture key parameters, P, representing the exerciserstandardAnd (3) representing a pose key parameter set of a standard action, wherein specific parameter meanings are given in the description of the pose key parameter calculation stage.
First calculate PuserAnd PstandardThe difference values of the parameters corresponding to each other in the parameter group in sequence are difference values, and the general calculation formula of the difference values is as follows:
differ=xu-xs
wherein x isuRepresents PuserParameter values, x, in parameter setssRepresents PstandardParameter values in the parameter set.
And secondly, setting output thresholds corresponding to different degrees of explanatory characters, wherein the different degrees of explanatory characters refer to the difference degree between the action of the exerciser and the standard action, such as: slightly larger, bigger and oversized. According to the calculated difference value, outputting explanatory words with different degrees, wherein the explanatory words should indicate the body parts with differences, the specific description of the differences (the difference degree with the standard action) and the adjustment scheme, such as: the left elbow joint of the upper limb should be bent at a slightly smaller angle, the forearm should be moved outward by about 5 °, the right knee joint of the lower limb should be relatively high above the ground, the knee joint should be pressed downward by about 10cm, the waist should be bent at a larger angle, and the upper body should be bent downward by about 15 °. The exerciser can set the fitness level, the selectable items are high level, middle level and primary level, the lower the level is, the higher the output threshold corresponding to the different degree explanatory characters is, the primary exerciser will be taken as an example below to give specific output explanatory characters, corresponding thresholds and adjustment schemes, and for the middle-level and high-level exercisers, the output explanatory characters and adjustment schemes are the same as those of the primary exerciser, and the only difference is that the threshold values are different.
For the primary exerciser, under the condition that the action of the primary exerciser is greatly different from the standard action, the primary exerciser outputs explanatory characters to prompt the exerciser that the action is wrong and gives an adjustment scheme, and the output threshold value is shown in table 2;
table 2 output explanatory text and corresponding threshold value table (elementary)
Figure BDA0003012124060000071
Figure BDA0003012124060000081
For the middle and high exercisers, the thresholds are shown in table 3;
TABLE 3 MEDIUM AND HIGH-LEVEL EXERCISE THRESHOLD PARAMETERS
Figure BDA0003012124060000082
According to the threshold values and the explanatory text in table 2, the corresponding adjustment scheme of the exerciser can be provided, and the detailed adjustment scheme is shown in table 4, and the detailed adjustment scheme indicates how the exerciser should adjust the posture of the exerciser.
Table 4 posture adjustment scheme table
Figure BDA0003012124060000083
Figure BDA0003012124060000091
According to tables 2, 3 and 4, the present embodiment can provide the problems of different levels of exerciser's own motion posture and the corresponding detailed adjustment scheme, and the following two cases are exemplified:
1) the person who is doing the primary exercise,the transverse distance hd of the feet of the exerciser is obtained by calculationfuDistance hd from the feet of the corresponding standard movementfsIf the difference value is-7 cm, the instruction words of the body parts of the exerciser with the difference are indicated to be as follows according to the difference value: the lateral distance between the feet is slightly smaller, and the corresponding adjustment scheme is given as follows: the left foot and the right foot are horizontally moved outwards by about 3.5cm, and the correct postures of all parts of the body are kept unchanged in the moving process.
1) The middle-level exerciser obtains the bending angle alpha of the waist of the exerciser through calculationyuBending angle alpha of waist corresponding to standard actionysIf the difference value is 13 °, the instruction words of the body parts of the exerciser with the difference are indicated to be: the bending angle of the waist is large, and the corresponding adjusting scheme is as follows: the body parts above the waist are moved about 13 degrees towards the chest direction, and the correct postures of all the body parts are kept unchanged in the moving process.
Finally, the above embodiments are only intended to illustrate the technical solutions of the present invention and not to limit the present invention, and although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions, and all of them should be covered by the claims of the present invention.

Claims (6)

1. A method for calculating key parameters of human motion postures is characterized by comprising the following steps:
s1: acquiring a motion video, and processing each frame of image in the video;
s2: extracting joint point data according to the skeleton, and calculating key parameters capable of describing human body postures, including angle parameters, height parameters and distance parameters;
the angle parameters include: the bending angle of the elbow joint, the bending angle of the knee joint and the bending angle of the waist;
the height parameters include: the ground clearance of the left elbow joint and the right elbow joint, the ground clearance of the left knee joint and the right knee joint and the ground clearance of the waist at the left side and the right side;
the distance parameters include: the lateral and longitudinal distances between the feet, the lateral and longitudinal distances between the arms, and the lateral and longitudinal distances between the knees.
2. The method as claimed in claim 1, wherein in step S2, the angle parameter is calculated from data of three related joint points, specifically, a triangle is constructed from the three joint points, and the distance between two of the three joint points, i.e. the side length of the triangle, is calculated.
3. The method as claimed in claim 1, wherein in step S2, the distance parameter is a triangle constructed by the projections of the joint points in the horizontal and vertical directions, and the length of the two right-angle sides of the triangle is calculated to obtain the transverse distance and the longitudinal distance between the joint points.
4. The method of claim 1, wherein in step S2, the height parameter is an absolute value of a distance difference between vertical coordinates of the related joints.
5. A method for judging the motion posture of a human body is characterized by comprising the following steps:
s31: the method for calculating the key parameters of human body motion posture according to any one of claims 1 to 4, wherein the key parameter set P of the posture of the exerciser is obtaineduserAnd pose key parameter set P of standard actionstandard
S32: calculate PuserAnd PstandardThe difference values of the parameters which correspond to each other in sequence in the parameter group are difference;
s33: setting the degree of difference between the action of the exerciser and the standard action, and giving a posture adjustment scheme according to the degree of difference between the action and the standard action.
6. The method for determining human motion gesture according to claim 5, wherein in step S33, the gesture adjustment scheme is:
(1) adjusting angle parameters:
when the elbow joint bending angle is differ degrees, moving the arm outwards/inwards by | differ | °, and keeping the correct postures of other parts of the body unchanged;
when the bending angle of the knee joint is differ degrees, the body part above the knee joint is lifted upwards/pressed downwards by differ degrees, and the correct postures of other parts of the body are kept unchanged;
when the bending angle of the waist is differ degrees, moving the body part above the waist to the back/chest direction by differ degrees, and keeping the correct postures of other parts of the body unchanged;
(2) adjusting distance parameters:
when the transverse distance and the longitudinal distance between the two feet are differ cm, the left foot and the right foot are translated outwards/inwards by | differ |/2cm, and the correct postures of other parts of the body are kept unchanged;
when the distance between the transverse direction and the longitudinal direction of the arms is differ cm, the left arm and the right arm are translated outwards/inwards by | differ |/2cm, and the correct postures of other parts of the body are kept unchanged;
when the distance between the two knees is differ cm, the left knee joint and the right knee joint are translated outwards/inwards by | differ |/2cm, and the correct postures of other parts of the body are kept unchanged;
(3) adjusting height parameters:
when the height of the elbow joint from the ground is differ cm, the left elbow joint and the right elbow joint are translated upwards/downwards by the differ cm, and the correct postures of other parts of the body are kept unchanged;
when the height of the knee joint from the ground is differ cm, the left knee joint and the right knee joint are translated upwards/downwards by the differ cm, and the correct postures of other parts of the body are kept unchanged;
and when the height of the waist from the ground is differential cm, the waist is translated upwards/downwards by the differential cm, and the correct postures of other parts of the body are kept unchanged.
CN202110379043.5A 2021-04-08 2021-04-08 Method for judging human motion gesture Active CN112990089B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110379043.5A CN112990089B (en) 2021-04-08 2021-04-08 Method for judging human motion gesture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110379043.5A CN112990089B (en) 2021-04-08 2021-04-08 Method for judging human motion gesture

Publications (2)

Publication Number Publication Date
CN112990089A true CN112990089A (en) 2021-06-18
CN112990089B CN112990089B (en) 2023-09-26

Family

ID=76339530

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110379043.5A Active CN112990089B (en) 2021-04-08 2021-04-08 Method for judging human motion gesture

Country Status (1)

Country Link
CN (1) CN112990089B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114974506A (en) * 2022-05-17 2022-08-30 重庆大学 Human body posture data processing method and system

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10261090A (en) * 1997-03-19 1998-09-29 Tokyo Electric Power Co Inc:The Motion capture system
CN105608467A (en) * 2015-12-16 2016-05-25 西北工业大学 Kinect-based non-contact type student physical fitness evaluation method
CN108537284A (en) * 2018-04-13 2018-09-14 东莞松山湖国际机器人研究院有限公司 Posture assessment scoring method based on computer vision deep learning algorithm and system
CN110245623A (en) * 2019-06-18 2019-09-17 重庆大学 A kind of real time human movement posture correcting method and system
CN110321754A (en) * 2018-03-28 2019-10-11 西安铭宇信息科技有限公司 A kind of human motion posture correcting method based on computer vision and system
CN110969114A (en) * 2019-11-28 2020-04-07 四川省骨科医院 Human body action function detection system, detection method and detector
US20200245904A1 (en) * 2019-01-31 2020-08-06 Konica Minolta, Inc. Posture estimation device, behavior estimation device, storage medium storing posture estimation program, and posture estimation method
CN111883229A (en) * 2020-07-31 2020-11-03 焦点科技股份有限公司 Intelligent movement guidance method and system based on visual AI
CN111931733A (en) * 2020-09-25 2020-11-13 西南交通大学 Human body posture detection method based on depth camera
CN112184898A (en) * 2020-10-21 2021-01-05 安徽动感智能科技有限公司 Digital human body modeling method based on motion recognition
CN112200074A (en) * 2020-10-09 2021-01-08 广州健康易智能科技有限公司 Attitude comparison method and terminal
CN112329513A (en) * 2020-08-24 2021-02-05 苏州荷露斯科技有限公司 High frame rate 3D (three-dimensional) posture recognition method based on convolutional neural network
CN112381002A (en) * 2020-11-16 2021-02-19 深圳技术大学 Human body risk posture identification method and system

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10261090A (en) * 1997-03-19 1998-09-29 Tokyo Electric Power Co Inc:The Motion capture system
CN105608467A (en) * 2015-12-16 2016-05-25 西北工业大学 Kinect-based non-contact type student physical fitness evaluation method
CN110321754A (en) * 2018-03-28 2019-10-11 西安铭宇信息科技有限公司 A kind of human motion posture correcting method based on computer vision and system
CN108537284A (en) * 2018-04-13 2018-09-14 东莞松山湖国际机器人研究院有限公司 Posture assessment scoring method based on computer vision deep learning algorithm and system
US20200245904A1 (en) * 2019-01-31 2020-08-06 Konica Minolta, Inc. Posture estimation device, behavior estimation device, storage medium storing posture estimation program, and posture estimation method
CN110245623A (en) * 2019-06-18 2019-09-17 重庆大学 A kind of real time human movement posture correcting method and system
CN110969114A (en) * 2019-11-28 2020-04-07 四川省骨科医院 Human body action function detection system, detection method and detector
CN111883229A (en) * 2020-07-31 2020-11-03 焦点科技股份有限公司 Intelligent movement guidance method and system based on visual AI
CN112329513A (en) * 2020-08-24 2021-02-05 苏州荷露斯科技有限公司 High frame rate 3D (three-dimensional) posture recognition method based on convolutional neural network
CN111931733A (en) * 2020-09-25 2020-11-13 西南交通大学 Human body posture detection method based on depth camera
CN112200074A (en) * 2020-10-09 2021-01-08 广州健康易智能科技有限公司 Attitude comparison method and terminal
CN112184898A (en) * 2020-10-21 2021-01-05 安徽动感智能科技有限公司 Digital human body modeling method based on motion recognition
CN112381002A (en) * 2020-11-16 2021-02-19 深圳技术大学 Human body risk posture identification method and system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
DIANCHEN HE等: "A New Kinect-Based Posture Recognition Method in Physical Sports Training Based on Urban Data", WIRELESS COMMUNICATIONS AND MOBILE COMPUTING, pages 1 - 9 *
李杰: "基于运动传感的人体姿态实时捕获系统研究", 中国优秀硕士学位论文全文数据库 信息科技辑, no. 2, pages 138 - 4090 *
李鑫等: "基于Kinect的体育运动自训练系统", 计算机技术与发展, no. 4, pages 128 - 133 *
韩艳杰: "基于3D打印与光纤光栅传感技术的新型人体姿态识别系统", 中国优秀硕士学位论文全文数据库 信息科技辑, no. 1, pages 140 - 1175 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114974506A (en) * 2022-05-17 2022-08-30 重庆大学 Human body posture data processing method and system
CN114974506B (en) * 2022-05-17 2024-05-03 重庆大学 Human body posture data processing method and system

Also Published As

Publication number Publication date
CN112990089B (en) 2023-09-26

Similar Documents

Publication Publication Date Title
CN110969114B (en) Human body action function detection system, detection method and detector
CN111144217B (en) Motion evaluation method based on human body three-dimensional joint point detection
CN112069933A (en) Skeletal muscle stress estimation method based on posture recognition and human body biomechanics
WO2018120964A1 (en) Posture correction method based on depth information and skeleton information
CN114399826A (en) Image processing method and apparatus, image device, and storage medium
CN102567703B (en) Hand motion identification information processing method based on classification characteristic
CN102184541A (en) Multi-objective optimized human body motion tracking method
CN108875586B (en) Functional limb rehabilitation training detection method based on depth image and skeleton data multi-feature fusion
CN109766782B (en) SVM-based real-time limb action recognition method
CN112668531A (en) Motion posture correction method based on motion recognition
CN112435731B (en) Method for judging whether real-time gesture meets preset rules
CN106815855A (en) Based on the human body motion tracking method that production and discriminate combine
CN113065505A (en) Body action rapid identification method and system
Yamao et al. Development of human pose recognition system by using raspberry pi and posenet model
He et al. A new Kinect-based posture recognition method in physical sports training based on urban data
Goyal et al. Yoga pose perfection using deep learning: An algorithm to estimate the error in yogic poses
CN112990089B (en) Method for judging human motion gesture
Li et al. Posture recognition technology based on kinect
KR102013705B1 (en) Apparatus and method for recognizing user's posture in horse-riding simulator
CN106295616A (en) Exercise data analyses and comparison method and device
CN115006822A (en) Intelligent fitness mirror control system
CN112233769A (en) Recovery system after suffering from illness based on data acquisition
CN116266415A (en) Action evaluation method, system and device based on body building teaching training and medium
CN114360060B (en) Human body action recognition and counting method
Kishore et al. Smart yoga instructor for guiding and correcting yoga postures in real time

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant