CN112364694A - Human body sitting posture identification method based on key point detection - Google Patents

Human body sitting posture identification method based on key point detection Download PDF

Info

Publication number
CN112364694A
CN112364694A CN202011088718.2A CN202011088718A CN112364694A CN 112364694 A CN112364694 A CN 112364694A CN 202011088718 A CN202011088718 A CN 202011088718A CN 112364694 A CN112364694 A CN 112364694A
Authority
CN
China
Prior art keywords
sitting posture
deviation
value
data set
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011088718.2A
Other languages
Chinese (zh)
Other versions
CN112364694B (en
Inventor
郑佳罄
石守东
胡加钿
房志远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo University
Original Assignee
Ningbo University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo University filed Critical Ningbo University
Priority to CN202011088718.2A priority Critical patent/CN112364694B/en
Publication of CN112364694A publication Critical patent/CN112364694A/en
Application granted granted Critical
Publication of CN112364694B publication Critical patent/CN112364694B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis

Abstract

The invention discloses a human body sitting posture identification method based on key point detection1、T2、T3And T4By key point coordinate determination, when the sitting posture of a human body is detected, the key point coordinate of the correct sitting posture of the human body is firstly obtained as a reference, then the real-time key point coordinate of the human body and four judgment thresholds are combined to judge the sitting posture of the human body in real time, when the same sitting posture is judged to be incorrect for 3 times, voice broadcasting is carried out to remind a user, and when more than two incorrect sitting postures appear for 3 times continuously, the sitting posture with the highest priority level is broadcasted during voice broadcasting; the method has the advantages of simple implementation process, low requirement on the computing capacity of hardware, higher practicability, lower cost, higher real-time property and good interactivity, and can be subsequently transplanted to embedded equipment.

Description

Human body sitting posture identification method based on key point detection
Technical Field
The invention relates to a human body sitting posture identification method, in particular to a human body sitting posture identification method based on key point detection.
Background
In work and life, people adopt sitting postures most of the time, and take incorrect sitting postures with little attention, and long-term incorrect sitting postures can cause scoliosis, cervical spondylosis, myopia and a series of complications. The good sitting posture has important influence on improving the living and working efficiency of people and keeping physical and mental health, and the correct recognition of the sitting posture of people can assist people to form good sitting posture habits. For this reason, human sitting posture recognition technology has been widely studied.
Most of the existing human body sitting posture recognition technologies are based on machine learning, for example, chinese patent application publication No. CN111414780A) discloses a human body sitting posture recognition method, which collects user sitting posture images in real time, recognizes human body feature key points, calculates current sitting posture data according to the human body feature key point data, the key point data includes eye coordinates, mouth coordinates, neck coordinates, and shoulder coordinates, the current sitting posture data includes a current head inclination angle, a current shoulder inclination angle, a current height difference between the neck and the face, and a current height difference between the shoulder and the face, and finally compares the current sitting posture data with standard sitting posture data to determine whether the current sitting posture is abnormal. The standard sitting posture data comprise a standard head inclination angle threshold value, a standard shoulder inclination angle threshold value, a standard eye-using over-close difference value ratio threshold value and a standard lying table difference value ratio threshold value, and the four threshold values are acquired by performing big data training through a machine learning supervised learning classification algorithm. The supervised learning classification algorithm of machine learning has high requirement on the computing power of hardware, a large amount of data is needed during training to ensure the accuracy of the algorithm, and a certain time is needed for calculating a corresponding result. Therefore, when the human body sitting posture identification method is realized, the requirement on the computing capacity of hardware is high, the cost is high, in order to ensure the accuracy of the human body sitting posture identification method, a large amount of time is needed to be spent for manufacturing a large amount of training data, the realization process is complex, more time is spent on the calculation result during identification, and the real-time performance is not high.
Disclosure of Invention
The invention aims to solve the technical problem of providing the method for identifying the human body sitting posture based on the key point detection, which has the advantages of simple implementation process, lower requirement on the computing capacity of hardware, higher practicability, lower cost, higher real-time property and good interactivity.
The technical scheme adopted by the invention for solving the technical problems is as follows: a human body sitting posture identification method based on key point detection comprises the following steps:
(1) be equipped with a PC that has the image processing procedure in advance, an infrared distance measurement sensor and a camera, be connected infrared distance measurement sensor and camera equipment and with the PC, infrared distance measurement sensor and camera are on same vertical plane and the distance is no longer than 5 centimetres, use the picture upper left corner that the camera was gathered in real time as the origin of coordinates, the level right side direction is the x axle positive direction, the vertical direction is y axle positive direction, establish the coordinate system, it has four to judge threshold value T to store in advance in the PC1、T2、T3And T4The four decision thresholds are predetermined by the following method:
1-1, dividing sitting posture behaviors into 9 categories including too close distance, too far distance, left head deviation, right head deviation, left body inclination, right body inclination, non-parallel shoulder, bent spine and correct sitting posture;
step 1-2, selecting 120 females with the height of 120 cm-180 cm and 120 males with the height of 130 cm-190 cm as pre-inspectors, wherein each 10cm of 120 cm-180 cm is a grade, the total grade is 6, each female grade is 20, each 10cm of 130 cm-190 cm is a grade, the total grade is 6, and each male grade is 20; randomly numbering 240 pre-inspectors as 1-240, and designating the pre-inspector with the number i as the ith pre-inspector, wherein i is 1,2, … and 240;
step 1-3, respectively carrying out pre-detection on 240 pre-detection personnel, wherein the specific process is as follows:
s1, the camera is over against the face of the pre-detection person, the distance between the face of the pre-detection person and the face of the pre-detection person is 30-50 cm, and the face and shoulders of the pre-detection person cannot be shielded;
s2, each pre-detection person sequentially takes 7 sitting postures of correct sitting posture, left head deviation, right head deviation, left body inclination, right body inclination, spine bending and shoulder non-parallelism in front of the camera, the camera shoots images of the 7 sitting postures of the pre-detection person and sends the images to the PC, wherein the 7 sitting postures are sequentially numbered as 1-7, the sitting posture numbered as j is called as the jth sitting posture, j is 1,2, …, 7, the correct sitting posture is that the waist and the back are naturally straight, the chest is open, the shoulders are flat, the neck, the chest and the waist are kept straight, and other 6 sitting postures except the correct sitting posture are implemented according to personal ordinary habits;
s3, respectively acquiring and recording coordinates of 6 key points of left eye pupil, right eye pupil, nose tip, neck (concave point at the joint of two clavicles), left shoulder and right shoulder of each pre-inspector in 7 sitting postures by adopting an image processing program at a PC (personal computer) to obtain 240 sets of coordinate data, wherein each set of coordinate data respectively comprises left eye pupil coordinates, right eye pupil coordinates, nose tip coordinates, neck coordinates, left shoulder coordinates and right shoulder coordinates of one pre-inspector in 7 sitting postures, and the coordinates of the left eye pupil of the ith pre-inspector in the jth sitting posture are recorded as coordinates of the left eye pupil of the ith pre-inspector in the jth sitting posture
Figure BDA0002721249650000021
The coordinates of the pupil of the right eye are recorded as
Figure BDA0002721249650000022
The coordinates of the tip of the nose are recorded as
Figure BDA0002721249650000023
Coordinates of the neck are noted
Figure BDA0002721249650000024
The coordinates of the left shoulder are recorded as
Figure BDA0002721249650000025
The coordinates of the right shoulder are recorded as
Figure BDA0002721249650000031
S4, regarding the left deviation of the left eye on the x axis when the ith pre-examined person inclines left as the left inclination deviation, and recording the left deviation as delta LiAnd the right deviation of the right eye on the x axis in the right inclination of the body is taken as the right inclination deviation and is recorded as delta RiThe amount of cervical offset in the y-axis during spinal flexion is denoted as Δ CiTwo critical points of the shoulder are in the case of non-parallel shouldersThe difference on the y-axis is taken as the shoulder offset and is recorded as Δ HiRespectively calculating by adopting formulas (1), (2), (3) and (4) to obtain delta Li、ΔRi、ΔCiAnd Δ Hi
Figure BDA0002721249650000032
Figure BDA0002721249650000033
Figure BDA0002721249650000034
Figure BDA0002721249650000035
In the formula (4), | is an absolute value symbol;
s5, integrating 240 sets of coordinate data according to sitting posture categories, and then respectively carrying out 7 sets again according to 7 sitting posture categories to obtain 7 sets of sitting posture data, wherein each set of sitting posture data respectively comprises a left eye pupil coordinate, a right eye pupil coordinate, a nose tip coordinate, a neck coordinate, a left shoulder coordinate and a right shoulder coordinate of 240 testers in the sitting posture;
s6, determining the judgment threshold values T respectively1、T2、T3And T4Wherein a decision threshold value T is determined1The specific process comprises the following steps:
A. by Δ L1~ΔL240The 240 left inclination deviation quantities form an original data set, and the original data set is used as a 0 th generation data set;
B. setting an iteration variable t, initializing t, and setting t to be 1;
C. and carrying out the t-th iteration updating to obtain a t-th generation data set, wherein the specific process is as follows:
c1 calculating kurtosis of the t-1 generation data set
Figure BDA0002721249650000036
Mean value
Figure BDA0002721249650000037
And standard deviation of
Figure BDA0002721249650000038
C2, judgment
Figure BDA0002721249650000039
Whether or not it is greater than 3, if
Figure BDA00027212496500000310
Not more than 3, and
Figure BDA00027212496500000311
if the difference from 3 is not greater than 1, the decision threshold is set
Figure BDA00027212496500000312
If it is not
Figure BDA00027212496500000313
Not more than 3, and
Figure BDA00027212496500000314
if the difference value between the threshold value T and the threshold value 3 is larger than 1, the value of the maximum left deviation amount in the T-1 generation data group is used as a judgment threshold value T1If, if
Figure BDA00027212496500000315
If the sum is more than 3, calculating each left-leaning deviation sum in the t-1 th generation data set
Figure BDA00027212496500000316
The square value of the difference, the left inclination deviation amount corresponding to the maximum square value is deleted from the t-1 generation data group to obtain a t generation data group, then the current value of t is adopted to add 1 and update the value of t, the step C is returned, and the next iteration is carried out until the next iteration is carried out
Figure BDA00027212496500000317
Not more than 3;
determining a decision threshold T2The specific process comprises the following steps:
D. by Δ R1~ΔR240The 240 right deviation quantities form an original data set, and the original data set is used as a 0 th generation data set;
E. setting an iteration variable t, initializing t, and setting t to be 1;
F. and carrying out the t-th iteration updating to obtain a t-th generation data set, wherein the specific process is as follows:
f1, calculating kurtosis of the t-1 generation data set
Figure BDA0002721249650000041
Mean value
Figure BDA0002721249650000042
And standard deviation of
Figure BDA0002721249650000043
F2, determination
Figure BDA0002721249650000044
Whether or not it is greater than 3, if
Figure BDA0002721249650000045
Not more than 3, and
Figure BDA0002721249650000046
if the difference from 3 is not greater than 1, the decision threshold is set
Figure BDA0002721249650000047
If it is not
Figure BDA0002721249650000048
Not more than 3, and
Figure BDA0002721249650000049
if the difference value between the threshold value T and the threshold value 3 is larger than 1, the value of the maximum right deviation amount in the T-1 generation data group is used as a judgment threshold value T2Such asFruit
Figure BDA00027212496500000410
If the sum is more than 3, calculating the right deviation sum of each data group in the t-1 generation
Figure BDA00027212496500000411
Deleting the left deviation corresponding to the maximum square value from the t-1 th generation data group to obtain a t-th generation data group, then adding 1 to the current value of t and updating the value of t, returning to the step F, and performing the next iteration until the next iteration is performed until the current value of t is added with the value of 1
Figure BDA00027212496500000412
Not more than 3;
determining a decision threshold T3The specific process comprises the following steps:
G. by using Δ C1~ΔC240The 240 neck offsets form an original data set, and the original data set is used as a 0 th generation data set;
H. setting an iteration variable t, initializing t, and setting t to be 1;
I. and carrying out the t-th iteration updating to obtain a t-th generation data set, wherein the specific process is as follows:
i1 calculating kurtosis of the t-1 generation data set
Figure BDA00027212496500000413
Mean value
Figure BDA00027212496500000414
And standard deviation of
Figure BDA00027212496500000415
I2, judgment
Figure BDA00027212496500000416
Whether or not it is greater than 3, if
Figure BDA00027212496500000417
Not more than 3, and
Figure BDA00027212496500000418
if the difference from 3 is not greater than 1, the decision threshold is set
Figure BDA00027212496500000419
If it is not
Figure BDA00027212496500000420
Not more than 3, and
Figure BDA00027212496500000421
if the difference value between the threshold value T and the threshold value 3 is larger than 1, the maximum neck deviation value in the T-1 generation data set is taken as a judgment threshold value T3If, if
Figure BDA00027212496500000422
If the sum is more than 3, calculating each left-leaning deviation sum in the t-1 th generation data set
Figure BDA00027212496500000423
The square value of the difference is obtained by deleting the neck deviation corresponding to the maximum square value from the t-1 generation data set to obtain the t generation data set, then adding 1 to the current value of t and updating the value of t, returning to the step I, and performing the next iteration until the next iteration is performed
Figure BDA00027212496500000424
Not more than 3;
determining a decision threshold T4The specific process comprises the following steps:
J. by using Δ H1~ΔH240The 240 shoulder offsets form an original data set, and the original data set is used as a 0 th generation data set;
K. setting an iteration variable t, initializing t, and setting t to be 1;
l, carrying out the t-th iteration updating to obtain a t-th generation data set, wherein the specific process is as follows:
l1 calculating kurtosis of the t-1 generation data set
Figure BDA00027212496500000425
Mean value
Figure BDA00027212496500000426
And standard deviation of
Figure BDA00027212496500000427
L2, determination
Figure BDA00027212496500000428
Whether or not it is greater than 3, if
Figure BDA00027212496500000429
Not more than 3, and
Figure BDA00027212496500000430
if the difference from 3 is not greater than 1, the threshold value is determined
Figure BDA00027212496500000431
If it is not
Figure BDA00027212496500000432
Not more than 3, and
Figure BDA00027212496500000433
if the difference from 3 is greater than 1, the value of the largest shoulder deviation in the T-1 generation data set is used as the judgment threshold T4If, if
Figure BDA0002721249650000051
Then calculate each left-leaning deviation and
Figure BDA0002721249650000052
the square value of the difference is obtained by deleting the shoulder deviation corresponding to the maximum square value from the t-1 generation data group to obtain the t generation data group, then adding 1 to the current value of t and updating the value of t, returning to the step L, and performing the next iteration until the next iteration is performed until the current value of t is added to the value of 1 and the value of t is updated
Figure BDA0002721249650000053
Not more than 3;
(2) when need carrying out the position of sitting discernment to the user, the camera is installed and is being carried out the position department that corresponds that the position of sitting discerned in needs, and the advanced reference data acquisition who uses the user, specific process is: the method comprises the steps that a user sits in front of a camera in a correct sitting posture, the camera faces the face of the user, the distance between the face of the user and shoulders is 30-50 cm, the face and the shoulders of the user cannot be shielded, the camera shoots images of the correct sitting posture of the user and sends the images to a PC (personal computer), the PC processes the images of the correct sitting posture of the user by using an image processing program prestored in the PC, and determines and records coordinates of 6 key points, namely left eye pupils, right eye pupils, nose tips, neck parts (concave points at the joints of two clavicles), left shoulders and right shoulders of the user in the correct sitting posture, and records the coordinates of the left eye pupils of the user as (lx, ly), coordinates of the right eye pupils as (rx, ry), coordinates of the nose tips as (nx, ny), coordinates of the neck parts as (bx, by), coordinates of the left shoulders as (lsx, lsy) and coordinates of the right shoulders as (rsx, rsy;
(3) after the reference data of the user is determined, the sitting posture of the user is identified in real time, and the specific process is as follows:
step 3-1, the PC collects images of the sitting posture of the user from the camera every 2 seconds, an image processing program is adopted to process the real-time images of the sitting posture of the user, coordinates of 6 key points of a left eye pupil, a right eye pupil, a nose tip, a neck (concave points at the joints of two clavicles) and a left shoulder and a right shoulder of the user in the current sitting posture are determined and recorded, meanwhile, the distance between the user and the camera, which is measured by the infrared distance measuring sensor, is received, and the coordinates of the left eye pupil of the user in the current sitting posture are recorded as (lx)N,lyN) And the coordinates of the pupil of the right eye are noted as (rx)N,ryN) The coordinate of the tip of the nose is (nx)N,nyN) The coordinates of the neck are (bx)N,byN) The coordinates of the left shoulder are noted as (lsx)N,lsyN) The coordinates of the right shoulder are (rsx)N,rsyN) The distance between the user and the camera is recorded as D, the key point of the left eye pupil is connected with the key point of the nose tip, the connecting line of the key point of the left eye pupil and the key point of the nose tip is recorded as a line segment a, the key point of the right eye pupil is respectively connected with the key point of the nose tip, the connecting line of the right eye pupil and the key point of the nose tip is recorded as a line segment b, the key point of the nose tip is connected withRecording a connecting line as a line segment c, connecting the left shoulder key point with the neck key point, recording a connecting line of the left shoulder key point and the neck key point as a line segment d, connecting the right shoulder key point with the neck key point, recording a connecting line of the right shoulder key point and the neck key point as a line segment e, recording an included angle between the line segment c and the line segment d as an angle alpha, and recording an included angle between the line segment c and the line segment e as an angle beta;
step 3-2, the sitting posture of the user is judged in real time according to the real-time data condition in the step 3-1, and the specific judgment standard is as follows:
if D is less than 30cm, determining that the distance is too close;
if D is larger than 50 cm, determining that the distance is too far;
if alpha is larger than 0 degree and smaller than or equal to 70 degrees, the current sitting posture is judged to be the left head deviation;
if the beta is larger than 0 degree and smaller than or equal to 70 degrees, judging that the current sitting posture is the head right deviation;
if lx-lxN>T1If so, judging that the current sitting posture is left-leaning;
if rxN-rx>T2Judging that the current sitting posture is right inclination;
if | lsyN-rsyN|>T4Judging that the current sitting posture is not parallel to the shoulders;
if byN-by>T3Judging that the current sitting posture is spinal curvature;
if the situation is other than the above situation, the current sitting posture is judged to be the correct sitting posture;
and 3-3, if the sitting postures are continuously judged to be the same incorrect sitting posture for 3 times, voice broadcasting is carried out to remind the user, when more than two incorrect sitting postures are continuously and simultaneously presented for 3 times, the sitting posture with the highest priority level is broadcasted during voice broadcasting, and the priorities of the 8 incorrect sitting postures are sequentially too close, too far, left head deviation, right head deviation, left body deviation, right body deviation, uneven shoulders and bent spine from high to low.
Compared with the prior art, the invention has the advantages that the image processing method is established by a PC (personal computer) pre-stored with an image processing program, an infrared distance measuring sensor and a cameraIn the hardware environment, the upper left corner of a picture acquired by a camera in real time is taken as the origin of coordinates, the horizontal right direction is the positive direction of an x axis, the vertical downward direction is the positive direction of a y axis, a coordinate system is established, and four judgment threshold values T are prestored in a PC (personal computer)1、T2、T3And T4Four decision thresholds T1、T2、T3And T4The method is characterized in that key point coordinates are determined in advance, when the sitting posture of a human body is detected, the key point coordinates of the correct sitting posture of the human body are obtained as a reference, and then the real-time key point coordinates of the human body and four judgment thresholds T are combined1、T2、T3And T4The method for determining the four determination threshold values has the advantages that compared with the existing machine learning, a large amount of training data does not need to be made, the calculation process is simplified while high accuracy is guaranteed, and the time required by calculation is shortened.
Drawings
Fig. 1 is a schematic diagram of each key point, key point connecting lines and included angles between the connecting lines of the human body sitting posture identification method based on key point detection.
Detailed Description
The invention is described in further detail below with reference to the accompanying examples.
Example (b): a human body sitting posture identification method based on key point detection comprises the following steps:
(1) a PC pre-storing image processing program, an infrared distance measuring sensor and a camera, and a camera groupThe infrared distance measuring sensor and the camera are arranged on the same vertical plane, the distance between the infrared distance measuring sensor and the camera is not more than 5 centimeters, the upper left corner of a picture acquired by the camera in real time is taken as an origin of coordinates, the horizontal right direction is the positive direction of an x axis, the vertical downward direction is the positive direction of a y axis, a coordinate system is established, and four judgment threshold values T are stored in the PC in advance1、T2、T3And T4The four decision thresholds are predetermined by the following method:
1-1, dividing sitting posture behaviors into 9 categories including too close distance, too far distance, left head deviation, right head deviation, left body inclination, right body inclination, non-parallel shoulder, bent spine and correct sitting posture;
step 1-2, selecting 120 females with the height of 120 cm-180 cm and 120 males with the height of 130 cm-190 cm as pre-inspectors, wherein each 10cm of 120 cm-180 cm is a grade, the total grade is 6, each female grade is 20, each 10cm of 130 cm-190 cm is a grade, the total grade is 6, and each male grade is 20; randomly numbering 240 pre-inspectors as 1-240, and designating the pre-inspector with the number i as the ith pre-inspector, wherein i is 1,2, … and 240;
step 1-3, respectively carrying out pre-detection on 240 pre-detection personnel, wherein the specific process is as follows:
s1, the camera is over against the face of the pre-detection person, the distance between the face of the pre-detection person and the face of the pre-detection person is 30-50 cm, and the face and shoulders of the pre-detection person cannot be shielded;
s2, each pre-detection person sequentially takes 7 sitting postures of correct sitting posture, left head deviation, right head deviation, left body inclination, right body inclination, spine bending and shoulder non-parallelism in front of the camera, the camera shoots images of the 7 sitting postures of the pre-detection person and sends the images to the PC, wherein the 7 sitting postures are sequentially numbered as 1-7, the sitting posture numbered as j is called as the jth sitting posture, j is 1,2, …, 7, the correct sitting posture is that the waist and the back are naturally straight, the chest is open, the shoulders are flat, the neck, the chest and the waist are kept straight, and other 6 sitting postures except the correct sitting posture are implemented according to personal ordinary habits;
s3, respectively acquiring and recording each pre-examining person in 7 sitting postures at the PC by adopting an image processing programThe coordinates of 6 key points of the left eye pupil, the right eye pupil, the tip of the nose, the neck (concave points at the joints of the clavicles at the two sides), the left shoulder and the right shoulder are obtained to obtain 240 groups of coordinate data, each group of coordinate data respectively comprises a left eye pupil coordinate, a right eye pupil coordinate, a tip of the nose coordinate, a neck coordinate, a left shoulder coordinate and a right shoulder coordinate of a pre-inspector in 7 sitting postures, and the coordinate of the left eye pupil of the ith pre-inspector in the jth sitting posture is recorded as the coordinate of the left eye pupil
Figure BDA0002721249650000071
The coordinates of the pupil of the right eye are recorded as
Figure BDA0002721249650000072
The coordinates of the tip of the nose are recorded as
Figure BDA0002721249650000081
Coordinates of the neck are noted
Figure BDA0002721249650000082
The coordinates of the left shoulder are recorded as
Figure BDA0002721249650000083
The coordinates of the right shoulder are recorded as
Figure BDA0002721249650000084
S4, regarding the left deviation of the left eye on the x axis when the ith pre-examined person inclines left as the left inclination deviation, and recording the left deviation as delta LiAnd the right deviation of the right eye on the x axis in the right inclination of the body is taken as the right inclination deviation and is recorded as delta RiThe amount of cervical offset in the y-axis during spinal flexion is denoted as Δ CiWhen the shoulders are not parallel, the difference value of the key points of the two shoulders on the y axis is taken as the shoulder deviation and is recorded as delta HiRespectively calculating by adopting formulas (1), (2), (3) and (4) to obtain delta Li、ΔRi、ΔCiAnd Δ Hi
Figure BDA0002721249650000085
Figure BDA0002721249650000086
Figure BDA0002721249650000087
Figure BDA0002721249650000088
In the formula (4), | is an absolute value symbol;
s5, integrating 240 sets of coordinate data according to sitting posture categories, and then respectively carrying out 7 sets again according to 7 sitting posture categories to obtain 7 sets of sitting posture data, wherein each set of sitting posture data respectively comprises a left eye pupil coordinate, a right eye pupil coordinate, a nose tip coordinate, a neck coordinate, a left shoulder coordinate and a right shoulder coordinate of 240 testers in the sitting posture;
s6, determining the judgment threshold values T respectively1、T2、T3And T4Wherein a decision threshold value T is determined1The specific process comprises the following steps:
A. by Δ L1~ΔL240The 240 left inclination deviation quantities form an original data set, and the original data set is used as a 0 th generation data set;
B. setting an iteration variable t, initializing t, and setting t to be 1;
C. and carrying out the t-th iteration updating to obtain a t-th generation data set, wherein the specific process is as follows:
c1 calculating kurtosis of the t-1 generation data set
Figure BDA0002721249650000089
Mean value
Figure BDA00027212496500000810
And standard deviation of
Figure BDA00027212496500000811
C2, judgment
Figure BDA00027212496500000812
Whether or not it is greater than 3, if
Figure BDA00027212496500000813
Not more than 3, and
Figure BDA00027212496500000814
if the difference from 3 is not greater than 1, the decision threshold is set
Figure BDA00027212496500000815
If it is not
Figure BDA00027212496500000816
Not more than 3, and
Figure BDA00027212496500000817
if the difference value between the threshold value T and the threshold value 3 is larger than 1, the value of the maximum left deviation amount in the T-1 generation data group is used as a judgment threshold value T1If, if
Figure BDA00027212496500000818
If the sum is more than 3, calculating each left-leaning deviation sum in the t-1 th generation data set
Figure BDA00027212496500000819
The square value of the difference, the left inclination deviation amount corresponding to the maximum square value is deleted from the t-1 generation data group to obtain a t generation data group, then the current value of t is adopted to add 1 and update the value of t, the step C is returned, and the next iteration is carried out until the next iteration is carried out
Figure BDA00027212496500000820
Not more than 3;
determining a decision threshold T2The specific process comprises the following steps:
D. by Δ R1~ΔR240The 240 right deviation quantities form an original data set, and the original data set is used as a 0 th generation data set;
E. setting an iteration variable t, initializing t, and setting t to be 1;
F. and carrying out the t-th iteration updating to obtain a t-th generation data set, wherein the specific process is as follows:
f1, calculating kurtosis of the t-1 generation data set
Figure BDA0002721249650000091
Mean value
Figure BDA0002721249650000092
And standard deviation of
Figure BDA0002721249650000093
F2, determination
Figure BDA0002721249650000094
Whether or not it is greater than 3, if
Figure BDA0002721249650000095
Not more than 3, and
Figure BDA0002721249650000096
if the difference from 3 is not greater than 1, the decision threshold is set
Figure BDA0002721249650000097
If it is not
Figure BDA0002721249650000098
Not more than 3, and
Figure BDA0002721249650000099
if the difference value between the threshold value T and the threshold value 3 is larger than 1, the value of the maximum right deviation amount in the T-1 generation data group is used as a judgment threshold value T2If, if
Figure BDA00027212496500000910
If the sum is more than 3, calculating the right deviation sum of each data group in the t-1 generation
Figure BDA00027212496500000911
The square value of the difference is calculated by changing the left inclination deviation corresponding to the maximum square value from the t-1 generationDeleting the data group to obtain a t-th generation data group, then adding 1 to the current value of t and updating the value of t, returning to the step F, and carrying out the next iteration until the next iteration is carried out
Figure BDA00027212496500000912
Not more than 3;
determining a decision threshold T3The specific process comprises the following steps:
G. by using Δ C1~ΔC240The 240 neck offsets form an original data set, and the original data set is used as a 0 th generation data set;
H. setting an iteration variable t, initializing t, and setting t to be 1;
I. and carrying out the t-th iteration updating to obtain a t-th generation data set, wherein the specific process is as follows:
i1 calculating kurtosis of the t-1 generation data set
Figure BDA00027212496500000913
Mean value
Figure BDA00027212496500000914
And standard deviation of
Figure BDA00027212496500000915
I2, judgment
Figure BDA00027212496500000916
Whether or not it is greater than 3, if
Figure BDA00027212496500000917
Not more than 3, and
Figure BDA00027212496500000918
if the difference from 3 is not greater than 1, the decision threshold is set
Figure BDA00027212496500000919
If it is not
Figure BDA00027212496500000920
Not more than 3 percent of the total weight of the composition,and is
Figure BDA00027212496500000921
If the difference value between the threshold value T and the threshold value 3 is larger than 1, the maximum neck deviation value in the T-1 generation data set is taken as a judgment threshold value T3If, if
Figure BDA00027212496500000922
If the sum is more than 3, calculating each left-leaning deviation sum in the t-1 th generation data set
Figure BDA00027212496500000923
The square value of the difference is obtained by deleting the neck deviation corresponding to the maximum square value from the t-1 generation data set to obtain the t generation data set, then adding 1 to the current value of t and updating the value of t, returning to the step I, and performing the next iteration until the next iteration is performed
Figure BDA00027212496500000924
Not more than 3;
determining a decision threshold T4The specific process comprises the following steps:
J. by using Δ H1~ΔH240The 240 shoulder offsets form an original data set, and the original data set is used as a 0 th generation data set;
K. setting an iteration variable t, initializing t, and setting t to be 1;
l, carrying out the t-th iteration updating to obtain a t-th generation data set, wherein the specific process is as follows:
l1 calculating kurtosis of the t-1 generation data set
Figure BDA00027212496500000925
Mean value
Figure BDA00027212496500000926
And standard deviation of
Figure BDA00027212496500000927
L2, determination
Figure BDA0002721249650000101
Whether or not it is greater than 3, if
Figure BDA0002721249650000102
Not more than 3, and
Figure BDA0002721249650000103
if the difference from 3 is not greater than 1, the threshold value is determined
Figure BDA0002721249650000104
If it is not
Figure BDA0002721249650000105
Not more than 3, and
Figure BDA0002721249650000106
if the difference from 3 is greater than 1, the value of the largest shoulder deviation in the T-1 generation data set is used as the judgment threshold T4If, if
Figure BDA0002721249650000107
Then calculate each left-leaning deviation and
Figure BDA0002721249650000108
the square value of the difference is obtained by deleting the shoulder deviation corresponding to the maximum square value from the t-1 generation data group to obtain the t generation data group, then adding 1 to the current value of t and updating the value of t, returning to the step L, and performing the next iteration until the next iteration is performed until the current value of t is added to the value of 1 and the value of t is updated
Figure BDA0002721249650000109
Not more than 3;
(2) when need carrying out the position of sitting discernment to the user, the camera is installed and is being carried out the position department that corresponds that the position of sitting discerned in needs, and the advanced reference data acquisition who uses the user, specific process is: the method comprises the steps that a user sits in front of a camera in a correct sitting posture, the camera faces the face of the user, the distance between the face of the user and shoulders is 30-50 cm, the face and the shoulders of the user cannot be shielded, the camera shoots images of the correct sitting posture of the user and sends the images to a PC (personal computer), the PC processes the images of the correct sitting posture of the user by using an image processing program prestored in the PC, and determines and records coordinates of 6 key points, namely left eye pupils, right eye pupils, nose tips, neck parts (concave points at the joints of two clavicles), left shoulders and right shoulders of the user in the correct sitting posture, and records the coordinates of the left eye pupils of the user as (lx, ly), coordinates of the right eye pupils as (rx, ry), coordinates of the nose tips as (nx, ny), coordinates of the neck parts as (bx, by), coordinates of the left shoulders as (lsx, lsy) and coordinates of the right shoulders as (rsx, rsy;
(3) after the reference data of the user is determined, the sitting posture of the user is identified in real time, and the specific process is as follows:
step 3-1, the PC collects images of the sitting posture of the user from the camera every 2 seconds, an image processing program is adopted to process the real-time images of the sitting posture of the user, coordinates of 6 key points of a left eye pupil, a right eye pupil, a nose tip, a neck (concave points at the joints of two clavicles) and a left shoulder and a right shoulder of the user in the current sitting posture are determined and recorded, meanwhile, the distance between the user and the camera, which is measured by the infrared distance measuring sensor, is received, and the coordinates of the left eye pupil of the user in the current sitting posture are recorded as (lx)N,lyN) And the coordinates of the pupil of the right eye are noted as (rx)N,ryN) The coordinate of the tip of the nose is (nx)N,nyN) The coordinates of the neck are (bx)N,byN) The coordinates of the left shoulder are noted as (lsx)N,lsyN) The coordinates of the right shoulder are (rsx)N,rsyN) The distance between the user and the camera is recorded as D, the left eye pupil key point and the nose tip key point are connected, the connecting line of the left eye pupil key point and the nose tip key point is recorded as a line segment a, the right eye pupil key point and the nose tip key point are respectively connected, the connecting line of the right eye pupil key point and the nose tip key point is recorded as a line segment b, the nose tip key point and the neck key point are connected, the connecting line of the nose tip key point and the neck key point is recorded as a line segment c, the left shoulder key point and the neck key point are connected, the connecting line of the left shoulder key point and the neck key point is recorded as a line segment D, the connecting line of the right shoulder key point and the neck key point is recorded as a;
step 3-2, the sitting posture of the user is judged in real time according to the real-time data condition in the step 3-1, and the specific judgment standard is as follows:
if D is less than 30cm, determining that the distance is too close;
if D is larger than 50 cm, determining that the distance is too far;
if alpha is larger than 0 degree and smaller than or equal to 70 degrees, the current sitting posture is judged to be the left head deviation;
if the beta is larger than 0 degree and smaller than or equal to 70 degrees, judging that the current sitting posture is the head right deviation;
if lx-lxN>T1If so, judging that the current sitting posture is left-leaning;
if rxN-rx>T2Judging that the current sitting posture is right inclination;
if | lsyN-rsyN|>T4Judging that the current sitting posture is not parallel to the shoulders;
if byN-by>T3Judging that the current sitting posture is spinal curvature;
if the situation is other than the above situation, the current sitting posture is judged to be the correct sitting posture;
and 3-3, if the sitting postures are continuously judged to be the same incorrect sitting posture for 3 times, voice broadcasting is carried out to remind the user, when more than two incorrect sitting postures are continuously and simultaneously presented for 3 times, the sitting posture with the highest priority level is broadcasted during voice broadcasting, and the priorities of the 8 incorrect sitting postures are sequentially too close, too far, left head deviation, right head deviation, left body deviation, right body deviation, uneven shoulders and bent spine from high to low.

Claims (1)

1. A human body sitting posture identification method based on key point detection is characterized by comprising the following steps:
(1) be equipped with a PC that has the image processing procedure in advance, an infrared distance measurement sensor and a camera, be connected infrared distance measurement sensor and camera equipment and with the PC, infrared distance measurement sensor and camera are on same vertical plane and the distance is no longer than 5 centimetres to the picture upper left corner that the camera was gathered in real time is the origin of coordinates, and the level right direction is x axle positive direction, and the vertical direction is followed for y axle positive direction downIn the positive direction, a coordinate system is established, and four judgment threshold values T are stored in advance in the PC1、T2、T3And TiThe four decision thresholds are predetermined by the following method:
1-1, dividing sitting posture behaviors into 9 categories including too close distance, too far distance, left head deviation, right head deviation, left body inclination, right body inclination, non-parallel shoulder, bent spine and correct sitting posture;
step 1-2, selecting 120 females with the height of 120 cm-180 cm and 120 males with the height of 130 cm-190 cm as pre-inspectors, wherein each 10cm of 120 cm-180 cm is a grade, the total grade is 6, each female grade is 20, each 10cm of 130 cm-190 cm is a grade, the total grade is 6, and each male grade is 20; randomly numbering 240 pre-inspectors as 1-240, and designating the pre-inspector with the number i as the ith pre-inspector, wherein i is 1,2, … and 240;
step 1-3, respectively carrying out pre-detection on 240 pre-detection personnel, wherein the specific process is as follows:
s1, the camera is over against the face of the pre-detection person, the distance between the face of the pre-detection person and the face of the pre-detection person is 30-50 cm, and the face and shoulders of the pre-detection person cannot be shielded;
s2, each pre-detection person sequentially takes 7 sitting postures of correct sitting posture, left head deviation, right head deviation, left body inclination, right body inclination, spine bending and shoulder non-parallelism in front of the camera, the camera shoots images of the 7 sitting postures of the pre-detection person and sends the images to the PC, wherein the 7 sitting postures are sequentially numbered as 1-7, the sitting posture numbered as j is called as the jth sitting posture, j is 1,2, …, 7, the correct sitting posture is that the waist and the back are naturally straight, the chest is open, the shoulders are flat, the neck, the chest and the waist are kept straight, and other 6 sitting postures except the correct sitting posture are implemented according to personal ordinary habits;
s3, respectively acquiring and recording coordinates of 6 key points of a left eye pupil, a right eye pupil, a nose tip, a neck (a concave point at the joint of two clavicles), a left shoulder and a right shoulder of each pre-inspector in 7 sitting postures by adopting an image processing program at a PC (personal computer) to obtain 240 groups of coordinate data, wherein each group of coordinate data respectively comprises a left eye pupil coordinate, a left shoulder coordinate and a right shoulder coordinate of each pre-inspector in 7 sitting postures,The coordinates of the pupil of the right eye, the nose tip, the neck, the left shoulder and the right shoulder of the right eye are recorded as the coordinates of the pupil of the left eye of the ith pre-inspector in the jth sitting posture
Figure FDA0002721249640000011
The coordinates of the pupil of the right eye are recorded as
Figure FDA0002721249640000012
The coordinates of the tip of the nose are recorded as
Figure FDA0002721249640000013
Coordinates of the neck are noted
Figure FDA0002721249640000014
The coordinates of the left shoulder are recorded as
Figure FDA0002721249640000015
The coordinates of the right shoulder are recorded as
Figure FDA0002721249640000021
S4, regarding the left deviation of the left eye on the x axis when the ith pre-examined person inclines left as the left inclination deviation, and recording the left deviation as delta LiAnd the right deviation of the right eye on the x axis in the right inclination of the body is taken as the right inclination deviation and is recorded as delta RiThe amount of cervical offset in the y-axis during spinal flexion is denoted as Δ CiWhen the shoulders are not parallel, the difference value of the key points of the two shoulders on the y axis is taken as the shoulder deviation and is recorded as delta HiRespectively calculating by adopting formulas (1), (2), (3) and (4) to obtain delta Li、ΔRi、ΔCiAnd Δ Hi
Figure FDA0002721249640000022
Figure FDA0002721249640000023
Figure FDA0002721249640000024
Figure FDA0002721249640000025
In the formula (4), | is an absolute value symbol;
s5, integrating 240 sets of coordinate data according to sitting posture categories, and then respectively carrying out 7 sets again according to 7 sitting posture categories to obtain 7 sets of sitting posture data, wherein each set of sitting posture data respectively comprises a left eye pupil coordinate, a right eye pupil coordinate, a nose tip coordinate, a neck coordinate, a left shoulder coordinate and a right shoulder coordinate of 240 testers in the sitting posture;
s6, determining the judgment threshold values T respectively1、T2、T3And T4Wherein a decision threshold value T is determined1The specific process comprises the following steps:
A. by Δ L1~ΔL240The 240 left inclination deviation quantities form an original data set, and the original data set is used as a 0 th generation data set;
B. setting an iteration variable t, initializing t, and setting t to be 1;
C. and carrying out the t-th iteration updating to obtain a t-th generation data set, wherein the specific process is as follows:
c1 calculating kurtosis of the t-1 generation data set
Figure FDA0002721249640000026
Mean value
Figure FDA0002721249640000027
And standard deviation of
Figure FDA0002721249640000028
C2, judgment
Figure FDA0002721249640000029
Whether or not it is greater than 3, if
Figure FDA00027212496400000210
Not more than 3, and
Figure FDA00027212496400000211
if the difference from 3 is not greater than 1, the decision threshold is set
Figure FDA00027212496400000212
If it is not
Figure FDA00027212496400000213
Not more than 3, and
Figure FDA00027212496400000214
if the difference value between the threshold value T and the threshold value 3 is larger than 1, the value of the maximum left deviation amount in the T-1 generation data group is used as a judgment threshold value T1If, if
Figure FDA00027212496400000215
If the sum is more than 3, calculating each left-leaning deviation sum in the t-1 th generation data set
Figure FDA00027212496400000216
The square value of the difference, the left inclination deviation amount corresponding to the maximum square value is deleted from the t-1 generation data group to obtain a t generation data group, then the current value of t is adopted to add 1 and update the value of t, the step C is returned, and the next iteration is carried out until the next iteration is carried out
Figure FDA00027212496400000217
Not more than 3;
determining a decision threshold T2The specific process comprises the following steps:
D. by Δ R1~ΔR240The 240 right deviation quantities form an original data set, and the original data set is used as a 0 th generation data set;
E. setting an iteration variable t, initializing t, and setting t to be 1;
F. and carrying out the t-th iteration updating to obtain a t-th generation data set, wherein the specific process is as follows:
f1, calculating kurtosis of the t-1 generation data set
Figure FDA0002721249640000031
Mean value
Figure FDA0002721249640000032
And standard deviation of
Figure FDA0002721249640000033
F2, determination
Figure FDA0002721249640000034
Whether or not it is greater than 3, if
Figure FDA0002721249640000035
Not more than 3, and
Figure FDA0002721249640000036
if the difference from 3 is not greater than 1, the decision threshold is set
Figure FDA0002721249640000037
If it is not
Figure FDA0002721249640000038
Not more than 3, and
Figure FDA0002721249640000039
if the difference value between the threshold value T and the threshold value 3 is larger than 1, the value of the maximum right deviation amount in the T-1 generation data group is used as a judgment threshold value T2If, if
Figure FDA00027212496400000310
If the sum is more than 3, calculating the right deviation sum of each data group in the t-1 generation
Figure FDA00027212496400000311
Deleting the left deviation corresponding to the maximum square value from the t-1 th generation data group to obtain a t-th generation data group, then adding 1 to the current value of t and updating the value of t, returning to the step F, and performing the next iteration until the next iteration is performed until the current value of t is added with the value of 1
Figure FDA00027212496400000312
Not more than 3;
determining a decision threshold T3The specific process comprises the following steps:
G. by using Δ C1~ΔC240The 240 neck offsets form an original data set, and the original data set is used as a 0 th generation data set;
H. setting an iteration variable t, initializing t, and setting t to be 1;
I. and carrying out the t-th iteration updating to obtain a t-th generation data set, wherein the specific process is as follows:
i1 calculating kurtosis of the t-1 generation data set
Figure FDA00027212496400000313
Mean value
Figure FDA00027212496400000314
And standard deviation of
Figure FDA00027212496400000315
I2, judgment
Figure FDA00027212496400000316
Whether or not it is greater than 3, if
Figure FDA00027212496400000317
Not more than 3, and
Figure FDA00027212496400000318
if the difference from 3 is not greater than 1, the decision threshold is set
Figure FDA00027212496400000319
If it is not
Figure FDA00027212496400000320
Not more than 3, and
Figure FDA00027212496400000321
if the difference value between the threshold value T and the threshold value 3 is larger than 1, the maximum neck deviation value in the T-1 generation data set is taken as a judgment threshold value T3If, if
Figure FDA00027212496400000322
If the sum is more than 3, calculating each left-leaning deviation sum in the t-1 th generation data set
Figure FDA00027212496400000323
The square value of the difference is obtained by deleting the neck deviation corresponding to the maximum square value from the t-1 generation data set to obtain the t generation data set, then adding 1 to the current value of t and updating the value of t, returning to the step I, and performing the next iteration until the next iteration is performed
Figure FDA00027212496400000324
Not more than 3;
determining a decision threshold T4The specific process comprises the following steps:
J. by using Δ H1~ΔH240The 240 shoulder offsets form an original data set, and the original data set is used as a 0 th generation data set;
K. setting an iteration variable t, initializing t, and setting t to be 1;
l, carrying out the t-th iteration updating to obtain a t-th generation data set, wherein the specific process is as follows:
l1 calculating kurtosis of the t-1 generation data set
Figure FDA00027212496400000325
Mean value
Figure FDA00027212496400000326
And standard deviation of
Figure FDA00027212496400000327
L2, determination
Figure FDA0002721249640000041
Whether or not it is greater than 3, if
Figure FDA0002721249640000042
Not more than 3, and
Figure FDA0002721249640000043
if the difference from 3 is not greater than 1, the threshold value is determined
Figure FDA0002721249640000044
If it is not
Figure FDA0002721249640000045
Not more than 3, and
Figure FDA0002721249640000046
if the difference from 3 is greater than 1, the value of the largest shoulder deviation in the T-1 generation data set is used as the judgment threshold T4If, if
Figure FDA0002721249640000047
Then calculate each left-leaning deviation and
Figure FDA0002721249640000048
the square value of the difference is obtained by deleting the shoulder deviation corresponding to the maximum square value from the t-1 generation data group to obtain the t generation data group, then adding 1 to the current value of t and updating the value of t, returning to the step L, and performing the next iteration until the next iteration is performed until the current value of t is added to the value of 1 and the value of t is updated
Figure FDA0002721249640000049
Not more than 3;
(2) when need carrying out the position of sitting discernment to the user, the camera is installed and is being carried out the position department that corresponds that the position of sitting discerned in needs, and the advanced reference data acquisition who uses the user, specific process is: the method comprises the steps that a user sits in front of a camera in a correct sitting posture, the camera faces the face of the user, the distance between the face of the user and shoulders is 30-50 cm, the face and the shoulders of the user cannot be shielded, the camera shoots images of the correct sitting posture of the user and sends the images to a PC (personal computer), the PC processes the images of the correct sitting posture of the user by using an image processing program prestored in the PC, and determines and records coordinates of 6 key points, namely left eye pupils, right eye pupils, nose tips, neck parts (concave points at the joints of two clavicles), left shoulders and right shoulders of the user in the correct sitting posture, and records the coordinates of the left eye pupils of the user as (lx, ly), coordinates of the right eye pupils as (rx, ry), coordinates of the nose tips as (nx, ny), coordinates of the neck parts as (bx, by), coordinates of the left shoulders as (lsx, lsy) and coordinates of the right shoulders as (rsx, rsy;
(3) after the reference data of the user is determined, the sitting posture of the user is identified in real time, and the specific process is as follows:
step 3-1, the PC collects images of the sitting posture of the user from the camera every 2 seconds, an image processing program is adopted to process the real-time images of the sitting posture of the user, coordinates of 6 key points of a left eye pupil, a right eye pupil, a nose tip, a neck (concave points at the joints of two clavicles) and a left shoulder and a right shoulder of the user in the current sitting posture are determined and recorded, meanwhile, the distance between the user and the camera, which is measured by the infrared distance measuring sensor, is received, and the coordinates of the left eye pupil of the user in the current sitting posture are recorded as (lx)N,lyN) And the coordinates of the pupil of the right eye are noted as (rx)N,ryN) The coordinate of the tip of the nose is (nx)N,nyN) The coordinates of the neck are (bx)N,byN) The coordinates of the left shoulder are noted as (lsx)N,lsyN) The coordinates of the right shoulder are (rsx)N,rsyN) The distance between the user and the camera is recorded as D, the key point of the left eye pupil is connected with the key point of the nose tip, the connecting line of the key point of the left eye pupil and the key point of the nose tip is recorded as a line segment a, the key point of the right eye pupil is respectively connected with the key point of the nose tip, and the connecting line of the right eye pupil and the key point of the nose tip is recordedFor a line segment b, connecting a key point of the nose tip with a key point of the neck, and marking the connecting line of the key point of the nose tip and the key point of the neck as a line segment c, connecting a key point of the left shoulder with the key point of the neck, and marking the connecting line of the key point of the left shoulder with the key point of the neck as a line segment d, connecting a key point of the right shoulder with the key point of the neck, and marking the connecting line of the left shoulder with the key point of the neck as a line segment e, marking the included angle between;
step 3-2, the sitting posture of the user is judged in real time according to the real-time data condition in the step 3-1, and the specific judgment standard is as follows:
if D is less than 30cm, determining that the distance is too close;
if D is larger than 50 cm, determining that the distance is too far;
if alpha is larger than 0 degree and smaller than or equal to 70 degrees, the current sitting posture is judged to be the left head deviation;
if the beta is larger than 0 degree and smaller than or equal to 70 degrees, judging that the current sitting posture is the head right deviation;
if lx-lxN>T1If so, judging that the current sitting posture is left-leaning;
if rxN-rx>T2Judging that the current sitting posture is right inclination;
if | lsyN-rsyN|>T4Judging that the current sitting posture is not parallel to the shoulders;
if byN-by>T3Judging that the current sitting posture is spinal curvature;
if the situation is other than the above situation, the current sitting posture is judged to be the correct sitting posture;
and 3-3, if the sitting postures are continuously judged to be the same incorrect sitting posture for 3 times, voice broadcasting is carried out to remind the user, when more than two incorrect sitting postures are continuously and simultaneously presented for 3 times, the sitting posture with the highest priority level is broadcasted during voice broadcasting, and the priorities of the 8 incorrect sitting postures are sequentially too close, too far, left head deviation, right head deviation, left body deviation, right body deviation, uneven shoulders and bent spine from high to low.
CN202011088718.2A 2020-10-13 2020-10-13 Human body sitting posture identification method based on key point detection Active CN112364694B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011088718.2A CN112364694B (en) 2020-10-13 2020-10-13 Human body sitting posture identification method based on key point detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011088718.2A CN112364694B (en) 2020-10-13 2020-10-13 Human body sitting posture identification method based on key point detection

Publications (2)

Publication Number Publication Date
CN112364694A true CN112364694A (en) 2021-02-12
CN112364694B CN112364694B (en) 2023-04-18

Family

ID=74507159

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011088718.2A Active CN112364694B (en) 2020-10-13 2020-10-13 Human body sitting posture identification method based on key point detection

Country Status (1)

Country Link
CN (1) CN112364694B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113034322A (en) * 2021-04-01 2021-06-25 珠海爱浦京软件股份有限公司 Internet-based online education supervision system and method
CN113112770A (en) * 2021-03-11 2021-07-13 合肥视其佳科技有限公司 Detection method for recognizing head postures of students
CN113627369A (en) * 2021-08-16 2021-11-09 南通大学 Action recognition and tracking method in auction scene
CN113657271A (en) * 2021-08-17 2021-11-16 上海科技大学 Sitting posture detection method and system combining quantifiable factors and non-quantifiable factors for judgment
CN113743255A (en) * 2021-08-18 2021-12-03 广东机电职业技术学院 Neural network-based child sitting posture identification and correction method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190313915A1 (en) * 2015-06-14 2019-10-17 Facense Ltd. Posture-adjusted calculation of physiological signals
CN111382653A (en) * 2018-12-29 2020-07-07 沈阳新松机器人自动化股份有限公司 Human body sitting posture monitoring method
CN111414780A (en) * 2019-01-04 2020-07-14 卓望数码技术(深圳)有限公司 Sitting posture real-time intelligent distinguishing method, system, equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190313915A1 (en) * 2015-06-14 2019-10-17 Facense Ltd. Posture-adjusted calculation of physiological signals
CN111382653A (en) * 2018-12-29 2020-07-07 沈阳新松机器人自动化股份有限公司 Human body sitting posture monitoring method
CN111414780A (en) * 2019-01-04 2020-07-14 卓望数码技术(深圳)有限公司 Sitting posture real-time intelligent distinguishing method, system, equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郭园;郭晨旭;时新;申黎明;: "基于OpenPose学习坐姿分析的桌椅人机适应性研究" *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113112770A (en) * 2021-03-11 2021-07-13 合肥视其佳科技有限公司 Detection method for recognizing head postures of students
CN113034322A (en) * 2021-04-01 2021-06-25 珠海爱浦京软件股份有限公司 Internet-based online education supervision system and method
CN113034322B (en) * 2021-04-01 2024-02-02 珠海爱浦京软件股份有限公司 Internet-based online education supervision system and method
CN113627369A (en) * 2021-08-16 2021-11-09 南通大学 Action recognition and tracking method in auction scene
CN113657271A (en) * 2021-08-17 2021-11-16 上海科技大学 Sitting posture detection method and system combining quantifiable factors and non-quantifiable factors for judgment
CN113657271B (en) * 2021-08-17 2023-10-03 上海科技大学 Sitting posture detection method and system combining quantifiable factors and unquantifiable factor judgment
CN113743255A (en) * 2021-08-18 2021-12-03 广东机电职业技术学院 Neural network-based child sitting posture identification and correction method and system

Also Published As

Publication number Publication date
CN112364694B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
CN112364694B (en) Human body sitting posture identification method based on key point detection
CN107169453B (en) Sitting posture detection method based on depth sensor
KR101930652B1 (en) Gait analysis system and computer program recorded on recording medium
CN112069933A (en) Skeletal muscle stress estimation method based on posture recognition and human body biomechanics
CN106022378B (en) Sitting posture judgment method and based on camera and pressure sensor cervical spondylosis identifying system
CN111931733B (en) Human body posture detection method based on depth camera
US9576191B2 (en) Posture estimation device, posture estimation method, and posture estimation program
CN110934591B (en) Sitting posture detection method and device
CN109785396A (en) Writing posture monitoring method based on binocular camera, system, device
CN112990137B (en) Classroom student sitting posture analysis method based on template matching
CN110059670B (en) Non-contact measuring method and equipment for head and face, limb movement angle and body posture of human body
CN112115827A (en) Falling behavior identification method based on human body posture dynamic characteristics
Bei et al. Sitting posture detection using adaptively fused 3D features
CN112990011A (en) Body-building action recognition and evaluation method based on machine vision and deep learning
CN115240247A (en) Recognition method and system for detecting motion and posture
CN109674477A (en) Computer vision Postural Analysis method based on deep learning
CN113191200A (en) Push-up test counting method, device, equipment and medium
CN113065532A (en) Sitting posture geometric parameter detection method and system based on RGBD image
CN113221815A (en) Gait identification method based on automatic detection technology of skeletal key points
JP6525181B1 (en) Behavior estimation device
CN116580359A (en) Construction personnel severe fatigue detection system and method based on space-time characteristics
CN106327484B (en) A method of it is assessed for dentist's operation posture
US20230368408A1 (en) Posture Detection Apparatus, Posture Detection Method, and Sleeping Posture Determination Method
Otsuka et al. Joint position registration between OpenPose and motion analysis for rehabilitation
CN113361333B (en) Non-contact type riding motion state monitoring method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant