CN108170281B - A kind of work posture analysis system measuring method - Google Patents

A kind of work posture analysis system measuring method Download PDF

Info

Publication number
CN108170281B
CN108170281B CN201810051149.0A CN201810051149A CN108170281B CN 108170281 B CN108170281 B CN 108170281B CN 201810051149 A CN201810051149 A CN 201810051149A CN 108170281 B CN108170281 B CN 108170281B
Authority
CN
China
Prior art keywords
coordinate points
points
vector
work
shoulder
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810051149.0A
Other languages
Chinese (zh)
Other versions
CN108170281A (en
Inventor
姜盛乾
刘鹏
张开淦
张昕莹
陈雪纯
黄卓
徐杨
高大伟
曹明钰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin University
Original Assignee
Jilin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin University filed Critical Jilin University
Priority to CN201810051149.0A priority Critical patent/CN108170281B/en
Publication of CN108170281A publication Critical patent/CN108170281A/en
Application granted granted Critical
Publication of CN108170281B publication Critical patent/CN108170281B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Abstract

The invention belongs to body feeling interaction field, specifically a kind of work posture analysis system measuring method.This approach includes the following steps:Step 1: start Kinect V2 equipment, and when acquisition operations people is upright 25 skeleton points template data;Step 2: judging the score value on operating personnel head;Step 3: judging the score value at operating personnel back;Step 4: judging the score value of operating personnel leg;Step 5: judging the score value of operating personnel's arm;Step 6: finally being graded by table according to the score value at four positions.The present invention is one kind by Kinect V2 catcher's body skeleton datas, the head of operating personnel, back, arm and leg are evaluated, the work posture analysis system measuring method of the rank finally acted according to action grade separation, has filled up the blank on existing market.

Description

A kind of work posture analysis system measuring method
Technical field
The invention belongs to body feeling interaction field, specifically a kind of work posture analysis system measuring method.
Background technology
Work posture analysis system (OWAS) is proposed the 1970s by an iron company of Finland earliest.By There are a large amount of weight muscle power works, incorrect work posture that a large amount of workers is brought to apply because of professional industrial injury in the production of steel at that time Sick leave and early retirement, thus the system of complete set is urgently established to define various work postures and improve to it.
Invention content
The present invention provides one kind by Kinect V2 catcher's body skeleton datas, head, the back of the body to operating personnel Portion, arm and leg are evaluated, and the work posture analysis system of the rank finally acted according to action grade separation is surveyed Calculation method has filled up the blank on existing market.
Technical solution of the present invention is described with reference to the drawings as follows:
A kind of work posture analysis system measuring method, this approach includes the following steps:
Step 1: start Kinect V2 equipment, and when acquisition operations people is upright 25 skeleton points template data;
Step 2: judging the score value on operating personnel head;
Step 3: judging the score value at operating personnel back;
Step 4: judging the score value of operating personnel leg;
Step 5: judging the score value of operating personnel's arm;
Step 6: finally being graded by table according to the score value at four positions.
The specific method is as follows for the step one:
Start Kinect V2, face operating personnel, and when acquisition operations people is upright 25 skeleton points template data;Its In 25 position skeleton points include head, neck, shoulder center, left thumb, right thumb, left finger tip, right finger tip, left hand, the right hand, left hand Wrist, right finesse, left elbow, right elbow, left shoulder, right shoulder, backbone, hip joint center, left stern, left knee, left ankle, left foot, the right side Stern, right knee, right ankle, right crus of diaphragm.
The specific method is as follows for the step two:
Acquire human body head coordinate points A1(x1,y1,z1), neck coordinate points A2(x2,y2,z2), shoulder centre coordinate point A3(x3,y3, z3), left shoulder coordinate points A4(x4,y4,z4), right shoulder coordinate points A5(x5,y5,z5), left foot coordinate points A6(x6,y6,z6), left foot Ankle coordinate points A7(x7,y7,z7), right crus of diaphragm coordinate points A8(x8,y8,z8), right ankle coordinate points A9(x9,y9,z9);Pass through KinectV2 Facial recognition module capture people left eye coordinates point A10(x10,y10,z10), right eye coordinate points A11(x11,y11,z11);
At this point, the vector of neck to the end is
The vector of shoulder center to neck is
The vector of left shoulder to right shoulder is
The vector of left ankle to left foot is
The vector of right ankle to right crus of diaphragm is
The vector of left eye to right eye is
Calculate angle
Choose θ6=min | θ3- 90 ° |, | θ4- 90 ° | }, it is false If
θ6=| θ3- 90 ° |, it calculatesIt is on the contrary
Work as θ1>=135 ° and 135 ° >=θ2>=45 ° and 135 ° >=θ7At >=45 °, the zone of action of operating personnel is in head Free space I, and remember 1 point;
Work as θ1135 ° of < and θ745 ° of < and θ5When 10 ° of <, the zone of action of operating personnel is in head and is bent region forward II, and remember 2 points;
Work as θ2135 ° of > or θ2When 45 ° of <, the zone of action of operating personnel is in head lateral bending region III, and remembers 3 points;
Work as θ1135 ° of < and θ7135 ° of > and θ5When 10 ° of <, the zone of action of operating personnel is in head and faces upward region IV, and And 4 points of note;
When zone of action is not in head lateral bending region III, then can determine whether to turn round by eye vector and the angle of shoulder vector Turn, as 135 ° >=θ2>=45 ° and θ5When 10 ° of >, the zone of action of operating personnel is in head twist range V, and remembers 5 points;
When other angles, the zone of action of operating personnel is in torsion lateral bending region VI, and remembers 6 points.
The specific method is as follows for the step three:
Acquire shoulder centre coordinate point A3(x3,y3,z3), left shoulder coordinate points A4(x4,y4,z4), right shoulder coordinate points A5(x5, y5,z5), backbone coordinate points A12(x12,y12,z12), hip joint centre coordinate point A13(x13,y13,z13), left stern coordinate points A14 (x14,y14,z14), right stern coordinate points A15(x15,y15,z15);
At this point, the vector of left shoulder to right shoulder is
The vector at backbone to shoulder center is
The vector of hip joint center to backbone is
The vector of left stern to right stern is
Calculate angle
Work as θ8>=135 ° and θ9When 10 ° of <, operating personnel are in the upright range in back, and remember 1 point;
Work as θ8135 ° of < and θ9When 10 ° of <, operating personnel are in back and bend over range, and remember 2 points;
Work as θ8>=135 ° and θ9At >=10 °, operating personnel are in back slew range, and remember 3 points;
Work as θ8135 ° of < and θ9At >=10 °, operating personnel are in back and bend over to add slew range, and remember 4 points.
The specific method is as follows for the step four:
Acquire backbone coordinate points A12(x12,y12,z12), hip joint centre coordinate point A13(x13,y13,z13), left stern coordinate points A14(x14,y14,z14), right stern wing coordinate points A15(x15,y15,z15), left knee coordinate points A16(x16,y16,z16), right knee coordinate Point A17(x17,y17,z17), left ankle coordinate points A7(x7,y7,z7), right ankle coordinate points A9(x9,y9,z9);
At this point, the vector of hip joint center to backbone is
The vector of left knee to left stern is
The vector of left ankle to left knee is
The vector of right knee to right stern is
The vector of right ankle to right knee is
Calculate angle
Left ankle is to left stern distance
Right ankle is to right stern distance
Work as θ10When 95 ° of <, operator is in sitting range, and remembers 1 point;
Work as θ10>=95 ° and θ1160 ° of < and θ1260 ° of < and | l2-l1| when < 5cm, operator is in standing state, and remembers 2 points;
Work as θ10>=95 ° and θ1160 ° of < and θ1260 ° of < and | l2-l1When | >=5cm, operator is in the state that stands on one leg, and And 3 points of note;
Work as θ10>=95 ° and θ11>=60 ° and θ12>=60 ° and | l2-l1| when < 5cm, operator is in double gonocampsis standing shapes State, and remember 4 points;
Work as θ10>=95 ° and θ11>=60 ° and θ12>=60 ° and | l2-l1When | >=5cm, operator is in single knee bends and stands State, and remember 5 points.
The specific method is as follows for the step five:
Acquire left hand coordinate points A18(x18,y18,z18), right-handed scale (R.H.scale) point A19(x19,y19,z19), shoulder centre coordinate point A3 (x3,y3,z3), backbone coordinate points A12(x12,y12,z12), hip joint centre coordinate point A13(x13,y13,z13), left stern coordinate points A14 (x14,y14,z14), right stern coordinate points A15(x15,y15,z15), left knee coordinate points A16(x16,y16,z16), right knee coordinate points A17 (x17,y17,z17), left ankle coordinate points A7(x7,y7,z7), right ankle coordinate points A9(x9,y9,z9), left foot A20(x20,y20, z20), right crus of diaphragm A21(x21,y21,z21);
When leg judgement is sitting, chooses left stern coordinate points, right stern coordinate points and right knee coordinate points and set up vectorPlane 1 is built, first really Recognize normal vectorIt can obtain
Wherein it is set as B1For (y15-y14)×(z17-z15)-(z15-z14)×(y17-y15),
B2For (z15-z14)×(x17-x15)-(x15-x14)×(z17-z15),
B3For (x15-x14)×(y17-y15)-(y15-y14)×(x17-x15)。
Normal vector is n=(B1,B2,B3), plane equation is B1(x-x14)+B2(y-y14)+B3(z-z14)=0, abbreviation B can be obtained1x+B2y+B3Z=B1x14+B2y14+B3z14
If B1x14+B2y14+B3z14For C, 1 equation of plane is represented by B1x+B2y+B3Z=C;
Calculate distance
Offset distance
Build plane 2:B1x+B2y+B3Z=C+l7
When leg judgement is non-sitting, work as l1> l2, choose left ankle, left foot and right crus of diaphragm coordinate points and set up vector, according to flat Face building mode obtains plane 1;
Calculate distance
Offset distanceIt can obtain Plane 2;
Work as l1≤l2, choose right ankle, left foot and right crus of diaphragm coordinate points and set up vector, plane is obtained according to plane building mode 1;
Calculate distance
Offset distanceIt can obtain Plane 2;
Left hand and the right hand are calculated to plane 1, the distance of plane 2, distance of the left hand apart from plane 1 beDistance apart from plane 2 isThe right hand Distance apart from plane 1 isDistance apart from plane 2 is
Work as d1< l7And d2< l7And d3< l7And d4< l7Or d1< l7And d2< l7And d3< d4And d4≥l7Or d1< d2And d2≥l7And d3< l7And d4< l7Or d1< d2And d2≥l7And d3< d4And d4≥l7When, operator's both hands are under shoulder, and are remembered 1 point;
Work as d1≥l7And d2< d1And d3< l7And d4< l7Or d1≥l7And d2< d1And d3< d4And d4≥l7Or d1< l7And d2< l7And d3> d4And d4≥l7Or d1< d2And d2≥l7And d3> d4And d3≥l7When, operator's one hand is in shoulder or more, and And 2 points of note;
Work as d1≥l7And d2< d1And d3≥l7And d4< d3When, operator's both hands are in shoulder or more, and remember 3 points.
The specific method is as follows for the step six:
The work posture row of operating personnel is checked according to the score of the head of the operator of calculating, back, leg and arm Dynamic table of grading obtains final grading;Final grading is divided into Pyatyi, respectively grade 1, grade 2, grade 3, class 4 and grade 5;Grade 1 is expressed as normal posture, can receive, and need not handle;Grade 2 is to need to take improvement in the recent period by slight hazard Measure;Grade 3, which is posture, apparent harm, needs to take Improving Measurements as early as possible;Class 4 has more serious harm, needs immediately Take Improving Measurements;Class 5 has serious harm, need to stop immediately.
Beneficial effects of the present invention are:
The judge of work posture analysis system can be designed to the receptible mathematical method of computer by the present invention, reduce engineering The workload of teacher.
Description of the drawings
Fig. 1 head movement area schematics;
The backs Fig. 2 flexible movable angle schematic diagram;
The backs Fig. 3 flexible movable angle, apart from schematic diagram;
The judgement of the legs Fig. 4 is sitting offset distance schematic diagram;
The judgement of the legs Fig. 5 is non-sitting and l1> l2Offset distance schematic diagram;
The judgement of the legs Fig. 6 is non-sitting and l1≤l2Offset distance schematic diagram.
Specific implementation mode
The present invention is from work posture analysis system, it is intended to be calculated automatically using body feeling interaction technology and computer technology Go out to participate in the head of the personnel of Virtual assemble, back, leg, arm state, and the skeleton acquired by Kinect V2 It puts to calculate specific action, to be evaluated.
The specific method is as follows:
Step 1: start Kinect V2 equipment, face operating personnel, and 25 skeleton points when acquisition operations people is upright Template data;
Wherein 25 position skeleton points include head, neck, shoulder center, left thumb, right thumb, left finger tip, right finger tip, left hand, The right hand, left finesse, right finesse, left elbow, right elbow, left shoulder, right shoulder, backbone, hip joint center, left stern, left knee, left foot Ankle, left foot, right stern, right knee, right ankle, right crus of diaphragm.
Step 2: judging the score value on operating personnel head;
Acquire human body head coordinate points A1(x1,y1,z1), neck coordinate points A2(x2,y2,z2), shoulder centre coordinate point A3(x3,y3, z3), left shoulder coordinate points A4(x4,y4,z4), right shoulder coordinate points A5(x5,y5,z5), left foot coordinate points A6(x6,y6,z6), left foot Ankle coordinate points A7(x7,y7,z7), right crus of diaphragm coordinate points A8(x8,y8,z8), right ankle coordinate points A9(x9,y9,z9);Pass through KinectV2 Facial recognition module capture people left eye coordinates point A10(x10, y10, z10), right eye coordinate points A11(x11, y11, z11);
At this point, the vector of neck to the end is
The vector of shoulder center to neck is
The vector of left shoulder to right shoulder is
The vector of left ankle to left foot is
The vector of right ankle to right crus of diaphragm is
The vector of left eye to right eye is
Calculate angle
Choose θ6=min | θ3- 90 ° |, | θ4- 90 ° | }, it is false If
θ6=| θ3- 90 ° |, it calculatesIt is on the contrary
Refering to fig. 1, work as θ1>=135 ° and 135 ° >=θ2>=45 ° and 135 ° >=θ7At >=45 °, the zone of action of operating personnel In head free space I, and remember 1 point;
Work as θ1135 ° of < and θ745 ° of < and θ5When 10 ° of <, the zone of action of operating personnel is in head and is bent region forward II, and remember 2 points;
Work as θ2135 ° of > or θ2When 45 ° of <, the zone of action of operating personnel is in head lateral bending region III, and remembers 3 points;
Work as θ1135 ° of < and θ7135 ° of > and θ5When 10 ° of <, the zone of action of operating personnel is in head and faces upward region IV, and And 4 points of note;
When zone of action is not in head lateral bending region III, then can determine whether to turn round by eye vector and the angle of shoulder vector Turn, as 135 ° >=θ2>=45 ° and θ5When 10 ° of >, the zone of action of operating personnel is in head twist range V, and remembers 5 points;
When other angles, the zone of action of operating personnel is in torsion lateral bending region VI, and remembers 6 points.
Step 3: judging the score value at operating personnel back;
Acquire shoulder centre coordinate point A3(x3,y3,z3), left shoulder coordinate points A4(x4,y4,z4), right shoulder coordinate points A5(x5, y5,z5), backbone coordinate points A12(x12,y12,z12), hip joint centre coordinate point A13(x13,y13,z13), left stern coordinate points A14 (x14,y14,z14), right stern coordinate points A15(x15,y15,z15);
At this point, the vector of left shoulder to right shoulder is
The vector at backbone to shoulder center is
The vector of hip joint center to backbone is
The vector of left stern to right stern is
Referring to Fig.2, calculating angle
Work as θ8>=135 ° and θ9When 10 ° of <, operating personnel are in the upright range in back, and remember 1 point;
Work as θ8135 ° of < and θ9When 10 ° of <, operating personnel are in back and bend over range, and remember 2 points;
Work as θ8>=135 ° and θ9At >=10 °, operating personnel are in back slew range, and remember 3 points;
Work as θ8135 ° of < and θ9At >=10 °, operating personnel are in back and bend over to add slew range, and remember 4 points.
Step 4: judging the score value of operating personnel leg;
The specific method is as follows for the step four:
Acquire backbone coordinate points A12(x12,y12,z12), hip joint centre coordinate point A13(x13,y13,z13), left stern coordinate points A14(x14,y14,z14), right stern wing coordinate points A15(x15,y15,z15), left knee coordinate points A16(x16,y16,z16), right knee coordinate Point A17(x17,y17,z17), left ankle coordinate points A7(x7,y7,z7), right ankle coordinate points A9(x9,y9,z9);
At this point, the vector of hip joint center to backbone is
The vector of left knee to left stern is
The vector of left ankle to left knee is
The vector of right knee to right stern is
The vector of right ankle to right knee is
Refering to Fig. 3, angle is calculated
Left ankle is to left stern distance
Right ankle is to right stern distance
Work as θ10When 95 ° of <, operator is in sitting range, and remembers 1 point;
Work as θ10>=95 ° and θ1160 ° of < and θ1260 ° of < and | l2-l1| when < 5cm, operator is in standing state, and remembers 2 points;
Work as θ10>=95 ° and θ1160 ° of < and θ1260 ° of < and | l2-l1When | >=5cm, operator is in the state that stands on one leg, and And 3 points of note;
Work as θ10>=95 ° and θ11>=60 ° and θ12>=60 ° and | l2-l1| when < 5cm, operator is in double gonocampsis standing shapes State, and remember 4 points;
Work as θ10>=95 ° and θ11>=60 ° and θ12>=60 ° and | l2-l1When | >=5cm, operator is in single knee bends and stands State, and remember 5 points.
Step 5: judging the score value of operating personnel's arm;
Acquire left hand coordinate points A18(x18,y18,z18), right-handed scale (R.H.scale) point A19(x19,y19,z19), shoulder centre coordinate point A3 (x3,y3,z3), backbone coordinate points A12(x12,y12,z12), hip joint centre coordinate point A13(x13,y13,z13), left stern coordinate points A14 (x14,y14,z14), right stern coordinate points A15(x15,y15,z15), left knee coordinate points A16(x16,y16,z16), right knee coordinate points A17 (x17,y17,z17), left ankle coordinate points A7(x7,y7,z7), right ankle coordinate points A9(x9,y9,z9), left foot A20(x20,y20, z20), right crus of diaphragm A21(x21,y21,z21);
When leg judgement is sitting, chooses left stern coordinate points, right stern coordinate points and right knee coordinate points and set up vectorPlane 1 is built, first really Recognize normal vectorIt can obtain
Wherein it is set as B1For (y15-y14)×(z17-z15)-(z15-z14)×(y17-y15),
B2For (z15-z14)×(x17-x15)-(x15-x14)×(z17-z15),
B3For (x15-x14)×(y17-y15)-(y15-y14)×(x17-x15)。
Normal vector is n=(B1,B2,B3), plane equation is B1(x-x14)+B2(y-y14)+B3(z-z14)=0, abbreviation B can be obtained1x+B2y+B3Z=B1x14+B2y14+B3z14
If B1x14+B2y14+B3z14For C, 1 equation of plane is represented by B1x+B2y+B3Z=C;
Calculate distance
Refering to Fig. 4, offset distance
Build plane 2:B1x+B2y+B3Z=C+l7
When leg judgement is non-sitting, work as l1> l2, choose left ankle, left foot and right crus of diaphragm coordinate points and set up vector, according to flat Face building mode obtains plane 1;
Calculate distance
Refering to Fig. 5, offset distance
Plane 2 can be obtained;
Work as l1≤l2, choose right ankle, left foot and right crus of diaphragm coordinate points and set up vector, plane is obtained according to plane building mode 1;
Calculate distance
Refering to Fig. 6, offset distance
Plane 2 can be obtained;
Left hand and the right hand are calculated to plane 1, the distance of plane 2, distance of the left hand apart from plane 1 beDistance apart from plane 2 isThe right hand Distance apart from plane 1 isDistance apart from plane 2 is
As (d1< l7And d2< l7And d3< l7And d4< l7) or (d1< l7And d2< l7And d3< d4And d4≥l7) or (d1< d2And d2≥l7And d3< l7And d4< l7) or (d1< d2And d2≥l7And d3< d4And d4≥l7) when, operator's both hands are in shoulder Under, and remember 1 point;
As (d1≥l7And d2< d1And d3< l7And d4< l7) or (d1≥l7And d2< d1And d3< d4And d4≥l7) or (d1< l7And d2< l7And d3> d4And d4≥l7) or (d1< d2And d2≥l7And d3> d4And d3≥l7) when, operator's one hand is in shoulder More than, and remember 2 points;
Work as d1≥l7And d2< d1And d3≥l7And d4< d3When, operator's both hands are in shoulder or more, and remember 3 points.
Step 6: finally being graded by table according to the score value at four positions.
The work posture row of operating personnel is checked according to the score of the head of the operator of calculating, back, leg and arm Dynamic table of grading can obtain final grading;Final grading is divided into Pyatyi, respectively grade 1, grade 2, grade 3, class 4 and waits Grade 5;Grade 1 is expressed as normal posture, can receive, and need not handle;Grade 2 changes by slight hazard, to need to take in the recent period Kind measure;Grade 3, which is posture, apparent harm, needs to take Improving Measurements as early as possible;Class 4 has more serious harm, needs to stand Take Improving Measurements;Class 5 has serious harm, need to stop immediately.The work posture action table of grading of operating personnel enters 1 institute of table Show.
The work posture action grade of 1 operating personnel of table

Claims (1)

1. a kind of work posture analysis system measuring method, which is characterized in that this approach includes the following steps:
Step 1: start Kinect V2 equipment, and when acquisition operations people is upright 25 skeleton points template data;
Step 2: judging the score value on operating personnel head;
Step 3: judging the score value at operating personnel back;
Step 4: judging the score value of operating personnel leg;
Step 5: judging the score value of operating personnel's arm;
Step 6: finally being graded by table according to the score value at four positions;
The specific method is as follows for the step one:
Start Kinect V2, face operating personnel, and when acquisition operations people is upright 25 skeleton points template data;Wherein 25 A position skeleton point includes head, neck, shoulder center, left thumb, right thumb, left finger tip, right finger tip, left hand, the right hand, left finesse, the right side Wrist, left elbow, right elbow, left shoulder, right shoulder, backbone, hip joint center, left stern, left knee, left ankle, left foot, right stern, the right side Knee, right ankle, right crus of diaphragm;
The specific method is as follows for the step two:
Acquire human body head coordinate points A1(x1,y1,z1), neck coordinate points A2(x2,y2,z2), shoulder centre coordinate point A3(x3,y3,z3), it is left Shoulder coordinate points A4(x4,y4,z4), right shoulder coordinate points A5(x5,y5,z5), left foot coordinate points A6(x6,y6,z6), left ankle coordinate Point A7(x7,y7,z7), right crus of diaphragm coordinate points A8(x8,y8,z8), right ankle coordinate points A9(x9,y9,z9);Pass through the face of Kinect V2 Hole identification module captures the left eye coordinates point A of people10(x10,y10,z10), right eye coordinate points A11(x11,y11,z11);
At this point, the vector of neck to the end is
The vector of shoulder center to neck is
The vector of left shoulder to right shoulder is
The vector of left ankle to left foot is
The vector of right ankle to right crus of diaphragm is
The vector of left eye to right eye is
Calculate angle Choose θ6=min | θ3- 90 ° |, | θ4- 90 ° | }, it is assumed that θ6=| θ3- 90 ° |, it calculatesIt is on the contrary
Work as θ1>=135 ° and 135 ° >=θ2>=45 ° and 135 ° >=θ7At >=45 °, the zone of action of operating personnel is in head freedom Region I, and remember 1 point;
Work as θ1<135 ° and θ7<45 ° and θ5<At 10 °, the zone of action of operating personnel is in head and is bent region II forward, and 2 points of note;
Work as θ2>135 ° or θ2<At 45 °, the zone of action of operating personnel is in head lateral bending region III, and remembers 3 points;
Work as θ1<135 ° and θ7>135 ° and θ5<At 10 °, the zone of action of operating personnel is in head and faces upward region IV, and remembers 4 points;
When zone of action is not in head lateral bending region III, then can determine whether to reverse by eye vector and the angle of shoulder vector, As 135 ° >=θ2>=45 ° and θ5>At 10 °, the zone of action of operating personnel is in head twist range V, and remembers 5 points;
When other angles, the zone of action of operating personnel is in torsion lateral bending region VI, and remembers 6 points;
The specific method is as follows for the step three:
Acquire shoulder centre coordinate point A3(x3,y3,z3), left shoulder coordinate points A4(x4,y4,z4), right shoulder coordinate points A5(x5,y5, z5), backbone coordinate points A12(x12,y12,z12), hip joint centre coordinate point A13(x13,y13,z13), left stern coordinate points A14(x14, y14,z14), right stern coordinate points A15(x15,y15,z15);
At this point, the vector of left shoulder to right shoulder is
The vector at backbone to shoulder center is
The vector of hip joint center to backbone is
The vector of left stern to right stern is
Calculate angle
Work as θ8>=135 ° and θ9<At 10 °, operating personnel are in the upright range in back, and remember 1 point;
Work as θ8<135 ° and θ9<At 10 °, operating personnel are in back and bend over range, and remember 2 points;
Work as θ8>=135 ° and θ9At >=10 °, operating personnel are in back slew range, and remember 3 points;
Work as θ8<135 ° and θ9At >=10 °, operating personnel are in back and bend over to add slew range, and remember 4 points;
The specific method is as follows for the step four:
Acquire backbone coordinate points A12(x12,y12,z12), hip joint centre coordinate point A13(x13,y13,z13), left stern coordinate points A14 (x14,y14,z14), right stern wing coordinate points A15(x15,y15,z15), left knee coordinate points A16(x16,y16,z16), right knee coordinate points A17(x17,y17,z17), left ankle coordinate points A7(x7,y7,z7), right ankle coordinate points A9(x9,y9,z9);
At this point, the vector of hip joint center to backbone is
The vector of left knee to left stern is
The vector of left ankle to left knee is
The vector of right knee to right stern isThe vector of right ankle to right knee is
Calculate angle
Left ankle is to left stern distance
Right ankle is to right stern distance
Work as θ10<At 95 °, operator is in sitting range, and remembers 1 point;
Work as θ10>=95 ° and θ11<60 ° and θ12<60 ° and | l2-l1|<When 5cm, operator is in standing state, and remembers 2 points;
Work as θ10>=95 ° and θ11<60 ° and θ12<60 ° and | l2-l1When | >=5cm, operator is in the state that stands on one leg, and remembers 3 Point;
Work as θ10>=95 ° and θ11>=60 ° and θ12>=60 ° and | l2-l1|<When 5cm, operator is in double gonocampsis standing states, and And 4 points of note;
Work as θ10>=95 ° and θ11>=60 ° and θ12>=60 ° and | l2-l1When | >=5cm, operator is in single knee bends standing state, And remember 5 points;
The specific method is as follows for the step five:
Acquire left hand coordinate points A18(x18,y18,z18), right-handed scale (R.H.scale) point A19(x19,y19,z19), shoulder centre coordinate point A3(x3,y3, z3), backbone coordinate points A12(x12,y12,z12), hip joint centre coordinate point A13(x13,y13,z13), left stern coordinate points A14(x14, y14,z14), right stern coordinate points A15(x15,y15,z15), left knee coordinate points A16(x16,y16,z16), right knee coordinate points A17(x17, y17,z17), left ankle coordinate points A7(x7,y7,z7), right ankle coordinate points A9(x9,y9,z9), left foot A20(x20,y20,z20), right crus of diaphragm A21(x21,y21,z21);
When leg judgement is sitting, chooses left stern coordinate points, right stern coordinate points and right knee coordinate points and set up vectorPlane 1 is built, first really Recognize normal vectorIt can obtain
Wherein it is set as B1For (y15-y14)×(z17-z15)-(z15-z14)×(y17-y15),
B2For (z15-z14)×(x17-x15)-(x15-x14)×(z17-z15),
B3For (x15-x14)×(y17-y15)-(y15-y14)×(x17-x15);
Normal vector is n=(B1,B2,B3), plane equation is B1(x-x14)+B2(y-y14)+B3(z-z14)=0, abbreviation can obtain B1x+B2y+B3Z=B1x14+B2y14+B3z14
If B1x14+B2y14+B3z14For C, 1 equation of plane is represented by B1x+B2y+B3Z=C;
Calculate distance
Offset distance
Build plane 2:B1x+B2y+B3Z=C+l7
When leg judgement is non-sitting, work as l1>l2, choose left ankle, left foot and right crus of diaphragm coordinate points and set up vector, according to plane structure The mode of building obtains plane 1;
Calculate distance
Offset distancePlane can be obtained 2;
Work as l1≤l2, choose right ankle, left foot and right crus of diaphragm coordinate points and set up vector, plane 1 is obtained according to plane building mode;
Calculate distance
Offset distancePlane can be obtained 2;
Left hand and the right hand are calculated to plane 1, the distance of plane 2, distance of the left hand apart from plane 1 beDistance apart from plane 2 isThe right hand Distance apart from plane 1 isDistance apart from plane 2 is
Work as d1<l7And d2<l7And d3<l7And d4<l7Or d1<l7And d2<l7And d3<d4And d4≥l7Or d1<d2And d2≥l7And d3<l7And d4<l7Or d1<d2And d2≥l7And d3<d4And d4≥l7When, operator's both hands are under shoulder, and remember 1 point;
Work as d1≥l7And d2<d1And d3<l7And d4<l7Or d1≥l7And d2<d1And d3<d4And d4≥l7Or d1<l7And d2<l7And d3>d4 And d4≥l7Or d1<d2And d2≥l7And d3>d4And d3≥l7When, operator's one hand is in shoulder or more, and remembers 2 points;
Work as d1≥l7And d2<d1And d3≥l7And d4<d3When, operator's both hands are in shoulder or more, and remember 3 points;
The specific method is as follows for the step six:
The work posture action etc. of operating personnel is checked according to the score of the head of the operator of calculating, back, leg and arm Grade table obtains final grading;Final grading is divided into Pyatyi, respectively grade 1, grade 2, grade 3, class 4 and class 5;Deng Grade 1 is expressed as normal posture, can receive, need not handle;Grade 2 is to need to take Improving Measurements in the recent period by slight hazard; Grade 3, which is posture, apparent harm, needs to take Improving Measurements as early as possible;Class 4 has more serious harm, needs to take immediately and changes Kind measure;Class 5 has serious harm, need to stop immediately.
CN201810051149.0A 2018-01-19 2018-01-19 A kind of work posture analysis system measuring method Expired - Fee Related CN108170281B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810051149.0A CN108170281B (en) 2018-01-19 2018-01-19 A kind of work posture analysis system measuring method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810051149.0A CN108170281B (en) 2018-01-19 2018-01-19 A kind of work posture analysis system measuring method

Publications (2)

Publication Number Publication Date
CN108170281A CN108170281A (en) 2018-06-15
CN108170281B true CN108170281B (en) 2018-09-28

Family

ID=62515244

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810051149.0A Expired - Fee Related CN108170281B (en) 2018-01-19 2018-01-19 A kind of work posture analysis system measuring method

Country Status (1)

Country Link
CN (1) CN108170281B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109102887B (en) * 2018-08-20 2019-04-16 吉林大学 A kind of body vibration evaluation method
CN110866417A (en) * 2018-08-27 2020-03-06 阿里巴巴集团控股有限公司 Image processing method and device and electronic equipment
CN109508755B (en) * 2019-01-22 2022-12-09 中国电子科技集团公司第五十四研究所 Psychological assessment method based on image cognition
CN110047591B (en) * 2019-04-23 2023-02-21 吉林大学 Method for evaluating posture of doctor in surgical operation process
CN110321798A (en) * 2019-06-03 2019-10-11 辽宁师范大学 Student classroom is listened to the teacher state automatic identification method
CN111539245B (en) * 2020-02-17 2023-04-07 吉林大学 CPR (CPR) technology training evaluation method based on virtual environment
CN112308914A (en) * 2020-03-06 2021-02-02 北京字节跳动网络技术有限公司 Method, apparatus, device and medium for processing information
CN112674759B (en) * 2020-12-21 2022-04-01 西南交通大学 Baby standing state identification method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103455657A (en) * 2013-06-21 2013-12-18 浙江理工大学 Kinect based field operation simulation method and Kinect based field operation simulation system
CN103827891A (en) * 2011-07-28 2014-05-28 Arb实验室公司 Systems and methods of detecting body movements using globally generated multi-dimensional gesture data
CN104460972A (en) * 2013-11-25 2015-03-25 安徽寰智信息科技股份有限公司 Human-computer interaction system based on Kinect
CN107293175A (en) * 2017-08-04 2017-10-24 华中科技大学 A kind of locomotive hand signal operation training method based on body-sensing technology

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103827891A (en) * 2011-07-28 2014-05-28 Arb实验室公司 Systems and methods of detecting body movements using globally generated multi-dimensional gesture data
CN103455657A (en) * 2013-06-21 2013-12-18 浙江理工大学 Kinect based field operation simulation method and Kinect based field operation simulation system
CN104460972A (en) * 2013-11-25 2015-03-25 安徽寰智信息科技股份有限公司 Human-computer interaction system based on Kinect
CN107293175A (en) * 2017-08-04 2017-10-24 华中科技大学 A kind of locomotive hand signal operation training method based on body-sensing technology

Also Published As

Publication number Publication date
CN108170281A (en) 2018-06-15

Similar Documents

Publication Publication Date Title
CN108170281B (en) A kind of work posture analysis system measuring method
Lindbeck et al. Gender differences in lifting technique
Webster et al. Experimental evaluation of Microsoft Kinect's accuracy and capture rate for stroke rehabilitation applications
CN108187310B (en) Feel that the limb motion of information and posture information is intended to understand and upper-limbs rehabilitation training robot and its control method based on power
CN108762495B (en) Virtual reality driving method based on arm motion capture and virtual reality system
CN102921162B (en) Self-help balance and gait training system and method
US10631751B2 (en) Motion analysis apparatus, method for analyzing motion, and motion analysis program
WO2021114666A1 (en) Human body safety evaluation method and system in human-machine collaboration
CN105469679A (en) Cardio-pulmonary resuscitation assisted training system and cardio-pulmonary resuscitation assisted training method based on Kinect
CN108634957A (en) Healing hand function quantitative evaluating method based on human hand &#34; receiving abduction in finger &#34; action
Sanders et al. Reliability of three-dimensional linear kinematics and kinetics of swimming derived from digitized video at 25 and 50 Hz with 10 and 5 frame extensions to the 4th order Butterworth smoothing window
JP6678492B2 (en) Dynamic balance evaluation device
Chaffin et al. On the validity of an isometric biomechanical model of worker strengths
CN108958478A (en) Action recognition and appraisal procedure are ridden in a kind of operation of Virtual assemble
CN109801709B (en) Hand posture capturing and health state sensing system for virtual environment
Karmegam et al. Anthropometry of Malaysian young adults
Jarvis et al. Pointing the foot without sickling: an examination of ankle movement during jumping
Fazeli et al. A virtual environment for hand motion analysis
Zhu et al. Design of a passive shoulder lifting exoskeleton of human-machine multi-link
Duke et al. Productivity and ergonomic investigation of bent-handle pliers
Marcon et al. Postural assessment in dentistry based on multiple markers tracking
CN113633281A (en) Method and system for evaluating human body posture in assembly and maintenance process
CN109117799B (en) A kind of appraisal procedure for recommending equation based on NIOSH
KR20160118678A (en) Workload estimation and job rotation scheduling for Work-related Musculoskeletal Disorders prevention
Hooke et al. Handwriting: three-dimensional kinetic synergies in circle drawing movements

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20180928

Termination date: 20200119

CF01 Termination of patent right due to non-payment of annual fee