CN113633281A - Method and system for evaluating human body posture in assembly and maintenance process - Google Patents

Method and system for evaluating human body posture in assembly and maintenance process Download PDF

Info

Publication number
CN113633281A
CN113633281A CN202110993155.XA CN202110993155A CN113633281A CN 113633281 A CN113633281 A CN 113633281A CN 202110993155 A CN202110993155 A CN 202110993155A CN 113633281 A CN113633281 A CN 113633281A
Authority
CN
China
Prior art keywords
determining
joint
human body
skeleton
joint node
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110993155.XA
Other languages
Chinese (zh)
Inventor
周栋
陈承璋
郭子玥
周启迪
仵宏铎
梁宇宁
王妍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN202110993155.XA priority Critical patent/CN113633281A/en
Publication of CN113633281A publication Critical patent/CN113633281A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/745Details of notification to user or communication with user or patient ; user input means using visual displays using a holographic display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/20Workers

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention relates to a method and a system for evaluating human body posture in the process of assembly and maintenance. The method comprises the steps of acquiring a color image and a depth image of an implementer in real time in the process of assembly and maintenance; determining joint nodes and a skeleton of the practitioner in real time according to the depth image; matching and superposing the joint node and the skeleton of the implementer at the same moment and the color image at the corresponding moment; determining the injury risk score of each joint node by adopting an RULA analysis method for the superposed color images; and displaying the scoring result; when the injury risk score of the joint node exceeds a set threshold, recording the current pose of an implementer and the position of the joint node of which the injury risk score exceeds the set threshold; and sending out an emergency warning. The invention improves the accuracy of human posture assessment in the process of assembly and maintenance.

Description

Method and system for evaluating human body posture in assembly and maintenance process
Technical Field
The invention relates to the field of human body posture assessment, in particular to a method and a system for assessing a human body posture in an assembly and maintenance process.
Background
During assembly and maintenance, some practitioners are at risk of injury due to improper posture. Among them, the Rapid Upper Limb Assessment (RULA) analysis method is an important human posture assessment tool in human engineering, and has the advantages of rapidness and easy implementation and more accurate assessment results, so that it is generally accepted by people and applied to many industrial engineering.
The traditional RULA analysis method has low efficiency, high labor cost, serious cleavage in the analysis process and the assembly and maintenance process, and extremely poor reproducibility and later error correction capability; at present, no method for quickly constructing a simulation flow is available for virtual maintenance, and an operator can transfer the real simulation flow to a computer by subjectively utilizing a key point creating mode only through a maintenance manual or video display, so that the shortest time is required for several days.
Therefore, it is desirable to provide a method and system for analyzing the ergonomic condition of the user during the assembly and maintenance work quickly and accurately, so as to improve the accuracy of the human posture assessment during the assembly and maintenance work.
Disclosure of Invention
The invention aims to provide a method and a system for evaluating human body posture in the process of assembly and maintenance, which improve the accuracy of evaluation on human body posture in the process of assembly and maintenance.
In order to achieve the purpose, the invention provides the following scheme:
a method for evaluating the posture of a human body in the process of assembly and maintenance comprises the following steps:
in the process of assembly and maintenance, acquiring a color image and a depth image of an implementer in real time;
determining joint nodes and a skeleton of the practitioner in real time according to the depth image;
matching and superposing the joint node and the skeleton of the implementer at the same moment and the color image at the corresponding moment;
determining the injury risk score of each joint node by adopting a rapid upper limb assessment RULA analysis method for the superposed color image; and displaying the scoring result;
when the injury risk score of the joint node exceeds a set threshold, recording the current pose of an implementer and the position of the joint node of which the injury risk score exceeds the set threshold; and sending out an emergency warning.
Optionally, the determining the joint nodes and the skeleton of the practitioner in real time according to the depth image specifically includes:
determining three-dimensional coordinates of 25 human body joint nodes of the implementer by utilizing a Kinect V2.0 somatosensory sensor according to the depth image; the 25 human joint nodes comprise: a head, a neck, a left shoulder, a left elbow, a left wrist, a left palm, a left thumb, a left middle fingertip, a right shoulder, a right elbow, a right wrist, a right palm, a right thumb, a right middle fingertip, an upper spine, a middle spine, a lower spine, a left crotch, a left knee, a left ankle, a left foot, a right crotch, a right knee, a right ankle, and a right foot;
determining a three-dimensional graph of the human body joint nodes according to the three-dimensional coordinates of the 25 human body joint nodes;
converting the three-dimensional graph of the human joint node into a two-dimensional graph; and determining human skeleton according to adjacent joint nodes in the two-dimensional graph, and further determining a two-dimensional projection graph of the human skeleton.
Optionally, the determining joint nodes and skeletons of the practitioner in real time according to the depth image further includes:
the position of each joint node and bone is corrected.
Optionally, determining an injury risk score of each joint node by using a RULA analysis method for the superposed color images; and displaying the scoring result, which specifically comprises the following steps:
determining an included angle between adjacent bones in real time according to the superposed color images;
determining an included angle between adjacent bones in real time, and determining an injury risk score of each joint node by adopting an RULA analysis method;
and displaying the scoring result on a two-dimensional projection drawing of the human skeleton.
A system for assessing the pose of a human body during assembly and maintenance, comprising:
the data acquisition module is used for acquiring a color image and a depth image of an implementer in real time in the process of assembly and maintenance;
the depth image processing module is used for determining the joint nodes and the skeleton of the implementer in real time according to the depth image;
the color image processing module is used for matching and superposing the joint nodes and the frameworks of the implementers at the same moment and the color images at the corresponding moment;
the RULA evaluation module is used for determining the injury risk score of each joint node by adopting an RULA analysis method on the superposed color images; and displaying the scoring result;
the RULA result display module is used for recording the current pose of an implementer and the position of the joint node of which the injury risk score exceeds a set threshold when the injury risk score of the joint node exceeds the set threshold; and sending out an emergency warning.
Optionally, the depth image processing module specifically includes:
the three-dimensional coordinate determination unit of the joint node is used for determining the three-dimensional coordinates of the joint nodes of 25 human bodies of the implementer by using a Kinect V2.0 somatosensory sensor according to the depth image; the 25 human joint nodes comprise: a head, a neck, a left shoulder, a left elbow, a left wrist, a left palm, a left thumb, a left middle fingertip, a right shoulder, a right elbow, a right wrist, a right palm, a right thumb, a right middle fingertip, an upper spine, a middle spine, a lower spine, a left crotch, a left knee, a left ankle, a left foot, a right crotch, a right knee, a right ankle, and a right foot;
the human body joint node three-dimensional graph determining unit is used for determining a human body joint node three-dimensional graph according to the three-dimensional coordinates of 25 human body joint nodes;
the human body skeleton two-dimensional projection graph determining unit is used for converting the human body joint node three-dimensional graph into a two-dimensional graph; and determining human skeleton according to adjacent joint nodes in the two-dimensional graph, and further determining a two-dimensional projection graph of the human skeleton.
Optionally, the depth image processing module further includes:
and the correction unit is used for correcting the position of each joint node and the bone.
Optionally, the RULA evaluation module specifically includes:
the included angle determining unit is used for determining the included angle between the adjacent bones in real time according to the superposed color images;
the injury risk score determining unit is used for determining an included angle between adjacent bones in real time and determining an injury risk score of each joint node by adopting a rapid upper limb assessment RULA analysis method;
and the scoring result display unit is used for displaying the scoring result on the two-dimensional projection drawing of the human body skeleton.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
the invention provides a method and a system for evaluating human body posture in the process of assembly and maintenance, which are characterized in that joint nodes and a skeleton of an implementer at the same moment are matched and superposed with a color image at the corresponding moment, the real-time pose of the implementer is determined by utilizing the superposed color image, the injury risk score of each joint node is determined by adopting an RULA analysis method, and the score result is displayed; a clear judgment standard is indicated, the ergonomic condition of an assembly maintenance worker in reality is analyzed quickly and accurately, and the result has higher reproducibility and recheckability; visually displaying the change trend of the recent injury risk of the implementer; the accuracy of human posture assessment in the assembling and maintaining process is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a schematic flow chart of a method for evaluating human body posture during assembly and maintenance according to the present invention;
FIG. 2 is a two-dimensional projection of a human skeleton;
FIG. 3 is a joint node topology diagram;
fig. 4 is a schematic structural diagram of a system for evaluating human body posture during assembly and maintenance provided by the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention aims to provide a method and a system for evaluating human body posture in the process of assembly and maintenance, which improve the accuracy of evaluation on human body posture in the process of assembly and maintenance.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Fig. 1 is a schematic flow chart of a method for evaluating a posture of a human body during an assembly and maintenance process, as shown in fig. 1, the method for evaluating a posture of a human body during an assembly and maintenance process, provided by the present invention, includes:
s101, in the process of assembly and maintenance, acquiring a color image and a depth image of an implementer in real time;
the data specifically require that as shown in table 1:
TABLE 1
Figure BDA0003229958250000051
S102, determining joint nodes and a skeleton of the implementer in real time according to the depth image;
s102 specifically comprises the following steps:
determining three-dimensional coordinates of 25 human body joint nodes of the implementer by utilizing a Kinect V2.0 somatosensory sensor according to the depth image; the 25 human joint nodes comprise: a head, a neck, a left shoulder, a left elbow, a left wrist, a left palm, a left thumb, a left middle fingertip, a right shoulder, a right elbow, a right wrist, a right palm, a right thumb, a right middle fingertip, an upper spine, a middle spine, a lower spine, a left crotch, a left knee, a left ankle, a left foot, a right crotch, a right knee, a right ankle, and a right foot;
determining a three-dimensional graph of the human body joint nodes according to the three-dimensional coordinates of the 25 human body joint nodes;
converting the three-dimensional graph of the human joint nodes into a two-dimensional graph, wherein the topological graph of the joint nodes is shown in FIG. 3; and determining human skeleton according to adjacent joint nodes in the two-dimensional graph, and further determining a two-dimensional projection graph of the human skeleton, as shown in fig. 2.
Kinect specifies that when a person is facing forward towards Kinect, the back of the person is the positive X direction, the top is the positive Y direction, and the left is the positive Z direction. The nodes are placed in a three-dimensional space specified by Kinect SDK according to three-dimensional coordinates, then the three-dimensional space is projected to a two-dimensional plane, adjacent nodes on a human body structure are connected through straight lines to form a human body skeleton two-dimensional projection graph, and line segments connected among the nodes are called as skeletons.
S102 further comprises:
the position of each joint node and bone is corrected.
And calculating the difference value between the current value of the positions of each joint node and each bone and the value of the previous moment in each moment, and if the difference value exceeds a set value, correcting.
S103, matching and superposing the joint nodes and the skeleton of the implementer at the same moment and the color image at the corresponding moment;
s104, determining an injury risk score of each joint node by adopting a rapid upper limb assessment RULA analysis method for the superposed color image; and displaying the scoring result;
s104 specifically comprises the following steps:
determining an included angle between adjacent bones in real time according to the superposed color images;
in particular, using formulae
Figure BDA0003229958250000061
Determining an included angle between adjacent bones;
wherein, α is (x1-x2, y1-y2, z1-z2), θ is the angle between two bones, α is the vector of one of the bones, (x1, y1, z1), (x2, y2, z2) are the three-dimensional coordinates of the nodes at two ends of the bones, and the process is fixed to be carried out once every 0.1 second.
Determining an included angle between adjacent bones in real time, and determining an injury risk score of each joint node by adopting an RULA analysis method;
and displaying the scoring result on a two-dimensional projection drawing of the human skeleton.
The specific analysis is as follows:
(1) judging the position of the big arm, wherein the included angle between the big arm and the trunk is calculated as +1 minute when the big arm naturally hangs on two sides of the body, +2 minutes when the big arm extends backwards and exceeds 20 degrees with the trunk, +2 minutes when the big arm extends forwards at an angle of 20-45 degrees, +3 minutes when the big arm extends forwards at an angle of 45-90 degrees, and +4 minutes when the big arm extends forwards at an angle of 90 degrees.
Then the position of the shoulder is judged, the +1 point is given when the shoulder is lifted, the +1 point is given when the arm is lifted completely, and the-1 point is given when the arm has a support.
The above scores are added up to a big arm score.
(2) And judging the score of the forearm, wherein the score is +1 when the included angle between the forearm and the large arm is 60-100 degrees, the score is +2 at other times, and the score is finally obtained as the score of the forearm when the forearm passes through the central axis of the body or is far away from the body and is + 1. In the method, the raw materials are mixed,calculating the included angle between the small arm and the large arm by adopting a vector angle; whether the forearm penetrates the central axis of the body adopts the vector alpha1(projection of vector formed by wrist and middle part of spine on horizontal plane) and alpha2(projection of the upper part of the spine and the shoulder in the horizontal direction) and when the included angle is larger than 90 degrees, the wrist is considered to pass through the central axis of the body; adopting a vector alpha to determine whether the small arm is far away from the body3(projection of vector formed in middle of elbow spine on horizontal plane) and alpha4The angle (projection of the superior vertebra from the shoulder in horizontal) is determined, and the lower arm is considered to be away from the body when the angle is less than 45 degrees.
(3) And judging the wrist score. When the wrist is in a horizontal state, the wrist is divided into +1, when the included angle between the wrist and the horizontal is within +/-15 degrees, the wrist is divided into +2, and other states are divided into + 3. Plus 1 additional point when the wrist is bent left or right. The sum of the two scores is the wrist position score. The former uses the vector alpha in the method5(the wrist and the palm constitute a vector) and6the included angle (the vector formed by the elbow and the wrist) is determined, and whether the elbow is bent left or right or not is controlled by the control panel because of the accuracy problem.
(4) And judging whether the wrist is twisted, if not, judging that the wrist is twisted by +1, and if the wrist is twisted by +2, wherein the score is the wrist twisting score.
(5) If the arm continuously repeats the same action for more than four times in one minute, the score is + 1; if the arm load is less than 1kg, not adding points; the arm load is 1-5 kg, and the load is dynamic and not repeated, +1 minute; the arm load is 1-5 kg, and the load is static or repeated, +2 minutes; the load was over 5kg, +3 min. The SCORE is added with the above adjustment value and recorded as SCORE a for later use.
(6) Then calculating scores of the neck and the head, and when the head is lowered by an angle smaller than 10 degrees, +1 score; when the head lowering angle is 10-20 degrees, +2 minutes; the head lowering angle exceeds 20 degrees and +3 minutes; the head leans backward, +4 minutes. The head rotates left and right for +1 minute; the head is inclined left and right, +1 minute. The final score is noted as the head score. In the method, the head lowering angle adopts a vector alpha7(the Y-coordinate of the head and neck constitutes a vector) and alpha8(the Y coordinates of the upper part of the vertebra and the lower part of the vertebra form a vector) and the rotating headThe head distortion is controlled by the control panel due to the precision problem.
(7) Calculating the score of the trunk, wherein when the trunk is vertical, the included angle between the trunk and the vertical is less than 10 degrees and +1 score; when the included angle between the trunk and the vertical is 10-20 degrees, +2 minutes; when the included angle between the trunk and the vertical is 20-60 degrees, +2 minutes; when the included angle between the trunk and the vertical exceeds 60 degrees, +4 minutes. The trunk is twisted left and right, +1 minute; the trunk is inclined from side to side, +1 point. The final score is noted as the torso score. In the method, the vector alpha is adopted for the waist bending angle of the trunk9The included angle between the vertical vector (0, 1, 0) and the Y coordinate of the upper part of the vertebra and the lower part of the vertebra is determined, and the trunk torsion and the inclination are controlled by the control panel due to the accuracy problem. The final score was noted as the torso score
(8) If the trunk or the feet continuously repeat the same action for more than four times in one minute, the score is + 1; if the load of the trunk or the feet is less than 1kg, the parts are not added; the load of the trunk or the feet is 1-5 kg, and the load is dynamic and is not repeated, +1 minute; the load of the trunk or the feet is 1-5 kg, and the load is static or repeated, +2 minutes; the load was over 5kg, +3 min. The SCORE is added with the above adjustment value and recorded as SCORE B for later use.
(9) And calculating a SCORE by using the SCORE A and the SCORE B, namely the final RULA SCORE.
When the final score is 1-2 minutes, the ergonomic state of the user is good, and no injury risk exists;
3-4 minutes later, representing the potential injury risk of the user, further observation, investigation and research are needed;
5-6 minutes, representing that the user has an injury risk, the posture should be changed as soon as possible;
when the time is 7 minutes or more, the user is likely to be injured, and the user should immediately stop the operation and change his posture.
For the judgment items which cannot be accurately judged or cannot be measured only through the Kinect sensor, through the control panel, an operator can select corresponding options from a check list or a pull-down list to specify, and the score of each joint node is determined. The corresponding options are: whether the left hand and the right hand are supported or leaned, whether the left hand and the right hand repeatedly act for more than four times within one minute, whether the corresponding arms twist or not and whether the weights are loaded; whether the foot has support; whether to lower or turn the head; whether the body is bending or twisting;
s105, when the injury risk score of the joint node exceeds a set threshold, recording the current pose of the implementer and the position of the joint node of which the injury risk score exceeds the set threshold; and sending out an emergency warning.
As a specific embodiment, a window for displaying the injury risk of each part of the human body in real time is established, when the RULA analysis result of the corresponding part shows that the injury risk is high, a circular pattern with different colors is displayed corresponding to the joint for reminding or warning, the circular pattern has three colors of yellow, orange and red, and the corresponding part is at the injury risk. When the risk of injury is low, the joint has no color identification; when facing moderate injury risk, the joint has yellow mark; when a higher injury risk is faced and actions need to be changed or maintenance operation needs to be cancelled as soon as possible, an orange mark is generated at a joint; when the injury risk is extremely high and the operation needs to be immediately interrupted, the corresponding joint is marked with red. The concrete contents are as follows:
shoulder part: the big arm is 3-4 yellow, 5 orange and 6 red
Elbow: the forearm score 2 was yellow and 3 was orange.
Wrist: wrist position score + wrist twist score, 4 for yellow, 5 for orange, 6 for red.
Head and neck: the neck and head were scored 3 yellow, 4 orange and 5 red.
Trunk: the trunk scored 3 yellow, 4 orange and 5 red.
Foot part: the foot score 2 was divided into yellow.
The results of the RULA analysis of the assembly practitioner's body parts and population are recorded and can be displayed in real time as desired by the user in a chart showing the RULA assessment scores from the start of assembly up to the current time, and a line chart showing the RULA assessment scores over the past ten seconds.
When the score of any part of the human body is orange, or the total injury risk is more than or equal to 5 minutes, screen capturing is automatically carried out, the current time is recorded and stored for subsequent analysis.
And outputting the coordinates of the human body nodes and the RULA evaluation scores to a file in a CSV format of a specified folder for recording and subsequent analysis processing.
The invention utilizes Kinect to directly analyze the implementer who carries out assembly and maintenance work, thereby saving the participation of analysts. The present invention analyzes thirty times per second in sequence, which is equivalent to only 0.033 seconds for each analysis. The continuity is strong, i.e. the analysis can be completed without the operator stopping the maintenance work. The analysis result is more intuitive, namely the injury risk of the implementer is displayed on a screen in real time, and meanwhile, the change trend of the recent injury risk of the implementer can be intuitively displayed by utilizing a chart. The debugging processing is more convenient, except recording the photo, still can record the current skeleton model of practitioner, the coordinate of each joint point and the injury risk of each position and the moment when analyzing, makes things convenient for the later stage to inspect and correct error. The method is beneficial to the subsequent establishment of the virtual maintenance process, the coordinates of each node of the body of the operator can be directly stored in a computer, and the coordinates of the nodes of the human body can be directly led in at any time when the virtual maintenance process is subsequently established, so that the virtual maintenance process can be simply and quickly established.
Fig. 4 is a schematic structural diagram of a system for evaluating human body posture in an assembly and maintenance process, as shown in fig. 4, the system for evaluating human body posture in an assembly and maintenance process, provided by the present invention, includes:
the data acquisition module 401 is used for acquiring a color image and a depth image of an implementer in real time in the process of assembly and maintenance;
a depth image processing module 402, configured to determine joint nodes and a skeleton of the practitioner in real time according to the depth image;
the color image processing module 403 is configured to perform matching and superposition on the joint node and the skeleton of the implementer at the same time and the color image at the corresponding time;
an RULA evaluation module 404, configured to determine an injury risk score of each joint node by using a RULA analysis method on the superimposed color image; and displaying the scoring result;
a RULA result display module 405, configured to record, when an injury risk score of a joint node exceeds a set threshold, a current pose of an implementer and a position of the joint node of which the injury risk score exceeds the set threshold; and sending out an emergency warning.
The depth image processing module 402 specifically includes:
the three-dimensional coordinate determination unit of the joint node is used for determining the three-dimensional coordinates of the joint nodes of 25 human bodies of the implementer by using a Kinect V2.0 somatosensory sensor according to the depth image; the 25 human joint nodes comprise: a head, a neck, a left shoulder, a left elbow, a left wrist, a left palm, a left thumb, a left middle fingertip, a right shoulder, a right elbow, a right wrist, a right palm, a right thumb, a right middle fingertip, an upper spine, a middle spine, a lower spine, a left crotch, a left knee, a left ankle, a left foot, a right crotch, a right knee, a right ankle, and a right foot;
the human body joint node three-dimensional graph determining unit is used for determining a human body joint node three-dimensional graph according to the three-dimensional coordinates of 25 human body joint nodes;
the human body skeleton two-dimensional projection graph determining unit is used for converting the human body joint node three-dimensional graph into a two-dimensional graph; and determining human skeleton according to adjacent joint nodes in the two-dimensional graph, and further determining a two-dimensional projection graph of the human skeleton.
The depth image processing module 402 further includes:
and the correction unit is used for correcting the position of each joint node and the bone.
The RULA evaluation module 404 specifically includes:
the included angle determining unit is used for determining the included angle between the adjacent bones in real time according to the superposed color images;
the injury risk score determining unit is used for determining an included angle between adjacent bones in real time and determining an injury risk score of each joint node by adopting a rapid upper limb assessment RULA analysis method;
and the scoring result display unit is used for displaying the scoring result on the two-dimensional projection drawing of the human body skeleton.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the system disclosed by the embodiment, the description is relatively simple because the system corresponds to the method disclosed by the embodiment, and the relevant points can be referred to the method part for description.
The principles and embodiments of the present invention have been described herein using specific examples, which are provided only to help understand the method and the core concept of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed. In view of the above, the present disclosure should not be construed as limiting the invention.

Claims (8)

1. A method for evaluating the posture of a human body in the process of assembly and maintenance is characterized by comprising the following steps:
in the process of assembly and maintenance, acquiring a color image and a depth image of an implementer in real time;
determining joint nodes and a skeleton of the practitioner in real time according to the depth image;
matching and superposing the joint node and the skeleton of the implementer at the same moment and the color image at the corresponding moment;
determining the injury risk score of each joint node by adopting a rapid upper limb assessment RULA analysis method for the superposed color image; and displaying the scoring result;
when the injury risk score of the joint node exceeds a set threshold, recording the current pose of an implementer and the position of the joint node of which the injury risk score exceeds the set threshold; and sending out an emergency warning.
2. The method for evaluating the human body posture in the assembly and maintenance process according to claim 1, wherein the determining the joint nodes and the skeleton of the practitioner in real time according to the depth image specifically comprises:
determining three-dimensional coordinates of 25 human body joint nodes of the implementer by using a KinectV2.0 somatosensory sensor according to the depth image; the 25 human joint nodes comprise: a head, a neck, a left shoulder, a left elbow, a left wrist, a left palm, a left thumb, a left middle fingertip, a right shoulder, a right elbow, a right wrist, a right palm, a right thumb, a right middle fingertip, an upper spine, a middle spine, a lower spine, a left crotch, a left knee, a left ankle, a left foot, a right crotch, a right knee, a right ankle, and a right foot;
determining a three-dimensional graph of the human body joint nodes according to the three-dimensional coordinates of the 25 human body joint nodes;
converting the three-dimensional graph of the human joint node into a two-dimensional graph; and determining human skeleton according to adjacent joint nodes in the two-dimensional graph, and further determining a two-dimensional projection graph of the human skeleton.
3. The method for assessing the posture of a human body during assembly and maintenance according to claim 2, wherein the determining the joint nodes and the skeleton of the practitioner in real time according to the depth image further comprises:
the position of each joint node and bone is corrected.
4. The method for assessing the posture of a human body during assembly and maintenance according to claim 2, wherein the superimposed color images are analyzed by RULA to determine an injury risk score of each joint node; and displaying the scoring result, which specifically comprises the following steps:
determining an included angle between adjacent bones in real time according to the superposed color images;
determining an included angle between adjacent bones in real time, and determining an injury risk score of each joint node by adopting an RULA analysis method;
and displaying the scoring result on a two-dimensional projection drawing of the human skeleton.
5. A system for assessing the posture of a human body during assembly and maintenance, comprising:
the data acquisition module is used for acquiring a color image and a depth image of an implementer in real time in the process of assembly and maintenance;
the depth image processing module is used for determining the joint nodes and the skeleton of the implementer in real time according to the depth image;
the color image processing module is used for matching and superposing the joint nodes and the frameworks of the implementers at the same moment and the color images at the corresponding moment;
the RULA evaluation module is used for determining the injury risk score of each joint node by adopting an RULA analysis method on the superposed color images; and displaying the scoring result;
the RULA result display module is used for recording the current pose of an implementer and the position of the joint node of which the injury risk score exceeds a set threshold when the injury risk score of the joint node exceeds the set threshold; and sending out an emergency warning.
6. The system for evaluating human body posture in the assembling and maintaining process of claim 1, wherein the depth image processing module comprises:
a three-dimensional coordinate determination unit of the joint node, configured to determine, using a kinectv2.0 somatosensory sensor, three-dimensional coordinates of 25 human joint nodes of the practitioner according to the depth image; the 25 human joint nodes comprise: a head, a neck, a left shoulder, a left elbow, a left wrist, a left palm, a left thumb, a left middle fingertip, a right shoulder, a right elbow, a right wrist, a right palm, a right thumb, a right middle fingertip, an upper spine, a middle spine, a lower spine, a left crotch, a left knee, a left ankle, a left foot, a right crotch, a right knee, a right ankle, and a right foot;
the human body joint node three-dimensional graph determining unit is used for determining a human body joint node three-dimensional graph according to the three-dimensional coordinates of 25 human body joint nodes;
the human body skeleton two-dimensional projection graph determining unit is used for converting the human body joint node three-dimensional graph into a two-dimensional graph; and determining human skeleton according to adjacent joint nodes in the two-dimensional graph, and further determining a two-dimensional projection graph of the human skeleton.
7. The system of claim 6, wherein the depth image processing module further comprises:
and the correction unit is used for correcting the position of each joint node and the bone.
8. The system for assessing the posture of a human body during assembly and maintenance as claimed in claim 6, wherein said RULA assessment module comprises:
the included angle determining unit is used for determining the included angle between the adjacent bones in real time according to the superposed color images;
the injury risk score determining unit is used for determining an included angle between adjacent bones in real time and determining an injury risk score of each joint node by adopting a rapid upper limb assessment RULA analysis method;
and the scoring result display unit is used for displaying the scoring result on the two-dimensional projection drawing of the human body skeleton.
CN202110993155.XA 2021-08-25 2021-08-25 Method and system for evaluating human body posture in assembly and maintenance process Pending CN113633281A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110993155.XA CN113633281A (en) 2021-08-25 2021-08-25 Method and system for evaluating human body posture in assembly and maintenance process

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110993155.XA CN113633281A (en) 2021-08-25 2021-08-25 Method and system for evaluating human body posture in assembly and maintenance process

Publications (1)

Publication Number Publication Date
CN113633281A true CN113633281A (en) 2021-11-12

Family

ID=78424125

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110993155.XA Pending CN113633281A (en) 2021-08-25 2021-08-25 Method and system for evaluating human body posture in assembly and maintenance process

Country Status (1)

Country Link
CN (1) CN113633281A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117237880A (en) * 2023-11-13 2023-12-15 东莞先知大数据有限公司 Diesel oil unloading standard detection method and device, electronic equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010114208A1 (en) * 2009-03-31 2010-10-07 전북대학교산학협력단 Method for evaluating inconvenience of driving posture of bus driver
CN104239605A (en) * 2014-07-25 2014-12-24 北京航空航天大学 Aircraft assembling process risk evaluation method
CN105930795A (en) * 2016-04-20 2016-09-07 东北大学 Walking state identification method based on space vector between human body skeleton joints
CN107831897A (en) * 2017-11-17 2018-03-23 吉林大学 RULA evaluating methods in a kind of Virtual assemble operation
CN110480634A (en) * 2019-08-08 2019-11-22 北京科技大学 A kind of arm guided-moving control method for manipulator motion control
CN111652047A (en) * 2020-04-17 2020-09-11 福建天泉教育科技有限公司 Human body gesture recognition method based on color image and depth image and storage medium
CN112131928A (en) * 2020-08-04 2020-12-25 浙江工业大学 Human body posture real-time estimation method based on RGB-D image feature fusion
CN112568898A (en) * 2019-09-29 2021-03-30 杭州福照光电有限公司 Method, device and equipment for automatically evaluating injury risk and correcting motion of human body motion based on visual image
CN113158910A (en) * 2021-04-25 2021-07-23 北京华捷艾米科技有限公司 Human skeleton recognition method and device, computer equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010114208A1 (en) * 2009-03-31 2010-10-07 전북대학교산학협력단 Method for evaluating inconvenience of driving posture of bus driver
CN104239605A (en) * 2014-07-25 2014-12-24 北京航空航天大学 Aircraft assembling process risk evaluation method
CN105930795A (en) * 2016-04-20 2016-09-07 东北大学 Walking state identification method based on space vector between human body skeleton joints
CN107831897A (en) * 2017-11-17 2018-03-23 吉林大学 RULA evaluating methods in a kind of Virtual assemble operation
CN110480634A (en) * 2019-08-08 2019-11-22 北京科技大学 A kind of arm guided-moving control method for manipulator motion control
CN112568898A (en) * 2019-09-29 2021-03-30 杭州福照光电有限公司 Method, device and equipment for automatically evaluating injury risk and correcting motion of human body motion based on visual image
CN111652047A (en) * 2020-04-17 2020-09-11 福建天泉教育科技有限公司 Human body gesture recognition method based on color image and depth image and storage medium
CN112131928A (en) * 2020-08-04 2020-12-25 浙江工业大学 Human body posture real-time estimation method based on RGB-D image feature fusion
CN113158910A (en) * 2021-04-25 2021-07-23 北京华捷艾米科技有限公司 Human skeleton recognition method and device, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
姜盛乾: "基于虚拟现实技术的装配及人因评价研究", 《中国优秀博硕士学位论文全文数据库(硕士)工程科技Ⅰ辑工程科技Ⅰ辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117237880A (en) * 2023-11-13 2023-12-15 东莞先知大数据有限公司 Diesel oil unloading standard detection method and device, electronic equipment and storage medium
CN117237880B (en) * 2023-11-13 2024-02-09 东莞先知大数据有限公司 Diesel oil unloading standard detection method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109432753B (en) Action correcting method, device, storage medium and electronic equipment
CN111443619B (en) Virtual-real fused human-computer cooperation simulation method and system
JP4451817B2 (en) Dental technology evaluation system
US20150079565A1 (en) Automated intelligent mentoring system (aims)
JP6124308B2 (en) Operation evaluation apparatus and program thereof
KR101416282B1 (en) Functional measurement and evaluation system for exercising Health and Rehabilitation based on Natural Interaction
CN111091732A (en) Cardio-pulmonary resuscitation (CPR) guiding device and method based on AR technology
JP2005224452A (en) Posture diagnostic apparatus and program thereof
US11426099B2 (en) Mobile device avatar generation for biofeedback to customize movement control
JP2015186531A (en) Action information processing device and program
CN102012742A (en) Method and device for correcting eye mouse
CN109219426B (en) Rehabilitation training assistance control device and computer-readable recording medium
CN111539245A (en) CPR (CPR) technology training evaluation method based on virtual environment
CN113345069A (en) Modeling method, device and system of three-dimensional human body model and storage medium
CN113633281A (en) Method and system for evaluating human body posture in assembly and maintenance process
CN110717972B (en) Transformer substation exception handling simulation system based on VR local area network online system
US20230240594A1 (en) Posture assessment program, posture assessment apparatus, posture assessment method, and posture assessment system
KR20160076488A (en) Apparatus and method of measuring the probability of muscular skeletal disease
Yun et al. Animation fidelity in self-avatars: impact on user performance and sense of agency
Plantard et al. Usability of corrected Kinect measurement for ergonomic evaluation in constrained environment
CN108363984B (en) Fatigue strength monitoring method in a kind of Virtual assemble
CN113051973A (en) Method and device for posture correction and electronic equipment
JP6577150B2 (en) Human body model display system, human body model display method, communication terminal device, and computer program
CN113100717B (en) Naked eye 3D dizziness training system suitable for dizziness patient and evaluation method
JP6744139B2 (en) Rehabilitation support control device and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination