WO2016105102A1 - Dispositif et procédé pour évaluer la possibilité d'apparition de trouble musculo-squelettique - Google Patents

Dispositif et procédé pour évaluer la possibilité d'apparition de trouble musculo-squelettique Download PDF

Info

Publication number
WO2016105102A1
WO2016105102A1 PCT/KR2015/014128 KR2015014128W WO2016105102A1 WO 2016105102 A1 WO2016105102 A1 WO 2016105102A1 KR 2015014128 W KR2015014128 W KR 2015014128W WO 2016105102 A1 WO2016105102 A1 WO 2016105102A1
Authority
WO
WIPO (PCT)
Prior art keywords
likelihood
determining
data
occurrence
sensor
Prior art date
Application number
PCT/KR2015/014128
Other languages
English (en)
Korean (ko)
Inventor
이성균
Original Assignee
디게이트 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 디게이트 주식회사 filed Critical 디게이트 주식회사
Publication of WO2016105102A1 publication Critical patent/WO2016105102A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb

Definitions

  • the present invention relates to an apparatus and method for determining the likelihood of occurrence of musculoskeletal disorders, and more particularly, to determining whether there is a risk of musculoskeletal disorders of a subject using motion information of a body part of the subject. It is about a method.
  • musculoskeletal disorders are health disorders caused by repetitive movements, improper working postures, or excessive use of force. They include diseases of the neck, shoulders, waist, upper and lower extremity muscles, and surrounding body tissues.
  • social costs for musculoskeletal disease prevention are steadily increasing, such as the Occupational Safety and Health Act stipulating appropriate measures to prevent health disorders caused by simple repetitive tasks or tasks that put excessive burden on the human body.
  • the risk of musculoskeletal movement or the possibility of musculoskeletal disorders should be judged beforehand.
  • the possibility of musculoskeletal disorders is determined by observing the working environment by a person or the symptoms of the disease. Has been judged through examinations.
  • the present invention provides an apparatus and method for determining the possibility of occurrence of musculoskeletal disorders of the subject based on obtaining motion information according to the movement of the subject.
  • an apparatus and method for determining the likelihood of occurrence of musculoskeletal disorders as follows.
  • the apparatus for determining the likelihood of occurrence of musculoskeletal disorders includes: a sensing unit including an inertial measurement unit (IMU) attached to an object and a 3D (Three Dimensional) sensor for capturing the object from the outside of the object; A data synchronization unit configured to generate synchronized data by synchronizing inertial data output from the inertial sensor and 3D image data output from the 3D sensor; A determination unit determining a possibility of occurrence of musculoskeletal disorders according to the motion of the subject based on the synchronized data; And a user interface for outputting the determined result. It may include.
  • IMU inertial measurement unit
  • 3D Three Dimensional
  • the 3D sensor may include at least one of an image sensor, an infrared sensor, and an ultrasonic sensor.
  • the 3D image data may include a color image and a distance image.
  • the data synchronizer may correct 3D image data based on the inertial data.
  • the data synchronization unit may generate synchronized data based on the inertia data and the corrected 3D image data.
  • the determination unit may determine the occurrence of musculoskeletal disorders using at least one of a Rapid Upper Limb Assessment (RULA), a Rapid Entire Body Assessment (REBA), and a NIOSH Lifting Equation (NLE).
  • RULA Rapid Upper Limb Assessment
  • REBA Rapid Entire Body Assessment
  • NLE NIOSH Lifting Equation
  • the user interface may receive a user command for classifying an operation of the object into a plurality of operation patterns.
  • the determiner may classify the synchronized data corresponding to the motion patterns.
  • the determination unit may determine the possibility of occurrence of musculoskeletal disorders for each operation pattern based on the synchronized data of each operation pattern.
  • the determination unit may adopt and use at least one of a Rapid Upper Limb Assessment (RULA), a Rapid Entire Body Assessment (REBA), and a NIOSH Lifting Equation (NLE) for each operation pattern.
  • RULA Rapid Upper Limb Assessment
  • REBA Rapid Entire Body Assessment
  • NLE NIOSH Lifting Equation
  • the user interface may receive a user command for selecting at least one operation pattern among the operation patterns.
  • the determination unit may determine a possibility of occurrence of the musculoskeletal disorder for the selected operation pattern based on the synchronized data of the selected operation pattern.
  • the determination unit may adopt and use at least one of a Rapid Upper Limb Assessment (RULA), a Rapid Entire Body Assessment (REBA), and a NIOSH Lifting Equation (NLE) according to the selected operation pattern.
  • RULA Rapid Upper Limb Assessment
  • REBA Rapid Entire Body Assessment
  • NLE NIOSH Lifting Equation
  • the user interface may receive a user command for setting an object.
  • the determination unit may adopt and use at least one of a Rapid Upper Limb Assessment (RULA), a Rapid Entire Body Assessment (REBA), and a NIOSH Lifting Equation (NLE) according to a part of the object.
  • RULA Rapid Upper Limb Assessment
  • REBA Rapid Entire Body Assessment
  • NLE NIOSH Lifting Equation
  • a method of determining the likelihood of musculoskeletal disorders includes obtaining inertial data of an inertial measurement unit (IMU) attached to an object and 3D image data of a 3D (Three Dimensional) sensor photographing the object from the outside of the object;
  • IMU inertial measurement unit
  • determining the likelihood of musculoskeletal disease according to the motion of the subject may be performed using at least one of a Rapid Upper Limb Assessment (RULA), a Rapid Entire Body Assessment (REBA), and a NIOSH Lifting Equation (NLE).
  • RULA Rapid Upper Limb Assessment
  • REBA Rapid Entire Body Assessment
  • NLE NIOSH Lifting Equation
  • the motion information of the subject can be obtained more accurately, and the methods for determining the possibility of the disease can be selectively applied according to the motion pattern of the subject or the part of the subject. can do.
  • the objectivity and reliability of the determination result can be improved.
  • FIG. 1 is a control block diagram according to an embodiment of an apparatus for determining the likelihood of occurrence of musculoskeletal disorders.
  • FIG. 2 is a diagram illustrating an example of setting an object through a user interface.
  • FIG. 3 is a diagram illustrating another example of setting an object through a user interface.
  • FIG. 4 is a diagram for explaining a process of evaluating a workload using RULA.
  • FIG. 5 is a diagram illustrating an action step table finally generated by the RULA.
  • FIG. 6 is a diagram illustrating a decision table of REBA.
  • FIG. 7 is a flowchart illustrating a method of determining the likelihood of occurrence of musculoskeletal disorders according to an embodiment.
  • first, second, A, and B may be used to describe various components, but the components should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another.
  • the first component may be referred to as the second component, and similarly, the second component may also be referred to as the first component.
  • FIG. 1 is a control block diagram according to an embodiment of an apparatus for determining the likelihood of occurrence of musculoskeletal disorders.
  • the apparatus 100 for determining the likelihood of occurrence of musculoskeletal disorders includes a detector 110, a controller 120, a storage 130, and a user interface 140 to detect an operation of an object. Based on this, the likelihood of occurrence of musculoskeletal disorders can be determined and output of the determination result.
  • the subject may be a living body of a human or an animal, or a specific part of the living body such as a neck, shoulder, arm, waist, upper and lower limbs, but is not limited thereto.
  • the object may be a worker who performs a work according to a predetermined operation pattern, that is, a process in a certain working environment such as a factory, may be the upper and lower legs of the worker, or may be the waist of the worker.
  • the object may be a robot and a specific portion of the robot performing a process on behalf of an operator.
  • the object will be described in detail as an operator performing a work according to a predetermined pattern or a specific part of the worker's body.
  • the user interface 140 includes an input unit 141 and a display unit 142 to provide a user interface by receiving a user command or displaying various information to the user.
  • the user is a person who performs monitoring of the object using the apparatus 100 for determining the likelihood of occurrence of musculoskeletal disorders.
  • the object when the object is an operator, the user may monitor the health of a manager or worker who manages a predetermined process. You can be a managing staff, and you can be a user.
  • the present invention is not limited thereto, and any person using the apparatus 100 for determining the likelihood of occurrence of musculoskeletal disorders may be a user.
  • the user interface 140 may output an image or a result obtained in the process of determining a disease occurrence possibility.
  • the user interface 140 may receive various conditions from the user for determining the likelihood of occurrence of musculoskeletal disorders.
  • the user may set an object to be monitored, a process to be determined, or a time point to be determined through the user interface 140, which will be described in detail with reference to FIGS. 2 and 3. This will be described in detail with reference to FIG. 3.
  • FIG. 2 is a diagram illustrating an example of setting an object through a user interface
  • FIG. 3 is a diagram illustrating another example of setting an object through a user interface.
  • a captured image of the object or a captured image of a working environment including the object may be obtained.
  • the captured image may be the user interface 140 as illustrated in FIG. It may be output through the display unit 142 of the configuration.
  • the captured image may include only a color image or may include a color image and a distance image.
  • the user may check the number of workers and the positions of the workers from the photographed image of the working environment output from the display unit 142, and as shown in FIG. 2B, a single operator ob1 may be set as an object. As illustrated in FIG. 2C, a plurality of workers ob1 and ob2 may be set as objects.
  • the user when the display unit 142 outputs a work environment for a single worker, the user may set a specific part of the body of the worker as the object.
  • the operator's upper limb ob3 may be set as an object, and as shown in FIG. 3 (c), a plurality of body parts may be used, for example.
  • the upper limb ob3 and the lower limb ob4 may be set as an object.
  • the setting of the object is performed through the input unit 141 or the display unit 142 constituting the user interface 140.
  • the input unit 141 may include a hardware input device such as various buttons or switches, a keyboard, a mouse, a track-ball, various levers, a handle or a stick, etc. for the user's input. Can be.
  • the input unit 141 may include a graphical user interface (GUI) such as a touch pad, that is, an input device that is software for user input.
  • GUI graphical user interface
  • the touch pad may be implemented as a touch screen panel (TSP) to form a mutual layer structure with the display unit 142.
  • the display unit 142 may include a cathode ray tube (CRT), a digital light processing (DLP) panel, a plasma display penal, a liquid crystal display (LCD) panel, and electroluminescence.
  • CTR cathode ray tube
  • DLP digital light processing
  • plasma display penal a liquid crystal display
  • LCD liquid crystal display
  • electroluminescence Electro Luminescence (EL) panels, Electrophoretic Display (EPD) panels, Electrochromic Display (ECD) panels, Light Emitting Diode (LED) panels, or Organic Light Emitting Diodes: OLED) panel, etc., but is not limited thereto.
  • the display unit 142 may be used as an input device in addition to the display device.
  • the detector 110 detects the motion of the object and outputs the detected motion information to the controller 120.
  • the sensing unit 110 may be implemented as an inertial measurement unit (IMU) and a three dimensional sensor (3D) to detect the motion of the object.
  • IMU inertial measurement unit
  • 3D three dimensional sensor
  • the inertial sensor is attached to the object and detects motion information such as the inclination of the attachment site, the direction of motion of the object, the speed of motion, and the acceleration of motion. At this time, the operation information detected from the inertial sensor will be defined as 'inertial data'.
  • the inertial sensor may be provided in a form that is mounted on a sensor wearing member, for example, a wear band wound around a body, an attachment patch, or the like.
  • the inertial sensor may be provided as a gyro sensor, an acceleration sensor, or a geo-magnetic sensor.
  • the inertial sensor may be provided as at least one, and may be configured by a combination of a gyro sensor, an acceleration sensor, and a geomagnetic sensor.
  • the inertial sensor is composed of a gyro sensor that can accurately measure the rotation angle of the object together with the acceleration sensor, or a geomagnetic sensor that can accurately measure the position, direction, etc. together with the acceleration sensor. The precision of inertial data can be improved.
  • the 3D sensor acquires a color image or a distance (or depth) image of the object from the outside of the object without being attached to the object, and detects motion information such as the motion direction, motion speed, and acceleration of the object from the color image or the distance image. .
  • the motion information detected from the 3D sensor will be defined as '3D image data'.
  • the 3D sensor may be provided as at least one, and may be implemented as an image sensor, an infrared sensor, or an ultrasonic sensor.
  • the image sensor may be divided into a charge coupled device (CCD) image sensor and a complementary metal oxide semiconductor (CMOS) image sensor.
  • CCD charge coupled device
  • CMOS complementary metal oxide semiconductor
  • the image sensor may acquire a color image of the object, and when a plurality of image sensors are provided, the image sensor may acquire not only a color image of the object but also a distance image.
  • the infrared sensor includes a light emitting element and a light receiving element, and the infrared rays generated by the light emitting element hit and reflect the object, and the light receiving element detects the amount of change of the reflected light and thus detects the motion of the object, or the intensity of the reflected light.
  • the distance image of the object can be obtained through the infrared sensor.
  • Ultrasonic sensors use piezoelectricity or magnetostriction to generate ultrasonic waves, measure the time it takes for the ultrasonic waves to return to the object, and calculate the distance by multiplying the speed of sound waves to find the distance to the object.
  • a sensor it can be used to acquire a distance image of an object, like an infrared sensor.
  • the 3D sensor may be composed of a combination of an image sensor, an infrared sensor, and an ultrasonic sensor.
  • the 3D sensor may be configured of an image sensor and an infrared sensor, so that a color image may be obtained from the image sensor and a distance image may be obtained through the infrared sensor.
  • the 3D sensor may be configured of an image sensor and an ultrasonic sensor, so that a color image may be obtained from the image sensor and a distance image may be obtained through the ultrasonic sensor.
  • 3D image data generated from the 3D sensor may be output through the display unit 142.
  • the display unit 142 may output only the color image of the 3D image data, or may output both the color image and the distance image.
  • the user may check the 3D image data output to the display unit 142 and perform recording, playback, and trekking.
  • the user may divide the 3D image data by operation pattern or process by using recording, playback, and trekking. can do.
  • the divided 3D data may be constructed as a data base (DB) in the storage 130 to be described later for each operation pattern.
  • DB data base
  • the motion information of the object detected from the inertial sensor and the 3D sensor that is, the inertial data and the 3D data, is output to the controller 120.
  • the controller 120 may generate various control signals for controlling the apparatus 100 for determining the possibility of musculoskeletal disease occurrence.
  • the control unit 120 may include a data synchronization unit 121 and a determination unit 122 to perform various operations and determinations for the operation of the apparatus 100 for determining the possibility of musculoskeletal disease occurrence.
  • the controller 120 may be various processors including at least one chip in which an integrated circuit is formed.
  • the controller 120 may be provided in one processor or may be provided separately from a plurality of processors.
  • the control unit 120 is divided into the data synchronization unit 121 and the determination unit 122 and described above, but the data synchronization unit 121 and the determination unit 122 may also be provided together in one processor. It may be provided separately to a plurality of processors.
  • the data synchronizer 121 synchronizes the motion information output from the detector 110, that is, the inertial data of the inertial sensor and the 3D data of the 3D sensor.
  • the data synchronizer 121 synchronizes the inertial data with the 3D image data to obtain the synchronized data.
  • the 3D image data may be divided by operation patterns or by processes.
  • the user may divide the synchronized data by operation patterns or processes on behalf of the 3D image data.
  • the synchronized data generated through the data synchronization unit 121 is output to the display unit 142, and the user checks the synchronized data output to the display unit 142 and performs recording, playback, and trekking. In this way, by using recording, playback, and trekking, the synchronized data can be divided into operation patterns or processes.
  • the divided synchronized data may be constructed as a DB in the storage unit 130 to be described later for each operation pattern.
  • the data synchronizer 121 may correct 3D image data based on inertial data.
  • an error may occur in 3D image data acquired by the 3D sensor.
  • the inertial sensor since the inertial sensor is attached to the object to operate, it is possible to obtain accurate inertial data even in an environment such as shaking of the 3D sensor or occlusion of all or part of the object. Therefore, when an error occurs in the 3D image data, the data synchronizer 121 corrects the 3D image data based on the inertial data, thereby improving the reliability of the 3D image data or the synchronized data.
  • the determination unit 122 determines the possibility of musculoskeletal disorders based on the synchronized data.
  • the determination unit 122 may determine using an ergonomic evaluation methodology, for example, an evaluation methodology such as RULA (Rapid Upper Limb Assessment), REBA (Rapid Entire Body Assessment) and NLE (NIOSH Lifting Equation).
  • RULA Remote Upper Limb Assessment
  • REBA Remote Entire Body Assessment
  • NLE NIOSH Lifting Equation
  • the present invention is not limited to the examples, and all of them can be applied as long as the possibility of disease occurrence can be determined with respect to the motion of the subject.
  • the determination unit 122 applies the synchronized data to ergonomic evaluation methodologies such as RULA, REBA, NLE, etc. to quantify the risk level of the movement of the subject, and determine the possibility of disease occurrence according to the movement based on the numerical value. do.
  • Rapid Upper Limb Assessment is a method of evaluating the work load due to the work tax, focusing on the upper limbs such as shoulder, cuff, wrist, and neck, and consists of a total of 15 steps using three scorecards.
  • each working tax is divided into two groups for each body part, and the working tax, the degree of use of muscles, and the strength of each group are evaluated, and a detailed description thereof will be described with reference to FIGS. 4 and 5. Shall be.
  • Figure 5 is a diagram illustrating a step table finally generated by the RULA.
  • RULA divides each working posture into group A (upper arm, wrist, wrist torsion) and group B (neck, torso, leg) by body part.
  • the posture score P1 for group A is obtained and if group A's movement is included in the additional details as an assessment of muscle utilization and strength, The additional score is given to the posture score P1 to obtain a score P2.
  • the posture score P3 for group B was obtained, and the behavior of group B was further refined as an assessment of muscle utilization and strength. If included, the corresponding additional score is assigned to the posture score P3 to obtain a score P4.
  • the evaluation result for group A (score P2) and the evaluation result for group B (score P4) are summed, and the score is compared with the RULA final score table (third score table) to obtain a final score P5.
  • the third score table is expressed as a total score between 1 and 7 points, and can be classified into four action levels according to the final score. As illustrated in FIG. 5, the final score between 1 and 2 is Action Step 1, the final score between 3 and 4 is Action Step 2, and the final score between 5 and 6 is Action Step 3 and the final score is 7 or more. Each of them is classified as Action Step 4, and the work load is quantified and evaluated.
  • REBA Rapid Entire Body Assessment
  • the REBA analyzes the posture of the waist, neck, and legs and adds scores for weight and strength to determine the score S1.
  • posture analysis is performed on the arms and cuffs, and the score on the handles is added to obtain a score S2.
  • the score S3 is obtained by adding the score S1 and the score S2, and the final score is obtained by adding the score S3 and the behavior score.
  • the stage of the final score can be found by the decision table of the REBA as shown in FIG. 6.
  • FIG. 6 is a diagram illustrating a decision table of REBA.
  • the risk level may be classified into five levels corresponding to the final score. Specifically, if the final score corresponds to 1, the risk level can be ignored as 0 level, if the final score is between 2 and 3, the risk level is low to 1 level, and the final score is 4 to 7 If the final score is between 8 and 10, the risk level is as high as 3 levels, and if the final score is between 11 and 15, the risk level is rated as 4 levels. do.
  • NOSH Lifting Equation is a methodology for evaluating the risk of work by calculating the recommended weight limit (RWL) for lifting work.
  • RWL recommended weight limit egg
  • the recommended weight limit egg (RWL) is defined as the limit of weight that a healthy worker can lift without a risk of low back pain without putting a strain on the waist during the actual working time in a specific lifting operation, as shown in Equation 1 below. It is determined by several working variables.
  • HM horizontal multipllier
  • VM vertical multiplier
  • DM distance multiplier
  • AM distance multiplier
  • FM frequency multiplier
  • CM couple
  • NLE calculates the Lifting Index (LI) using the ratio of the weight of the actual work and the recommended weight limit, as shown in [Equation 2] below, and when the lifting index (LI) is greater than 1, the workload is recommended. It is considered that the risk of back pain is higher than that of the device.
  • the determination unit 122 applies the synchronization data obtained from the data synchronization unit 121 to the ergonomic evaluation methodology such as RULA, REBA, and NLE as described above, and quantifies the risk level of the movement of the object to perform the operation. To determine the likelihood of disease in the musculoskeletal system.
  • the determination unit 122 may select an appropriate ergonomic evaluation methodology according to the site of the object, and determine the possibility of disease occurrence using the selected evaluation methodology. For example, when the user selects the operator's neck and shoulders as the object through the user interface 140, the determination unit 122 may adopt a RULA methodology suitable for determining the upper limb. In addition, when the user selects the entire body of the worker as an object, the determination unit 122 may adopt a REBA methodology suitable for whole body determination.
  • the determination unit 122 may select an appropriate ergonomic evaluation methodology according to the operation pattern, and may determine a disease occurrence possibility using the selected evaluation methodology. For example, when the process performed by the object is a process in which a pushing or pulling operation is repeated, the determination unit 122 may adopt a suitable REBA methodology. In addition, when the process performed by the object is a process for carrying the workpiece, the determination unit 122 may adopt a suitable NLE methodology.
  • the determination unit 122 may determine a disease occurrence possibility in correspondence with all the operation patterns, or may determine the disease occurrence possibility with respect to the selected operation pattern, which is based on a user input.
  • the user checks the 3D image data or the synchronized data output through the display unit 142, reproduces, records, and tracks the data, the user classifies the process into the process 1, process 2, and process 3.
  • the unit 122 may determine the possibility of a disease occurring in all of the process 1, the process 2, and the process 3.
  • the determination unit 122 may determine the possibility of the disease only for the process 2 based on the user input.
  • the determination unit 122 may determine the possibility of a disease occurring in step 2 including the time point.
  • the determination result determined by the determination unit 122 may be output through the display unit 142 for user confirmation.
  • the display unit 142 outputs the final step score and the calculated final score of FIG. Can be checked.
  • the display unit 142 may output an action step and an action corresponding to the calculated final score. For example, when the final score calculated by the determination unit 122 is 5, the display unit 142 displays the action phrases "" 3 "" and "need to change posture as soon as possible”. You can print
  • the display unit 142 when the determination unit 122 uses the REBA methodology, the display unit 142 outputs the decision score of FIG. 6 together with the final score of the calculated REBA to allow the user to determine the possibility of disease occurrence. Indirect results can be identified.
  • the display unit 142 may output a risk level corresponding to the calculated final score. For example, if the final score calculated by the determination unit 122 is 12, the display unit 142 may output the phrase "4" risk level and "very high" risk level.
  • the above descriptions are merely examples of the output method of the determination result. If the user can confirm the determination result or the possibility of the occurrence of the disease, the display unit 142 is not limited to the above examples and adopts the output method. can do.
  • the storage unit 130 temporarily or non-temporarily stores data and algorithms for the operation of the apparatus 100 for determining the likelihood of occurrence of musculoskeletal disorders.
  • the storage unit 130 may store user commands input through the user interface 140.
  • the storage unit 130 may store the part of the object selected by the user, and may store operation patterns or processes classified by the user.
  • the storage unit 130 may store the process or time points of the object selected by the user.
  • the storage unit 130 may store operation information output from the sensing unit 110.
  • the storage unit 130 may store inertial data output from the inertial sensor and 3D image data output from the 3D sensor.
  • the storage unit 130 may store data in which inertia data and 3D image data are synchronized.
  • the storage unit 130 may store the corrected 3D image data and synchronized data.
  • the storage unit 130 may classify the 3D image data or the synchronized data by motion patterns and construct a DB.
  • the storage unit 130 may classify the corrected 3D image data or the synchronized data into DBs by operation patterns.
  • the storage unit 130 may store algorithms for determining a disease occurrence possibility based on the synchronized data.
  • the storage unit 130 may store algorithms of an ergonomic evaluation methodology such as RULA, REBA, and NLE, and an algorithm for adopting an ergonomic evaluation methodology suitable for each set object region, and ergonomics suitable for each operation pattern. Algorithms for adopting evaluation methodologies can also be stored.
  • the storage unit 130 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory). ), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Programmable Read-Only Memory (PROM), Magnetic At least one type of storage medium may include a memory, a magnetic disk, and an optical disk.
  • the apparatus 100 for determining the likelihood of occurrence of musculoskeletal disorders may operate a web storage that performs a storage function on the Internet.
  • the apparatus 100 for determining the likelihood of occurrence of musculoskeletal disorders has been described based on the illustrated block diagram.
  • a method for determining the likelihood of occurrence of musculoskeletal disorders will be described with reference to the given flowchart.
  • FIG. 7 is a flowchart illustrating a method of determining the likelihood of occurrence of musculoskeletal disorders according to an embodiment.
  • inertial data and 3D image data are acquired from an inertial sensor and a 3D sensor that detects an operation of an object, in operation 310.
  • the inertial sensor is attached to the object to detect the motion of the object and outputs the detected motion information of the object, that is, inertial data.
  • the 3D sensor is not attached to the object, and photographs the working environment including the object or the object from the outside of the object and generates a color image and a distance image.
  • the color image and the distance image generated by capturing the object, that is, the 3D image data may include motion information of the object.
  • the inertial data and the 3D image data are transmitted to the control unit 120 through wired or wireless communication, and the control unit 120 obtains basic data for determining the likelihood of occurrence of musculoskeletal disorders.
  • the inertial data and the 3D image data thus obtained are synchronized (320).
  • the controller 120 may generate the synchronized data by synchronizing the inertial data with the 3D image data. Before performing the synchronization, the controller 120 may correct the 3D image data based on the inertial data. In particular, in an environment in which an error may occur in 3D image data such as shaking of the 3D sensor or occlusion of all or a part of the object, the controller 120 may correct the 3D image data based on inertial data. When the 3D image data is corrected, the controller 120 may generate synchronized data based on the corrected 3D image data. In other words, the synchronized 3D image data may be synchronized with the inertial data to generate synchronized data.
  • the synchronized data is displayed through the display unit 142 (330).
  • the synchronized data may be displayed automatically or based on a user's input command. For example, when a user presses a play button or touches a play icon through the user interface 140, the controller 120 may generate synchronized data according to a display command of the user. You can output control commands for display.
  • the user may check the motion of the object as an image through the display and classify the motion of the object into a plurality of motion patterns.
  • the classification of the operation pattern may be performed by a method of clicking or touching on a display screen to mark or dragging.
  • the user may input a classification command by marking the time point T1, the time point T2, and the time point T3 on the timeline of the display screen, and thus the user's motion is classified into three processes.
  • an operation up to the time point T1 may be classified as a process 1
  • an operation from the time points T1 to T2 may be classified as a process 2
  • operations after the time point T2 may be classified as a process 3, respectively.
  • the controller 120 determines whether a classification command of an operation pattern is input from a user, and if a classification command is input, the controller 120 performs a process of 351 or less.
  • the controller 120 classifies the synchronized data by operation pattern (351).
  • the control unit 120 since the user's operation is classified into the process 1, the process 2, and the process 3 according to the user's input, the control unit 120 stores the entire synchronized data and the synchronized data constituting the process 1 (hereinafter, Referred to as 'synchronized data 1', synchronized data composing process 2 (hereinafter referred to as 'synchronized data 2'), and synchronized data composing process 3 (hereinafter referred to as 'synchronized data 3'). Each).
  • the synchronized data may be built in the storage unit 130 in a DB corresponding to the operation pattern.
  • the user may select an operation pattern of the determination target through the user interface 140.
  • the user may select a single operation pattern, but may select a plurality of operation patterns.
  • step 1 step 2, and step 3, if the user wants to make a decision on step 2, the user touches a point of time included in step 2 on the timeline of the display screen,
  • the judgment object can be selected as the process 2 using a method such as directly inputting the process 2 '.
  • the likelihood of occurrence of musculoskeletal disorder for the selected motion pattern is determined (371).
  • the controller 120 may be determined using an ergonomic evaluation methodology, for example, an evaluation methodology such as a Rapid Upper Limb Assessment (RULA), a Rapid Entire Body Assessment (REBA), and a NIOSH Lifting Equation (NLE).
  • an evaluation methodology such as a Rapid Upper Limb Assessment (RULA), a Rapid Entire Body Assessment (REBA), and a NIOSH Lifting Equation (NLE).
  • the controller 120 transmits the synchronized data corresponding to the process 2, that is, the synchronized data 2 to the ergonomic evaluation methodology. Apply.
  • the controller 120 applies the synchronized data 2 to ergonomic evaluation methodologies such as RULA, REBA, and NLE, quantifies the risk level of the musculoskeletal system during the process 2, and based on the numerical value Determine the likelihood of disease in the musculoskeletal system.
  • the controller 120 may determine a disease occurrence possibility by adopting an appropriate ergonomic evaluation methodology according to the selected operation pattern. For example, if process 2 is a process in which a pushing or pulling operation is repeated, the control unit 120 may adopt an appropriate REBA methodology.
  • the controller 120 may determine a disease occurrence possibility by selecting an appropriate ergonomic evaluation methodology according to the part of the object, not the selected motion pattern. For example, when the object is the neck and shoulders of the worker, the controller 120 may adopt a RULA methodology suitable for determining the upper limb.
  • the determination result may be output through the user interface 140 or the display unit 142 to confirm the user (380).
  • the controller 120 recognizes that the user wants to determine the possibility of disease occurrence for all the operation patterns. The controller 120 recognizes the user as if all the operation patterns were selected.
  • the control unit 120 applies the synchronized data 1 to ergonomic evaluation methodologies such as RULA, REBA, NLE, etc. to quantify the risk level of the musculoskeletal system during the process 1, and based on the numerical value, To determine the likelihood of disease in the musculoskeletal system.
  • the control unit 120 applies the synchronized data 2 and the synchronized data 3 to the ergonomic evaluation methodology, respectively, to determine the possibility of disease of the musculoskeletal system during the process 2 and the process 3.
  • the controller 120 may determine a disease occurrence possibility by adopting an appropriate ergonomic evaluation methodology for each operation pattern. For example, if process 2 is a process in which a pushing or pulling operation is repeated, the control unit 120 may adopt an appropriate REBA methodology.
  • the control unit 120 is a process 1 and 3
  • the REBA methodology suitable for the judgment of the pushing and pulling repetitive operation can be adopted, and the NLE methodology for the determination of the lifting operation can be adopted for the process 2.
  • the determination result is output through the user interface 140 according to step 380.
  • an operation of the monitored object is a repetition of the same operation, for example, only a pushing and pulling work is repeatedly performed, the user does not need to input a classification command for classifying the operation pattern.
  • the controller 120 may recognize that the motion of the object is a single motion.
  • the controller 120 applies the synchronized data 1 to ergonomic evaluation methodologies such as RULA, REBA, and NLE, without having to classify the synchronization data, thereby determining the risk level of the musculoskeletal system or the possibility of occurrence of the musculoskeletal disease. .
  • a determination result is output according to step 380.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Radiology & Medical Imaging (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rheumatology (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)

Abstract

L'invention concerne un dispositif pour évaluer la possibilité d'apparition d'un trouble musculo-squelettique, qui peut comprendre : une unité de détection comprenant un capteur d'inertie (unité de mesure inertielle ; IMU) fixé à un objet, et un capteur tridimensionnel (3D) pour capturer une image de l'objet depuis l'extérieur de l'objet ; une unité de synchronisation de données pour générer des données synchronisées obtenues par synchronisation de données d'inertie fournies par le capteur d'inertie et de données d'image 3D fournies par le capteur 3D ; une unité d'évaluation pour évaluer la possibilité d'apparition d'un trouble musculo-squelettique suite à une action par l'objet, sur la base des données synchronisées ; et une interface utilisateur pour fournir les résultats de l'évaluation. Lors de l'utilisation de tels dispositif et procédé pour évaluer la possibilité d'apparition d'un trouble musculo-squelettique, des informations relatives à l'action de l'objet peuvent être acquises de façon relativement précise, et des procédés pour évaluer la possibilité d'apparition de troubles peuvent être utilisés sélectivement selon le motif d'action de l'objet ou le site de l'objet. De plus, étant donné que l'évaluation de la possibilité d'apparition d'un trouble est automatisée, l'objectivité et la fiabilité du résultat d'évaluation peuvent être améliorées.
PCT/KR2015/014128 2014-12-22 2015-12-22 Dispositif et procédé pour évaluer la possibilité d'apparition de trouble musculo-squelettique WO2016105102A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2014-0186454 2014-12-22
KR20140186454 2014-12-22

Publications (1)

Publication Number Publication Date
WO2016105102A1 true WO2016105102A1 (fr) 2016-06-30

Family

ID=56151039

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/014128 WO2016105102A1 (fr) 2014-12-22 2015-12-22 Dispositif et procédé pour évaluer la possibilité d'apparition de trouble musculo-squelettique

Country Status (2)

Country Link
KR (1) KR20160076488A (fr)
WO (1) WO2016105102A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10055012B2 (en) 2016-08-08 2018-08-21 International Business Machines Corporation Virtual reality sensory construct
US10114460B2 (en) 2016-08-08 2018-10-30 International Business Machines Corporation Virtual reality sensory construct

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113728394A (zh) * 2019-05-21 2021-11-30 史密夫和内修有限公司 身体活动执行和训练的评分度量
KR102337148B1 (ko) 2020-04-03 2021-12-08 성균관대학교산학협력단 작업부담평가 장치 및 방법
KR102628121B1 (ko) * 2023-06-19 2024-01-26 주식회사 누지 자세 교정 방법 및 그 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100111460A (ko) * 2009-04-07 2010-10-15 연세대학교 산학협력단 모션 센서를 이용한 보행주기 검출시스템과 방법
KR20120053481A (ko) * 2010-11-17 2012-05-25 서울대학교산학협력단 작업자의 생체정보 측정시스템 및 신체활동량과 종합스트레스지수 예측모델 시스템
KR20120075410A (ko) * 2010-12-28 2012-07-06 한국건설기술연구원 근골격계 신체운동의 모니터링방법 및 모니터링장치
US20120265104A1 (en) * 2011-02-25 2012-10-18 Nilton Luis Menegon Posture observer for ergonomic observation, posture analysis and reconstruction
KR20140077183A (ko) * 2011-09-23 2014-06-23 오르소센서 척추 하중 및 위치 감지용 시스템 및 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100111460A (ko) * 2009-04-07 2010-10-15 연세대학교 산학협력단 모션 센서를 이용한 보행주기 검출시스템과 방법
KR20120053481A (ko) * 2010-11-17 2012-05-25 서울대학교산학협력단 작업자의 생체정보 측정시스템 및 신체활동량과 종합스트레스지수 예측모델 시스템
KR20120075410A (ko) * 2010-12-28 2012-07-06 한국건설기술연구원 근골격계 신체운동의 모니터링방법 및 모니터링장치
US20120265104A1 (en) * 2011-02-25 2012-10-18 Nilton Luis Menegon Posture observer for ergonomic observation, posture analysis and reconstruction
KR20140077183A (ko) * 2011-09-23 2014-06-23 오르소센서 척추 하중 및 위치 감지용 시스템 및 방법

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10055012B2 (en) 2016-08-08 2018-08-21 International Business Machines Corporation Virtual reality sensory construct
US10114460B2 (en) 2016-08-08 2018-10-30 International Business Machines Corporation Virtual reality sensory construct
US10168770B2 (en) 2016-08-08 2019-01-01 International Business Machines Corporation Virtual reality sensory construct

Also Published As

Publication number Publication date
KR20160076488A (ko) 2016-06-30

Similar Documents

Publication Publication Date Title
WO2016105102A1 (fr) Dispositif et procédé pour évaluer la possibilité d'apparition de trouble musculo-squelettique
MassirisFernández et al. Ergonomic risk assessment based on computer vision and machine learning
US9700242B2 (en) Motion information processing apparatus and method
WO2016028097A1 (fr) Dispositif pouvant être porté
JP6930995B2 (ja) 立体画像生成システム、立体画像生成方法及び立体画像生成プログラム
WO2021045367A1 (fr) Procédé et programme informatique visant à déterminer un état psychologique par un processus de dessin du bénéficiaire de conseils
CN109219426B (zh) 康复训练辅助控制装置以及计算机能够读取的记录介质
WO2020054954A1 (fr) Procédé et système pour fournir une rétroaction virtuelle en temps réel
JP2020141806A (ja) 運動評価システム
Yahya et al. Accurate shoulder joint angle estimation using single RGB camera for rehabilitation
JP2016035651A (ja) 在宅リハビリテーションシステム
JP6676321B2 (ja) 適応性評価装置、適応性評価方法
WO2021132862A1 (fr) Dispositif de mesure d'indice de fonction cardiopulmonaire à base de mouvement et appareil et procédé de prédiction de degré de sénescence
JP2005091085A (ja) 非接触型関節角度計測システム
KR20160035497A (ko) 스켈레톤 추출정보를 이용한 동작분석 기반의 체형분석 시스템
Huang et al. Image-recognition-based system for precise hand function evaluation
WO2023282451A1 (fr) Procédé et appareil d'entraînement à la concentration interne
WO2021215843A1 (fr) Procédé de détection de marqueur d'image buccale, et dispositif d'adaptation d'image buccale et procédé utilisant celui-ci
JP2020119198A (ja) 動作分析装置、動作分析方法、動作分析プログラム及び動作分析システム
WO2021161651A1 (fr) Dispositif d'analyse de charge de travail, procédé d'analyse de charge de travail, ainsi que programme
JP6744139B2 (ja) リハビリテーション支援制御装置及びコンピュータプログラム
WO2020235784A1 (fr) Procédé et dispositif de détection de nerf
He Ling et al. Occupational evaluation with Rapid Entire Body Assessment (REBA) via imaging processing in field
CN115153517B (zh) 计时起立行走测试的测试方法、装置、设备及存储介质
Lin et al. Development of a body motion interactive system with a weight voting mechanism and computer vision technology

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15873629

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15873629

Country of ref document: EP

Kind code of ref document: A1