WO2023013562A1 - Fatigue estimation system, fatigue estimation method, posture estimation device, and program - Google Patents

Fatigue estimation system, fatigue estimation method, posture estimation device, and program Download PDF

Info

Publication number
WO2023013562A1
WO2023013562A1 PCT/JP2022/029404 JP2022029404W WO2023013562A1 WO 2023013562 A1 WO2023013562 A1 WO 2023013562A1 JP 2022029404 W JP2022029404 W JP 2022029404W WO 2023013562 A1 WO2023013562 A1 WO 2023013562A1
Authority
WO
WIPO (PCT)
Prior art keywords
posture
subject
fatigue
estimation
body part
Prior art date
Application number
PCT/JP2022/029404
Other languages
French (fr)
Japanese (ja)
Inventor
一輝 橋本
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to JP2023540319A priority Critical patent/JPWO2023013562A1/ja
Publication of WO2023013562A1 publication Critical patent/WO2023013562A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state

Definitions

  • the present disclosure relates to a fatigue estimation system for estimating the degree of fatigue of a subject, a posture estimation device used in the estimation system, a fatigue estimation method, and a program.
  • Patent Document 1 discloses a fatigue determination device that determines the presence or absence of fatigue and the type of fatigue based on force measurement and bioelectrical impedance measurement. .
  • the present disclosure provides a fatigue estimation system and the like that more appropriately estimate the posture and estimate the degree of fatigue.
  • a fatigue estimation system includes a plurality of imaging devices that respectively output images in which different parts of the subject's body part are captured, and the plurality of imaging devices. a posture estimation device for estimating the posture of the subject based on the plurality of images; a fatigue estimation device for estimating and outputting the degree of fatigue of the subject based on the result of estimating the posture of the subject; Prepare.
  • a posture estimation device is the posture estimation device described above.
  • a fatigue estimation method is a fatigue estimation method executed by a fatigue estimation device, wherein a plurality of and obtaining the images from each of the imaging devices, one image obtained by imaging one body part among the plurality of images, and the one body part and at least one joint among the plurality of images estimating the joint positions of the body parts including the one body part and the other body parts as the posture of the subject, and Based on the result of estimating the posture of the subject, the degree of fatigue of the subject is estimated and output to an output device connected to the fatigue estimation device.
  • a program according to one aspect of the present disclosure is a program for causing a computer to execute the fatigue estimation method described above.
  • a fatigue estimation system or the like according to one aspect of the present disclosure can more appropriately estimate the posture and estimate the degree of fatigue.
  • FIG. 1A is a first diagram for explaining estimation of fatigue level according to the embodiment.
  • FIG. 1B is a second diagram for explaining estimation of fatigue level according to the embodiment.
  • FIG. 1C is a third diagram for explaining estimation of fatigue level according to the embodiment.
  • FIG. 2 is a block diagram showing the functional configuration of the fatigue estimation system according to the embodiment.
  • FIG. 3 is a flowchart showing a fatigue level estimation method according to the embodiment.
  • 4A is a diagram showing a subject standing still in Posture A.
  • FIG. 4B is a diagram showing a subject standing still in Posture B.
  • FIG. 5A is a first diagram illustrating accumulation of estimated fatigue level of a subject according to the embodiment.
  • FIG. 5B is a second diagram illustrating accumulation of estimated fatigue level of the subject according to the embodiment.
  • FIG. 6 is a diagram illustrating an example of synthesizing joint positions according to the embodiment.
  • FIG. 7 is a diagram showing a display example of estimation results according to the embodiment.
  • each figure is a schematic diagram and is not necessarily strictly illustrated. Moreover, in each figure, the same code
  • the posture is estimated by capturing an image of the subject 11 (see FIG. 1A described later) whose fatigue level is to be estimated.
  • the degree of fatigue of the subject 11 is estimated from the estimated posture.
  • the imaging device 201 see FIG. 1A to be described later
  • Obtaining the desired image of the subject 11 may be difficult.
  • the fatigue level of the subject 11 cannot be estimated.
  • the present disclosure by using a plurality of imaging devices 201 each capable of imaging at least part of the subject 11, in the image of the subject 11 necessary for estimating the posture of the subject 11 Fill in the missing parts.
  • This provides the fatigue estimation system 200 that enables estimation of the degree of fatigue of the subject 11 even when the posture of the subject 11 cannot be estimated with one imaging device 201 .
  • the load on at least one of the muscles and joints and the deteriorating blood flow are estimated from the estimated posture of the subject 11.
  • the posture estimation device used for posture estimation can also be used for estimating muscle load, joint load, and the degree of deterioration of blood flow.
  • the posture estimation device according to the present disclosure can be applied as a device for estimating the posture of the subject 11 for various uses. That is, the posture estimation apparatus of the present disclosure can obtain an appropriate posture of the subject 11 even from images captured by the imaging devices 201 arranged such that only a part of the body of the subject 11 is captured by each of the imaging devices 201.
  • Information (2D skeletal information, 3D skeletal information, and feature quantities such as the angle between the neck and the spine and the angle between the spine and the lower leg
  • FIG. 1A is a first diagram for explaining estimation of fatigue level according to the embodiment.
  • FIG. 1B is a second diagram for explaining estimation of fatigue level according to the embodiment.
  • FIG. 1C is a third diagram for explaining estimation of fatigue level according to the embodiment.
  • the fatigue estimation system 200 in the present disclosure is a system that estimates the degree of fatigue of the subject 11 using an image output by imaging the subject 11 using the imaging device 201 .
  • the imaging device 201 is not limited in its form as long as it is a camera that captures the subject 11 and outputs an image, and as shown in FIG. is realized by
  • the subject is in a posture of sitting on the chair 12.
  • the degree of fatigue of the subject 11 is estimated based on the fatigue accumulated by taking a static posture with a fixed posture among the fatigue of the subject 11 .
  • this estimates the fatigue accumulated by the load on at least one of muscles and joints and deteriorating blood flow (hereinafter also referred to as decreased blood flow) due to a fixed posture.
  • the subject 11 is in a static posture, sitting, lying down or standing still for at least a certain period of time.
  • the fixed period is, for example, a minimum period during which fatigue can be estimated in the fatigue estimation system 200, such as several tens of seconds or several seconds. Such a period is determined depending on the processing capabilities of the imaging device 201 and the fatigue estimation device 100 (see FIG. 2 described later) that configure the fatigue estimation system 200 .
  • Examples of subjects 11 who take such a stationary posture include desk workers in offices, drivers who steer moving bodies, people who perform muscle strength training using a load in a stationary posture, residents of facilities such as hospitals, airplanes, and the like. passengers and crew members.
  • An image captured and output by the imaging device 201 is processed by the estimating device 100 to estimate the posture of the subject 11 (for example, the joint position 11a) as shown in FIG. 1B.
  • the estimated posture of the subject 11 is output as a rigid body link model as an example.
  • the straight skeletons are connected by joints indicated by black dots, and the posture of the subject 11 can be reproduced by the positional relationship between the two skeletons connected by one joint.
  • the posture is estimated by image recognition, and output as the rigid body link model described above based on the positional relationship between the joints.
  • each body part of the muscles that pull the skeletons together and the joints that connect the skeletons so that the positional relationship can be changed.
  • the amount of load applied to at least one of the muscles and joints of each body part is calculated as an estimated value. Since the estimated value of the load on at least one of the muscles and joints of each body part is accumulated as the duration of the stationary posture increases, calculation using the estimated value of the load and the duration The degree of fatigue due to the object person 11 maintaining a still posture is calculated by .
  • “at least one of muscles and joints" is also expressed as “muscles and/or joints.”
  • the degree of fatigue based on the estimated value of the blood flow of the subject 11 in addition to the estimated value of the load applied to the muscles and/or joints.
  • an example of estimating the fatigue level of the subject 11 using the estimated values of the load on the muscles and the load on the joints will be mainly described. It is also possible to estimate the degree of fatigue of the subject 11 with higher accuracy.
  • the fatigue level of the subject 11 can also be estimated using an estimated value of any one of the amount of load on the muscles of the subject 11, the amount of load on the joints, and the amount of blood flow.
  • the estimation of the fatigue level of the subject 11 based on the posture of the subject 11 is not limited to the above example, and any existing technique for estimating the fatigue level can be applied.
  • the fatigue estimation system 200 after estimating the posture of the subject 11, the fatigue estimation system 200, based on the duration of the posture, the amount of load on the muscles of the subject 11, the amount of load on the joints, and at least one of blood flow.
  • the fatigue estimation system 200 estimates the degree of fatigue of the subject 11 based on the estimated value of at least one of the estimated muscle load, joint load, and blood flow of the subject 11 .
  • the estimated value of the load amount may be simply referred to as the load amount or the estimated value.
  • the load is replaced with the blood flow, a large load is replaced with a decrease in blood flow, and a small load is replaced with an increase in blood flow.
  • the blood flow is information for quantifying the blood flow that deteriorates when the subject 11 maintains the posture.
  • the blood flow decreases, it means that the blood flow of the subject 11 is worsening, and can be used as an index of fatigue caused by the deterioration of the blood flow.
  • the blood flow may be obtained as an absolute numerical value at the time of measurement, or may be obtained as a relative numerical value between two different time points.
  • the degree of deterioration of the blood flow of the subject 11 can be estimated from the posture of the subject 11 and the relative numerical values of the blood flow at two points of time when the posture starts and ends.
  • the blood flow rate of the subject can be calculated simply from the posture of the subject 11 and the duration of the posture. can be estimated.
  • At least one of the amount of load on the muscles, the amount of load on the joints, and the amount of blood flow is estimated from the posture of the subject 11 using the musculoskeletal model described above.
  • a method using actual measurement data can also be applied as a method for estimating the load on muscles, the load on joints, and the blood flow.
  • This measured data is a database constructed by accumulating measured values of load on muscles, load on joints, and blood flow, which are measured for each posture, in association with the posture.
  • the fatigue estimation system 200 in this case, by inputting the estimated posture of the subject 11 into the database, the measured values of the load on the muscles, the load on the joints, and the blood flow in the corresponding posture are obtained. can be obtained as output.
  • the actual measurement data may be constructed using actual measurement values for each individual in consideration of individual differences in the subject 11, and statistical analysis or machine learning is performed for big data obtained from an unspecified number of subjects. It may be qualified and constructed so as to match each subject 11 by analysis processing such as.
  • FIG. 2 is a block diagram showing the functional configuration of the fatigue estimation system according to the embodiment.
  • the fatigue estimation system 200 in the present disclosure includes a fatigue estimation device 100, multiple imaging devices 201, a display device 205, and a recovery device 206.
  • Estimation device 100 includes acquisition section 101 , identification section 102 , posture estimation section 105 , first calculation section 106 , second calculation section 107 , fatigue estimation section 108 and output section 109 .
  • the acquisition unit 101 is a communication module that is connected to each of the plurality of imaging devices 201 and acquires images of the subject 11 captured from each of the plurality of imaging devices 201 .
  • the connection between the acquisition unit 101 and the imaging device 201 is performed by wire or wirelessly, and the method of communication performed through the connection is not particularly limited.
  • the fatigue estimation device 100 may include, in addition to the imaging device 201 described above, a communication module that is connected to a timer and acquires time from the timer.
  • the fatigue estimation device 100 may also include a communication module that is connected to the pressure sensor and acquires the pressure distribution from the pressure sensor.
  • the fatigue estimation device 100 may also include a communication module that is connected to the reception device and acquires personal information from the reception device.
  • the identification unit 102 is a processing unit realized by executing a predetermined program using a processor and memory.
  • the identification unit 102 is installed to realize an identification function for identifying one target person 11 from other persons. Details of the function of the identification unit 102 will be described later.
  • the posture estimation unit 105 is a processing unit realized by executing a predetermined program using a processor and memory. By the processing of the posture estimation unit 105, the posture of the subject 11 is estimated based on the image acquired by the acquisition unit 101, the additionally acquired pressure distribution, and the like.
  • Posture estimation section 105 is an example of a posture estimation device. That is, the posture estimation apparatus according to the present embodiment is implemented as posture estimation section 105 incorporated in fatigue estimation apparatus 100 .
  • the first calculation unit 106 is a processing unit realized by executing a predetermined program using a processor and memory. Based on the estimated posture of the subject 11 and additionally acquired personal information, the load amount applied to each muscle and/or joint is calculated by the processing of the first calculation unit 106 .
  • the second calculation unit 107 is a processing unit realized by executing a predetermined program using a processor and memory. By the processing of the second calculation unit 107, the amount of recovery from fatigue in each muscle and/or joint is calculated based on the amount of change in the estimated change in posture of the subject 11. FIG.
  • the fatigue estimation unit 108 is a processing unit realized by executing a predetermined program using a processor and memory.
  • the fatigue estimating unit 108 uses the posture estimated by the posture estimating unit 105 and the time obtained from a timing device or an internal oscillator to estimate fatigue of the subject 11 based on the duration of the posture. Estimate degrees.
  • the output unit 109 is a communication module that is connected to the display device 205 and the recovery device 206 and outputs to the display device 205 and the recovery device 206 the content based on the fatigue level estimation result by the fatigue estimation device 100 .
  • the connection between the output unit 109 and the display device 205 or the recovery device 206 is performed by wire or wirelessly, and the method of communication performed through the connection is not particularly limited.
  • the output unit 109 has a function of notifying the outside that there is a target person 11 whose degree of fatigue cannot be estimated. also have That is, the output unit 109 outputs the estimation result of the posture estimation unit 105 indicating that estimation of the posture of the subject 11 has failed.
  • the output unit 109 outputs the above estimation result as notification information for notifying the existence of the target person 11 whose posture estimation has failed.
  • the administrator or the like of the fatigue estimation system 200 can take measures such as estimating the fatigue level of the subject 11 whose posture estimation has failed based on this notification information using another system, or directly listening to the fatigue level. be. In this manner, the output unit 109 is configured in advance so that the fatigue level of the target person 11 whose posture has failed to be estimated will not be missed.
  • the imaging device 201 is a device that captures an image of the subject 11 and outputs an image, and is realized by a camera.
  • an existing camera such as a security camera or a fixed-point camera may be used in the space where the fatigue estimation system 200 is applied, or a dedicated camera may be newly provided.
  • Such an imaging device 201 is an example of an information output device that outputs an image as information about the position of the body part of the subject 11 . Therefore, the information to be output is an image, and is information including the positional relationship of the body parts of the subject 11 projected on the imaging device.
  • a clock is a device that measures time, and is realized by a clock.
  • the time measured by the timer may be absolute time or relative elapsed time from a starting point.
  • the timing device may be realized in any form as long as it can measure the time between two points of time when the target person 11 is detected to be still and the point of time when the degree of fatigue is estimated (that is, the duration of the stationary posture). .
  • a pressure sensor is a sensor having a detection surface, and measures the pressure applied to each of the unit detection surfaces that divide the detection surface into one or more.
  • the pressure sensor thus measures the pressure for each unit detection surface and outputs the pressure distribution on the detection surface.
  • the pressure sensor is provided so that the subject 11 is positioned on the detection surface.
  • the pressure sensors are provided on the seat and backrest of the chair on which the subject 11 sits.
  • the pressure sensor may have a marker attached on the detection surface, and the subject 11 may be guided onto the detection surface by a display such as "Please sit on the marker.”
  • the pressure sensor may output the pressure distribution of the target person 11 on the floor. Since the pressure distribution is used for the purpose of improving the accuracy of estimating the degree of fatigue, the fatigue estimation system 200 may be implemented without the pressure sensor if sufficient accuracy is ensured.
  • the reception device is a user interface that receives input of the personal information of the subject 11, and is realized by an input device such as a touch panel or keyboard.
  • the personal information includes at least one of age, sex, height, weight, muscle mass, stress level, body fat percentage, and exercise proficiency.
  • the age of the target person 11 may be a specific numerical value, or may be an age group divided by 10 years, such as teens, 20s, and 30s.
  • the age range may be divided into two divisions based on a predetermined age, such as age and older, or may be other age groups.
  • the gender of the subject 11 is one suitable for the subject 11, which is selected from two of male and female.
  • the height and weight numerical values of the height and weight of the subject 11 are respectively accepted.
  • the muscle mass the muscle composition ratio of the subject 11 measured using a body composition meter or the like is accepted.
  • the stress level of the subject 11 is selected by the subject 11 himself/herself from options such as high, medium, and low as the degree of subjective stress felt by the subject 11 .
  • the body fat percentage of the subject 11 is the ratio of the body fat weight to the body weight of the subject 11, and is expressed, for example, as a percentage of 100.
  • the subject's 11 exercise proficiency may be quantified by a score obtained when the subject 11 executes a predetermined exercise program, or may be the status of the exercise that the subject 11 usually undertakes.
  • the former is quantified by, for example, the time required to perform ten spins, the time required to run 50 meters, the flight distance of a long throw, and the like.
  • the latter is quantified by, for example, how many days a week you exercise or how many hours you exercise. Since the personal information is used for the purpose of improving the accuracy of estimating the degree of fatigue, the fatigue estimation system 200 may be implemented without the reception device if sufficient accuracy is ensured.
  • the display device 205 is a device for displaying the content based on the fatigue level estimation results output by the output unit 109 .
  • the display device 205 displays an image showing the content based on the result of estimating the degree of fatigue using a display panel such as a liquid crystal panel or an organic EL (Electro Luminescence) panel.
  • a display panel such as a liquid crystal panel or an organic EL (Electro Luminescence) panel.
  • the contents displayed by the display device 205 will be described later. Further, if the fatigue estimation system 200 is configured to only reduce the degree of fatigue of the subject 11 using the recovery device 206 for the subject 11, only the recovery device 206 may be provided, and the display device 205 Not required.
  • the recovery device 206 is a device that reduces the degree of fatigue of the subject 11 by promoting the subject's 11 blood circulation. Specifically, the recovery device 206 changes the arrangement of each part of the chair 12 by applying voltage, pressurizing, vibrating, heating, or the like, or by a mechanism provided in the chair 12, so that the sitting subject 11 Actively change the posture of As a result, the recovery device 206 changes the load on at least one of the muscles and joints of the subject 11 and promotes blood circulation. From the viewpoint of the blood flow, by promoting the blood circulation in this way, the influence of the deterioration of the blood flow due to the subject 11 being in the still posture is reduced, and the degree of fatigue is recovered.
  • the recovery device 206 is pre-applied or contacted to the appropriate body part of the subject 11, depending on the configuration of the device.
  • the fatigue estimation system 200 is configured to display only the fatigue level estimation result to the subject 11, the display device 205 only needs to be provided, and the recovery device 206 is not essential.
  • FIG. 3 is a flowchart showing a fatigue level estimation method according to the embodiment.
  • the fatigue estimation system 200 first acquires the personal information of the subject 11 . Acquisition of personal information is performed by the target person 11 himself or a manager who manages the degree of fatigue of the target person 11 by inputting to the reception device.
  • the input personal information of the subject 11 is stored in a storage device or the like (not shown), and is read out and used when estimating the degree of fatigue.
  • the fatigue estimation system 200 detects the subject 11 using the imaging device 201 .
  • the detection of the target person 11 is performed by counting the persons within the angle of view so that all persons within the angle of view of the camera, which is the imaging device 201 , are the target persons 11 . That is, here, the operation of acquiring the number of people (A) to be detected is performed (S101). Next, among these persons, the number (B) of persons whose postures can be estimated without image processing, that is, with only one image is calculated (S102). Then, the fatigue estimation device 100 determines whether or not A and B match (S103).
  • the fatigue estimation apparatus 100 determines that A and B match (Yes in S103), that is, when the postures of all persons to be detected can be estimated from one image, one image is obtained for each person.
  • the posture of the subject 11 is estimated from the image (here, the joint positions are estimated) (S108).
  • the fatigue estimation unit 108 estimates the fatigue level of the subject 11 (S109).
  • Steps S108 and S109 are performed as follows.
  • the fatigue estimation system 200 acquires the image output by the imaging device 201 by the acquisition unit 101 .
  • the estimation device 100 estimates the posture of the subject 11 .
  • the pressure distribution applied to the detection surface is obtained from the pressure sensor.
  • the posture estimation unit 105 estimates the posture of the subject 11 based on the acquired image and pressure distribution.
  • the pressure distribution is used, for example, when biased pressure is applied, to correct the estimated pose to form that bias.
  • the first calculator 106 calculates the amount of load on each muscle and/or joint of the subject 11 from the posture estimation result.
  • the personal information obtained in advance is used to correct and calculate the load amount.
  • the estimation of the posture of the subject 11 is as described with reference to FIG. 1B, and the calculation of the amount of load is as described with reference to FIG. 1C, so detailed description thereof will be omitted.
  • peak values may be based on the subject's 11 gender. Also, if the sex of the subject 11 is male, the amount of load may be small, and if the sex of the subject 11 is female, the amount of load may be large. Alternatively, the smaller the height and weight of the subject 11, the smaller the load, and the larger the height and weight, the larger the load.
  • the load amount may be reduced as the composition ratio of the subject 11 has a large muscle mass, and the load amount may be increased as the composition ratio of the muscle mass of the subject 11 is small.
  • the duration of the stationary posture of the subject 11 is measured based on the time acquired from the timing device.
  • the fatigue estimating unit 108 adds the load amount calculated above each time the duration time passes by the unit time, and estimates the degree of fatigue of the subject 11 at this point. These processes are continued until the target person 11 is released from the stationary state. Specifically, whether or not the stationary state has been canceled is determined based on whether or not the orientation estimated by orientation estimation section 105 has changed from a certain static orientation.
  • the fatigue estimation unit 108 estimates the fatigue level of the subject 11 using a fatigue level increasing function having a gradient corresponding to the calculated load amount with respect to the duration. Therefore, the greater the calculated load amount, the greater the degree of fatigue of the subject 11 that increases per unit time.
  • the fatigue level of the subject 11 is initialized (set to 0) at the start timing of the stationary posture, which is the starting point.
  • the posture estimation unit 105 calculates the amount of change in posture from the original stationary state posture to the current posture.
  • the amount of change in posture is calculated for each individual muscle and/or joint, similar to the amount of load described above.
  • the posture changes in this way the load on at least one of the muscles and joints changes, and in terms of the blood flow, the blood flow that has deteriorated is temporarily relieved, and the fatigue level of the subject 11 is recovered. turn into The degree of fatigue reduced by recovery is related to the amount of change in posture.
  • the second calculation unit 107 calculates a recovery amount, which is the degree of recovery from fatigue, based on the amount of change in posture.
  • the change time which is the time during which the change in posture of the subject 11 continues.
  • the relationship between the recovery amount and the change time is the same as the relationship between the load amount and the duration time, and the recovery amount of the subject 11 is integrated as long as the posture change continues.
  • the fatigue estimating unit 108 estimates the fatigue level of the subject 11 by subtracting the recovery amount each time the subject 11 passes the unit time at the timing when the posture of the subject 11 changes.
  • the process of calculating the amount of recovery, measuring the change time, and estimating the degree of fatigue of the subject 11 continues until the posture of the subject 11 is stationary. Specifically, it is determined whether or not the posture estimated by posture estimation section 105 is a static posture. When the target person 11 is not detected to be stationary, the amount of recovery is calculated, the change time is measured, and the amount of recovery is subtracted so that the fatigue level of the target person 11 recovers as long as the change in posture continues. Accumulate.
  • the fatigue estimation unit 108 estimates the fatigue level of the subject 11 using a fatigue level decreasing function having a slope corresponding to the calculated recovery amount with respect to the change time. Since the recovery amount of the fatigue level depends on the amount of change in posture, the greater the amount of change in posture, the greater the fatigue level of the subject 11, which decreases per unit time.
  • the posture and fatigue level are estimated again for a new static posture.
  • the fatigue estimation system 200 since the fatigue level of the subject 11 is calculated in consideration of the duration time in the stationary posture based on the image, the burden on the subject 11 is small, and more accurate The degree of fatigue of the subject 11 can be estimated.
  • FIG. 4A is a diagram showing a subject standing still in Posture A.
  • FIG. 4B is a diagram showing the subject standing still in posture B. As shown in FIG.
  • a subject 11 shown in FIGS. 4A and 4B is in a stationary posture in a seated position on a chair 12, similar to that shown in FIG. 1A.
  • FIGS. 4A and 4B there are actually tables, PCs, etc., which are not shown, but only the subject 11 and the chair 12 are shown here.
  • the stationary posture of the subject 11 shown in FIG. 4A is posture A in which the load on the shoulders is relatively large.
  • the stationary posture of the subject 11 shown in FIG. 4B is posture B in which the load on the shoulders is relatively small.
  • FIG. 5A is a first diagram illustrating accumulation of estimated fatigue level of a subject according to the embodiment. Also, FIG. 5B is a second diagram for explaining the accumulation of the subject's fatigue level estimated according to the embodiment.
  • posture A is a posture with a larger load than posture B. Therefore, for example, in certain muscles of the subject 11 (here, muscles related to shoulder movement), the load amount of posture A (slope of the straight line of posture A) is greater than the load of posture B (slope of the straight line of posture B). is also big. Therefore, in the posture A, the target person 11 accumulates (accumulates) a larger degree of fatigue in a shorter time than in the case of standing still in the posture B.
  • the degree of fatigue of the subject 11 is a positive slope increasing function is estimated as the accumulation (addition) of the degree of fatigue by , and the accumulation (addition) turns to recovery (decrease) at the change point where the subject 11 begins to change the posture.
  • the degree of fatigue of the subject 11 is determined by a decreasing function with a negative slope corresponding to the amount of change in posture. It recovers (decreases) by the amount shown as the width of change.
  • the fatigue level of the subject 11 is an increasing function with a positive slope corresponding to the load amount of the posture B, and is estimated as accumulation (addition) of the fatigue level. be done.
  • the degree of fatigue of the subject 11 is estimated in which accumulation and recovery are reflected depending on whether the posture of the subject 11 is stationary or changed.
  • step S103 when the fatigue estimation apparatus 100 determines that A and B do not match (No in S103), that is, one image in the person to be detected If a person whose posture cannot be estimated is included, such a person is identified (S104). Then, the posture estimation unit 105 selects and acquires a plurality of images that can be used for estimating the posture of the specified person (hereinafter referred to as a subject 11 whose fatigue level is estimated through synthesis of joint positions). (S105). Here, if the posture of the subject 11 cannot be estimated even using a plurality of images (No in S106), the process of estimating the degree of fatigue of the subject 11 ends. At this time, for example, notification information, which has been described as a function of the output unit 109, is output.
  • FIG. 6 is a diagram illustrating an example of synthesizing joint positions according to the embodiment.
  • FIG. 6 shows information generated during estimation of the posture (right end) of the subject 11 based on the acquired images (left end).
  • posture estimation section 105 acquires a plurality of images (one image 90a and another image 90b here) in step S105.
  • the one image 90a and the other image 90b are images obtained by imaging one body part and another body part, which are part of the body part of the subject 11, respectively.
  • one body part and another body part are body parts that share at least one joint with each other. Therefore, when a combination of a plurality of images sharing at least one joint with each other cannot be specified, the result of step S106 is No, and the process ends.
  • the posture estimation unit 105 estimates the joint positions of the body parts of the subject 11 that are partially reflected in each of the plurality of images (S107).
  • a joint position 11c in one body part reflected in one image 90a is estimated, and first information 90c including this is generated.
  • a joint position 11d of another body part reflected in another image 90b is estimated, and second information 90d including this is generated.
  • the joint positions 11c of the one body part and the joint positions 11d of the other body parts are combined to estimate the joint positions 11a of the body parts of the subject 11 including the one body part and the other body parts. do.
  • the posture estimation unit 105 can estimate the posture of the subject 11 using a plurality of images in which a part of the body part of the subject 11 is captured.
  • Synthesis of the joint positions of the subject 11 requires two or more common coordinates between the joint position 11c in at least one body part and the joint position 11d in another body part.
  • One of the common coordinates is the joint position of a shared joint, which is one joint shared between one body part and another body part.
  • posture estimating section 105 that is, in a situation in which two or more joints are shared between one body part and another body part, the posture estimating unit 105 calculates the distance between these joints from one joint to the other joint.
  • the joint position 11c in one body part and the joint position 11d in the other body part are adjusted so that the directions and magnitudes of the vectors extending to the body parts match.
  • the posture estimating unit 105 superimposes the joint position 11c of one body part and the joint position 11d of another body part with the shared joint as a reference while the posture (orientation) and scale match. By synthesizing, the joint positions 11a of the body parts of the subject 11 are estimated.
  • both one image 90a and another image 90b may be used as the reference coordinates.
  • both one image 90a and another image 90b show a commonly moving object marker M.
  • the object marker M may be a dedicated object, or some object that exists in space and can be identified as the same object by multiple imaging devices 201 may be used. If the spatial coordinates of the object marker M are used as the reference coordinates, the posture (orientation) and scale can be matched, and the shared joints can be used as a reference to superimpose and combine.
  • the posture estimation unit 105 may calculate the spatial coordinates Mc of the object marker M from one image 90a, and the spatial coordinates Md of the object marker M from the other image 90b.
  • a line marker L may also be used as another example of the marker.
  • the line marker L is realized by attaching a tape or the like to any location in the space.
  • the line marker L has both ends in common and is configured to extend between the two ends. The same effect as above can be obtained even if it does not exist.
  • the fatigue estimation system 200 can distinguish the target person 11 from other persons staying in the same space as the target person 11 based on the positional relationship between the marker and the target person 11 .
  • processing such as superimposing the joint positions for each body part of the subject 11 is performed from a plurality of images. Therefore, when another person (for example, another person 11z in FIG. 6) is staying in the same space as the subject 11, the joint position of one body part of the subject 11 and the other person's other If the joint positions of the body parts are superimposed, the posture of the subject 11 cannot be estimated normally.
  • the identification unit 102 is provided to identify the target person 11 and reliably overlap the joint positions of a part of the target person 11 with the joint positions of the other part of the target person 11. can. For this reason, the identification unit 102 detects the relative positions of the subject 11 and the marker (the subject 11 is positioned at a certain distance in a certain direction with respect to the marker) in the image acquired from the imaging device 201 . Recognizing that another person 11z is staying at a position advanced by a different distance in the direction This process may be performed with the spatial resolution of one person, so the process of estimating each joint position It has less processing load than , and can be easily implemented.
  • an image showing only a part of the body part of the subject 11 can be used for estimating the degree of fatigue by combining a plurality of images to complement each other. posture can be estimated.
  • the body part of the subject 11 when estimating the joint positions of the body parts including a part of the body part of the subject 11 and another part of the body part as the posture of the subject 11, the body part of the subject 11 There are cases where posture estimation cannot be performed for other parts that are not included in a part of and another part. For example, the posture of an invisible part that is not captured in any of the plurality of images, that is, is not captured in any of the images, cannot be estimated by the above processing.
  • the posture estimation unit 105 is configured to estimate the joint positions of the body parts of the subject 11 including invisible parts that are not captured in a plurality of images as the posture of the subject 11. good too.
  • the posture estimation unit 105 estimates the posture of the subject 11 through processing such as synthesis using a plurality of images in which a part of the body part of the subject 11 is captured. (S108). Then, in step S109, the fatigue estimating unit 108 estimates the degree of fatigue of the subject 11 as described above.
  • FIG. 7 is a diagram showing a display example of estimation results according to the embodiment.
  • the result of estimating the degree of fatigue of the subject 11 can be displayed using the display device 205 and fed back.
  • a display device 205 integrally displays a doll simulating the subject 11 and fatigue levels of the subject 11 in the shoulders, back, and waist.
  • the fatigue level of the shoulder is indicated as "stiff shoulder”
  • the fatigue of the back is indicated as “back pain”
  • the fatigue of the lower back is indicated as "low back pain”.
  • the fatigue levels of the subject 11 at three locations are displayed at once, and the fatigue levels at these three locations are estimated from images captured at once. That is, the estimating apparatus 100 detects the muscles and/or joints in each of a plurality of body parts including the first part (eg, shoulder), the second part (eg, back), and the third part (eg, waist) of the subject 11. , the degree of fatigue is estimated from one posture of the subject 11 . Therefore, even if the posture of the subject 11 is constant, the degree of fatigue accumulated in muscles and/or joints differs for each body part. can be estimated.
  • the load amount is calculated for each muscle and/or joint of the subject 11. Therefore, if there is no processing resource limitation, each muscle and/or joint fatigue can be estimated. Therefore, there is no limit to the number of body parts whose fatigue levels are estimated from images captured at one time, and the number may be one, two, or four or more.
  • the estimating device 100 calculates the load amount for each of a plurality of body parts, and calculates the degree of fatigue (the degree of stiff shoulders) of the first part based on the load amount calculated for the first part in one posture of the subject 11. , the fatigue level of the second part (the above-mentioned back pain level) due to the load amount calculated at the second part, and the fatigue level of the third part (the above-mentioned low back pain level) due to the load amount calculated at the third part, can be estimated.
  • the degree of stiff shoulders is estimated from the amount of load on the trapezius muscle
  • the degree of back pain is estimated from the degree of fatigue of the latissimus dorsi muscle
  • the degree of low back pain is estimated from the amount of load on the lumbar paraspinal muscles.
  • one fatigue level may be estimated from the load of one muscle and / or joint, but one fatigue level is estimated from the combined load of a plurality of muscles and / or joints.
  • the degree of stiff neck that is, one degree of shoulder fatigue
  • the degree of stiff neck may be estimated from the average value of the loads of the trapezius muscle, the levator scapula muscle, the rhomboid muscle, and the deltoid muscle.
  • the degree of fatigue estimated in this way may be indicated as a relative position on a reference meter with a minimum value of 0 and a maximum value of 100 as shown.
  • a reference value is provided at a predetermined position on the reference meter.
  • Such a reference value is set to the relative position (or before and after) of the degree of fatigue at which subjective symptoms such as pain may occur in a general subject 11 quantified in advance by an epidemiological survey or the like. Therefore, different reference values may be set according to the degree of fatigue of each body part.
  • the display device 205 may display a warning to the target person 11 as an estimation result when the estimated fatigue level of the target person 11 reaches the reference value.
  • the reference value here is an example of the first threshold.
  • the display device 205 may display a specific coping method such as "Recommend taking a break" as shown in the drawing.
  • the fatigue estimation system 200 actively activates the subject 11. It is also possible to consider a configuration that recovers the degree of fatigue of the user. Specifically, the recovery device 206 shown in FIG. 2 operates to recover the fatigue level of the subject 11 .
  • the specific configuration of the recovery device 206 is as described above, so a description thereof will be omitted.
  • the reference value here is an example of the third threshold, and may be the same as or different from either the first threshold or the second threshold.
  • the fatigue estimation system 200 in the present embodiment includes a plurality of imaging devices 201 each outputting an image in which a different part of the body part of the subject 11 is captured, and a plurality of imaging devices A posture estimation device (posture estimation unit 105) for estimating the posture of the subject 11 based on the plurality of images output in each of the 201, and a fatigue estimating device 100 that estimates and outputs the degree.
  • the posture estimation unit 105 cannot obtain an image showing the whole body of the subject 11 in one image due to conditions such as the arrangement position of the imaging device 201.
  • the posture of the subject 11 can be appropriately estimated by using a combination of different images. Then, it is possible to more appropriately estimate the fatigue level of the subject 11 based on the posture that is appropriately estimated by the posture estimation unit 105 .
  • the posture estimation unit 105 estimates the joint position 11c in one body part based on one image 90a in which one body part is captured among the plurality of images, and , based on another image 90b in which another body part sharing at least one joint with the one body part is imaged, the joint position 11d in the other body part is estimated, and the estimated joint position in the one body part 11 c and the estimated joint positions 11 d of other body parts may be combined to estimate joint positions 11 a of body parts including one body part and other body parts as the posture of the subject 11 .
  • the joint position 11a of the body part is estimated using one image 90a in which one body part is imaged and another image 90b in which the other body part is imaged, among the plurality of images. can do.
  • a joint position 11c in one body part is estimated from one image 90a
  • a joint position 11d in another body part is estimated from another image 90b
  • joint positions 11d in another body part are estimated, and these are combined to obtain the one body part and the other body part.
  • a shared joint that is shared between one body part and another body part Based on the relative positions of the joints and the reference coordinates shared in the plurality of images, at least one of the relative pose and the relative scale of the joint position 11c in one body part and the joint position 11d in the other body part is determined.
  • the shared joint and the reference coordinates are superimposed with at least one of the relative posture and the relative scale between the determined joint position 11c in one body part and the determined joint position 11d in another body part being applied.
  • the joint position 11c in one body part is obtained from one image 90a, and the joint position 11d in another body part is obtained from another image 90b.
  • the relative posture for example, the amount of rotation to eliminate the difference in relative orientation
  • the relative scale for example, A scaling ratio for canceling a relative scale difference
  • the reference coordinates may be the joint position of a joint different from the shared joint.
  • the joint position of the joints different from the shared joint is used as the reference coordinates and the relative position between the shared joint and the reference coordinates is overlaid, so that the joint position 11c in one image 90a is 11c, And joint positions 11d in other body parts can be synthesized from other images 90b.
  • the reference coordinates may be spatial coordinates of a marker placed at a position captured in any of the plurality of images.
  • one image 90a can be obtained.
  • the joint position 11c in one body part can be synthesized from the image 90b, and the joint position 11d in another body part can be synthesized from the other image 90b.
  • an identification unit 102 may be provided that identifies the target person 11 from another person 11z staying in the same space as the target person 11 based on the positional relationship between the marker and the target person 11.
  • the target person 11 is identified from other persons 11z staying in the same space as the target person 11, and the posture of only the target person 11 is estimated. be able to. As a result, the fatigue level can be estimated only for the target person 11 even when another person 11z is present.
  • the posture estimation unit 105 when the posture estimation unit 105 fails to estimate the posture of the target person 11, the posture estimation unit 105 outputs an estimation result indicating that the estimation of the posture of the target person 11 has failed, and the fatigue estimation system 200 further: An output unit 109 may be provided that outputs notification information for notifying the presence of the target person 11 whose posture estimation has failed, based on the estimation result indicating that the posture estimation of the target person 11 has failed.
  • the posture estimating unit 105 estimates the joint positions of the partial body parts of the subject 11 based on a plurality of images, and converts the estimated joint positions of the partial body parts of the subject 11 into human body parts.
  • the joint positions of the whole body are input to a learned machine learning model, and the joint positions 11a of the body parts of the subject 11 including invisible parts that are not captured in a plurality of images are estimated as the posture of the subject 11.
  • the joint positions 11a can be estimated for the body parts including the invisible parts of the subject 11 that cannot be estimated from any combination of a plurality of images.
  • the posture estimation unit 105 further acquires the number of detection target persons in the area, and among the detection target persons, only one image output from one imaging device 201 that captures an image of the area is used. Calculate the number of people whose postures can be estimated, and if the number of people whose postures can be estimated differs from the number of people whose postures can be estimated, among the people who are not postures can be estimated. 11, the posture of the subject 11 may be estimated.
  • the posture estimation device is posture estimation section 105 described above.
  • an image showing the entire body of the subject 11 cannot be obtained in one image depending on conditions such as the arrangement position of the imaging device 201, and it is difficult to estimate the posture of the subject 11 from one image.
  • a combination of different images can be used to better estimate the pose of the subject 11 .
  • the fatigue estimation method in the present embodiment is a fatigue estimation method executed by the fatigue estimation apparatus 100, and includes a plurality of images each outputting an image in which a different part of the body part of the subject 11 is captured. , and share one image 90a in which one body part among the plurality of images is captured, and one body part and at least one joint among the plurality of images. and another image 90b in which another body part is imaged, and joint positions 11a of body parts including one body part and another body part are estimated as the posture of the subject 11, and the subject 11 Based on the posture estimation result, the fatigue level of the subject 11 is estimated and output to an output device connected to the fatigue estimation apparatus 100 .
  • the program in the present embodiment is a program for causing a computer to execute the fatigue estimation method described above.
  • the processing executed by a specific processing unit may be executed by another processing unit.
  • the order of multiple processes may be changed, and multiple processes may be executed in parallel.
  • the fatigue estimation system or posture estimation device in the present disclosure may be implemented with multiple devices each having a portion of multiple components, or may be implemented with a single device having all of the multiple components. good too. Also, some of the functions of a component may be implemented as functions of another component, and each function may be distributed among the components in any way.
  • the present disclosure includes any form having a configuration that includes substantially all of the functions that can realize the fatigue estimation system or the posture estimation device of the present disclosure.
  • each component may be realized by executing a software program suitable for each component.
  • Each component may be realized by reading and executing a software program recorded in a recording medium such as a hard disk or a semiconductor memory by a program execution unit such as a CPU or processor.
  • each component may be realized by hardware.
  • each component may be a circuit (or integrated circuit). These circuits may form one circuit as a whole, or may be separate circuits. These circuits may be general-purpose circuits or dedicated circuits.
  • the posture of the subject is estimated from the image using the rigid link model generated by image recognition, the load amount is calculated from the estimated posture of the subject, and based on the load amount and the duration time.
  • the method of estimating the degree of fatigue is not limited to this. Any existing method may be used as a method of estimating the posture of the subject from the image, and any existing method may be used as a method of estimating the amount of load from the posture of the subject.
  • the increasing function and decreasing function are described as linear functions, but the present invention is not limited to this.
  • the increasing function may be a curvilinear function as long as the fatigue level increases with time.
  • the decreasing function may be a curvilinear function as long as it is a function that decreases the degree of fatigue over time.
  • the estimation device described above uses the load on muscles, the load on joints, and the blood flow estimated from the posture of the subject to estimate the degree of fatigue of the subject. As described above, it is also possible to correct the estimated value using the value measured using the measuring device to achieve more accurate estimation of the degree of fatigue. Specifically, the estimating device acquires a measured value corresponding to the estimated value, which is a measured value based on the measurement result of measuring the subject by the measuring device.
  • the detection device is, for example, an electromyograph, a muscle hardness meter, a pressure gauge, a near-infrared spectrometer, etc., and obtains measurement values regarding the amount of load on muscles, the amount of load on joints, and the amount of blood flow. can be done.
  • an electromyograph can estimate muscle movement corresponding to the potential based on the potential measured by potential measurement.
  • a value obtained by estimating the muscle movement can be obtained as a measured value. Since the value obtained by estimating the movement of the muscle can be converted into the amount of load on the muscle, the estimated value of the amount of load on the muscle can be corrected by the measured value.
  • the correction here is, for example, taking the average value of the estimated value and the measured value, selecting one of the estimated value and the measured value, and applying the estimated value to the correlation function between the estimated value and the measured value. and so on.
  • a muscle hardness meter can estimate muscle hardness from the stress when pressure is applied to the muscle. Since the estimated muscle hardness value can be converted into the amount of load on the muscle, it can be used to correct the estimated value in the same manner as described above.
  • the pressure gauge can obtain a measured value of what kind of pressure is applied to the body part of the subject. Such pressure parameters can be input into the musculoskeletal model. By inputting additional parameters such as pressure, the estimation accuracy of the musculoskeletal model is improved, and the estimated value estimated using the musculoskeletal model can be corrected with higher accuracy.
  • a near-infrared spectrometer can obtain spectroscopic measurement values of the subject's blood flow.
  • the estimated value may be corrected by combining the blood flow rate measured by the infrared spectrometer.
  • the measured blood flow may be used when the estimated blood flow has low reliability.
  • the fatigue estimation system described in the above embodiment may be used to configure a fatigue factor identification system that identifies the subject's fatigue factors.
  • Conventional devices or systems for estimating the degree of fatigue as "degree of stiff shoulder” and “degree of low back pain” use muscles and joints (that is, factors It was difficult to identify the posture that Therefore, by using the fatigue estimation system according to the present disclosure, the above problem can be addressed.
  • body parts where fatigue is likely to accumulate are identified as fatigue factor parts in the static posture taken by the subject.
  • the fatigue factor identification system may simply identify the fatigue factor part in one static posture taken by the subject, and the estimated amount in the fatigue factor part most among the plurality of static postures taken by the subject You may also identify the fatigue factor posture with many
  • a recommended posture that replaces the specified fatigue-causing posture may be presented, and a fatigue degree recovery operation using a recovery device may be performed on the fatigue-causing portion in the fatigue-causing posture.
  • the fatigue factor identification system includes the fatigue estimation system described in the above embodiment and a storage device for storing information on the estimated degree of fatigue.
  • a storage device may be implemented using, for example, a semiconductor memory or the like, and each main storage unit or the like constituting the fatigue estimation system may be used. good too.
  • the present disclosure may be implemented as a fatigue estimation method executed by a fatigue estimation system or an estimation device.
  • the present disclosure may be implemented as a program for causing a computer to execute such a fatigue estimation method, or may be implemented as a computer-readable non-temporary recording medium in which such a program is recorded. .

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Hospice & Palliative Care (AREA)
  • Pathology (AREA)
  • Developmental Disabilities (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Physics & Mathematics (AREA)
  • Child & Adolescent Psychology (AREA)
  • Biophysics (AREA)
  • Educational Technology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A fatigue estimation system (200) comprises: a plurality of imaging devices (201) that individually output images in which different parts of the body of a subject (11) are imaged; a posture estimation device (posture estimation unit (105)) that estimates the posture of the subject (11) on the basis of a plurality of images separately outputted by the plurality of imaging devices (201); and a fatigue estimation device (100) that estimates and outputs a level of fatigue of the subject (11) on the basis of the estimation result of the posture of the subject (11).

Description

疲労推定システム、疲労推定方法、姿勢推定装置及びプログラムFatigue estimation system, fatigue estimation method, posture estimation device and program
 本開示は、対象者の疲労度を推定するための疲労推定システム、当該推定システムに使用される姿勢推定装置、疲労推定方法、及びプログラムに関する。 The present disclosure relates to a fatigue estimation system for estimating the degree of fatigue of a subject, a posture estimation device used in the estimation system, a fatigue estimation method, and a program.
 近年、疲労の蓄積から体調不良をはじめ、怪我及び事故等につながるといった事例が散見される。これに対して、疲労の程度を推定することにより、体調不良、怪我及び事故等を未然に防ぐ技術に注目されるようになった。例えば、疲労度を推定するための疲労推定システムとして、特許文献1には、力計測、及び生体電気インピーダンス計測に基づいて疲労の有無及び疲労の種類を判定する、疲労判定装置が開示されている。 In recent years, there have been cases where accumulated fatigue leads to poor physical condition, injuries and accidents. On the other hand, attention has been focused on technology for preventing poor physical condition, injuries, accidents, etc. by estimating the degree of fatigue. For example, as a fatigue estimation system for estimating the degree of fatigue, Patent Document 1 discloses a fatigue determination device that determines the presence or absence of fatigue and the type of fatigue based on force measurement and bioelectrical impedance measurement. .
特開2017-023311号公報JP 2017-023311 A
 ところで、上記特許文献1に例示される従来の疲労判定装置等では、推定される姿勢が適切ではなく、疲労度の推定が行えない場合がある。そこで、本開示では、より適切に姿勢を推定して疲労度の推定をする疲労推定システム等を提供する。 By the way, in the conventional fatigue determination device and the like exemplified in Patent Document 1, there are cases where the estimated posture is not appropriate and the degree of fatigue cannot be estimated. Therefore, the present disclosure provides a fatigue estimation system and the like that more appropriately estimate the posture and estimate the degree of fatigue.
 本開示の一態様に係る疲労推定システムは、対象者の身体部位のうちの異なる一部が撮像された画像をそれぞれが出力する複数の撮像装置と、前記複数の撮像装置のそれぞれにおいて出力された複数の前記画像に基づいて、前記対象者の姿勢を推定する姿勢推定装置と、前記対象者の姿勢の推定結果に基づいて、前記対象者の疲労度を推定して出力する疲労推定装置と、を備える。 A fatigue estimation system according to one aspect of the present disclosure includes a plurality of imaging devices that respectively output images in which different parts of the subject's body part are captured, and the plurality of imaging devices. a posture estimation device for estimating the posture of the subject based on the plurality of images; a fatigue estimation device for estimating and outputting the degree of fatigue of the subject based on the result of estimating the posture of the subject; Prepare.
 また、本開示の一態様に係る姿勢推定装置は、上記に記載の姿勢推定装置である。 A posture estimation device according to an aspect of the present disclosure is the posture estimation device described above.
 また、本開示の一態様に係る疲労推定方法は、疲労推定装置によって実行される疲労推定方法であって、対象者の身体部位のうちの異なる一部が撮像された画像をそれぞれが出力する複数の撮像装置のそれぞれから前記画像を取得し、複数の前記画像のうちの一の身体部位が撮像された一の画像と、複数の前記画像のうちの、前記一の身体部位と少なくとも1つの関節を共有する他の身体部位が撮像された他の画像と、に基づいて、前記一の身体部位及び前記他の身体部位を含む身体部位の関節位置を、前記対象者の姿勢として推定し、前記対象者の姿勢の推定結果に基づいて、前記対象者の疲労度を推定して前記疲労推定装置に接続された出力装置に出力させる。 Further, a fatigue estimation method according to an aspect of the present disclosure is a fatigue estimation method executed by a fatigue estimation device, wherein a plurality of and obtaining the images from each of the imaging devices, one image obtained by imaging one body part among the plurality of images, and the one body part and at least one joint among the plurality of images estimating the joint positions of the body parts including the one body part and the other body parts as the posture of the subject, and Based on the result of estimating the posture of the subject, the degree of fatigue of the subject is estimated and output to an output device connected to the fatigue estimation device.
 また、本開示の一態様に係るプログラムは、上記に記載の疲労推定方法をコンピュータに実行させるためのプログラムである。 A program according to one aspect of the present disclosure is a program for causing a computer to execute the fatigue estimation method described above.
 本開示の一態様に係る疲労推定システム等は、より適切に姿勢を推定して疲労度の推定をすることができる。 A fatigue estimation system or the like according to one aspect of the present disclosure can more appropriately estimate the posture and estimate the degree of fatigue.
図1Aは、実施の形態に係る疲労度の推定を説明するための第1図である。FIG. 1A is a first diagram for explaining estimation of fatigue level according to the embodiment. 図1Bは、実施の形態に係る疲労度の推定を説明するための第2図である。FIG. 1B is a second diagram for explaining estimation of fatigue level according to the embodiment. 図1Cは、実施の形態に係る疲労度の推定を説明するための第3図である。FIG. 1C is a third diagram for explaining estimation of fatigue level according to the embodiment. 図2は、実施の形態に係る疲労推定システムの機能構成を示すブロック図である。FIG. 2 is a block diagram showing the functional configuration of the fatigue estimation system according to the embodiment. 図3は、実施の形態に係る疲労度の推定方法を示すフローチャートである。FIG. 3 is a flowchart showing a fatigue level estimation method according to the embodiment. 図4Aは、姿勢Aで静止する対象者を示す図である。4A is a diagram showing a subject standing still in Posture A. FIG. 図4Bは、姿勢Bで静止する対象者を示す図である。4B is a diagram showing a subject standing still in Posture B. FIG. 図5Aは、実施の形態に係る推定される対象者の疲労度の蓄積を説明する第1図である。FIG. 5A is a first diagram illustrating accumulation of estimated fatigue level of a subject according to the embodiment. 図5Bは、実施の形態に係る推定される対象者の疲労度の蓄積を説明する第2図である。FIG. 5B is a second diagram illustrating accumulation of estimated fatigue level of the subject according to the embodiment. 図6は、実施の形態に係る関節位置の合成の例を説明する図である。FIG. 6 is a diagram illustrating an example of synthesizing joint positions according to the embodiment. 図7は、実施の形態に係る推定結果の表示例を示す図である。FIG. 7 is a diagram showing a display example of estimation results according to the embodiment.
 以下、実施の形態について、図面を参照しながら具体的に説明する。なお、以下で説明する実施の形態は、いずれも包括的又は具体的な例を示すものである。以下の実施の形態で示される数値、形状、材料、構成要素、構成要素の配置位置及び接続形態、ステップ、ステップの順序などは、一例であり、本開示を限定する主旨ではない。また、以下の実施の形態における構成要素のうち、独立請求項に記載されていない構成要素については、任意の構成要素として説明される。 Hereinafter, embodiments will be specifically described with reference to the drawings. It should be noted that the embodiments described below are all comprehensive or specific examples. Numerical values, shapes, materials, components, arrangement positions and connection forms of components, steps, order of steps, and the like shown in the following embodiments are examples, and are not intended to limit the present disclosure. Further, among the constituent elements in the following embodiments, constituent elements not described in independent claims will be described as optional constituent elements.
 なお、各図は模式図であり、必ずしも厳密に図示されたものではない。また、各図において、実質的に同一の構成に対しては同一の符号を付し、重複する説明は省略または簡略化される場合がある。 It should be noted that each figure is a schematic diagram and is not necessarily strictly illustrated. Moreover, in each figure, the same code|symbol is attached|subjected with respect to substantially the same structure, and the overlapping description may be abbreviate|omitted or simplified.
 (実施の形態)
 [概要]
 本実施の形態における疲労推定システム200(後述する図2参照)では、疲労度の推定対象となる対象者11(後述する図1A参照)の画像を撮像することで、その姿勢の推定を行い、推定された姿勢から対象者11の疲労度を推定する。ここで、対象者11と、画像取得のための撮像装置201(後述する図1A参照)との位置関係によっては、間に遮蔽物などが存在して、対象者11の姿勢を推定するために必要な対象者11の画像を得ることが困難な場合がある。この結果、場合により、対象者11の疲労度の推定ができないといった結果をもたらしうる。
(Embodiment)
[overview]
In the fatigue estimation system 200 (see FIG. 2 described later) in the present embodiment, the posture is estimated by capturing an image of the subject 11 (see FIG. 1A described later) whose fatigue level is to be estimated. The degree of fatigue of the subject 11 is estimated from the estimated posture. Here, depending on the positional relationship between the subject 11 and the imaging device 201 (see FIG. 1A to be described later) for acquiring an image, there may be an obstacle or the like between them. Obtaining the desired image of the subject 11 may be difficult. As a result, in some cases, the fatigue level of the subject 11 cannot be estimated.
 そこで、本開示では、それぞれが対象者11の少なくとも一部を撮像することができる複数の撮像装置201を利用することで、対象者11の姿勢を推定するために必要な対象者11の画像において欠落した部分を補う。これにより、1つの撮像装置201では対象者11の姿勢の推定が行えない場合にも対象者11の疲労度の推定を可能とする疲労推定システム200を提供する。また、詳細は後述するが、本開示では、対象者11の疲労推定のために、推定された対象者11の姿勢から、筋肉及び関節の少なくとも一方における負荷ならびに悪化する血流の推定を行う。 Therefore, in the present disclosure, by using a plurality of imaging devices 201 each capable of imaging at least part of the subject 11, in the image of the subject 11 necessary for estimating the posture of the subject 11 Fill in the missing parts. This provides the fatigue estimation system 200 that enables estimation of the degree of fatigue of the subject 11 even when the posture of the subject 11 cannot be estimated with one imaging device 201 . In addition, although the details will be described later, in the present disclosure, in order to estimate the fatigue of the subject 11, the load on at least one of the muscles and joints and the deteriorating blood flow are estimated from the estimated posture of the subject 11.
 したがって、本開示において、姿勢の推定に用いられる姿勢推定装置は、筋負荷、関節負荷、及び、血流の悪化の程度の推定のために用いることもできる。またこの他、あらゆる用途のために、対象者11の姿勢の推定を行うための装置として、本開示における姿勢推定装置を適用することができる。すなわち、本開示の姿勢推定装置は、個々の撮像装置201に対象者11の身体の一部しか映らないような配置の撮像装置201によって撮像された画像からでも、適切な対象者11の姿勢に関する情報(2次元骨格情報、3次元骨格情報、ならびに、首と背骨との角度及び背骨と下腿との角度などの特徴量)を生成することができ、これをあらゆる用途に応用することが可能である。 Therefore, in the present disclosure, the posture estimation device used for posture estimation can also be used for estimating muscle load, joint load, and the degree of deterioration of blood flow. In addition, the posture estimation device according to the present disclosure can be applied as a device for estimating the posture of the subject 11 for various uses. That is, the posture estimation apparatus of the present disclosure can obtain an appropriate posture of the subject 11 even from images captured by the imaging devices 201 arranged such that only a part of the body of the subject 11 is captured by each of the imaging devices 201. Information (2D skeletal information, 3D skeletal information, and feature quantities such as the angle between the neck and the spine and the angle between the spine and the lower leg) can be generated and applied to all uses. be.
 [疲労推定システム]
 以下、実施の形態に係る疲労推定システム200の全体構成について説明する。図1Aは、実施の形態に係る疲労度の推定を説明するための第1図である。図1Bは、実施の形態に係る疲労度の推定を説明するための第2図である。図1Cは、実施の形態に係る疲労度の推定を説明するための第3図である。
[Fatigue estimation system]
The overall configuration of the fatigue estimation system 200 according to the embodiment will be described below. FIG. 1A is a first diagram for explaining estimation of fatigue level according to the embodiment. FIG. 1B is a second diagram for explaining estimation of fatigue level according to the embodiment. FIG. 1C is a third diagram for explaining estimation of fatigue level according to the embodiment.
 本開示における疲労推定システム200は、実施の形態では、撮像装置201を用いた対象者11の撮像によって出力された画像を用いて、当該対象者11における疲労度を推定するシステムである。撮像装置201は、対象者11を撮像して画像を出力するカメラであればその形態に限定はなく、図1Aに示すように、建物等の壁面又は天井等に設置される固定式のカメラ等で実現される。 In the embodiment, the fatigue estimation system 200 in the present disclosure is a system that estimates the degree of fatigue of the subject 11 using an image output by imaging the subject 11 using the imaging device 201 . The imaging device 201 is not limited in its form as long as it is a camera that captures the subject 11 and outputs an image, and as shown in FIG. is realized by
 ここで対象者は、椅子12に着座した姿勢である。本開示における疲労推定システム200では、対象者11における疲労のうち、姿勢が固定された静止姿勢をとることによって蓄積する疲労をもとに対象者11の疲労度を推定する。これはつまり、姿勢が固定された状態により、筋肉及び関節の少なくとも一方における負荷ならびに悪化する血流(以下、血流量の低下ともいう)によって蓄積される疲労を推定している。したがって、対象者11は、少なくとも一定の期間において座位、臥位又は立位で静止した静止姿勢である。一定の期間とは、例えば、数十秒又は数秒等、疲労推定システム200において疲労が推定可能な最小の期間である。このような期間は、疲労推定システム200を構成する撮像装置201及び疲労推定装置100(後述する図2参照)による処理能力に依存して決定される。 Here, the subject is in a posture of sitting on the chair 12. In the fatigue estimation system 200 according to the present disclosure, the degree of fatigue of the subject 11 is estimated based on the fatigue accumulated by taking a static posture with a fixed posture among the fatigue of the subject 11 . In other words, this estimates the fatigue accumulated by the load on at least one of muscles and joints and deteriorating blood flow (hereinafter also referred to as decreased blood flow) due to a fixed posture. Accordingly, the subject 11 is in a static posture, sitting, lying down or standing still for at least a certain period of time. The fixed period is, for example, a minimum period during which fatigue can be estimated in the fatigue estimation system 200, such as several tens of seconds or several seconds. Such a period is determined depending on the processing capabilities of the imaging device 201 and the fatigue estimation device 100 (see FIG. 2 described later) that configure the fatigue estimation system 200 .
 このような静止姿勢をとる対象者11としては、例えば、オフィスにおけるデスクワーカ、移動体を操舵するドライバ、静止姿勢での負荷を利用した筋力トレーニングを行う者、病院等の施設の入所者、飛行機等の乗客及び乗員等が挙げられる。 Examples of subjects 11 who take such a stationary posture include desk workers in offices, drivers who steer moving bodies, people who perform muscle strength training using a load in a stationary posture, residents of facilities such as hospitals, airplanes, and the like. passengers and crew members.
 撮像装置201によって撮像され、出力された画像は、推定装置100によって処理され、図1Bに示すように対象者11の姿勢(例えば、関節位置11a)が推定される。推定された対象者11の姿勢は、一例として剛体リンクモデルとして出力される。具体的には、図1Bに示すように、直線で示す骨格が黒点で示す関節によって接続され、一つの関節によって接続される二つの骨格同士の位置関係によって、対象者11の姿勢を再現できる。姿勢の推定は、画像認識によって行われ、関節同士の位置関係に基づき、上記の剛体リンクモデルとして出力される。 An image captured and output by the imaging device 201 is processed by the estimating device 100 to estimate the posture of the subject 11 (for example, the joint position 11a) as shown in FIG. 1B. The estimated posture of the subject 11 is output as a rigid body link model as an example. Specifically, as shown in FIG. 1B, the straight skeletons are connected by joints indicated by black dots, and the posture of the subject 11 can be reproduced by the positional relationship between the two skeletons connected by one joint. The posture is estimated by image recognition, and output as the rigid body link model described above based on the positional relationship between the joints.
 推定された剛体リンクモデルを、図1Cに示すような筋骨格モデル11bに当てはめることで、骨格同士を引っ張り合う筋肉、及び、当該骨格同士の位置関係を変更可能に接続する関節の各々の身体部位について、推定された姿勢に応じた位置関係に維持するために、個々の身体部位の筋肉及び関節の少なくとも一方にかかる負荷量を推定値として算出する。この各々の身体部位の筋肉及び関節の少なくとも一方における負荷量の推定値が、上記静止姿勢が継続された継続時間が延びるほど蓄積されるため、負荷量の推定値と継続時間とを用いた演算によって対象者11が静止姿勢を維持することによる疲労度が算出される。なお、以降の説明では、「筋肉及び関節の少なくとも一方」を「筋肉及び/又は関節」とも表現する。 By applying the estimated rigid body link model to the musculoskeletal model 11b shown in FIG. 1C, each body part of the muscles that pull the skeletons together and the joints that connect the skeletons so that the positional relationship can be changed. In order to maintain the positional relationship according to the estimated posture, the amount of load applied to at least one of the muscles and joints of each body part is calculated as an estimated value. Since the estimated value of the load on at least one of the muscles and joints of each body part is accumulated as the duration of the stationary posture increases, calculation using the estimated value of the load and the duration The degree of fatigue due to the object person 11 maintaining a still posture is calculated by . In the following description, "at least one of muscles and joints" is also expressed as "muscles and/or joints."
 また、本実施の形態では、上記の筋肉及び/又は関節にかかる負荷量の推定値に加え、対象者11の血流量の推定値に基づく疲労度の推定を行うことができる。以下の説明では、筋肉への負荷量及び関節への負荷量の推定値を用いて対象者11の疲労度の推定を行う例を中心に説明するが、ここに血流量の推定値を組み合わせて対象者11の疲労度の推定をより高精度に行うことも可能である。さらに、対象者11の疲労度の推定は、対象者11の筋肉への負荷量、関節への負荷量、及び血流量のいずれか一つの推定値を用いて行うことも可能である。また、本開示において、対象者11の姿勢に基づく対象者11の疲労度の推定は、上記の例に限らず、既存のあらゆる疲労度の推定に関する技術を適用することができる。 In addition, in the present embodiment, it is possible to estimate the degree of fatigue based on the estimated value of the blood flow of the subject 11 in addition to the estimated value of the load applied to the muscles and/or joints. In the following description, an example of estimating the fatigue level of the subject 11 using the estimated values of the load on the muscles and the load on the joints will be mainly described. It is also possible to estimate the degree of fatigue of the subject 11 with higher accuracy. Furthermore, the fatigue level of the subject 11 can also be estimated using an estimated value of any one of the amount of load on the muscles of the subject 11, the amount of load on the joints, and the amount of blood flow. In addition, in the present disclosure, the estimation of the fatigue level of the subject 11 based on the posture of the subject 11 is not limited to the above example, and any existing technique for estimating the fatigue level can be applied.
 一例として、本実施の形態では、疲労推定システム200は、対象者11の姿勢を推定した後、当該姿勢の継続時間に基づいて、対象者11の筋肉への負荷量、関節への負荷量、及び血流量の少なくとも一つを推定する。疲労推定システム200は、推定した対象者11の筋肉への負荷量、関節への負荷量、及び血流量の少なくとも一つの推定値に基づいて対象者11の疲労度の推定を行う。以下、簡略化のため、負荷量の推定値を、単に負荷量又は推定値と表現する場合がある。また、推定値に血流量の推定値が含まれる場合には、負荷量を血流量と読み替え、負荷量が多いことを血流量の低下に、負荷量が少ないことを血流量の上昇にそれぞれ置き換えてもよい。 As an example, in the present embodiment, after estimating the posture of the subject 11, the fatigue estimation system 200, based on the duration of the posture, the amount of load on the muscles of the subject 11, the amount of load on the joints, and at least one of blood flow. The fatigue estimation system 200 estimates the degree of fatigue of the subject 11 based on the estimated value of at least one of the estimated muscle load, joint load, and blood flow of the subject 11 . Hereinafter, for the sake of simplification, the estimated value of the load amount may be simply referred to as the load amount or the estimated value. In addition, when the estimated value of the blood flow is included in the estimated value, the load is replaced with the blood flow, a large load is replaced with a decrease in blood flow, and a small load is replaced with an increase in blood flow. may
 また、血流量とは、上記したように、対象者11が姿勢を維持することで悪化する血流を数値化するための情報である。血流量は、低下するほど、対象者11の血流が悪化していることを意味し、血流の悪化によって引き起こされる疲労の指標として利用できる。血流量は、測定時点における絶対的な数値として取得されてもよく、異なる2時点間での数値の相対的な数値として取得されてもよい。例えば、対象者11の姿勢と、当該姿勢の開始時点と終了時点との2時点における血流量の相対的数値によって、対象者11の血流の悪化の程度を推定できる。また、対象者11の姿勢及び当該姿勢の継続時間と、血流の悪化との間に相関関係が存在するため、単に、対象者11の姿勢及び当該姿勢の継続時間から対象者の血流量を推定してもよい。 In addition, as described above, the blood flow is information for quantifying the blood flow that deteriorates when the subject 11 maintains the posture. As the blood flow decreases, it means that the blood flow of the subject 11 is worsening, and can be used as an index of fatigue caused by the deterioration of the blood flow. The blood flow may be obtained as an absolute numerical value at the time of measurement, or may be obtained as a relative numerical value between two different time points. For example, the degree of deterioration of the blood flow of the subject 11 can be estimated from the posture of the subject 11 and the relative numerical values of the blood flow at two points of time when the posture starts and ends. In addition, since there is a correlation between the posture of the subject 11 and the duration of the posture, and the deterioration of blood flow, the blood flow rate of the subject can be calculated simply from the posture of the subject 11 and the duration of the posture. can be estimated.
 また、以降の説明では、上記した筋骨格モデルを用いて、対象者11の姿勢からの、筋肉への負荷量、関節への負荷量、及び血流量の少なくとも一つの推定を行うが、姿勢から、筋肉への負荷量、関節への負荷量、及び血流量を推定する方法として、上記の筋骨格モデルの他に、実測データを用いる方法を適用することも可能である。この実測データは、つまり、姿勢ごとに計測された、筋肉への負荷量、関節への負荷量、及び血流量の実測値を姿勢と対応付けて蓄積することで構築されたデータベースである。この場合の疲労推定システム200では、推定された対象者11の姿勢をデータベースに入力することで、対応する姿勢での、筋肉への負荷量、関節への負荷量、及び血流量の実測値を出力として得ることができる。 Further, in the following description, at least one of the amount of load on the muscles, the amount of load on the joints, and the amount of blood flow is estimated from the posture of the subject 11 using the musculoskeletal model described above. In addition to the musculoskeletal model described above, a method using actual measurement data can also be applied as a method for estimating the load on muscles, the load on joints, and the blood flow. This measured data is a database constructed by accumulating measured values of load on muscles, load on joints, and blood flow, which are measured for each posture, in association with the posture. In the fatigue estimation system 200 in this case, by inputting the estimated posture of the subject 11 into the database, the measured values of the load on the muscles, the load on the joints, and the blood flow in the corresponding posture are obtained. can be obtained as output.
 実測データは、対象者11の個人差を考慮して、個人ごとの実測値を用いて構築されてもよく、不特定多数の被検者から得られたビッグデータについて、統計解析、又は機械学習等の解析処理によって対象者11ごとに適合するよう、適格化して構築されてもよい。 The actual measurement data may be constructed using actual measurement values for each individual in consideration of individual differences in the subject 11, and statistical analysis or machine learning is performed for big data obtained from an unspecified number of subjects. It may be qualified and constructed so as to match each subject 11 by analysis processing such as.
 次に、本開示における疲労推定システム200の機能構成について、図2を用いて説明する。図2は、実施の形態に係る疲労推定システムの機能構成を示すブロック図である。 Next, the functional configuration of the fatigue estimation system 200 according to the present disclosure will be described using FIG. FIG. 2 is a block diagram showing the functional configuration of the fatigue estimation system according to the embodiment.
 図2に示すように、本開示における疲労推定システム200は、疲労推定装置100、複数の撮像装置201、表示装置205、及び回復装置206を備える。 As shown in FIG. 2, the fatigue estimation system 200 in the present disclosure includes a fatigue estimation device 100, multiple imaging devices 201, a display device 205, and a recovery device 206.
 推定装置100は、取得部101と、識別部102と、姿勢推定部105と、第1算出部106と、第2算出部107と、疲労推定部108と出力部109と、を備える。 Estimation device 100 includes acquisition section 101 , identification section 102 , posture estimation section 105 , first calculation section 106 , second calculation section 107 , fatigue estimation section 108 and output section 109 .
 取得部101は、複数の撮像装置201のそれぞれに接続され、複数の撮像装置201のそれぞれから対象者11が撮像された画像を取得する通信モジュールである。取得部101と撮像装置201との接続は、有線又は無線によって行われ、当該接続を介して行われる通信の方式にも特に限定はない。 The acquisition unit 101 is a communication module that is connected to each of the plurality of imaging devices 201 and acquires images of the subject 11 captured from each of the plurality of imaging devices 201 . The connection between the acquisition unit 101 and the imaging device 201 is performed by wire or wirelessly, and the method of communication performed through the connection is not particularly limited.
 疲労推定装置100は、上記の撮像装置201の他に、計時装置に接続され、計時装置から時間を取得する通信モジュールを備えてもよい。 The fatigue estimation device 100 may include, in addition to the imaging device 201 described above, a communication module that is connected to a timer and acquires time from the timer.
 また、疲労推定装置100は、他に、圧力センサに接続され、圧力センサから圧力分布を取得する通信モジュールを備えてもよい。 In addition, the fatigue estimation device 100 may also include a communication module that is connected to the pressure sensor and acquires the pressure distribution from the pressure sensor.
 また、疲労推定装置100は、他に、受付装置に接続され、受付装置から個人情報を取得する通信モジュールを備えてもよい。 In addition, the fatigue estimation device 100 may also include a communication module that is connected to the reception device and acquires personal information from the reception device.
 識別部102は、プロセッサ及びメモリを用いて所定のプログラムが実行されることにより実現される処理部である。識別部102は、1人の対象者11を他の人物から識別するための識別機能を実現するために搭載されている。識別部102の機能の詳細については後述する。 The identification unit 102 is a processing unit realized by executing a predetermined program using a processor and memory. The identification unit 102 is installed to realize an identification function for identifying one target person 11 from other persons. Details of the function of the identification unit 102 will be described later.
 姿勢推定部105は、プロセッサ及びメモリを用いて所定のプログラムが実行されることにより実現される処理部である。姿勢推定部105の処理により、取得部101において取得された画像、及び、付加的に取得された圧力分布などに基づいて、対象者11の姿勢が推定される。姿勢推定部105は、姿勢推定装置の一例である。すなわち、本実施の形態における姿勢推定装置は、姿勢推定部105として、疲労推定装置100に内蔵されるようにして実現される。 The posture estimation unit 105 is a processing unit realized by executing a predetermined program using a processor and memory. By the processing of the posture estimation unit 105, the posture of the subject 11 is estimated based on the image acquired by the acquisition unit 101, the additionally acquired pressure distribution, and the like. Posture estimation section 105 is an example of a posture estimation device. That is, the posture estimation apparatus according to the present embodiment is implemented as posture estimation section 105 incorporated in fatigue estimation apparatus 100 .
 第1算出部106は、プロセッサ及びメモリを用いて所定のプログラムが実行されることにより実現される処理部である。第1算出部106の処理により、推定された対象者11の姿勢、及び、付加的に取得された個人情報に基づいて、個々の筋肉及び/又は関節にかかる負荷量が算出される。 The first calculation unit 106 is a processing unit realized by executing a predetermined program using a processor and memory. Based on the estimated posture of the subject 11 and additionally acquired personal information, the load amount applied to each muscle and/or joint is calculated by the processing of the first calculation unit 106 .
 第2算出部107は、プロセッサ及びメモリを用いて所定のプログラムが実行されることにより実現される処理部である。第2算出部107の処理により、推定された対象者11の姿勢の変化における変化量に基づいて、個々の筋肉及び/又は関節における疲労の回復量が算出される。 The second calculation unit 107 is a processing unit realized by executing a predetermined program using a processor and memory. By the processing of the second calculation unit 107, the amount of recovery from fatigue in each muscle and/or joint is calculated based on the amount of change in the estimated change in posture of the subject 11. FIG.
 疲労推定部108は、プロセッサ及びメモリを用いて所定のプログラムが実行されることにより実現される処理部である。疲労推定部108は、姿勢推定部105において推定された姿勢と、計時装置又は内部の発振器などから取得された時間とを用いて、推定された姿勢の継続時間に基づいて、対象者11の疲労度を推定する。 The fatigue estimation unit 108 is a processing unit realized by executing a predetermined program using a processor and memory. The fatigue estimating unit 108 uses the posture estimated by the posture estimating unit 105 and the time obtained from a timing device or an internal oscillator to estimate fatigue of the subject 11 based on the duration of the posture. Estimate degrees.
 出力部109は、表示装置205及び回復装置206に接続され、疲労推定装置100による疲労度の推定結果に基づく内容を表示装置205及び回復装置206に出力する通信モジュールである。出力部109と表示装置205又は回復装置206との接続は、有線又は無線によって行われ、当該接続を介して行われる通信の方式にも特に限定はない。 The output unit 109 is a communication module that is connected to the display device 205 and the recovery device 206 and outputs to the display device 205 and the recovery device 206 the content based on the fatigue level estimation result by the fatigue estimation device 100 . The connection between the output unit 109 and the display device 205 or the recovery device 206 is performed by wire or wirelessly, and the method of communication performed through the connection is not particularly limited.
 また、出力部109は、姿勢推定部105において、対象者11の姿勢の推定が失敗した場合に、疲労度の推定が行えない対象者11が存在することとなり、この旨を外部に通知する機能も有する。すなわち、出力部109は、対象者11の姿勢の推定が失敗したことを示す姿勢推定部105の推定結果を出力する。 In addition, when the posture estimation unit 105 fails to estimate the posture of the target person 11, the output unit 109 has a function of notifying the outside that there is a target person 11 whose degree of fatigue cannot be estimated. also have That is, the output unit 109 outputs the estimation result of the posture estimation unit 105 indicating that estimation of the posture of the subject 11 has failed.
 このとき、出力部109は、上記の推定結果を、姿勢の推定が失敗した対象者11の存在を通知するための通知情報として出力する。疲労推定システム200の管理者等は、この通知情報に基づいて、姿勢の推定が失敗した対象者11の疲労度を別のシステムを用いて推定する、又は、直接聴取するなどの対処が可能である。このようにして、姿勢の推定が失敗した対象者11についても、その疲労度の取りこぼしがないように、あらかじめ出力部109が構成されている。 At this time, the output unit 109 outputs the above estimation result as notification information for notifying the existence of the target person 11 whose posture estimation has failed. The administrator or the like of the fatigue estimation system 200 can take measures such as estimating the fatigue level of the subject 11 whose posture estimation has failed based on this notification information using another system, or directly listening to the fatigue level. be. In this manner, the output unit 109 is configured in advance so that the fatigue level of the target person 11 whose posture has failed to be estimated will not be missed.
 撮像装置201は、上記したように、対象者11を撮像して画像を出力する装置であり、カメラによって実現される。撮像装置201として、防犯カメラ、定点カメラ等の疲労推定システム200を適用する空間に既存のカメラが用いられてもよく、専用のカメラが新たに設けられてもよい。このような撮像装置201は、画像を対象者11の身体部位の位置に関する情報として出力する情報出力装置の一例である。したがって、出力される情報は、画像であり、対象者11の身体部位の、投影された撮像素子上での位置関係を含む情報である。 As described above, the imaging device 201 is a device that captures an image of the subject 11 and outputs an image, and is realized by a camera. As the imaging device 201, an existing camera such as a security camera or a fixed-point camera may be used in the space where the fatigue estimation system 200 is applied, or a dedicated camera may be newly provided. Such an imaging device 201 is an example of an information output device that outputs an image as information about the position of the body part of the subject 11 . Therefore, the information to be output is an image, and is information including the positional relationship of the body parts of the subject 11 projected on the imaging device.
 計時装置は、時間を計測する装置であり、時計によって実現される。計時装置によって計測される時間とは、絶対的な時刻であってもよく、相対的な起点からの経過時間であってもよい。計時装置は、対象者11の静止を検出した時点と、疲労度を推定する時点との2時点の間の時間(つまり静止姿勢の継続時間)が計測できればどのような形態で実現されてもよい。 A clock is a device that measures time, and is realized by a clock. The time measured by the timer may be absolute time or relative elapsed time from a starting point. The timing device may be realized in any form as long as it can measure the time between two points of time when the target person 11 is detected to be still and the point of time when the degree of fatigue is estimated (that is, the duration of the stationary posture). .
 圧力センサは、検出面を有するセンサであり、当該検出面を1以上に区切る単位検出面のそれぞれに付与される圧力を計測する。圧力センサは、このように単位検出面ごとの圧力を計測し、検出面上における圧力分布を出力する。圧力センサは、対象者11が検出面上に位置するように設けられる。 A pressure sensor is a sensor having a detection surface, and measures the pressure applied to each of the unit detection surfaces that divide the detection surface into one or more. The pressure sensor thus measures the pressure for each unit detection surface and outputs the pressure distribution on the detection surface. The pressure sensor is provided so that the subject 11 is positioned on the detection surface.
 例えば、圧力センサは、対象者11が着座する椅子の座面、及びバックレストに設けられる。また、例えば、圧力センサは、検出面上にマーカが付され、「マーカの上に座ってください」等の表示によって、対象者11を検出面上に誘導するようにしてもよい。また、このようにして、床上の一部分に設けられた圧力センサの検出面上に対象者11を誘導することで、圧力センサは、床上での対象者11の圧力分布を出力してもよい。なお、圧力分布は、疲労度の推定精度を向上する目的で使用されるため、十分な精度が確保される場合には、圧力センサを備えずに疲労推定システム200を実現してもよい。 For example, the pressure sensors are provided on the seat and backrest of the chair on which the subject 11 sits. Further, for example, the pressure sensor may have a marker attached on the detection surface, and the subject 11 may be guided onto the detection surface by a display such as "Please sit on the marker." By guiding the target person 11 onto the detection surface of the pressure sensor provided on a part of the floor in this way, the pressure sensor may output the pressure distribution of the target person 11 on the floor. Since the pressure distribution is used for the purpose of improving the accuracy of estimating the degree of fatigue, the fatigue estimation system 200 may be implemented without the pressure sensor if sufficient accuracy is ensured.
 受付装置は、対象者11の個人情報の入力を受け付けるユーザインタフェースであり、タッチパネル又はキーボード等の入力装置によって実現される。個人情報は、年齢、性別、身長、体重、筋肉量、ストレス度、体脂肪率、及び運動に対する習熟度のうち少なくとも一つを含む。対象者11の年齢は、具体的な数値であってもよく、10代、20代、及び30代のように、10歳ごとに区分された年齢帯であってもよく、59歳以下又は60歳以上のように所定の年齢を境とした二区分の年齢帯であってもよく、その他であってもよい。 The reception device is a user interface that receives input of the personal information of the subject 11, and is realized by an input device such as a touch panel or keyboard. The personal information includes at least one of age, sex, height, weight, muscle mass, stress level, body fat percentage, and exercise proficiency. The age of the target person 11 may be a specific numerical value, or may be an age group divided by 10 years, such as teens, 20s, and 30s. The age range may be divided into two divisions based on a predetermined age, such as age and older, or may be other age groups.
 また、対象者11の性別は、男性又は女性の二者のうちから選択される、対象者11に適切な一方である。また、身長及び体重としては、対象者11の身長及び体重の数値がそれぞれ受け付けられる。また、筋肉量としては、体組成計等を用いて計測された対象者11の筋肉の組成比率が受け付けられる。また、対象者11のストレス度は、対象者11が感じる主観的なストレスの程度として、高度、中度及び低度等の選択肢の中から対象者11自身によって選択される。 In addition, the gender of the subject 11 is one suitable for the subject 11, which is selected from two of male and female. Also, as the height and weight, numerical values of the height and weight of the subject 11 are respectively accepted. Also, as the muscle mass, the muscle composition ratio of the subject 11 measured using a body composition meter or the like is accepted. In addition, the stress level of the subject 11 is selected by the subject 11 himself/herself from options such as high, medium, and low as the degree of subjective stress felt by the subject 11 .
 また、対象者11の体脂肪率は、対象者11の体重に占める体脂肪の重量の比率であり、例えば、100分率等で表現される。 Also, the body fat percentage of the subject 11 is the ratio of the body fat weight to the body weight of the subject 11, and is expressed, for example, as a percentage of 100.
 また、対象者11の運動に対する習熟度は、所定の運動プログラムを対象者11が実施した際のスコアで定量化されていてもよく、対象者11が普段取り組む運動の状況であってもよい。前者では、例えば、背筋を10回行うのに要した時間、50mを走るのに要した時間、遠投の飛距離等によって定量化される。後者では、例えば、一週間に何日運動を行うか、又は何時間運動を行うか等によって定量化される。なお、個人情報は、疲労度の推定精度を向上する目的で使用されるため、十分な精度が確保される場合には、受付装置を備えずに疲労推定システム200を実現してもよい。 In addition, the subject's 11 exercise proficiency may be quantified by a score obtained when the subject 11 executes a predetermined exercise program, or may be the status of the exercise that the subject 11 usually undertakes. The former is quantified by, for example, the time required to perform ten spins, the time required to run 50 meters, the flight distance of a long throw, and the like. The latter is quantified by, for example, how many days a week you exercise or how many hours you exercise. Since the personal information is used for the purpose of improving the accuracy of estimating the degree of fatigue, the fatigue estimation system 200 may be implemented without the reception device if sufficient accuracy is ensured.
 表示装置205は、出力部109によって出力された、疲労度の推定結果に基づく内容を表示するための装置である。表示装置205は、例えば、液晶パネルまたは有機EL(Electro Luminescence)パネルなどの表示パネルによって、疲労度の推定結果に基づく内容を示す画像を表示する。表示装置205によって表示される内容については後述する。また、疲労推定システム200は、対象者11に対して回復装置206を用いて対象者11の疲労度を低下させるのみの構成である場合、回復装置206のみを備えればよく、表示装置205は必須でない。 The display device 205 is a device for displaying the content based on the fatigue level estimation results output by the output unit 109 . The display device 205 displays an image showing the content based on the result of estimating the degree of fatigue using a display panel such as a liquid crystal panel or an organic EL (Electro Luminescence) panel. The contents displayed by the display device 205 will be described later. Further, if the fatigue estimation system 200 is configured to only reduce the degree of fatigue of the subject 11 using the recovery device 206 for the subject 11, only the recovery device 206 may be provided, and the display device 205 Not required.
 回復装置206は、対象者11の血行を促進させることで対象者11の疲労度を低下させる装置である。回復装置206は、具体的には、電圧印加、加圧、加振もしくは加温等、又は、椅子12に備えられた機構により椅子12の各部の配置が変化することで、着座する対象者11の姿勢を能動的に変更する。これにより、回復装置206は、対象者11の筋肉及び関節の少なくとも一方の負荷の態様を変更し、また、血行を促進させる。血流量の観点においても、このようにして血行が促進されることで、対象者11が静止姿勢であることによる血流悪化の影響が低減され、疲労度が回復する。回復装置206は、装置の構成に応じて、対象者11の適切な身体部位にあらかじめ装着又は接触される。 The recovery device 206 is a device that reduces the degree of fatigue of the subject 11 by promoting the subject's 11 blood circulation. Specifically, the recovery device 206 changes the arrangement of each part of the chair 12 by applying voltage, pressurizing, vibrating, heating, or the like, or by a mechanism provided in the chair 12, so that the sitting subject 11 Actively change the posture of As a result, the recovery device 206 changes the load on at least one of the muscles and joints of the subject 11 and promotes blood circulation. From the viewpoint of the blood flow, by promoting the blood circulation in this way, the influence of the deterioration of the blood flow due to the subject 11 being in the still posture is reduced, and the degree of fatigue is recovered. The recovery device 206 is pre-applied or contacted to the appropriate body part of the subject 11, depending on the configuration of the device.
 なお、加温により対象者11の血行を促進させる場合、対象者11の周囲の空間ごと加温するため、このような場合は、対象者11の適切な身体部位に装着又は接触される必要はない。また、疲労推定システム200は、対象者11に対して疲労度の推定結果を表示するのみの構成である場合、表示装置205のみを備えればよく、回復装置206は必須でない。 It should be noted that when the blood circulation of the subject 11 is promoted by heating, the entire space around the subject 11 is heated. do not have. In addition, if the fatigue estimation system 200 is configured to display only the fatigue level estimation result to the subject 11, the display device 205 only needs to be provided, and the recovery device 206 is not essential.
 [動作]
 次に、実施の形態における疲労推定システム200を用いた対象者11の疲労度の推定について、図3~図6を用いて説明する。図3は、実施の形態に係る疲労度の推定方法を示すフローチャートである。
[motion]
Next, estimation of the degree of fatigue of subject 11 using fatigue estimation system 200 according to the embodiment will be described with reference to FIGS. 3 to 6. FIG. FIG. 3 is a flowchart showing a fatigue level estimation method according to the embodiment.
 疲労推定システム200は、はじめに対象者11の個人情報を取得する。個人情報の取得は、受付装置への入力によって、対象者11本人又は対象者11の疲労度を管理する管理者等によって行われる。入力された対象者11の個人情報は、図示しない記憶装置等に格納され、疲労度の推定の際に読み出されて使用される。 The fatigue estimation system 200 first acquires the personal information of the subject 11 . Acquisition of personal information is performed by the target person 11 himself or a manager who manages the degree of fatigue of the target person 11 by inputting to the reception device. The input personal information of the subject 11 is stored in a storage device or the like (not shown), and is read out and used when estimating the degree of fatigue.
 疲労推定システム200は、撮像装置201により対象者11の検知を行う。対象者11の検知は、撮像装置201であるカメラの画角内にいるすべての人物を対象者11とするように、画角内の人物を計数することで行われる。つまりここでは、検知対象人数(A)を取得する動作が行われる(S101)。次に、これらの人物の中で、画像の処理を行うことなく、すなわち、1枚の画像のみで姿勢の推定が可能な人物の人数(B)を算出する(S102)。そして、疲労推定装置100は、AとBとが一致するか否かの判定を行う(S103)。疲労推定装置100が、AとBとが一致すると判定した場合(S103でYes)、つまり、検知対象の人物の全員がそれぞれ1枚の画像で姿勢の推定が可能である場合、それぞれ1枚の画像から対象者11の姿勢の推定(ここでは関節位置の推定)を行う(S108)。そして、疲労推定部108は、対象者11の疲労度の推定を行う(S109)。 The fatigue estimation system 200 detects the subject 11 using the imaging device 201 . The detection of the target person 11 is performed by counting the persons within the angle of view so that all persons within the angle of view of the camera, which is the imaging device 201 , are the target persons 11 . That is, here, the operation of acquiring the number of people (A) to be detected is performed (S101). Next, among these persons, the number (B) of persons whose postures can be estimated without image processing, that is, with only one image is calculated (S102). Then, the fatigue estimation device 100 determines whether or not A and B match (S103). When the fatigue estimation apparatus 100 determines that A and B match (Yes in S103), that is, when the postures of all persons to be detected can be estimated from one image, one image is obtained for each person. The posture of the subject 11 is estimated from the image (here, the joint positions are estimated) (S108). Then, the fatigue estimation unit 108 estimates the fatigue level of the subject 11 (S109).
 ステップS108及びステップS109は以下のようにして行われる。まず、疲労推定システム200は、撮像装置201によって出力された画像を取得部101によって取得する。ここで、取得された画像において、対象者11が静止している(静止姿勢である)ことが検知されると、推定装置100において対象者11の姿勢の推定が行われる。具体的には、まず、圧力センサから検出面に付与される圧力分布を取得する。  Steps S108 and S109 are performed as follows. First, the fatigue estimation system 200 acquires the image output by the imaging device 201 by the acquisition unit 101 . Here, when it is detected that the subject 11 is stationary (is in a stationary posture) in the acquired image, the estimation device 100 estimates the posture of the subject 11 . Specifically, first, the pressure distribution applied to the detection surface is obtained from the pressure sensor.
 姿勢推定部105は、取得された画像及び圧力分布に基づき対象者11の姿勢を推定する。圧力分布は、例えば、偏った圧力が付与されている場合、推定される姿勢を当該偏りが形成されるように補正するために使用される。次に、第1算出部106は、姿勢の推定結果から、対象者11の個々の筋肉及び/又は関節における負荷量を算出する。このとき、あらかじめ取得した個人情報を用いて、負荷量を補正して算出する。なお、対象者11の姿勢の推定は図1Bを用いて、負荷量の算出は図1Cを用いてそれぞれ説明した通りであるため、具体的な説明を省略する。 The posture estimation unit 105 estimates the posture of the subject 11 based on the acquired image and pressure distribution. The pressure distribution is used, for example, when biased pressure is applied, to correct the estimated pose to form that bias. Next, the first calculator 106 calculates the amount of load on each muscle and/or joint of the subject 11 from the posture estimation result. At this time, the personal information obtained in advance is used to correct and calculate the load amount. The estimation of the posture of the subject 11 is as described with reference to FIG. 1B, and the calculation of the amount of load is as described with reference to FIG. 1C, so detailed description thereof will be omitted.
 個人情報を用いた負荷量の補正では、例えば、対象者11の年齢が筋肉の発達のピーク年齢に近いほど負荷量を少なくし、当該ピーク年齢から離れるほど負荷量を多くする。このようなピーク値は対象者11の性別に基づいてもよい。また、対象者11の性別が男性であれば負荷量を少なく、女性であれば負荷量を多くしてもよい。また、対象者11の身長及び体重が小さい値であるほど負荷量を少なく、身長及び体重が大きい値であるほど負荷量を多くしてもよい。 In the correction of the load amount using personal information, for example, the closer the age of the subject 11 is to the peak age of muscle development, the smaller the load amount, and the farther from the peak age, the larger the load amount. Such peak values may be based on the subject's 11 gender. Also, if the sex of the subject 11 is male, the amount of load may be small, and if the sex of the subject 11 is female, the amount of load may be large. Alternatively, the smaller the height and weight of the subject 11, the smaller the load, and the larger the height and weight, the larger the load.
 また、対象者11の筋肉量が大きい組成比率であるほど負荷量を少なく、筋肉量が小さい組成比率であるほど負荷量を多くしてもよい。また、対象者11のストレス度が低いほど負荷量を少なく、ストレス度が高いほど負荷量を多くしてもよい。また、対象者11の体脂肪率が高いほど負荷量を多く、体脂肪率が低いほど負荷量を少なくしてもよい。さらに、対象者11の運動に対する習熟度が高いほど負荷量を少なく、運動に対する習熟度が低いほど負荷量を多くしてもよい。 In addition, the load amount may be reduced as the composition ratio of the subject 11 has a large muscle mass, and the load amount may be increased as the composition ratio of the muscle mass of the subject 11 is small. Alternatively, the lower the stress level of the subject 11, the smaller the load amount, and the higher the stress level, the larger the load amount. Alternatively, the higher the body fat percentage of the subject 11, the larger the load amount, and the lower the body fat percentage, the smaller the load amount. Furthermore, the higher the proficiency level of the subject 11 with respect to the exercise, the lower the load amount, and the lower the proficiency level with respect to the exercise, the higher the load amount.
 ここで、計時装置から取得される時間をもとに、対象者11の静止姿勢の継続時間を計測する。疲労推定部108は、継続時間が単位時間を経過するごとに上記で算出した負荷量を加算し、この時点における対象者11の疲労度を推定する。これらの処理を、対象者11の静止状態が解除されるまで継続する。具体的に、姿勢推定部105において推定される姿勢が、ある静止姿勢から変更されたか否かにより、静止状態の解除の有無を判定する。 Here, the duration of the stationary posture of the subject 11 is measured based on the time acquired from the timing device. The fatigue estimating unit 108 adds the load amount calculated above each time the duration time passes by the unit time, and estimates the degree of fatigue of the subject 11 at this point. These processes are continued until the target person 11 is released from the stationary state. Specifically, whether or not the stationary state has been canceled is determined based on whether or not the orientation estimated by orientation estimation section 105 has changed from a certain static orientation.
 静止状態が解除されたと判定されない場合、継続時間を計測し、負荷量の加算を行うことで、静止姿勢が継続される限り対象者11の疲労度を積算していく。つまり、疲労推定部108は、継続時間に対して、算出された負荷量に相当する傾きを有する疲労度の増加関数を用いて対象者11の疲労度を推定する。したがって、算出した負荷量が多いほど、単位時間当たりに増加する対象者11の疲労度が大きくなる。なお、このような疲労度の積算においては、起点である静止姿勢の開始タイミングで対象者11の疲労度が初期化(疲労度0に設定)される。 If it is not determined that the stationary state has been released, the duration time is measured and the load amount is added, so that the fatigue level of the subject 11 is accumulated as long as the stationary posture continues. That is, the fatigue estimation unit 108 estimates the fatigue level of the subject 11 using a fatigue level increasing function having a gradient corresponding to the calculated load amount with respect to the duration. Therefore, the greater the calculated load amount, the greater the degree of fatigue of the subject 11 that increases per unit time. In addition, in such accumulation of the fatigue level, the fatigue level of the subject 11 is initialized (set to 0) at the start timing of the stationary posture, which is the starting point.
 一方で、静止状態が解除されたと判定された場合、姿勢推定部105は、元の静止状態の姿勢から、変化した現時点の姿勢までの姿勢の変化量を算出する。姿勢の変化量は、上記の負荷量と同様に個々の筋肉及び/又は関節ごとに算出される。このように姿勢が変化した際、筋肉及び関節の少なくとも一方に対する負荷が変化し、また、血流量の観点では、悪化していた血流が一時的に緩和され、対象者11の疲労度は回復に転じる。回復によって低減される疲労度は、姿勢の変化量に関連する。これにしたがって、第2算出部107は、姿勢の変化量に基づき、疲労度の回復の程度である回復量を算出する。 On the other hand, when it is determined that the stationary state has been released, the posture estimation unit 105 calculates the amount of change in posture from the original stationary state posture to the current posture. The amount of change in posture is calculated for each individual muscle and/or joint, similar to the amount of load described above. When the posture changes in this way, the load on at least one of the muscles and joints changes, and in terms of the blood flow, the blood flow that has deteriorated is temporarily relieved, and the fatigue level of the subject 11 is recovered. turn into The degree of fatigue reduced by recovery is related to the amount of change in posture. Accordingly, the second calculation unit 107 calculates a recovery amount, which is the degree of recovery from fatigue, based on the amount of change in posture.
 計時装置から取得される時間をもとに、対象者11の姿勢の変更が継続する時間である変化時間を計測する。回復量と変化時間との関係は、負荷量と継続時間との関係と同様であり、姿勢の変化が継続される限り対象者11の回復量を積算していく。つまり、疲労推定部108は、このように対象者11の姿勢が変化するタイミングでは、単位時間が経過するごとに回復量を減算することで対象者11の疲労度を推定する。 Based on the time acquired from the timing device, the change time, which is the time during which the change in posture of the subject 11 continues, is measured. The relationship between the recovery amount and the change time is the same as the relationship between the load amount and the duration time, and the recovery amount of the subject 11 is integrated as long as the posture change continues. In other words, the fatigue estimating unit 108 estimates the fatigue level of the subject 11 by subtracting the recovery amount each time the subject 11 passes the unit time at the timing when the posture of the subject 11 changes.
 回復量の算出、変化時間の計測、及び、対象者11の疲労度の推定の処理を、対象者11の姿勢が静止されるまで継続する。具体的に、姿勢推定部105において推定される姿勢が、ある静止姿勢であるか否かを判定する。対象者11の静止が検出されない場合、回復量を算出し、変化時間を計測し、回復量の減算を行うことで、姿勢の変更が継続される限り対象者11の疲労度が回復するように積算していく。  The process of calculating the amount of recovery, measuring the change time, and estimating the degree of fatigue of the subject 11 continues until the posture of the subject 11 is stationary. Specifically, it is determined whether or not the posture estimated by posture estimation section 105 is a static posture. When the target person 11 is not detected to be stationary, the amount of recovery is calculated, the change time is measured, and the amount of recovery is subtracted so that the fatigue level of the target person 11 recovers as long as the change in posture continues. Accumulate.
 つまり、疲労推定部108は、変化時間に対して、算出された回復量に相当する傾きを有する疲労度の減少関数を用いて対象者11の疲労度を推定する。疲労度の回復量は、姿勢の変化量に依存しているため、姿勢の変化量が大きいほど、単位時間当たりに減少する対象者11の疲労度が大きくなる。 That is, the fatigue estimation unit 108 estimates the fatigue level of the subject 11 using a fatigue level decreasing function having a slope corresponding to the calculated recovery amount with respect to the change time. Since the recovery amount of the fatigue level depends on the amount of change in posture, the greater the amount of change in posture, the greater the fatigue level of the subject 11, which decreases per unit time.
 一方で、対象者11の静止が検出された場合、再度新たな静止姿勢について、姿勢及び疲労度の推定を行う。このようにして、疲労推定システム200では、画像に基づき静止姿勢での継続時間を考慮して対象者11の疲労度が算出されるため、対象者11の負担が少なく、かつ、より高精度に対象者11の疲労度を推定することができる。 On the other hand, when the target person 11 is detected to be stationary, the posture and fatigue level are estimated again for a new static posture. In this way, in the fatigue estimation system 200, since the fatigue level of the subject 11 is calculated in consideration of the duration time in the stationary posture based on the image, the burden on the subject 11 is small, and more accurate The degree of fatigue of the subject 11 can be estimated.
 以上に関して、さらに図4A~図5Bを用いて具体的に説明する。図4Aは、姿勢Aで静止する対象者を示す図である。また、図4Bは、姿勢Bで静止する対象者を示す図である。 The above will be explained in more detail with reference to FIGS. 4A to 5B. 4A is a diagram showing a subject standing still in Posture A. FIG. Moreover, FIG. 4B is a diagram showing the subject standing still in posture B. As shown in FIG.
 図4A及び図4Bに示す対象者11は、図1Aに示したものと同様に、椅子12に着座した座位にて静止姿勢をとっている。なお、図4A及び図4Bでは、図示しないテーブルやPC等が実際には存在するが、ここでは対象者11及び椅子12のみを図示している。図4Aに示す対象者11の静止姿勢は、肩への負荷が比較的に多い姿勢Aである。一方で、図4Bに示す対象者11の静止姿勢は、肩への負荷が比較的に少ない姿勢Bである。 A subject 11 shown in FIGS. 4A and 4B is in a stationary posture in a seated position on a chair 12, similar to that shown in FIG. 1A. In FIGS. 4A and 4B, there are actually tables, PCs, etc., which are not shown, but only the subject 11 and the chair 12 are shown here. The stationary posture of the subject 11 shown in FIG. 4A is posture A in which the load on the shoulders is relatively large. On the other hand, the stationary posture of the subject 11 shown in FIG. 4B is posture B in which the load on the shoulders is relatively small.
 このような姿勢A又は姿勢Bで静止する対象者11において推定される疲労度は、図5A及び図5Bのように蓄積される。図5Aは、実施の形態に係る推定される対象者の疲労度の蓄積を説明する第1図である。また、図5Bは、実施の形態に係る推定される対象者の疲労度の蓄積を説明する第2図である。 The degree of fatigue estimated for the subject 11 standing still in posture A or posture B is accumulated as shown in FIGS. 5A and 5B. FIG. 5A is a first diagram illustrating accumulation of estimated fatigue level of a subject according to the embodiment. Also, FIG. 5B is a second diagram for explaining the accumulation of the subject's fatigue level estimated according to the embodiment.
 図5Aに示すように、対象者11の姿勢が図4Aに示す姿勢A又は図4Bに示す姿勢Bのまま静止している場合、対象者11の疲労度は、姿勢から算出される負荷量を傾きとする一次関数によって表現される。 As shown in FIG. 5A, when the target person 11 is stationary in the posture A shown in FIG. 4A or the posture B shown in FIG. It is represented by a linear function with a slope.
 上記したように、姿勢Aは、姿勢Bに比べて負荷が大きい姿勢である。したがって、例えば、対象者11のある筋肉(ここでは肩の可動に関する筋肉)において、姿勢Aの負荷量(姿勢Aの直線の傾き)は、姿勢Bの負荷量(姿勢Bの直線の傾き)よりも大きい。このため、対象者11は、姿勢Aでは、姿勢Bで静止している場合に比べ、より短時間により多くの疲労度が蓄積(積算)されてしまう。 As described above, posture A is a posture with a larger load than posture B. Therefore, for example, in certain muscles of the subject 11 (here, muscles related to shoulder movement), the load amount of posture A (slope of the straight line of posture A) is greater than the load of posture B (slope of the straight line of posture B). is also big. Therefore, in the posture A, the target person 11 accumulates (accumulates) a larger degree of fatigue in a shorter time than in the case of standing still in the posture B.
 一方で、図5Bに示すように、対象者11の姿勢が図4Aに示す姿勢Aから図4Bに示す姿勢Bへと変化する場合、対象者11の疲労度は、姿勢から算出される負荷量を傾きとする一次関数と、姿勢の変化量を傾きとする一次関数とが連結された関数によって表現される。 On the other hand, as shown in FIG. 5B, when the posture of subject 11 changes from posture A shown in FIG. 4A to posture B shown in FIG. is represented by a function in which a linear function whose gradient is and a linear function whose gradient is the amount of change in posture are connected.
 したがって、例えば、対象者11のある筋肉において、姿勢Aで静止している間は、対象者11の疲労度は、図5Aと同様に、姿勢Aの負荷量に相当する正の傾きの増加関数によって疲労度の蓄積(加算)として推定され、対象者11が姿勢を変更し始めた変化点において蓄積(加算)が回復(減少)に転じる。対象者11の疲労度は、姿勢の変更量に相当する負の傾きの減少関数によって、図中に変化時間として示す姿勢の変更が継続している期間に、対象者11の疲労度が、図中に変化幅として示す量だけ回復(減少)する。対象者11の姿勢が姿勢Bで再び静止した変化点以降では、対象者11の疲労度は、姿勢Bの負荷量に相当する正の傾きの増加関数で、疲労度の蓄積(加算)として推定される。 Therefore, for example, in certain muscles of the subject 11, while the subject 11 is stationary in the posture A, the degree of fatigue of the subject 11 is a positive slope increasing function is estimated as the accumulation (addition) of the degree of fatigue by , and the accumulation (addition) turns to recovery (decrease) at the change point where the subject 11 begins to change the posture. The degree of fatigue of the subject 11 is determined by a decreasing function with a negative slope corresponding to the amount of change in posture. It recovers (decreases) by the amount shown as the width of change. After the change point at which the posture of the subject 11 is again stationary at the posture B, the fatigue level of the subject 11 is an increasing function with a positive slope corresponding to the load amount of the posture B, and is estimated as accumulation (addition) of the fatigue level. be done.
 このように、本実施の形態における疲労推定システム200では、対象者11の姿勢の静止と変更とに応じて蓄積及び回復が反映された対象者11の疲労度が推定される。 Thus, in the fatigue estimation system 200 of the present embodiment, the degree of fatigue of the subject 11 is estimated in which accumulation and recovery are reflected depending on whether the posture of the subject 11 is stationary or changed.
 図3のフローチャートの説明に戻り、ステップS103において、疲労推定装置100が、AとBとが一致しないと判定した場合(S103でNo)、つまり、検知対象の人物の中に1枚の画像では姿勢の推定が不可能である人物が含まれる場合、このような人物の特定を行う(S104)。そして、姿勢推定部105は、特定した人物(以下、関節位置の合成を経て疲労度の推定を行う対象者11とする)の姿勢の推定に利用可能な複数の画像を選定し、これらを取得する(S105)。ここで、複数の画像を用いてもなお、対象者11の姿勢の推定が行えない場合(S106でNo)、この対象者11に対する疲労度の推定の処理を終了する。この際、例えば、出力部109の機能として説明した通知情報の出力が行われる。 Returning to the description of the flowchart in FIG. 3, in step S103, when the fatigue estimation apparatus 100 determines that A and B do not match (No in S103), that is, one image in the person to be detected If a person whose posture cannot be estimated is included, such a person is identified (S104). Then, the posture estimation unit 105 selects and acquires a plurality of images that can be used for estimating the posture of the specified person (hereinafter referred to as a subject 11 whose fatigue level is estimated through synthesis of joint positions). (S105). Here, if the posture of the subject 11 cannot be estimated even using a plurality of images (No in S106), the process of estimating the degree of fatigue of the subject 11 ends. At this time, for example, notification information, which has been described as a function of the output unit 109, is output.
 複数の画像を用いることで、姿勢の推定が可能な(S106でYes)対象者11に対して、以下の図6のようにして対象者11の姿勢の推定を行う。図6は、実施の形態に係る関節位置の合成の例を説明する図である。図6では、取得された複数の画像(左端)に基づいて、対象者11の姿勢(右端)の推定が行われるときの間に生成される情報を示している。 For the target person 11 whose posture can be estimated by using multiple images (Yes in S106), the posture of the target person 11 is estimated as shown in FIG. 6 below. FIG. 6 is a diagram illustrating an example of synthesizing joint positions according to the embodiment. FIG. 6 shows information generated during estimation of the posture (right end) of the subject 11 based on the acquired images (left end).
 また、図6では、左側から中央にかけて、画像の処理系が上下段に分かれており、上段側に複数の画像のうちの一の画像90aと、一の画像90aから推定される一の身体部位における関節位置11cとが示されている。同様に、図6では、下段側に複数の画像のうちの他の画像90bと、他の画像90bから推定される他の身体部位における関節位置11dとが示されている。まず、姿勢推定部105は、ステップS105において、複数の画像(ここでは一の画像90a及び他の画像90b)を取得する。この一の画像90a及び他の画像90bは、それぞれ、対象者11の身体部位の一部である一の身体部位、及び、他の身体部位が撮像された画像である。 In FIG. 6, the image processing system is divided into upper and lower stages from the left side to the center. The joint positions 11c and 11c are shown. Similarly, in FIG. 6, another image 90b of the plurality of images and a joint position 11d in another body part estimated from the other image 90b are shown on the lower side. First, posture estimation section 105 acquires a plurality of images (one image 90a and another image 90b here) in step S105. The one image 90a and the other image 90b are images obtained by imaging one body part and another body part, which are part of the body part of the subject 11, respectively.
 また、一の身体部位と他の身体部位とは、互いに少なくとも1つの関節を共有する身体部位である。したがって、この互いに少なくとも1つの関節を共有する複数の画像の組み合わせが特定できない場合には、上記のステップS106においてNoとなって処理が終了する。 Also, one body part and another body part are body parts that share at least one joint with each other. Therefore, when a combination of a plurality of images sharing at least one joint with each other cannot be specified, the result of step S106 is No, and the process ends.
 姿勢推定部105は、図3に示すように、複数の画像のそれぞれについて、部分的に映り込んだ対象者11の身体部位の関節位置の推定を行う(S107)。図6の例では、一の画像90aに基づいて、一の画像90aに映り込んだ一の身体部位における関節位置11cを推定して、これを含む第1情報90cを生成する。同様に、図6の例では、他の画像90bに基づいて、他の画像90bに映り込んだ他の身体部位における関節位置11dを推定して、これを含む第2情報90dを生成する。その後、一の身体部位における関節位置11cと、他の身体部位における関節位置11dとを合成することにより、一の身体部位及び他の身体部位を含む対象者11の身体部位の関節位置11aを推定する。このようにして、姿勢推定部105は、対象者11の一部の身体部位が映る複数の画像を用いて、対象者11の姿勢の推定を行うことができる。 As shown in FIG. 3, the posture estimation unit 105 estimates the joint positions of the body parts of the subject 11 that are partially reflected in each of the plurality of images (S107). In the example of FIG. 6, based on one image 90a, a joint position 11c in one body part reflected in one image 90a is estimated, and first information 90c including this is generated. Similarly, in the example of FIG. 6, based on another image 90b, a joint position 11d of another body part reflected in another image 90b is estimated, and second information 90d including this is generated. After that, the joint positions 11c of the one body part and the joint positions 11d of the other body parts are combined to estimate the joint positions 11a of the body parts of the subject 11 including the one body part and the other body parts. do. In this manner, the posture estimation unit 105 can estimate the posture of the subject 11 using a plurality of images in which a part of the body part of the subject 11 is captured.
 ここで、引き続き図6を参照しながら関節位置の合成について説明する。対象者11の関節位置の合成には、少なくとも一の身体部位における関節位置11cと他の身体部位における関節位置11dとの間で、共通する座標が2つ以上必要となる。この共通する座標のうち1つは、一の身体部位と他の身体部位との間で共有される関節の1つである共有関節の関節位置である。そして、共有関節の関節位置の他に上記の複数の画像内で共有される基準座標を用いれば、複数の画像の縮尺及び画像内における身体部位の姿勢(向き)を一致させることができる。 Here, the synthesizing of joint positions will be described while continuing to refer to FIG. Synthesis of the joint positions of the subject 11 requires two or more common coordinates between the joint position 11c in at least one body part and the joint position 11d in another body part. One of the common coordinates is the joint position of a shared joint, which is one joint shared between one body part and another body part. By using the reference coordinates shared in the plurality of images in addition to the joint positions of the shared joints, the scales of the plurality of images and the poses (orientations) of the body parts in the images can be matched.
 基準座標としては、一の身体部位と他の身体部位との間で共有される関節のうち、共有関節とは異なる関節の関節位置が用いられる。この例では、姿勢推定部105は、すなわち、一の身体部位と他の身体部位との間で2つ以上の関節が共有されている状況で、これらの関節間で一方の関節から他方の関節へと延びるベクトルの向き及び大きさが一致するように、一の身体部位における関節位置11c及び他の身体部位における関節位置11dの調節を行う。そして、姿勢推定部105は、姿勢(向き)及び縮尺が一致している状態で、一の身体部位における関節位置11c及び他の身体部位における関節位置11dを、共有関節を基準にして重ね合わせて合成することにより、対象者11の身体部位の関節位置11aを推定する。 Among the joints shared between one body part and another body part, the joint positions of the joints that are different from the shared joints are used as the reference coordinates. In this example, posture estimating section 105, that is, in a situation in which two or more joints are shared between one body part and another body part, the posture estimating unit 105 calculates the distance between these joints from one joint to the other joint. The joint position 11c in one body part and the joint position 11d in the other body part are adjusted so that the directions and magnitudes of the vectors extending to the body parts match. Then, the posture estimating unit 105 superimposes the joint position 11c of one body part and the joint position 11d of another body part with the shared joint as a reference while the posture (orientation) and scale match. By synthesizing, the joint positions 11a of the body parts of the subject 11 are estimated.
 また、基準座標としては、上記の他に、一の画像90a及び他の画像90bの両方に写るマーカが用いられてもよい。図6の例では、一の画像90a及び他の画像90bの両方に、共通して移る物体マーカMが示されている。物体マーカMは、専用の物体であってもよいし、空間内に存在し、複数の撮像装置201によって同じ物体であると識別しうる何らかの物体が利用されてもよい。この物体マーカMの空間座標を基準座標にすれば、上記の例と同様に、姿勢(向き)及び縮尺を一致させて、共有関節を基準にして重ね合わせて合成することができる。このために、姿勢推定部105は、一の画像90aから物体マーカMの空間座標Mcを算出し、他の画像90bから物体マーカMの空間座標Mdを算出すればよい。 In addition to the above, markers appearing in both one image 90a and another image 90b may be used as the reference coordinates. In the example of FIG. 6, both one image 90a and another image 90b show a commonly moving object marker M. In the example of FIG. The object marker M may be a dedicated object, or some object that exists in space and can be identified as the same object by multiple imaging devices 201 may be used. If the spatial coordinates of the object marker M are used as the reference coordinates, the posture (orientation) and scale can be matched, and the shared joints can be used as a reference to superimpose and combine. For this purpose, the posture estimation unit 105 may calculate the spatial coordinates Mc of the object marker M from one image 90a, and the spatial coordinates Md of the object marker M from the other image 90b.
 また、マーカの別の例として、ラインマーカLが用いられてもよい。例えば、ラインマーカLは、テープなどが空間内のいずれかの箇所に貼り付けられて実現される。ラインマーカLは、両端部の位置が共通であり、かつ、この両端部間を延びるように構成されるので、ラインマーカLが存在する例では、対象者11の身体部位に共有される関節が存在しなくても上記と同様の効果を得られる。 A line marker L may also be used as another example of the marker. For example, the line marker L is realized by attaching a tape or the like to any location in the space. The line marker L has both ends in common and is configured to extend between the two ends. The same effect as above can be obtained even if it does not exist.
 また、これらのマーカが実空間に存在することを利用して、更なる効果を得ることができる。すなわち、疲労推定システム200では、マーカと対象者11との位置関係に基づいて、対象者11と同じ空間に滞在する他の人物から、対象者11を識別することができる。本実施の形態では、複数の画像から、対象者11の身体部位ごとの関節位置を重ね合わせるなどの処理を行う。したがって、対象者11と同じ空間に他の人物(例えば、図6の他の人物11z)が滞在している場合に、対象者11の一の身体部位における関節位置と、他の人物の他の身体部位における関節位置とを重ね合わせてしまうと、正常に対象者11の姿勢を推定することができない。 Further effects can be obtained by using the fact that these markers exist in the real space. That is, the fatigue estimation system 200 can distinguish the target person 11 from other persons staying in the same space as the target person 11 based on the positional relationship between the marker and the target person 11 . In the present embodiment, processing such as superimposing the joint positions for each body part of the subject 11 is performed from a plurality of images. Therefore, when another person (for example, another person 11z in FIG. 6) is staying in the same space as the subject 11, the joint position of one body part of the subject 11 and the other person's other If the joint positions of the body parts are superimposed, the posture of the subject 11 cannot be estimated normally.
 そこで、疲労推定システム200では、識別部102を備えることで、対象者11を識別し、対象者11の一部の関節位置を、確実に対象者11の他部の関節位置と重ね合わせることができる。このために識別部102は、撮像装置201から取得される画像において、対象者11とマーカとの相対位置(マーカに対して、ある方向にある距離だけ進んだ位置に対象者11が、別の方向に別の距離だけ進んだ位置に他の人物11zが滞在していることを識別する。この処理は、人ひとり分の空間解像度で識別されていればよいので、各関節位置を推定する処理に比べて処理負荷が少なく容易に実装が可能である。 Therefore, in the fatigue estimation system 200, the identification unit 102 is provided to identify the target person 11 and reliably overlap the joint positions of a part of the target person 11 with the joint positions of the other part of the target person 11. can. For this reason, the identification unit 102 detects the relative positions of the subject 11 and the marker (the subject 11 is positioned at a certain distance in a certain direction with respect to the marker) in the image acquired from the imaging device 201 . Recognizing that another person 11z is staying at a position advanced by a different distance in the direction This process may be performed with the spatial resolution of one person, so the process of estimating each joint position It has less processing load than , and can be easily implemented.
 以上のように本実施の形態では、対象者11の身体部位の一部しか映らない画像であっても、複数の画像を組み合わせて補完させることにより、疲労度の推定に用いることが可能な適切な姿勢の推定を行うことができる。 As described above, in the present embodiment, even an image showing only a part of the body part of the subject 11 can be used for estimating the degree of fatigue by combining a plurality of images to complement each other. posture can be estimated.
 また、上記により、対象者11の姿勢として、対象者11の身体部位の一部と、身体部位の別の一部とを含む身体部位の関節位置を推定した際に、対象者11の身体部位の一部及び別の一部に含まれないその他の部位について、姿勢の推定が行えない場合がある。例えば、複数の画像のいずれにも写っていない、すなわち、いずれの画像においても撮像されていない不可視部位については、上記の処理では姿勢を推定することができない。 Further, as described above, when estimating the joint positions of the body parts including a part of the body part of the subject 11 and another part of the body part as the posture of the subject 11, the body part of the subject 11 There are cases where posture estimation cannot be performed for other parts that are not included in a part of and another part. For example, the posture of an invisible part that is not captured in any of the plurality of images, that is, is not captured in any of the images, cannot be estimated by the above processing.
 その際は、例えば、上記によって推定された対象者11の身体部位の一部及び別の一部の関節位置を、人の全身の関節位置が学習された機械学習モデルに入力することで、全身の関節位置を、対象者11の全身の姿勢として推定してもよい。また、全身でなくとも、不可視部位のうちの、機械学習モデルでの出力として、十分な信頼性を有する一部のみを加えた対象者11の姿勢を推定してもよい。このようにして、対象者11の身体部位のうち複数の画像において撮像されていない不可視部位を含む身体部位の関節位置を、対象者11の姿勢として推定するように姿勢推定部105を構成してもよい。 In that case, for example, by inputting the joint positions of a part of the body part of the subject 11 estimated by the above and another part into a machine learning model in which the joint positions of the whole body of the person are learned, may be estimated as the posture of the whole body of the subject 11 . Alternatively, the posture of the subject 11 may be estimated by adding only a part of the invisible parts that has sufficient reliability as an output in the machine learning model, instead of the whole body. In this manner, the posture estimation unit 105 is configured to estimate the joint positions of the body parts of the subject 11 including invisible parts that are not captured in a plurality of images as the posture of the subject 11. good too.
 再び図3に戻り、以上のようにして、姿勢推定部105は、対象者11の一部の身体部位が映る複数の画像を用いて、合成などの処理を経て、対象者11の姿勢の推定を行う(S108)。そして、ステップS109に進み、疲労推定部108は、すでに述べたように、対象者11の疲労度の推定を行う。 Returning to FIG. 3 again, as described above, the posture estimation unit 105 estimates the posture of the subject 11 through processing such as synthesis using a plurality of images in which a part of the body part of the subject 11 is captured. (S108). Then, in step S109, the fatigue estimating unit 108 estimates the degree of fatigue of the subject 11 as described above.
 次に、推定された疲労度に基づく出力部109の出力の一例を説明する。図7は、実施の形態に係る推定結果の表示例を示す図である。 Next, an example of the output of the output unit 109 based on the estimated degree of fatigue will be described. FIG. 7 is a diagram showing a display example of estimation results according to the embodiment.
 図7に示すように、疲労推定システム200では、対象者11の疲労度の推定結果を、表示装置205を用いて表示してフィードバックすることができる。具体的には、図7に示すように、対象者11の疲労度を可視化することで、視覚的に対象者11がどの程度疲労しているかを把握することができる。図中では、対象者11を模した人形と、対象者11の肩部、背部、及び腰部の疲労度のそれぞれとを、表示装置205によって一体的に表示している。対象者11が直感的に疲労度を把握しやすくするため、肩部の疲労度が「肩こり度」、背部の疲労度が「背部痛度」、腰部の疲労度が「腰痛度」としてそれぞれ示されている。 As shown in FIG. 7, in the fatigue estimation system 200, the result of estimating the degree of fatigue of the subject 11 can be displayed using the display device 205 and fed back. Specifically, as shown in FIG. 7, by visualizing the degree of fatigue of the subject 11, it is possible to visually grasp how tired the subject 11 is. In the figure, a display device 205 integrally displays a doll simulating the subject 11 and fatigue levels of the subject 11 in the shoulders, back, and waist. In order to make it easier for the subject 11 to intuitively grasp the fatigue level, the fatigue level of the shoulder is indicated as "stiff shoulder", the fatigue of the back is indicated as "back pain", and the fatigue of the lower back is indicated as "low back pain". It is
 ここで、図中の表示では、対象者11の3か所の疲労度が一挙に表示されているが、これら3か所の疲労度の推定は、一度に撮像された画像から行われる。つまり、推定装置100は、対象者11の第1部位(例えば肩部)、第2部位(例えば背部)、及び第3部位(例えば腰部)を含む複数の身体部位の各々における筋肉及び/又は関節について、対象者11の一つの姿勢から疲労度を推定する。したがって、対象者11の姿勢が一定であっても、身体部位ごとに筋肉及び/又は関節に蓄積される疲労度は異なるが、疲労推定システム200では、このように異なる疲労度を同時かつ個別に推定できる。 Here, in the display in the figure, the fatigue levels of the subject 11 at three locations are displayed at once, and the fatigue levels at these three locations are estimated from images captured at once. That is, the estimating apparatus 100 detects the muscles and/or joints in each of a plurality of body parts including the first part (eg, shoulder), the second part (eg, back), and the third part (eg, waist) of the subject 11. , the degree of fatigue is estimated from one posture of the subject 11 . Therefore, even if the posture of the subject 11 is constant, the degree of fatigue accumulated in muscles and/or joints differs for each body part. can be estimated.
 図1Cを用いて説明したように、本実施の形態では、対象者11の筋肉及び/又は関節一つひとつについて負荷量が算出されるため、処理リソースの制限がなければ、筋肉及び/又は関節の一つひとつの疲労度を推定することができる。したがって、一度に撮像された画像から疲労度が推定される身体部位の数に限定はなく、1か所でもよく、2か所でもよく、4か所以上でもよい。 As described with reference to FIG. 1C, in the present embodiment, the load amount is calculated for each muscle and/or joint of the subject 11. Therefore, if there is no processing resource limitation, each muscle and/or joint fatigue can be estimated. Therefore, there is no limit to the number of body parts whose fatigue levels are estimated from images captured at one time, and the number may be one, two, or four or more.
 推定装置100では、複数の身体部位の各々で負荷量を算出し、対象者11の一つの姿勢において、第1部位で算出された負荷量による第1部位の疲労度(上記の肩こり度)と、第2部位で算出された負荷量による第2部位の疲労度(上記の背部痛度)と、第3部位で算出された負荷量による第3部位の疲労度(上記の腰痛度)と、を推定できる。 The estimating device 100 calculates the load amount for each of a plurality of body parts, and calculates the degree of fatigue (the degree of stiff shoulders) of the first part based on the load amount calculated for the first part in one posture of the subject 11. , the fatigue level of the second part (the above-mentioned back pain level) due to the load amount calculated at the second part, and the fatigue level of the third part (the above-mentioned low back pain level) due to the load amount calculated at the third part, can be estimated.
 また、図中の例では、僧帽筋の負荷量から肩こり度が推定され、広背筋の疲労度から背部痛度が推定され、腰部傍脊柱筋の負荷量から腰痛度が推定される。このように、一つの筋肉及び/又は関節の負荷量から、一つの疲労度を推定してもよいが、複数の筋肉及び/又は関節の複合的な負荷量から、一つの疲労度を推定してもよい。例えば、僧帽筋と、肩甲挙筋と、菱形筋と、三角筋との負荷量の平均値から肩こり度(つまり肩部の一つの疲労度)が推定されてもよい。また、疲労度の推定では、単純な平均値ではなく、当該身体部位の疲労度に特に大きく影響する筋肉及び/又は関節の負荷量に影響の大きさに応じた重みづけをすることで、より現実に即したな疲労度の推定を行ってもよい。 In the example in the figure, the degree of stiff shoulders is estimated from the amount of load on the trapezius muscle, the degree of back pain is estimated from the degree of fatigue of the latissimus dorsi muscle, and the degree of low back pain is estimated from the amount of load on the lumbar paraspinal muscles. In this way, one fatigue level may be estimated from the load of one muscle and / or joint, but one fatigue level is estimated from the combined load of a plurality of muscles and / or joints. may For example, the degree of stiff neck (that is, one degree of shoulder fatigue) may be estimated from the average value of the loads of the trapezius muscle, the levator scapula muscle, the rhomboid muscle, and the deltoid muscle. In addition, in estimating the degree of fatigue, rather than a simple average value, by weighting the load amount of muscles and / or joints that have a particularly large effect on the degree of fatigue of the body part according to the magnitude of the effect, A more realistic estimation of the degree of fatigue may be performed.
 このようにして推定された疲労度は、それぞれ図示するように最小値を0、最大値を100とする基準メータ上の相対位置として示されてもよい。ここで、基準メータ上の所定の位置に基準値が設けられる。このような基準値は、疫学的調査等によって事前に数値化された一般的な対象者11において痛みなどの自覚症状が発現し得る疲労度の相対位置(又はその前後等)に設定される。したがって、基準値は、各身体部位の疲労度に応じて異なる値が設定されてもよい。 The degree of fatigue estimated in this way may be indicated as a relative position on a reference meter with a minimum value of 0 and a maximum value of 100 as shown. Here, a reference value is provided at a predetermined position on the reference meter. Such a reference value is set to the relative position (or before and after) of the degree of fatigue at which subjective symptoms such as pain may occur in a general subject 11 quantified in advance by an epidemiological survey or the like. Therefore, different reference values may be set according to the degree of fatigue of each body part.
 さらに、表示装置205は、推定された対象者11の疲労度が基準値に達したことを契機に、対象者11に対する警告を推定結果として表示してもよい。ここでの基準値は、第1閾値の一例である。図中では、このような警告の一例として、表示装置205の下部に表示されているように「肩こり度が基準値を超えています。」を示している。また、このような警告に関連して、表示装置205は、図中に併記されているように「休憩をおすすめします。」等の具体的な対処方法を表示してもよい。 Furthermore, the display device 205 may display a warning to the target person 11 as an estimation result when the estimated fatigue level of the target person 11 reaches the reference value. The reference value here is an example of the first threshold. In the drawing, as an example of such a warning, "the degree of stiff shoulders exceeds the reference value" is displayed at the bottom of the display device 205. Further, in relation to such a warning, the display device 205 may display a specific coping method such as "Recommend taking a break" as shown in the drawing.
 また、以上のように、対象者11に対して推定結果を表示することで対象者11自身が蓄積される疲労度に対処することを促す構成の他、疲労推定システム200がアクティブに対象者11の疲労度を回復させる構成も考えられ得る。具体的には、図2に示した回復装置206が動作することで、対象者11の疲労度が回復される。回復装置206の具体的な構成については、上記したとおりであるので説明を省略するが、推定された対象者11の疲労度が基準値に達したことを契機に、回復装置206が動作し、対象者11の筋肉及び関節の少なくとも一方に対する負荷を変化させ、また、血行を促進させることで対象者の疲労度を低下させる。ここでの基準値は、第3閾値の一例であり、第1閾値及び第2閾値のいずれかと同一であってもよく、異なっていてもよい。 As described above, in addition to the configuration that prompts the subject 11 to cope with the accumulated fatigue level by displaying the estimation result to the subject 11, the fatigue estimation system 200 actively activates the subject 11. It is also possible to consider a configuration that recovers the degree of fatigue of the user. Specifically, the recovery device 206 shown in FIG. 2 operates to recover the fatigue level of the subject 11 . The specific configuration of the recovery device 206 is as described above, so a description thereof will be omitted. By changing the load on at least one of the muscles and joints of the subject 11 and promoting blood circulation, the subject's degree of fatigue is reduced. The reference value here is an example of the third threshold, and may be the same as or different from either the first threshold or the second threshold.
 [効果等]
 以上説明したように、本実施の形態における疲労推定システム200は、対象者11の身体部位のうちの異なる一部が撮像された画像をそれぞれが出力する複数の撮像装置201と、複数の撮像装置201のそれぞれにおいて出力された複数の画像に基づいて、対象者11の姿勢を推定する姿勢推定装置(姿勢推定部105)と、対象者11の姿勢の推定結果に基づいて、対象者11の疲労度を推定して出力する疲労推定装置100と、を備える。
[Effects, etc.]
As described above, the fatigue estimation system 200 in the present embodiment includes a plurality of imaging devices 201 each outputting an image in which a different part of the body part of the subject 11 is captured, and a plurality of imaging devices A posture estimation device (posture estimation unit 105) for estimating the posture of the subject 11 based on the plurality of images output in each of the 201, and a fatigue estimating device 100 that estimates and outputs the degree.
 このような疲労推定システム200では、姿勢推定部105が、撮像装置201の配置位置などの条件によって1枚の画像内に対象者11の全身が写る画像が得られず、1枚の画像からの対象者11の姿勢の推定が困難な状況において、別の画像を組み合わせて使用することで対象者11の姿勢を適切に推定することができる。そして、姿勢推定部105によって適切に推定された姿勢に基づいて、より適切に対象者11の疲労度の推定をすることが可能となる。 In such a fatigue estimation system 200, the posture estimation unit 105 cannot obtain an image showing the whole body of the subject 11 in one image due to conditions such as the arrangement position of the imaging device 201. In a situation where it is difficult to estimate the posture of the subject 11, the posture of the subject 11 can be appropriately estimated by using a combination of different images. Then, it is possible to more appropriately estimate the fatigue level of the subject 11 based on the posture that is appropriately estimated by the posture estimation unit 105 .
 また、例えば、姿勢推定部105は、複数の画像のうちの一の身体部位が撮像された一の画像90aに基づいて、一の身体部位における関節位置11cを推定し、複数の画像のうちの、一の身体部位と少なくとも1つの関節を共有する他の身体部位が撮像された他の画像90bに基づいて、他の身体部位における関節位置11dを推定し、推定した一の身体部位における関節位置11c、及び、推定した他の身体部位における関節位置11dを合成して、一の身体部位及び他の身体部位を含む身体部位の関節位置11aを、対象者11の姿勢として推定してもよい。 Further, for example, the posture estimation unit 105 estimates the joint position 11c in one body part based on one image 90a in which one body part is captured among the plurality of images, and , based on another image 90b in which another body part sharing at least one joint with the one body part is imaged, the joint position 11d in the other body part is estimated, and the estimated joint position in the one body part 11 c and the estimated joint positions 11 d of other body parts may be combined to estimate joint positions 11 a of body parts including one body part and other body parts as the posture of the subject 11 .
 これによれば、複数の画像のうちの一の身体部位が撮像された一の画像90a、及び、他の身体部位が撮像された他の画像90bを用いて、身体部位の関節位置11aを推定することができる。ここでは、一の画像90aから一の身体部位における関節位置11cを推定し、他の画像90bから他の身体部位における関節位置11dを推定してこれらを合成することによって、一の身体部位及び他の身体部位を含む身体部位の関節位置11aを推定することができる。このように、一の画像90aには含まれない、他の身体部位及び他の画像90bには含まれない一の身体部位をいずれも含む関節位置11aを推定することができる。 According to this, the joint position 11a of the body part is estimated using one image 90a in which one body part is imaged and another image 90b in which the other body part is imaged, among the plurality of images. can do. Here, a joint position 11c in one body part is estimated from one image 90a, a joint position 11d in another body part is estimated from another image 90b, and joint positions 11d in another body part are estimated, and these are combined to obtain the one body part and the other body part. can estimate the joint positions 11a of the body parts including the body parts of In this way, it is possible to estimate joint positions 11a that include both other body parts that are not included in one image 90a and one body parts that are not included in another image 90b.
 また、例えば、推定した一の身体部位における関節位置11c、及び、推定した他の身体部位における関節位置11dの合成では、一の身体部位及び他の身体部位の間で共有される関節である共有関節と、複数の画像内で共有される基準座標との相対位置に基づいて、一の身体部位における関節位置11cと、他の身体部位における関節位置11dとの相対姿勢及び相対縮尺の少なくとも一方を決定し、決定した一の身体部位における関節位置11cと、他の身体部位における関節位置11dとの相対姿勢及び相対縮尺の少なくとも一方が適用された状態で、共有関節、及び、基準座標を重ね合わせてもよい。 Further, for example, in synthesizing the estimated joint position 11c in one body part and the estimated joint position 11d in another body part, a shared joint that is shared between one body part and another body part Based on the relative positions of the joints and the reference coordinates shared in the plurality of images, at least one of the relative pose and the relative scale of the joint position 11c in one body part and the joint position 11d in the other body part is determined. The shared joint and the reference coordinates are superimposed with at least one of the relative posture and the relative scale between the determined joint position 11c in one body part and the determined joint position 11d in another body part being applied. may
 これによれば、共有関節と基準座標との相対位置を重ね合わせることで、一の画像90aから一の身体部位における関節位置11c、及び、他の画像90bから他の身体部位における関節位置11dを合成することできる。このとき、撮像装置201の配置位置によっては、一の身体部位における関節位置11cと、他の身体部位における関節位置11dとの姿勢及び縮尺の少なくとも一方が2つの画像間で異なっている状況が生じうる。上記の態様では、一の身体部位における関節位置11cと、他の身体部位における関節位置11dとの相対姿勢(例えば、相対的な向きの差を解消するための回転量)及び相対縮尺(例えば、相対的な縮尺の差を解消するための拡縮の比率)を適用して、2つの画像間で異なる姿勢及び縮尺等を調整したうえで、上記の合成をすることできる。 According to this, by superimposing the relative positions of the shared joints and the reference coordinates, the joint position 11c in one body part is obtained from one image 90a, and the joint position 11d in another body part is obtained from another image 90b. can be synthesized. At this time, depending on the arrangement position of the imaging device 201, at least one of the posture and the scale of the joint position 11c in one body part and the joint position 11d in the other body part differs between the two images. sell. In the above aspect, the relative posture (for example, the amount of rotation to eliminate the difference in relative orientation) and the relative scale (for example, A scaling ratio for canceling a relative scale difference) can be applied to adjust different poses, scales, etc. between the two images, and then the above synthesis can be performed.
 また、例えば、基準座標は、共有関節とは異なる関節の関節位置であってもよい。 Also, for example, the reference coordinates may be the joint position of a joint different from the shared joint.
 これによれば、共有関節とは異なる関節の関節位置を基準座標として用いて、共有関節と基準座標との相対位置を重ね合わせることで、一の画像90aから一の身体部位における関節位置11c、及び、他の画像90bから他の身体部位における関節位置11dを合成することできる。 According to this, the joint position of the joints different from the shared joint is used as the reference coordinates and the relative position between the shared joint and the reference coordinates is overlaid, so that the joint position 11c in one image 90a is 11c, And joint positions 11d in other body parts can be synthesized from other images 90b.
 また、例えば、基準座標は、複数の画像のいずれにも撮像される位置に配置されたマーカの空間座標であってもよい。 Also, for example, the reference coordinates may be spatial coordinates of a marker placed at a position captured in any of the plurality of images.
 これによれば、複数の画像のいずれにも撮像される位置に配置されたマーカの空間座標を基準座標として用いて、共有関節と基準座標との相対位置を重ね合わせることで、一の画像90aから一の身体部位における関節位置11c、及び、他の画像90bから他の身体部位における関節位置11dを合成することできる。 According to this, by using the spatial coordinates of the marker arranged at the position captured in any of the plurality of images as the reference coordinates and superimposing the relative positions of the shared joints and the reference coordinates, one image 90a can be obtained. The joint position 11c in one body part can be synthesized from the image 90b, and the joint position 11d in another body part can be synthesized from the other image 90b.
 また、例えば、さらに、マーカと対象者11との位置関係に基づいて、対象者11と同じ空間に滞在する他の人物11zから、対象者11を識別する識別部102を備えてもよい。 Further, for example, an identification unit 102 may be provided that identifies the target person 11 from another person 11z staying in the same space as the target person 11 based on the positional relationship between the marker and the target person 11.
 これによれば、マーカと対象者11との位置関係に基づいて、対象者11と同じ空間に滞在する他の人物11zから、対象者11を識別し、対象者11についてのみ姿勢の推定を行うことができる。この結果、他の人物11zが存在する場合でも対象者11についてのみ疲労度を推定することができる。 According to this, based on the positional relationship between the marker and the target person 11, the target person 11 is identified from other persons 11z staying in the same space as the target person 11, and the posture of only the target person 11 is estimated. be able to. As a result, the fatigue level can be estimated only for the target person 11 even when another person 11z is present.
 また、例えば、姿勢推定部105は、対象者11の姿勢の推定に失敗した場合に、対象者11の姿勢の推定が失敗したことを示す推定結果を出力し、疲労推定システム200は、さらに、対象者11の姿勢の推定が失敗したことを示す推定結果に基づいて、姿勢の推定が失敗した対象者11の存在を通知するための通知情報を出力する出力部109を備えてもよい。 Further, for example, when the posture estimation unit 105 fails to estimate the posture of the target person 11, the posture estimation unit 105 outputs an estimation result indicating that the estimation of the posture of the target person 11 has failed, and the fatigue estimation system 200 further: An output unit 109 may be provided that outputs notification information for notifying the presence of the target person 11 whose posture estimation has failed, based on the estimation result indicating that the posture estimation of the target person 11 has failed.
 これによれば、通知された通知情報に基づいて、姿勢の推定が失敗した対象者11の存在を知ることができる。 According to this, it is possible to know the presence of the target person 11 whose posture estimation has failed based on the notified notification information.
 また、例えば、姿勢推定部105は、複数の画像に基づいて対象者11の一部の身体部位における関節位置を推定し、推定した対象者11の一部の身体部位における関節位置を、人の全身の関節位置が学習された機械学習モデルに入力し、対象者11の身体部位のうち複数の画像において撮像されていない不可視部位を含む身体部位の関節位置11aを、対象者11の姿勢として推定してもよい。 Further, for example, the posture estimating unit 105 estimates the joint positions of the partial body parts of the subject 11 based on a plurality of images, and converts the estimated joint positions of the partial body parts of the subject 11 into human body parts. The joint positions of the whole body are input to a learned machine learning model, and the joint positions 11a of the body parts of the subject 11 including invisible parts that are not captured in a plurality of images are estimated as the posture of the subject 11. You may
 これによれば、人の全身の関節位置が学習された機械学習モデルを用いることで、複数の画像のいずれの組み合わせでも推定できない対象者11の不可視部位を含む身体部位について、関節位置11aの推定をすることができる。 According to this, by using a machine learning model in which the joint positions of the whole human body are learned, the joint positions 11a can be estimated for the body parts including the invisible parts of the subject 11 that cannot be estimated from any combination of a plurality of images. can do
 また、例えば、姿勢推定部105は、さらに、エリア内の検知対象者の人数を取得し、検知対象者のうち、エリア内を撮像する撮像装置201の1つから出力された1つの画像のみで姿勢の推定が可能な姿勢推定可能者の人数を算出し、検知対象者の人数と姿勢推定可能者の人数とが異なる場合に、検知対象者のうち、姿勢推定可能者ではない人物を対象者11として、対象者11の姿勢を推定してもよい。 Further, for example, the posture estimation unit 105 further acquires the number of detection target persons in the area, and among the detection target persons, only one image output from one imaging device 201 that captures an image of the area is used. Calculate the number of people whose postures can be estimated, and if the number of people whose postures can be estimated differs from the number of people whose postures can be estimated, among the people who are not postures can be estimated. 11, the posture of the subject 11 may be estimated.
 これによれば、エリア内の検知対象者の人数と姿勢推定可能者の人数とが異なる場合、すなわち、検知対象者の中に複数の画像を組み合わせなければ姿勢の推定ができない対象者11が存在する場合にのみ、複数の画像を用いた上記の姿勢の推定を行うことができる。このため、当該姿勢の推定用の処理リソースを縮小でき、簡易な構成で、疲労推定システム200を実現することができる。 According to this, when the number of detection targets in an area is different from the number of persons whose postures can be estimated, there is a target 11 among the detection targets whose posture cannot be estimated unless a plurality of images are combined. The above pose estimation using a plurality of images can be performed only when Therefore, processing resources for estimating the posture can be reduced, and the fatigue estimation system 200 can be realized with a simple configuration.
 また、本実施の形態における姿勢推定装置は、上記に記載の姿勢推定部105である。 Also, the posture estimation device according to the present embodiment is posture estimation section 105 described above.
 これによれば、撮像装置201の配置位置などの条件によって1枚の画像内に対象者11の全身が写る画像が得られず、1枚の画像からの対象者11の姿勢の推定が困難な状況において、別の画像を組み合わせて使用することで対象者11の姿勢を適切に推定することができる。 According to this, an image showing the entire body of the subject 11 cannot be obtained in one image depending on conditions such as the arrangement position of the imaging device 201, and it is difficult to estimate the posture of the subject 11 from one image. In some situations, a combination of different images can be used to better estimate the pose of the subject 11 .
 また、本実施の形態における疲労推定方法は、疲労推定装置100によって実行される疲労推定方法であって、対象者11の身体部位のうちの異なる一部が撮像された画像をそれぞれが出力する複数の撮像装置201のそれぞれから画像を取得し、複数の画像のうちの一の身体部位が撮像された一の画像90aと、複数の画像のうちの、一の身体部位と少なくとも1つの関節を共有する他の身体部位が撮像された他の画像90bと、に基づいて、一の身体部位及び他の身体部位を含む身体部位の関節位置11aを、対象者11の姿勢として推定し、対象者11の姿勢の推定結果に基づいて、対象者11の疲労度を推定して疲労推定装置100に接続された出力装置に出力させる。 Further, the fatigue estimation method in the present embodiment is a fatigue estimation method executed by the fatigue estimation apparatus 100, and includes a plurality of images each outputting an image in which a different part of the body part of the subject 11 is captured. , and share one image 90a in which one body part among the plurality of images is captured, and one body part and at least one joint among the plurality of images. and another image 90b in which another body part is imaged, and joint positions 11a of body parts including one body part and another body part are estimated as the posture of the subject 11, and the subject 11 Based on the posture estimation result, the fatigue level of the subject 11 is estimated and output to an output device connected to the fatigue estimation apparatus 100 .
 これによれば、上記に記載の疲労推定システム200と同様の効果を奏することができる。 According to this, the same effect as the fatigue estimation system 200 described above can be obtained.
 また、本実施の形態におけるプログラムは、上記に記載の疲労推定方法をコンピュータに実行させるためのプログラムである。 Also, the program in the present embodiment is a program for causing a computer to execute the fatigue estimation method described above.
 これによれば、コンピュータを用いて上記に記載の疲労推定システム200と同様の効果を奏することができる。 According to this, it is possible to obtain the same effect as the fatigue estimation system 200 described above using a computer.
 (その他の実施の形態)
 以上、実施の形態について説明したが、本開示は、上記実施の形態に限定されるものではない。
(Other embodiments)
Although the embodiments have been described above, the present disclosure is not limited to the above embodiments.
 例えば、上記実施の形態において、特定の処理部が実行する処理を別の処理部が実行してもよい。また、複数の処理の順序が変更されてもよいし、複数の処理が並行して実行されてもよい。 For example, in the above embodiment, the processing executed by a specific processing unit may be executed by another processing unit. In addition, the order of multiple processes may be changed, and multiple processes may be executed in parallel.
 また、本開示における疲労推定システム又は姿勢推定装置は、複数の構成要素の一部ずつを有する複数の装置で実現されてもよく、複数の構成要素のすべてを有する単一の装置で実現されてもよい。また、構成要素の機能の一部が別の構成要素の機能として実現されてもよく、各機能が各構成要素にどのように分配されてもよい。実質的に本開示の疲労推定システム又は姿勢推定装置を実現し得る機能がすべて備えられる構成を有する形態であれば本開示に含まれる。 In addition, the fatigue estimation system or posture estimation device in the present disclosure may be implemented with multiple devices each having a portion of multiple components, or may be implemented with a single device having all of the multiple components. good too. Also, some of the functions of a component may be implemented as functions of another component, and each function may be distributed among the components in any way. The present disclosure includes any form having a configuration that includes substantially all of the functions that can realize the fatigue estimation system or the posture estimation device of the present disclosure.
 また、上記実施の形態において、各構成要素は、各構成要素に適したソフトウェアプログラムを実行することによって実現されてもよい。各構成要素は、CPU又はプロセッサなどのプログラム実行部が、ハードディスク又は半導体メモリなどの記録媒体に記録されたソフトウェアプログラムを読み出して実行することによって実現されてもよい。 Also, in the above embodiments, each component may be realized by executing a software program suitable for each component. Each component may be realized by reading and executing a software program recorded in a recording medium such as a hard disk or a semiconductor memory by a program execution unit such as a CPU or processor.
 また、各構成要素は、ハードウェアによって実現されてもよい。例えば、各構成要素は、回路(又は集積回路)でもよい。これらの回路は、全体として1つの回路を構成してもよいし、それぞれ別々の回路でもよい。また、これらの回路は、それぞれ、汎用的な回路でもよいし、専用の回路でもよい。 Also, each component may be realized by hardware. For example, each component may be a circuit (or integrated circuit). These circuits may form one circuit as a whole, or may be separate circuits. These circuits may be general-purpose circuits or dedicated circuits.
 また、本開示の全般的又は具体的な態様は、システム、装置、方法、集積回路、コンピュータプログラム又はコンピュータ読み取り可能なCD-ROMなどの記録媒体で実現されてもよい。また、システム、装置、方法、集積回路、コンピュータプログラム及び記録媒体の任意な組み合わせで実現されてもよい。 Also, general or specific aspects of the present disclosure may be implemented in a system, apparatus, method, integrated circuit, computer program, or recording medium such as a computer-readable CD-ROM. Also, any combination of systems, devices, methods, integrated circuits, computer programs and recording media may be implemented.
 また、上記実施の形態では、画像認識によって生成した剛体リンクモデルを用いて画像から対象者の姿勢を推定し、推定した対象者の姿勢から負荷量を算出し、負荷量と継続時間とに基づいて対象者の疲労度を推定したが、疲労度の推定方法はこれに限らない。画像から対象者の姿勢を推定する方法として、既存のいかなる方法が用いられてもよいし、対象者の姿勢から負荷量を推定する方法として、既存のいかなる方法が用いられてもよい。 Further, in the above embodiment, the posture of the subject is estimated from the image using the rigid link model generated by image recognition, the load amount is calculated from the estimated posture of the subject, and based on the load amount and the duration time. However, the method of estimating the degree of fatigue is not limited to this. Any existing method may be used as a method of estimating the posture of the subject from the image, and any existing method may be used as a method of estimating the amount of load from the posture of the subject.
 また、上記実施の形態では、増加関数及び減少関数を直線的な一次関数であるものとして説明したが、これに限らない。増加関数は、時間の経過に応じて疲労度が増加する関数であれば曲線的な関数であってもよい。また、減少関数は、時間の経過に応じて疲労度が減少する関数であれば曲線的な関数であってもよい。 Also, in the above embodiment, the increasing function and decreasing function are described as linear functions, but the present invention is not limited to this. The increasing function may be a curvilinear function as long as the fatigue level increases with time. Also, the decreasing function may be a curvilinear function as long as it is a function that decreases the degree of fatigue over time.
 また、上記に説明した推定装置は、対象者の姿勢から推定した筋肉への負荷量、関節への負荷量、及び、血流量の推定値を用いて、対象者の疲労度を推定する態様を説明したが、計測装置を用いて計測した値によって推定値を補正してより高精度な疲労度の推定を実現することもできる。具体的には、推定装置は、計測装置によって対象者を計測した計測結果に基づく計測値であって、推定値に対応する計測値を取得する。 In addition, the estimation device described above uses the load on muscles, the load on joints, and the blood flow estimated from the posture of the subject to estimate the degree of fatigue of the subject. As described above, it is also possible to correct the estimated value using the value measured using the measuring device to achieve more accurate estimation of the degree of fatigue. Specifically, the estimating device acquires a measured value corresponding to the estimated value, which is a measured value based on the measurement result of measuring the subject by the measuring device.
 検知装置は、例えば、筋電計、筋硬度計、圧力計、及び、近赤外線分光計等であり、筋肉への負荷量、関節への負荷量、及び血流量に関する計測値を計測によって得ることができる。例えば、筋電計は、電位計測によって計測された電位をもとに、当該電位に対応する筋肉の動きを推定することができる。つまり、筋肉の動きを推定した値を計測値として得ることができる。筋肉の動きを推定した値は、すなわち、筋肉への負荷量に換算することができるため、筋肉への負荷量の推定値を計測値によって補正することができる。ここでの補正は、例えば、推定値と計測値との平均値をとること、推定値と計測値とのいずれかを選択すること、及び、推定値と計測値との相関関数に推定値を代入すること等である。 The detection device is, for example, an electromyograph, a muscle hardness meter, a pressure gauge, a near-infrared spectrometer, etc., and obtains measurement values regarding the amount of load on muscles, the amount of load on joints, and the amount of blood flow. can be done. For example, an electromyograph can estimate muscle movement corresponding to the potential based on the potential measured by potential measurement. In other words, a value obtained by estimating the muscle movement can be obtained as a measured value. Since the value obtained by estimating the movement of the muscle can be converted into the amount of load on the muscle, the estimated value of the amount of load on the muscle can be corrected by the measured value. The correction here is, for example, taking the average value of the estimated value and the measured value, selecting one of the estimated value and the measured value, and applying the estimated value to the correlation function between the estimated value and the measured value. and so on.
 筋硬度計は、筋肉に圧力を付与した際の応力によって筋肉の硬さを推定することができる。筋肉の硬さ推定した値は、筋肉への負荷量に換算することができるため、上記と同様にして推定値の補正に利用できる。 A muscle hardness meter can estimate muscle hardness from the stress when pressure is applied to the muscle. Since the estimated muscle hardness value can be converted into the amount of load on the muscle, it can be used to correct the estimated value in the same manner as described above.
 圧力計は、対象者の身体部位にどのような圧力がかかっているかを計測値として得ることができる。このような圧力のパラメータは、筋骨格モデルに入力することが可能である。圧力などの付加パラメータを入力することで筋骨格モデルの推定精度が向上され、筋骨格モデルを用いて推定される推定値をより高精度に補正できる。 The pressure gauge can obtain a measured value of what kind of pressure is applied to the body part of the subject. Such pressure parameters can be input into the musculoskeletal model. By inputting additional parameters such as pressure, the estimation accuracy of the musculoskeletal model is improved, and the estimated value estimated using the musculoskeletal model can be corrected with higher accuracy.
 近赤外線分光計は、対象者の血流量を分光学的に計測した計測値を得ることができる。上記の実施の形態のように、推定値に血流量が含まれない場合に、赤外線分光計によって計測された血流量を組み合わせることで、推定値の補正を行ってもよい。また、推定値に血流量が含まれる場合であっても、当該血流量の推定値の信頼性が低い場合などに計測された血流量が用いられてもよい。 A near-infrared spectrometer can obtain spectroscopic measurement values of the subject's blood flow. When the blood flow rate is not included in the estimated value as in the above embodiment, the estimated value may be corrected by combining the blood flow rate measured by the infrared spectrometer. Moreover, even if the estimated value includes the blood flow, the measured blood flow may be used when the estimated blood flow has low reliability.
 このように別の側面から得られた推定値に対応する計測値を用いて、推定値をより高精度にするための補正を行うことで、より正確な対象者の疲労度の推定を行うことが可能となる。 In this way, by using the measured values corresponding to the estimated values obtained from different aspects and correcting the estimated values to make them more accurate, more accurate estimation of the fatigue level of the subject can be performed. becomes possible.
 また、上記実施の形態において説明した疲労推定システムを用いて、対象者の疲労の要因を特定する疲労要因特定システムを構成してもよい。従来における、「肩こり度」及び「腰痛度」等として疲労の程度を推定する装置又はシステム等では、このような「肩こり度」及び「腰痛度」の要因となる筋肉及び関節の使い方(つまり要因となる姿勢)を特定することは困難であった。そこで、本開示における疲労推定システムを用いることで、上記の課題に対応することができる。 In addition, the fatigue estimation system described in the above embodiment may be used to configure a fatigue factor identification system that identifies the subject's fatigue factors. Conventional devices or systems for estimating the degree of fatigue as "degree of stiff shoulder" and "degree of low back pain" use muscles and joints (that is, factors It was difficult to identify the posture that Therefore, by using the fatigue estimation system according to the present disclosure, the above problem can be addressed.
 すなわち、本開示における疲労要因特定システムでは、対象者がとる静止姿勢において、疲労が蓄積しやすい身体部位(各種疲労を促進する推定量の多い身体部位)を疲労要因部位として特定する。さらに、疲労要因特定システムは、単に、対象者がとる一つの静止姿勢のうちの疲労要因部位を特定してもよく、対象者がとる複数の静止姿勢のうちから、最も疲労要因部位における推定量の多い疲労要因姿勢を特定してもよい。また、このように特定した疲労要因姿勢に代わる推奨姿勢を提示してもよく、疲労要因姿勢における疲労要因部位に対して回復装置を用いた疲労度の回復動作を行ってもよい。 That is, in the fatigue factor identification system of the present disclosure, body parts where fatigue is likely to accumulate (body parts with large estimated amounts that promote various types of fatigue) are identified as fatigue factor parts in the static posture taken by the subject. Furthermore, the fatigue factor identification system may simply identify the fatigue factor part in one static posture taken by the subject, and the estimated amount in the fatigue factor part most among the plurality of static postures taken by the subject You may also identify the fatigue factor posture with many In addition, a recommended posture that replaces the specified fatigue-causing posture may be presented, and a fatigue degree recovery operation using a recovery device may be performed on the fatigue-causing portion in the fatigue-causing posture.
 疲労要因特定システムは、上記実施の形態において説明した疲労推定システムと、推定された疲労度に関する情報を格納するための記憶装置とを備える。このような記憶装置は、例えば半導体メモリ等を用いて実現され、疲労推定システムを構成する各主記憶部等が用いられてもよく、推定装置に通信接続される記憶装置が新たに設けられてもよい。 The fatigue factor identification system includes the fatigue estimation system described in the above embodiment and a storage device for storing information on the estimated degree of fatigue. Such a storage device may be implemented using, for example, a semiconductor memory or the like, and each main storage unit or the like constituting the fatigue estimation system may be used. good too.
 また、本開示は、疲労推定システム又は推定装置が実行する疲労推定方法として実現されてもよい。本開示は、このような疲労推定方法をコンピュータに実行させるためのプログラムとして実現されてもよいし、このようなプログラムが記録されたコンピュータ読み取り可能な非一時的な記録媒体として実現されてもよい。 Also, the present disclosure may be implemented as a fatigue estimation method executed by a fatigue estimation system or an estimation device. The present disclosure may be implemented as a program for causing a computer to execute such a fatigue estimation method, or may be implemented as a computer-readable non-temporary recording medium in which such a program is recorded. .
 その他、実施の形態に対して当業者が思いつく各種変形を施して得られる形態、又は、本開示の趣旨を逸脱しない範囲で各実施の形態における構成要素及び機能を任意に組み合わせることで実現される形態も本開示に含まれる。 In addition, a form obtained by applying various modifications to the embodiment that a person skilled in the art can think of, or realized by arbitrarily combining the components and functions in each embodiment within the scope of the present disclosure Forms are also included in this disclosure.
 11 対象者
 11a 関節位置
 11c 一の身体部位における関節位置
 11d 他の身体部位における関節位置
 11z 他の人物
 90a 一の画像
 90b 他の画像
 100 疲労推定装置
 101 取得部
 102 識別部
 105 姿勢推定部(姿勢推定装置)
 109 出力部
 200 疲労推定システム
 201 撮像装置
 Mc、Md 空間座標
11 Subject 11a Joint position 11c Joint position in one body part 11d Joint position in another body part 11z Other person 90a One image 90b Another image 100 Fatigue estimation device 101 Acquisition unit 102 Identification unit 105 Posture estimation unit (Posture estimation device)
109 Output unit 200 Fatigue estimation system 201 Imaging device Mc, Md Spatial coordinates

Claims (12)

  1.  対象者の身体部位のうちの異なる一部が撮像された画像をそれぞれが出力する複数の撮像装置と、
     前記複数の撮像装置のそれぞれにおいて出力された複数の前記画像に基づいて、前記対象者の姿勢を推定する姿勢推定装置と、
     前記対象者の姿勢の推定結果に基づいて、前記対象者の疲労度を推定して出力する疲労推定装置と、を備える
     疲労推定システム。
    a plurality of imaging devices each outputting an image in which a different part of a subject's body part is captured;
    a posture estimation device that estimates the posture of the subject based on the plurality of images output from each of the plurality of imaging devices;
    A fatigue estimation system, comprising: a fatigue estimation device that estimates and outputs a degree of fatigue of the subject based on estimation results of the posture of the subject.
  2.  前記姿勢推定装置は、
     複数の前記画像のうちの一の身体部位が撮像された一の画像に基づいて、前記一の身体部位における関節位置を推定し、
     複数の前記画像のうちの、前記一の身体部位と少なくとも1つの関節を共有する他の身体部位が撮像された他の画像に基づいて、前記他の身体部位における関節位置を推定し、
     推定した前記一の身体部位における関節位置、及び、推定した前記他の身体部位における関節位置を合成して、前記一の身体部位及び前記他の身体部位を含む身体部位の関節位置を、前記対象者の姿勢として推定する
     請求項1に記載の疲労推定システム。
    The posture estimation device is
    estimating a joint position in the one body part based on one of the plurality of images in which one body part is captured;
    estimating a joint position in the other body part based on another image among the plurality of images in which another body part sharing at least one joint with the one body part is imaged;
    By synthesizing the estimated joint positions of the one body part and the estimated joint positions of the other body parts, the joint positions of the body parts including the one body part and the other body parts are obtained by the object. The fatigue estimation system according to claim 1, wherein the fatigue estimation system estimates the posture of a person.
  3.  推定した前記一の身体部位における関節位置、及び、推定した前記他の身体部位における関節位置の合成では、
     前記一の身体部位及び前記他の身体部位の間で共有される関節である共有関節と、複数の前記画像内で共有される基準座標との相対位置に基づいて、前記一の身体部位における関節位置と、前記他の身体部位における関節位置との相対姿勢及び相対縮尺の少なくとも一方を決定し、
     決定した前記一の身体部位における関節位置と、前記他の身体部位における関節位置との相対姿勢及び相対縮尺の少なくとも一方が適用された状態で、前記共有関節、及び、前記基準座標を重ね合わせる
     請求項2に記載の疲労推定システム。
    In synthesizing the estimated joint position in the one body part and the estimated joint position in the other body part,
    joints in the one body part based on relative positions of shared joints that are joints shared between the one body part and the other body part and reference coordinates shared in the plurality of images; determining at least one of a relative pose and a relative scale of a position and a joint position in said other body part;
    superimposing the shared joints and the reference coordinates in a state in which at least one of a relative posture and a relative scale between the determined joint positions of the one body part and the joint positions of the other body parts is applied; Item 3. The fatigue estimation system according to item 2.
  4.  前記基準座標は、前記共有関節とは異なる関節の関節位置である
     請求項3に記載の疲労推定システム。
    The fatigue estimation system according to claim 3, wherein the reference coordinate is a joint position of a joint different from the shared joint.
  5.  前記基準座標は、複数の前記画像のいずれにも撮像される位置に配置されたマーカの空間座標である
     請求項3に記載の疲労推定システム。
    The fatigue estimation system according to claim 3, wherein the reference coordinates are spatial coordinates of a marker arranged at a position captured in any of the plurality of images.
  6.  さらに、前記マーカと前記対象者との位置関係に基づいて、前記対象者と同じ空間に滞在する他の人物から、前記対象者を識別する識別部を備える
     請求項5に記載の疲労推定システム。
    The fatigue estimation system according to claim 5, further comprising an identification unit that identifies the target person from other persons staying in the same space as the target person based on the positional relationship between the marker and the target person.
  7.  前記姿勢推定装置は、前記対象者の姿勢の推定に失敗した場合に、前記対象者の姿勢の推定が失敗したことを示す前記推定結果を出力し、
     前記疲労推定システムは、さらに、前記対象者の姿勢の推定が失敗したことを示す前記推定結果に基づいて、姿勢の推定が失敗した前記対象者の存在を通知するための通知情報を出力する出力部を備える
     請求項1~6のいずれか1項に記載の疲労推定システム。
    The posture estimation device outputs the estimation result indicating that the estimation of the posture of the subject is unsuccessful when estimation of the posture of the subject is unsuccessful;
    The fatigue estimation system further outputs notification information for notifying the presence of the subject whose posture estimation has failed, based on the estimation result indicating that the subject's posture has failed to be estimated. The fatigue estimation system according to any one of claims 1 to 6, comprising a unit.
  8.  前記姿勢推定装置は、
     複数の前記画像に基づいて前記対象者の一部の身体部位における関節位置を推定し、
     推定した前記対象者の一部の身体部位における関節位置を、人の全身の関節位置が学習された機械学習モデルに入力し、
     前記対象者の身体部位のうち複数の前記画像において撮像されていない不可視部位を含む身体部位の関節位置を、前記対象者の姿勢として推定する
     請求項1~7のいずれか1項に記載の疲労推定システム。
    The posture estimation device is
    estimating joint positions in a part of the subject's body based on the plurality of images;
    inputting the estimated joint positions in the partial body part of the subject into a machine learning model in which the joint positions of the whole human body have been learned;
    Joint positions of body parts including invisible parts that are not imaged in a plurality of the images of the body parts of the subject, to estimate as the posture of the subject Fatigue according to any one of claims 1 to 7 estimation system.
  9.  前記姿勢推定装置は、さらに、
     エリア内の検知対象者の人数を取得し、
     前記検知対象者のうち、前記エリア内を撮像する撮像装置の1つから出力された1つの画像のみで姿勢の推定が可能な姿勢推定可能者の人数を算出し、
     前記検知対象者の人数と前記姿勢推定可能者の人数とが異なる場合に、前記検知対象者のうち、前記姿勢推定可能者ではない人物を前記対象者として、前記対象者の姿勢を推定する
     請求項1~8のいずれか1項に記載の疲労推定システム。
    The posture estimation device further comprises:
    Get the number of people to be detected in the area,
    Calculating the number of people whose posture can be estimated from only one image output from one imaging device that captures an image of the area, among the detection target people,
    When the number of the detection target persons and the number of the posture estimable persons are different, estimating the posture of the target person among the detection target persons who are not the posture estimable persons as the target person. Item 9. The fatigue estimation system according to any one of items 1 to 8.
  10.  請求項1~9のいずれか1項に記載の姿勢推定装置。 The posture estimation device according to any one of claims 1 to 9.
  11.  疲労推定装置によって実行される疲労推定方法であって、
     対象者の身体部位のうちの異なる一部が撮像された画像をそれぞれが出力する複数の撮像装置のそれぞれから前記画像を取得し、
     複数の前記画像のうちの一の身体部位が撮像された一の画像と、複数の前記画像のうちの、前記一の身体部位と少なくとも1つの関節を共有する他の身体部位が撮像された他の画像と、に基づいて、前記一の身体部位及び前記他の身体部位を含む身体部位の関節位置を、前記対象者の姿勢として推定し、
     前記対象者の姿勢の推定結果に基づいて、前記対象者の疲労度を推定して前記疲労推定装置に接続された出力装置に出力させる
     疲労推定方法。
    A fatigue estimation method performed by a fatigue estimation device,
    Acquiring the images from each of a plurality of imaging devices that each output an image in which a different part of the subject's body part is captured;
    one image obtained by imaging one body part among the plurality of images, and another body part sharing at least one joint with the one body part among the plurality of images obtained by imaging and estimating the joint positions of the body parts including the one body part and the other body part as the posture of the subject, based on the image of
    A fatigue estimation method, comprising: estimating a degree of fatigue of the subject based on an estimation result of the posture of the subject, and outputting the fatigue level to an output device connected to the fatigue estimation device.
  12.  請求項11に記載の疲労推定方法をコンピュータに実行させるための
     プログラム。
    A program for causing a computer to execute the fatigue estimation method according to claim 11.
PCT/JP2022/029404 2021-08-04 2022-07-29 Fatigue estimation system, fatigue estimation method, posture estimation device, and program WO2023013562A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023540319A JPWO2023013562A1 (en) 2021-08-04 2022-07-29

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-128062 2021-08-04
JP2021128062 2021-08-04

Publications (1)

Publication Number Publication Date
WO2023013562A1 true WO2023013562A1 (en) 2023-02-09

Family

ID=85154746

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/029404 WO2023013562A1 (en) 2021-08-04 2022-07-29 Fatigue estimation system, fatigue estimation method, posture estimation device, and program

Country Status (2)

Country Link
JP (1) JPWO2023013562A1 (en)
WO (1) WO2023013562A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007310707A (en) * 2006-05-19 2007-11-29 Toshiba Corp Apparatus and method for estimating posture
CN107577451A (en) * 2017-08-03 2018-01-12 中国科学院自动化研究所 More Kinect human skeletons coordinate transformation methods and processing equipment, readable storage medium storing program for executing
KR20180094253A (en) * 2017-02-15 2018-08-23 연세대학교 산학협력단 Apparatus and Method for Estimating Pose of User
US20210104069A1 (en) * 2019-10-07 2021-04-08 Sony Corporation Camera calibration method using human joint points
WO2021112096A1 (en) * 2019-12-06 2021-06-10 パナソニックIpマネジメント株式会社 Fatigue estimation system, estimation device, and fatigue estimation method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007310707A (en) * 2006-05-19 2007-11-29 Toshiba Corp Apparatus and method for estimating posture
KR20180094253A (en) * 2017-02-15 2018-08-23 연세대학교 산학협력단 Apparatus and Method for Estimating Pose of User
CN107577451A (en) * 2017-08-03 2018-01-12 中国科学院自动化研究所 More Kinect human skeletons coordinate transformation methods and processing equipment, readable storage medium storing program for executing
US20210104069A1 (en) * 2019-10-07 2021-04-08 Sony Corporation Camera calibration method using human joint points
WO2021112096A1 (en) * 2019-12-06 2021-06-10 パナソニックIpマネジメント株式会社 Fatigue estimation system, estimation device, and fatigue estimation method

Also Published As

Publication number Publication date
JPWO2023013562A1 (en) 2023-02-09

Similar Documents

Publication Publication Date Title
JP7133779B2 (en) Fatigue estimation system, estimation device, and fatigue estimation method
US11182599B2 (en) Motion state evaluation system, motion state evaluation device, motion state evaluation server, motion state evaluation method, and motion state evaluation program
Bonnechere et al. Determination of the precision and accuracy of morphological measurements using the Kinect™ sensor: comparison with standard stereophotogrammetry
KR101488130B1 (en) Running form diagnostic system and method for scoring running form
US8878922B2 (en) Video image information processing apparatus and video image information processing method
JP6433805B2 (en) Motor function diagnosis apparatus and method, and program
KR102165429B1 (en) Body shape analysis method and apparatus
Mallare et al. Sitting posture assessment using computer vision
CN104598012B (en) A kind of interactive advertising equipment and its method of work
CN115004265A (en) Information processing apparatus and determination result output method
WO2023013562A1 (en) Fatigue estimation system, fatigue estimation method, posture estimation device, and program
KR102310964B1 (en) Electronic Device, Method, and System for Diagnosing Musculoskeletal Symptoms
CN117115922A (en) Seat body forward-bending evaluation method, system, electronic equipment and storage medium
WO2023100718A1 (en) Fatigue estimation system, estimation device, and fatigue estimation method
JP5427679B2 (en) Floor reaction force measurement system and method
WO2023120064A1 (en) Fatigue estimation device, fatigue estimation system, and fatigue estimation method
JP7298561B2 (en) Information processing system, information processing device, and program
WO2023223880A1 (en) Posture evaluation device, private room booth, and posture evaluation method
WO2021241347A1 (en) Fatigue inference system, fatigue inference method, and program
US20240359081A1 (en) Information processing apparatus, method, and non-transitory computer-readable program
Ravera et al. A regularized functional method to determine the hip joint center of rotation in subjects with limited range of motion
KR102411882B1 (en) Untact physical fitness measurement system using images
WO2024219429A1 (en) Exercise evaluation device, exercise evaluation method, and exercise evaluation program
US20240206769A1 (en) Information processing apparatus, information processing method, and non-transitory computer readable medium
Nicolau et al. Database generation for markerless tracking based on deep learning networks

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22852981

Country of ref document: EP

Kind code of ref document: A1

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2023540319

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22852981

Country of ref document: EP

Kind code of ref document: A1