WO2022244222A1 - 推定装置、推定システム、推定方法、および記録媒体 - Google Patents
推定装置、推定システム、推定方法、および記録媒体 Download PDFInfo
- Publication number
- WO2022244222A1 WO2022244222A1 PCT/JP2021/019305 JP2021019305W WO2022244222A1 WO 2022244222 A1 WO2022244222 A1 WO 2022244222A1 JP 2021019305 W JP2021019305 W JP 2021019305W WO 2022244222 A1 WO2022244222 A1 WO 2022244222A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- feature amount
- foot
- estimation
- pronation
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 24
- 238000000605 extraction Methods 0.000 claims abstract description 49
- 230000033001 locomotion Effects 0.000 claims abstract description 33
- 230000001133 acceleration Effects 0.000 claims description 46
- 238000009826 distribution Methods 0.000 claims description 11
- 230000008569 process Effects 0.000 claims description 5
- 230000004044 response Effects 0.000 claims description 5
- 239000000284 extract Substances 0.000 abstract description 16
- 210000002683 foot Anatomy 0.000 description 187
- 238000010586 diagram Methods 0.000 description 36
- 238000003860 storage Methods 0.000 description 26
- 238000001514 detection method Methods 0.000 description 24
- 238000012795 verification Methods 0.000 description 17
- 238000004891 communication Methods 0.000 description 14
- 230000005021 gait Effects 0.000 description 14
- 230000000875 corresponding effect Effects 0.000 description 13
- 210000002414 leg Anatomy 0.000 description 11
- 230000005540 biological transmission Effects 0.000 description 9
- 230000010365 information processing Effects 0.000 description 9
- 238000005259 measurement Methods 0.000 description 8
- 238000012545 processing Methods 0.000 description 8
- 210000003371 toe Anatomy 0.000 description 7
- 238000005520 cutting process Methods 0.000 description 5
- 238000010988 intraclass correlation coefficient Methods 0.000 description 5
- 230000005856 abnormality Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 206010061159 Foot deformity Diseases 0.000 description 3
- 208000010300 Genu Varum Diseases 0.000 description 3
- 208000001963 Hallux Valgus Diseases 0.000 description 3
- 206010062061 Knee deformity Diseases 0.000 description 3
- 208000027418 Wounds and injury Diseases 0.000 description 3
- 238000012937 correction Methods 0.000 description 3
- 230000006378 damage Effects 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 208000014674 injury Diseases 0.000 description 3
- 210000002303 tibia Anatomy 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 238000002790 cross-validation Methods 0.000 description 2
- 238000000691 measurement method Methods 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 208000008589 Obesity Diseases 0.000 description 1
- 241001227561 Valgus Species 0.000 description 1
- 241000469816 Varus Species 0.000 description 1
- 210000000549 articulatio subtalaris Anatomy 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 230000001404 mediated effect Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 235000020824 obesity Nutrition 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 208000024891 symptom Diseases 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
Definitions
- the present disclosure relates to an estimation device or the like that estimates a physical state based on sensor data measured while walking.
- a device that analyzes the user's gait by mounting a load measuring device or an inertial measuring device on footwear such as shoes. If the physical condition can be estimated based on the information about the gait, appropriate countermeasures can be taken according to the signs that appear in walking. For example, if the physical condition can be estimated based on the information about the gait, it is possible to take measures according to the estimated physical condition.
- Patent Document 1 discloses a system that determines a user's motion characteristics using measured values measured by a motion sensor attached to the back of footwear.
- the system of U.S. Patent No. 6,200,000 uses acceleration in three directions, angular velocity and angle about three axes to calculate strength metrics such as pronation and pronation excursion.
- Patent Document 2 discloses a method of analyzing sole pressure data and evaluating the presence or absence of foot-related abnormalities.
- sole pressure data for a predetermined period of time is acquired by a pressure sensor provided in the insole of the shoe.
- the pronation foot/rotation Evaluate the outer leg.
- Patent Document 3 discloses a walking motion analysis device that analyzes walking motions using images of pedestrians photographed from multiple directions.
- the device of Patent Document 3 generates a silhouette image from the difference in value between the pixels at the positions of the pedestrians in the image and the pixels at the positions corresponding to the pedestrians in the background image.
- the apparatus of Patent Document 3 constructs a three-dimensional human model using the generated silhouette image, and acquires the angle of each joint, the length between the joints, and the movement distance of the joints as parameters.
- the device of Patent Literature 3 analyzes the walking state by comparing a parameter string extracted for each walking cycle calculated from the parameters with dictionary data generated in advance indicating actions during walking.
- Patent Literature 3 discloses that a parameter string of a healthy person is stored as dictionary data for each person attribute such as height, weight, age, and sex.
- intensity metrics such as pronation and pronation excursion can be calculated based on waveform features of time-series data of measured values measured by a motion sensor.
- the judgment according to the calculated intensity metric had to be made by an expert.
- Patent Document 2 it is possible to determine whether there is an abnormality in the pronation/supination foot based on the sole pressure data measured by the pressure sensor. Normally, the presence or absence of an abnormality in the pronation/supination foot cannot be accurately determined unless it is determined according to attributes such as gender and age. In the method of Patent Document 2, since the determination cannot be made according to the attributes such as gender and age, an accurate determination result cannot be obtained in some cases. Further, in the method of Patent Document 2, since the determination is made based on the data of the sole pressure measured by the pressure sensor, it was not possible to extract the features appearing during the swing phase.
- Patent Document 3 since the dictionary data used as the reference for analysis is classified for each person attribute, by comparing the dictionary data closer to the person attribute with the parameter string, it is possible to determine walking according to the attribute of the pedestrian. The state can be analyzed. However, in the technique of Patent Document 3, it was necessary to use images captured from multiple directions in order to analyze the walking state. Moreover, in the method of Patent Document 3, since the attributes of a healthy person are used as a standard for analysis, the degree of abnormality due to injury or disease cannot be verified.
- An object of the present disclosure is to provide an estimation device or the like that can estimate a physical state according to attributes based on sensor data measured while walking.
- An estimation device from a walking waveform extracted from time-series data of sensor data based on the movement of the user's feet, in a section in which physical state features corresponding to the attribute appear, according to the user's attribute
- a feature quantity extraction unit for extracting a feature quantity, and an estimation unit for estimating the user's physical state using the feature quantity extracted according to the user's attribute are included in a walking waveform extracted from time-series data of sensor data based on the movement of the user's feet, in a section in which physical state features corresponding to the attribute appear, according to the user's attribute.
- the computer extracts from the walking waveform extracted from the time-series data of the sensor data based on the movement of the user's leg, in the section where the feature of the physical state corresponding to the attribute appears, the user's A feature amount corresponding to the attribute is extracted, and the physical state of the user is estimated using the feature amount extracted according to the attribute of the user.
- a program from a gait waveform extracted from time-series data of sensor data based on the movement of the user's leg, performs a feature corresponding to the attribute of the user in a section in which the feature of the physical state corresponding to the attribute appears.
- a computer is caused to execute a process of extracting the quantity and a process of estimating the user's physical state using the feature amount extracted according to the user's attribute.
- an estimation device or the like capable of estimating a physical state according to attributes based on sensor data measured while walking.
- FIG. 1 is a block diagram showing an example of the configuration of an estimation system according to a first embodiment
- FIG. It is a conceptual diagram showing an example of arranging the data acquisition device of the estimation system according to the first embodiment in footwear.
- FIG. 2 is a conceptual diagram for explaining the relationship between a local coordinate system and a world coordinate system set in the data acquisition device of the estimation system according to the first embodiment;
- FIG. 2 is a conceptual diagram for explaining a human body plane;
- FIG. 4 is a conceptual diagram for explaining a walking event;
- FIG. 2 is a conceptual diagram for explaining pronation/supination of the foot;
- FIG. 2 is a conceptual diagram for explaining a foot pressure center locus index (CPEI: Center of Pressure Excursion Index) derived from a foot pressure distribution;
- CPEI Center of Pressure Excursion Index
- FIG. 4 is a conceptual diagram for explaining the difference in CPEI according to the degree of pronation/supination of the foot;
- FIG. 4 is a conceptual diagram for explaining an example of the correspondence relationship between the foot pressure center trajectory and the walking cycle; It is a block diagram showing an example of the configuration of the data acquisition device of the estimation system according to the first embodiment. It is a block diagram showing an example of a configuration of an estimating device of the estimating system according to the first embodiment.
- FIG. 4 is a conceptual diagram for explaining an example of an inference model for each attribute, which is used by the estimation device of the estimation system according to the first embodiment;
- FIG. 4 is a conceptual diagram showing an example of learning of an inference model used by the estimation device of the estimation system according to the first embodiment;
- FIG. 4 is a conceptual diagram showing an example of estimation of the pronation/supination degree of the foot by the estimation device of the estimation system according to the first embodiment; 4 is a flowchart for explaining an example of the operation of the estimation device of the estimation system according to the first embodiment;
- FIG. 2 is a conceptual diagram showing an example of arranging the data acquisition device and the pressure sensor of the estimation system according to the first embodiment in footwear.
- the correlation coefficient between the CPEI (estimated value) estimated based on the sensor data measured by the data acquisition device of the estimation system according to the first embodiment and the CPEI (true value) measured by the pressure sensor is It is a graph showing.
- FIG. 5 is a graph showing the correlation between CPEI (estimated value) estimated by the estimation device of the estimation system of the first embodiment and CPEI (true value) based on the measurement result of the pressure sensor;
- Graph showing the result of Z-score verification of the correlation between the CPEI (estimated value) estimated by the estimation device of the estimation system of the first embodiment and the CPEI (true value) based on the measurement result of the pressure sensor 110 is.
- the correlation coefficient between the CPEI (estimated value) estimated based on the sensor data measured by the data acquisition device of the estimation system according to the first embodiment and the CPEI (true value) measured by the pressure sensor is It is a graph showing.
- FIG. 4 is a conceptual diagram showing an example of displaying information based on an estimation result regarding the degree of pronation/supination of the foot estimated by the estimation device of the estimation system according to the first embodiment on the display unit of the mobile terminal.
- FIG. 10 is a conceptual diagram showing another example of displaying information based on the estimation result regarding the degree of pronation/supination of the foot estimated by the estimation device of the estimation system according to the first embodiment on the display unit of the mobile terminal.
- FIG. 4 is a conceptual diagram showing an example of transmitting to a data center data based on an estimation result regarding the degree of pronation/supination of the foot estimated by the estimation device of the estimation system according to the first embodiment. It is a block diagram which shows an example of a structure of the estimation apparatus which concerns on 2nd Embodiment. It is a block diagram showing an example of hardware constitutions which realize an estimation device concerning each embodiment.
- the estimation system of this embodiment measures the features (also called gait) included in the user's walking pattern, and estimates the physical state of the user by analyzing the measured gait.
- the degree of pronation/supination of the foot is estimated based on the sensor data regarding the movement of the foot, according to the user's attributes such as gender and age.
- the physical condition estimated by the method of the present embodiment is not limited to the degree of pronation/supination of the foot, and may include attributes such as gender and age, such as hallux valgus, bow legs/cross legs, and degree of obesity.
- FIG. 1 is a block diagram showing the configuration of an estimation system 1 of this embodiment.
- the estimation system 1 includes a data acquisition device 11 and an estimation device 12 .
- the data acquisition device 11 and the estimation device 12 may be wired or wirelessly connected.
- the data acquisition device 11 and the estimation device 12 may be configured as a single device.
- the estimation system 1 may be configured with only the estimation device 12 excluding the data acquisition device 11 from the configuration of the estimation system 1 .
- the data acquisition device 11 is installed on the foot.
- the data acquisition device 11 is installed on footwear such as shoes.
- the data acquisition device 11 includes an acceleration sensor and an angular velocity sensor.
- the data acquisition device 11 acquires physical quantities such as acceleration measured by an acceleration sensor (also referred to as spatial acceleration) and angular velocity measured by an angular velocity sensor (also referred to as spatial angular velocity) as physical quantities related to the movement of the foot of the user wearing the footwear. measure.
- the physical quantities related to the movement of the foot measured by the data acquisition device 11 include velocity, angle, and position (trajectory) calculated by integrating acceleration and angular velocity.
- the data acquisition device 11 converts the measured physical quantity into digital data (also called sensor data).
- the data acquisition device 11 transmits the converted sensor data to the estimation device 12 .
- the data acquisition device 11 is connected to the estimation device 12 via a mobile terminal (not shown) carried by the user.
- a mobile terminal is a communication device that can be carried by a user.
- a mobile terminal is a mobile communication device having a communication function, such as a smart phone, a smart watch, or a mobile phone.
- the mobile terminal receives sensor data regarding the movement of the user's foot from the data acquisition device 11 .
- the mobile terminal transmits the received sensor data to a server or the like in which the estimation device 12 is implemented.
- the function of the estimation device 12 may be implemented by application software or the like installed in the mobile terminal. In that case, the mobile terminal processes the received sensor data using application software or the like installed therein.
- the data acquisition device 11 is realized, for example, by an inertial measurement device including an acceleration sensor and an angular velocity sensor.
- An example of an inertial measurement device is an IMU (Inertial Measurement Unit).
- the IMU includes an acceleration sensor that measures acceleration along three axes and an angular velocity sensor that measures angular velocity around three axes.
- the data acquisition device 11 may be realized by an inertial measurement device such as VG (Vertical Gyro) or AHRS (Attitude Heading).
- the data acquisition device 11 may be realized by GPS/INS (Global Positioning System/Inertial Navigation System).
- FIG. 2 is a conceptual diagram showing an example of arranging the data acquisition device 11 inside the shoe 100.
- the data acquisition device 11 is placed in an arrangement that hits the back side of the arch.
- the data acquisition device 11 is placed on an insole that is inserted into the shoe 100 .
- the data acquisition device 11 is arranged on the bottom surface of the shoe 100 .
- the data acquisition device 11 is embedded in the body of the shoe 100 .
- the data acquisition device 11 may be removable from the shoe 100 or may not be removable from the shoe 100 .
- the data acquisition device 11 may be installed at a position other than the back side of the arch as long as it can acquire sensor data relating to the movement of the foot.
- the data acquisition device 11 may be installed on a sock worn by the user or an accessory such as an anklet worn by the user. Also, the data acquisition device 11 may be attached directly to the foot or embedded in the foot.
- FIG. 2 shows an example in which the data acquisition device 11 is installed on the shoe 100 on the right foot side, but the data acquisition device 11 may be installed on the shoes 100 on both feet. If the data acquisition devices 11 are installed in the shoes 100 for both feet, the physical condition can be estimated based on the movement of the feet for both feet.
- FIG. 3 shows the local coordinate system (x-axis, y-axis, z-axis) set in the data acquisition device 11 when the data acquisition device 11 is installed on the back side of the foot arch
- the world 2 is a conceptual diagram for explaining a coordinate system (X-axis, Y-axis, Z-axis);
- FIG. 1 In the world coordinate system (X-axis, Y-axis, Z-axis), when the user is standing upright, the lateral direction of the user is the X-axis direction (right direction is positive), and the front direction of the user (moving direction) is the Y-axis direction ( Forward is positive), and the direction of gravity is set to be the Z-axis direction (vertically upward is positive).
- a local coordinate system consisting of x, y, and z directions with reference to the data acquisition device 11 is set.
- FIG. 4 is a conceptual diagram for explaining the plane set for the human body (also called the human body plane).
- a sagittal plane that divides the body left and right a coronal plane that divides the body front and back, and a horizontal plane that divides the body horizontally are defined.
- the world coordinate system and the local coordinate system match in the upright state as shown in FIG.
- rotation in the sagittal plane with the x-axis as the rotation axis is roll
- rotation in the coronal plane with the y-axis as the rotation axis is pitch
- rotation in the horizontal plane with the z-axis as the rotation axis is yaw.
- the rotation angle in the sagittal plane with the x-axis as the rotation axis is the roll angle
- the rotation angle in the coronal plane with the y-axis as the rotation axis is the pitch angle
- the rotation angle in the horizontal plane with the z-axis as the rotation axis.
- the pitch angle rotation in the abduction direction (counterclockwise rotation about the y-axis) is positive
- rotation in the adduction direction clockwise rotation about the y-axis
- FIG. 5 is a conceptual diagram for explaining the step cycle based on the right foot.
- the horizontal axis of FIG. 5 is a walking cycle normalized by taking one walking cycle of the right foot as 100%, starting from when the heel of the right foot touches the ground and ending when the heel of the right foot touches the ground. is.
- One walking cycle of one leg is roughly divided into a stance phase in which at least part of the sole of the foot is in contact with the ground, and a swing phase in which the sole of the foot is separated from the ground.
- normalization is performed so that the stance phase accounts for 60% and the swing phase accounts for 40%.
- the stance phase is further subdivided into early stance T1, middle stance T2, final stance T3, and early swing T4.
- the swing phase is further subdivided into early swing phase T5, middle swing phase T6, and final swing phase T7.
- the walking waveform for one step cycle does not have to start from the time when the heel touches the ground.
- FIG. 5(a) represents an event (heel strike) in which the heel of the right foot touches the ground (HS: Heel Strike).
- FIG. 5B shows an event in which the toe of the left foot leaves the ground while the sole of the right foot touches the ground (OTO: Opposite Toe Off).
- FIG. 5(c) shows an event in which the heel of the right foot is lifted (heel rise) while the sole of the right foot is in contact with the ground (HR: Heel Rise).
- FIG. 5(d) shows an event in which the heel of the left foot touches the ground (opposite heel strike) (OHS: Opposite Heel Strike).
- FIG. 5(e) represents an event (toe off) in which the toe of the right foot leaves the ground while the sole of the left foot touches the ground (TO: Toe Off).
- FIG. 5(f) represents an event (foot crossing) in which the left foot and the right foot cross each other while the sole of the left foot is in contact with the ground (FA: Foot Adjacent).
- FIG. 5(g) represents an event (tibia vertical) in which the tibia of the right foot becomes almost vertical to the ground while the sole of the left foot is in contact with the ground (TV: Tibia Vertical).
- FIG. 5(h) represents an event (heel strike) in which the heel of the right foot touches the ground (HS: Heel Strike).
- FIG. 5(h) corresponds to the end point of the walking cycle starting from FIG. 5(a) and the starting point of the next walking cycle.
- FIG. 6 is a conceptual diagram for explaining pronation/supination of the foot.
- Pronation/supination of the foot is a triplanar movement that includes movements in the coronal, sagittal and horizontal planes simultaneously.
- pronation/supination of the foot is considered as coronal motion of the subtalar joint.
- the sensor data in the local coordinate system are transformed into the world coordinate system when calculating the pitch angle.
- Supination is a movement of the foot consisting of adduction, plantarflexion and varus.
- Pronation is a movement consisting of abduction, dorsiflexion, and valgus.
- a foot that pronates to a large extent and is fixed in that state is called a pronated foot.
- a foot that has a large degree of supination and is fixed in that state is called a supination foot.
- FIG. 7 is a conceptual diagram for explaining CPEI.
- FIG. 7 shows a center of pressure (CoP) trajectory superimposed on the foot pressure distribution.
- the foot pressure center trajectory is the point of maximum foot pressure (maximum load center) on a line obtained by cutting the foot pressure distribution on the floor (in the XY plane) by the coronal plane (ZX plane), and the heel contact along the Y-axis direction. It is a trajectory connecting a point (start point) to a tiptoe separation point (end point). A straight line connecting the start point and the end point is called a central axis (also called a construction line).
- a trapezoid with a base perpendicular to the central axis encloses the outline of the foot, and the front third of the foot is cut with a cutting line parallel to the base of the trapezoid.
- A be the point on the inside of the foot and D be the point on the outside of the foot.
- B be the intersection point of the central axis and the cutting line
- C be the intersection point of the foot pressure center locus and the cutting line.
- the line segment BC is called CPE (Center of Pressure Excursion).
- a line segment AD corresponds to the foot width.
- Equation 1 the ratio of CPE (length of line segment BC) to foot width (length of line segment AD) corresponds to foot pressure central locus index CPEI (equation 1).
- CPEI CPE/foot width ⁇ 100 (1)
- start point and end point how to enclose the trapezoid, how to cut the trapezoid, and the like, which are used to derive the CPEI, are only examples, and are not limited to the above definitions.
- FIG. 8 is a conceptual diagram for comparing the CPEI for supinated, normal, and pronated paws.
- the CPE the length of the line segment BC
- the foot width the length of the line segment AD
- CPE length of line segment BC
- foot width the length of line segment AD
- a CPEI of 20 or more is classified as supination
- a CPEI of 9-20 is classified as normal
- a CPEI of 9 or less is classified as pronation.
- the criteria based on the above CPEI values are an example of the degree of pronation/supination of the foot, and the criteria for the degree of pronation/supination of the foot are not limited to those described above. .
- the estimating device 12 estimates the degree of pronation/supination of the foot using the time-series data of the pitch angle, which is the rotation angle of the foot about the Y-axis. Specifically, the estimating device 12 estimates the degree of pronation/supination of the foot using the feature amount extracted from the time-series data of the pitch angle based on the criterion according to the attribute.
- a walking waveform that expresses the characteristics of the physical condition may be used.
- the estimation device 12 estimates the degree of pronation/supination of the foot using an estimation model generated for each attribute. An inference model for each attribute will be described later.
- FIG. 9 is a conceptual diagram for explaining the correspondence between the foot pressure center trajectory and the walking cycle.
- FIG. 9 shows the foot pressure distribution measured for a subject, the central axis of the foot pressure distribution, and the locus of the foot pressure center.
- the left side of the foot pressure distribution shows the walking cycle corresponding to the foot pressure center trajectory. 15 to 25% of the walking cycle is in a state where the sole of the right foot is fully grounded (Foot flat).
- the full sole contact means that the entire contact surface of the sole is in contact with the ground.
- the pitch angle is 0 degree when the sole is fully grounded (Foot flat). 30% of the walking cycle corresponds to heel lift timing.
- the contact area between the sole and the ground is biased toward the outside of the foot, resulting in a steep CPEI curve.
- the pitch angle becomes smaller at the end of stance, which is 30 to 50% of the walking cycle. If the degree of supination is excessive, the pitch angle may become negative.
- the contact portion between the sole and the ground is biased toward the medial side of the foot, resulting in a gentle CPEI curve. In this case, there is a tendency for abduction, and the pitch angle increases at the end of stance, which is 30-50% of the walking cycle.
- the feature amount that can be used to estimate the degree of pronation/supination of the foot appears in different walking cycles (also referred to as walking phases) depending on attributes such as gender and age. It became clear.
- features that can be applied to the estimation of the degree of foot pronation/supination appear in the latter half of the stance phase (immediately before toe-off) and the latter half of the swing phase (immediately before heel strike).
- a feature quantity that can be applied to the estimation of the degree of pronation/supination of the foot appears in the first half of the stance phase (immediately after heel contact and during single leg support). Therefore, when gender is used as an attribute, features extracted from the latter half of the stance phase and the latter half of the swing phase (male) and the first half of the stance phase (female) may be used.
- the estimation device 12 is implemented in a server (not shown) or the like.
- the estimating device 12 may be implemented by an application server.
- the estimation device 12 may be implemented by application software or the like installed in a mobile terminal (not shown).
- FIG. 10 is a block diagram showing an example of the detailed configuration of the data acquisition device 11. As shown in FIG. The data acquisition device 11 has an acceleration sensor 111 , an angular velocity sensor 112 , a control section 113 and a data transmission section 115 . Note that the data acquisition device 11 includes a power supply (not shown).
- the acceleration sensor 111 is a sensor that measures acceleration in three axial directions (also called spatial acceleration).
- the acceleration sensor 111 outputs the measured acceleration to the controller 113 .
- the acceleration sensor 111 can be a sensor of a piezoelectric type, a piezoresistive type, a capacitive type, or the like. It should be noted that the sensor used for the acceleration sensor 111 is not limited in its measurement method as long as it can measure acceleration.
- the angular velocity sensor 112 is a sensor that measures angular velocities in three axial directions (also called spatial angular velocities).
- the angular velocity sensor 112 outputs the measured angular velocity to the controller 113 .
- the angular velocity sensor 112 can be a vibration type sensor or a capacitance type sensor. It should be noted that the sensor used for the angular velocity sensor 112 is not limited in its measurement method as long as it can measure the angular velocity.
- the control unit 113 acquires accelerations in three-axis directions and angular velocities around three axes from each of the acceleration sensor 111 and the angular velocity sensor 112 .
- Control unit 113 converts the acquired acceleration and angular velocity into digital data, and outputs the converted digital data (also referred to as sensor data) to data transmission unit 115 .
- the sensor data includes at least acceleration data converted into digital data and angular velocity data converted into digital data.
- the acceleration data includes acceleration vectors in three axial directions.
- the angular velocity data includes angular velocity vectors around three axes. Acceleration data and angular velocity data are associated with acquisition times of the data.
- control unit 113 may be configured to output sensor data obtained by adding corrections such as mounting error, temperature correction, linearity correction, etc. to the acquired acceleration data and angular velocity data. Also, the control unit 113 may generate angle data about three axes using the acquired acceleration data and angular velocity data.
- control unit 113 is a microcomputer or microcontroller that performs overall control of the data acquisition device 11 and data processing.
- the control unit 113 has a CPU (Central Processing Unit), RAM (Random Access Memory), ROM (Read Only Memory), flash memory, and the like.
- Control unit 113 controls acceleration sensor 111 and angular velocity sensor 112 to measure angular velocity and acceleration.
- the control unit 113 performs AD conversion (Analog-to-Digital Conversion) on physical quantities (analog data) such as measured angular velocity and acceleration, and stores the converted digital data in a flash memory.
- AD conversion Analog-to-Digital Conversion
- Physical quantities (analog data) measured by acceleration sensor 111 and angular velocity sensor 112 may be converted into digital data by acceleration sensor 111 and angular velocity sensor 112, respectively.
- Digital data stored in the flash memory is output to the data transmission unit 115 at a predetermined timing.
- the data transmission unit 115 acquires sensor data from the control unit 113.
- the data transmission unit 115 transmits the acquired sensor data to the estimation device 12 .
- the data transmission unit 115 may transmit the sensor data to the estimation device 12 via a cable such as a cable, or may transmit the sensor data to the estimation device 12 via wireless communication.
- the data transmission unit 115 is configured to transmit the sensor data to the estimation device 12 via a wireless communication function (not shown) conforming to standards such as Bluetooth (registered trademark) and WiFi (registered trademark). .
- the communication function of the data transmission unit 115 may conform to standards other than Bluetooth (registered trademark) and WiFi (registered trademark).
- FIG. 11 is a block diagram showing an example of the configuration of the estimation device 12.
- the estimation device 12 has a detection unit 121 , a feature quantity extraction unit 125 , a storage unit 123 and an estimation unit 127 .
- a communication interface such as a receiving unit for receiving sensor data from the data acquisition device 11 and an output unit for outputting the result of estimation by the estimation unit 127 is provided. In this embodiment, communication interfaces are omitted.
- the detection unit 121 acquires sensor data from the data acquisition device 11 .
- the detection unit 121 transforms the coordinate system of the acquired sensor data from the local coordinate system to the world coordinate system.
- the local coordinate system (x-axis, y-axis, z-axis) and the world coordinate system (X-axis, Y-axis, Z-axis) coincide. Since the spatial posture of the data acquisition device 11 changes while the user is walking, the local coordinate system (x-axis, y-axis, z-axis) and the world coordinate system (X-axis, Y-axis, Z-axis) are It does not match.
- the detection unit 121 converts the sensor data acquired by the data acquisition device 11 from the local coordinate system (x-axis, y-axis, z-axis) of the data acquisition device 11 to the world coordinate system (X-axis, Y-axis, Z-axis). ).
- the detection unit 121 uses the sensor data to generate time-series data of physical quantities related to the movement of the foot, which are measured as the pedestrian wearing the footwear on which the data acquisition device 11 is installed walks. For example, the detection unit 121 generates time-series data such as spatial acceleration and spatial angular velocity. The detection unit 121 also integrates the spatial acceleration and the spatial angular velocity to generate time-series data such as the spatial velocity, the spatial angle (sole angle), and the spatial trajectory. These time-series data correspond to walking waveforms. The detection unit 121 generates time-series data at predetermined timings and time intervals set in accordance with a general walking cycle or a user-specific walking cycle.
- the timing at which the detection unit 121 generates the time-series data can be set arbitrarily.
- the detection unit 121 is configured to continue generating time-series data while the user continues walking.
- the detection unit 121 may be configured to generate time-series data at a specific timing.
- the detection unit 121 extracts time-series data (also referred to as a walking waveform) for one step cycle from the generated time-series data.
- time-series data also referred to as a walking waveform
- the detection unit 121 extracts a walking waveform of acceleration in the direction of travel for one step cycle from time-series data of acceleration in the direction of travel (Y direction).
- the detection unit 121 detects the timing of the toe-off in the walking waveform of the acceleration in the traveling direction for one step cycle.
- the timing of the toe-off is the timing at which a trough is detected between two peaks included in the maximum peak in the walking waveform of the acceleration in the traveling direction for one step cycle.
- the detection unit 121 detects the heel contact timing in the traveling direction acceleration walking waveform for one step period.
- the timing of heel contact is the midpoint timing between the timing at which the minimum peak is detected and the timing at which the maximum peak that appears next to the minimum peak is detected in the forward acceleration walking waveform for one step cycle. .
- the storage unit 123 stores an inference model generated in advance for each attribute.
- the storage unit 123 stores an estimation model for estimating the physical condition for each attribute such as gender and age.
- FIG. 12 is a conceptual diagram showing an example of the inference model 130 stored in the storage unit 123.
- the storage unit 123 stores an attribute-by-attribute estimation model 130 for estimating a physical state in response to an input of a feature amount extracted from a walking phase in which differences due to attributes such as gender and age appear.
- Male model 130M (also referred to as male inference model) is a model for estimating male physical condition.
- the female model 130F (also referred to as an inference model for female) is a model for estimating the physical condition of a female.
- the young person model 130Y (also referred to as the young person's estimation model) is a model for estimating the physical condition of a young person of about 20 to 39 years old.
- the elderly model 130S (also referred to as an inference model for the elderly) is a model for estimating the physical condition of an elderly person aged 60 or over. Note that if the attributes of the user are known in advance, the inference model 130 corresponding to the attributes of the user may be stored in the storage unit 123 .
- the body condition to be estimated is the degree of pronation/supination
- walking cycle walking phase
- the features specific to men are the latter half of the stance phase (immediately before toe-off) and the latter half of the swing phase (immediately before heel strike) in the pitch angle walking waveform (also called the angular waveform in the coronal plane). is extracted from a period including (also referred to as a male feature amount extraction period).
- a female-specific feature amount is extracted from a period (also referred to as a female feature amount extraction period) including the first half of the stance phase (immediately after heel contact and a period of one-leg support).
- the feature values peculiar to young people aged 20 to 39 are the period including the latter half of the stance phase (immediately before toe-off) and the latter half of the swing phase (immediately before heel contact) (young person feature value extraction period (also called For example, the feature value specific to elderly people aged 60 and over is a period including the first half of the stance phase (immediately after heel contact) and the latter half of the swing phase (immediately before heel contact) (also referred to as the elderly feature value extraction period).
- the inference model 130 for each attribute outputs an estimation result regarding the physical state in response to the input of the feature amount of the pitch angle in the walking cycle (walking phase) according to the attribute.
- the estimation model 130 for each attribute outputs an estimation result regarding the degree of pronation/supination of the foot according to the input of the pitch angle feature amount in the walking cycle (walking phase) according to the attribute.
- the inference model 130 outputs pronation/supination of the foot as an estimation result regarding the degree of pronation/supination of the foot in accordance with the input of the feature amount of the pitch angle in the walking cycle (walking phase) according to the attribute. / Output the normal judgment result.
- the male model 130M estimates the degree of pronation/supination of the foot according to the input of the feature amount extracted from the latter half of the stance phase and the latter half of the swing phase in the walking waveform of the pitch angle. Output.
- the male model 130M outputs an estimation result according to the input of the feature amount extracted from the walking waveform of the pitch angle during the male feature amount extraction period including immediately before the toe-off and immediately before the heel strike.
- the female model 130F outputs an estimation result regarding the degree of pronation/supination of the foot according to the input of the feature quantity extracted from the first half of the stance phase in the walking waveform of the pitch angle.
- the female model 130F outputs an estimation result according to the input of the feature amount extracted from the walking waveform of the pitch angle in the female feature amount extraction period including the period immediately after heel contact and the period of one-leg support.
- the young person model 130Y estimates the degree of pronation/supination of the foot according to the input of the feature amount extracted from the latter half of the stance phase and the latter half of the swing phase in the walking waveform of the pitch angle. to output
- the young person's model 130Y performs pronation/pronation of the foot according to the input of the feature amount extracted from the walking waveform of the pitch angle during the young person's feature amount extraction period including immediately before the toe-off and immediately before the heel strike.
- the elderly model 130S estimates the degree of pronation/supination of the foot according to the input of the feature amount extracted from the first half of the stance phase and the second half of the swing phase in the walking waveform of the pitch angle. to output In other words, the elderly model 130S outputs the estimation result according to the input of the feature quantity extracted from the walking waveform of the pitch angle during the elderly feature quantity extraction period including immediately after the heel contact and immediately before the heel contact. .
- age is used as an attribute
- the latter half of the swing phase (immediately before heel contact)
- an estimation is made by learning a data set of the feature amount extracted from the pitch angle time-series data measured by the data acquisition device 11 and the CPEI obtained from the foot pressure distribution measured by the pressure sensor. Create a model in advance. For example, an inference model that outputs the degree of pronation/supination of the foot according to the input of feature values extracted from the pitch angle time-series data during the walking cycle (gait phase), in which unique features appear for each attribute. be generated in advance.
- an arithmetic mean, weighted mean, or other mean value, integral value, or the like of the pitch angle extracted from the time-series data of the pitch angle is used as a feature quantity.
- a large amount of data with pitch angles as explanatory variables and CPEI as objective variables is measured for a plurality of subjects, and an inference model is generated by learning these data as teacher data. For example, even if we generate an inference model that classifies the foot condition as one of pronation, supination, and normal according to the CPEI estimate and outputs the degree of pronation/supination of the foot. good.
- FIG. 13 is a conceptual diagram showing an example of learning by the learning device 13 using a data set of pitch angle feature amounts (explanatory variables) and CPEI (objective variables) as teacher data.
- an inference model 130 that outputs an estimation result regarding the degree of pronation/supination of the foot according to the input of the feature amount of the pitch angle is prepared in advance. generate it.
- the pre-generated inference model 130 is stored in the storage unit 123.
- the estimation model 130 may be stored in the estimating device 12 at times such as when the product is shipped from the factory or during calibration before the user uses the estimating device 12 .
- the estimating device 12 inputs to the estimating model 130 the feature quantity extracted from the pitch angle time-series data generated using the sensor data measured by the data acquisition device 11, thereby pronating/rotating the foot.
- Estimate the degree of outside For example, the estimating device 12 outputs an estimation result classified into one of the three categories of supination, normal, and pronation as the degree of pronation/supination of the foot.
- the estimating device 12 may output an estimated CPEI value or a pitch angle feature amount as the degree of pronation/supination of the foot.
- the feature amount extraction unit 125 extracts feature amounts used for estimating the physical state from the walking waveform for the one-step cycle, according to the attributes of the user. Specifically, the feature quantity extraction unit 125 extracts the feature quantity used for estimating the physical state from the walking waveform for the one-step cycle, using the estimation model for each attribute stored in the storage unit 123 . For example, the feature quantity extraction unit 125 uses an inference model according to the attributes of the user to extract the feature quantity used for estimating the degree of pronation/supination from the walking waveform of the pitch angle (angular waveform in the coronal plane). to extract For example, the feature amount extraction unit 125 uses an inference model according to the attributes of the user to extract pronation/pronation/ It is extracted as a feature quantity used for estimating the degree of supination.
- the estimating unit 127 inputs the feature amount of the walking waveform of the pitch angle (angular waveform in the coronal plane) extracted by the feature amount extracting unit 125 to the estimation model 130 for each attribute, and estimates the estimation result regarding the physical condition. .
- the estimation unit 127 inputs the feature amount of the pitch angle gait waveform (angular waveform in the coronal plane) extracted by the feature amount extraction unit 125 to the estimation model 130 for each attribute, and calculates the pronation/rotation of the foot. Estimate the result of the estimation on the degree of outside.
- the estimation unit 127 outputs estimation results.
- the estimation result by the estimation unit 127 is output to a host system, a server in which a database is constructed, a user's mobile terminal from which the walking waveform is acquired, or the like.
- the output destination of the estimation result by the estimation unit 127 is not particularly limited.
- FIG. 14 is a conceptual diagram showing an example in which estimation results regarding the degree of pronation/supination of the foot are output by inputting the feature amount of the pitch angle for each attribute into the estimation model 130 generated in advance.
- the estimation result regarding the degree of pronation/supination of the foot includes the determination result of pronation/supination/normality of the foot.
- the estimation result regarding the degree of pronation/supination of the foot includes recommendation information to refer to a suitable hospital for examination according to the determination result of pronation/supination/normality of the foot.
- the estimated result for the degree of pronation/supination of the foot may be a pitch angle value or a CPEI. Note that the above estimation result is an example, and does not limit the estimation result output from the estimation model 130 by inputting the pitch angle feature amount for each attribute.
- the inference model 130 uses the input of the feature amount of the pitch angle for each attribute to determine the pronation/supination/normality determination result of the foot as the estimation result regarding the degree of pronation/supination of the foot. Output recommendation information to proceed to an appropriate hospital.
- the inference model 130 outputs a pitch angle value or CPEI as an estimation result regarding the degree of pronation/supination of the foot in response to the input of the pitch angle feature amount for each attribute.
- the estimation result of the estimation model 130 described above is an example, and the estimation result output from the estimation model 130 is not limited by inputting the feature amount of the pitch angle for each attribute.
- FIG. 15 is a flowchart for explaining the outline of the operation of the estimating device 12. As shown in FIG. The details of the operation of the estimating device 12 are as described for the configuration above.
- the estimating device 12 acquires sensor data relating to physical quantities related to foot movement from the data acquiring device 11 (step S11).
- the estimation device 12 converts the coordinate system of the acquired sensor data from the local coordinate system set in the data acquisition device 11 to the world coordinate system (step S12).
- the estimation device 12 generates a walking waveform using the time-series data of the sensor data converted into the world coordinate system (step S13).
- the estimating device 12 extracts a feature quantity from the angle waveform of the coronal plane (pitch angle walking waveform) during a period (walking phase) in which features according to the user's attributes appear (step S14).
- the estimation device 12 inputs the extracted feature amount to the estimation model 130 corresponding to the user's attribute, and estimates the physical state of the user (step S15). For example, the estimating device 12 inputs a feature amount into the estimation model 130 according to the user's attribute, and estimates the degree of pronation/supination of the user's foot.
- the estimation device 12 outputs an estimation result regarding the user's physical condition (step S16). For example, the estimating device 12 outputs an estimation result regarding the degree of pronation/supination of the foot (step S16).
- FIG. 16 is a conceptual diagram showing an arrangement example of the pressure sensor 110 and the data acquisition device 11 used in this verification example.
- a pressure sensor 110 capable of measuring foot pressure distribution was inserted as an insole of the shoe 100, and the data acquisition device 11 was mounted on the back side of the arch.
- the CPEI estimated value estimated based on the feature amount extracted from the pitch angle time-series data and the actual CPEI value (true value) was verified.
- the average age of the subjects in this verification example was 44.3 years old, the youngest was 20 years old, and the oldest was 71 years old.
- the subjects had an average weight of 62.9 kg (kg), a maximum weight of 115 kg, and a minimum weight of 40 kg.
- the average height of the subjects in this verification example was 165.0 cm (centimeter), the maximum height was 192 cm, and the minimum height was 143 cm.
- the test subject's foot size in this verification example was 25.3 cm on average, 22.5 cm at minimum, and 29.0 cm at maximum.
- the BMI (Body Mass Index) of the subjects in this verification example was 22.9 on average, 36.3 at maximum, and 17.3 at minimum.
- each subject was asked to walk straight from the starting point to the turning point 15 meters (meters) ahead, turn back at the turning point, and walk straight to the starting point three times.
- Each subject was allowed to walk at different walking speeds in three trials.
- the walking speed in the three trials includes three patterns of normal walking, slow walking, and fast walking. To eliminate physical and psychological biases, we changed the order of walking speeds for each subject.
- the average value of all the number of steps in one trial was calculated.
- One point of data one set of average waveforms
- the estimating device 12 generates nine types of walking waveforms of acceleration in three-axis directions, angular velocities around three axes, and angles around three axes (plantar angles).
- correlation with CPEI was evaluated for each walking cycle (walking phase) from nine types of walking waveforms.
- a leave-one-subject-out method was used to eliminate biases due to data distribution, and finally the average correlation coefficient was calculated.
- a threshold value was set in advance for the correlation coefficient, and walking phases in which the correlation coefficient exceeded the threshold value and which could be correlated were selected.
- the feature amount for each walking phase cluster was calculated.
- the integral average value of the measurement values in the multiple walking phases that make up the walking phase cluster was used as the feature quantity.
- FIG. 17 shows correlation coefficients for each sex between the CPEI (estimated value) estimated based on the feature amount extracted from the pitch angle time-series data and the CPEI (true value) measured by the pressure sensor 110.
- the hatched upper/lower boundary line of the range corresponds to the threshold.
- the correlation between the CPEI (estimated value) and the CPEI (true value) is high in the section (the walking phase section in which the correlation coefficient exceeds the threshold) that protrudes outside the hatched range. That is, it is possible to extract the feature amount of CPEI for each attribute (sex) from the section of the walking phase in which the correlation coefficient exceeds the threshold.
- the correlation coefficient exceeds the threshold in the first half of the stance phase (period F1 immediately after heel strike and period F2 with single leg support).
- the correlation coefficient exceeds the threshold in the latter half of the stance phase (period M1 immediately before toe-off) and the latter half of the swing phase (period M2 immediately before heel-strike).
- the interval of the walking phase that correlates with CPEI is different.
- an inference model was generated using features extracted from the first half of the stance phase (the period F1 immediately after heel contact and the period F2 during single-leg support).
- an inference model generated for males (dotted line).
- FIG. 18 is based on the CPEI (estimated value) estimated by inputting the feature amount extracted from the section of the walking phase where the characteristics of each gender appear into the inference model for each gender, and the measurement result of the pressure sensor 110.
- 4 is a graph showing a correlation with CPEI (true value);
- FIG. 18 shows the results of verifying the correlation between CPEI (estimated value) and CPEI (true value) using leave-one-subject-out cross-validation.
- the Intraclass Correlation Coefficient (ICC) was 0.560.
- the ICC was 0.553.
- FIG. 19 is a graph showing the results of Z-score verification of the correlation between CPEI (estimated value) and CPEI (true value).
- the results of FIGS. 18 and 19 are the CPEI (estimated value) estimated using the feature amount extracted from the walking phase section in which features for each gender appear, and the CPEI (estimated value) measured using the pressure sensor 110 ( true value) indicates moderate agreement. That is, the degree of pronation/supination (CPEI) can be estimated using the feature amount extracted from the walking phase section in which the characteristics of each gender appear.
- FIG. 20 shows the correlation coefficient for each age between the CPEI (estimated value) estimated based on the feature amount extracted from the pitch angle time-series data and the CPEI (true value) measured by the pressure sensor 110.
- upper/lower boundary lines of the hatched range are set as threshold values.
- the correlation between the CPEI (estimated value) and the CPEI (true value) is high in the section (the walking phase section in which the correlation coefficient exceeds the threshold) that protrudes outside the hatched range. That is, the feature amount of CPEI can be extracted from the section of the walking phase in which the correlation coefficient exceeds the threshold.
- the correlation coefficient exceeds the threshold in the first half of the stance phase (period S1 immediately after heel strike) and the latter half of the swing phase (period S2 immediately before heel strike).
- the correlation coefficient exceeds the threshold in the latter half of the stance phase (period Y1 immediately before toe-off) and the latter half of the swing phase (period Y2 immediately before heel-strike). In this way, the elderly (solid line) and the young (dotted line) have different walking phase sections that are correlated with the CPEI.
- an inference model is generated using features extracted from the first half of the stance phase (period S1 immediately after heel contact) and the second half of the swing phase (period S2 immediately before heel contact). should be generated.
- an inference model using features extracted from the latter half of the stance phase (period Y1 immediately before toe-off) and the latter half of the swing phase (period Y2 immediately before heel contact) should be generated.
- the degree of pronation/supination can also be estimated using the feature values extracted from the walking phase sections in which age-specific features appear.
- This application example is an example in which the estimation result regarding the degree of pronation/supination of the foot output by the estimation device 12 is displayed on a display device or utilized as big data.
- the data acquisition device 11 is installed in the shoes of the pedestrian, and the sensor data based on the physical quantity related to the movement of the foot measured by the data acquisition device 11 is transmitted to the portable terminal possessed by the pedestrian. shall be It is assumed that the sensor data transmitted to the mobile terminal is processed by application software or the like installed in the mobile terminal.
- FIG. 21 is an example of displaying an estimation result regarding the degree of pronation/supination of the foot of the walker on the screen of the mobile terminal 160 of the walker wearing the shoes 100 on which the data acquisition device (not shown) is installed. is.
- Let A pedestrian who has browsed the estimation result regarding the degree of pronation/supination of the foot displayed on the screen of the mobile terminal 160 can take action according to the estimation result.
- a pedestrian who browses the estimation results regarding the degree of pronation/supination of the foot displayed on the screen of the mobile terminal 160 can contact a medical institution or the like about his/her situation according to the estimation results.
- a pedestrian who browses the estimation results regarding the degree of pronation/supination of the foot displayed on the screen of the mobile terminal 160 can exercise and walk in a manner suitable for him/herself according to the estimation results.
- FIG. 22 is a screen of a mobile terminal 160 of a walker wearing shoes 100 on which a data acquisition device (not shown) is installed. is displayed.
- the screen of the mobile terminal 160 displays information recommending that the pedestrian undergo a medical examination at a hospital, depending on the progress of pronation/supination of the foot.
- a link to a site or a telephone number of a hospital that can be visited may be displayed on the screen of the mobile terminal 160 .
- a pedestrian who has browsed the information according to the estimation result regarding the degree of pronation/supination of the foot displayed on the screen of the mobile terminal 160 can act according to the information.
- FIG. 23 shows an example in which information based on sensor data measured by a data acquisition device (not shown) is transmitted to a data center 170 from mobile terminals 160 of a plurality of pedestrians wearing shoes 100 equipped with a data acquisition device.
- the mobile terminal 160 transmits the sensor data measured by the data acquisition device, the CPEI estimated value, and the estimation results regarding the degree of pronation/supination of the foot of the walker to the data center 170 .
- data sent to data center 170 is stored in a database.
- data accumulated in a database is utilized as big data.
- an example of estimating the degree of pronation/supination is described as a physical condition.
- the method of the present embodiment can also be applied to estimation of body conditions other than the degree of pronation/supination.
- the degree of hallux valgus tends to be primarily influenced by footwear in women and by injury in men. Therefore, it is presumed that the degree of hallux valgus is characterized by different sections depending on gender. For example, the tendencies of bow legs and cross legs are different depending on whether a person mainly works standing or sitting. Assuming such an assumption, it is presumed that the degree of bow legs and cross legs differs in the interval in which characteristics appear depending on social attributes such as occupation.
- the estimation system of this embodiment includes a data acquisition device and an estimation device.
- the data acquisition device is installed on the user's foot and measures spatial acceleration and spatial angular velocity.
- the data acquisition device generates sensor data based on the measured spatial acceleration and spatial angular velocity, and transmits the generated sensor data to the estimation device.
- the estimation device includes a detection unit, a feature amount extraction unit, a storage unit, and an estimation unit.
- the detection unit detects a walking event from time-series data of sensor data based on the movement of the user's foot.
- the detection unit extracts a walking waveform for one step based on the detected walking event.
- the feature quantity extraction unit extracts a feature quantity corresponding to the attribute of the user from the walking waveform extracted by the detection unit in a section in which the feature of the physical state corresponding to the attribute of the user appears.
- the storage unit stores an inference model that outputs the user's physical state according to the input of the feature amount extracted according to the user's attribute.
- the estimation unit inputs the feature amount extracted according to the attribute of the user to the estimation model stored in the storage unit to estimate the physical state of the user.
- the estimation system of the present embodiment uses sensor data based on foot movements measured by a data acquisition device installed on the user's foot to estimate the user's physical condition according to attributes. That is, the estimation system of this embodiment can estimate the physical condition according to the attribute based on the sensor data measured while walking.
- the estimating unit in response to the input of the feature amount extracted according to the attribute, outputs the estimated result of the physical state according to the attribute, which is extracted from the walking waveform of the user. Input the feature value.
- the estimation unit estimates the physical state of the user based on the estimation result output from the estimation model. According to this aspect, by inputting the feature amount extracted from the walking waveform according to the user's attribute into the estimation model generated for each attribute, the physical state reflecting the user's attribute can be estimated.
- the feature amount extraction unit extracts feature amounts from a walking waveform related to angles in the coronal plane.
- the estimating unit estimates the degree of pronation/supination of the foot using the feature amount extracted from the walking waveform regarding the angle in the coronal plane.
- the degree of pronation/supination of the foot reflecting the user's attributes can be estimated as the physical condition.
- the estimating unit inputs the feature amount extracted from the user's walking waveform to the estimating model that has been trained on data sets related to a plurality of subjects, and performs pronation/supination of the user's foot.
- Estimate the degree of The data set consists of the foot pressure distribution measured by the pressure sensor, using the feature values extracted from the walking waveforms related to the angles in the coronal plane as the explanatory variables in the sections where the characteristics of the physical condition according to the attributes appear.
- the pressure center locus index is used as the objective variable. According to this aspect, it is possible to estimate the degree of pronation/supination of the foot reflecting the attributes of the user, using an inference model that has been trained on data sets relating to a plurality of subjects.
- the estimation unit estimates the degree of pronation/supination of the user using an inference model for each gender. If the user is female, the feature quantity extraction unit extracts the feature quantity in the female feature quantity extraction period including the period immediately after heel contact and the period of one leg support from the walking waveform regarding the angle in the coronal plane. Then, the estimating unit inputs the feature amount extracted by the feature amount extracting unit to an inferring model for female trained on the feature amount extracted in the female feature amount extraction period for a plurality of female subjects, and inputs the feature amount extracted by the feature amount extracting unit to the user's pronation. / Estimate the degree of supination.
- the feature quantity extraction unit extracts the feature quantity in the male feature quantity extraction period including the period immediately before the toe-off and the period immediately before the heel strike. Then, the estimating unit inputs the feature amount extracted by the feature amount extracting unit to an inferring model for males trained on the feature amount extracted from the male feature amount extraction period for a plurality of male subjects, and inputs the feature amount extracted by the feature amount extracting unit to the user's pronation. / Estimate the degree of supination.
- features are extracted from the gait waveform related to the angles in the coronal plane during periods when unique features appear according to gender. Then, in this aspect, the user's physical condition is estimated using an inference model for each gender. Therefore, according to this aspect, it is possible to estimate the degree of pronation/supination of the foot that better reflects the attribute (gender) of the user.
- the estimation unit estimates the degree of pronation/supination of the user using an estimation model for each age.
- the feature amount extraction unit extracts feature amounts in the elderly feature amount extraction period including the period immediately after the heel contact and the period immediately before the heel contact from the walking waveform related to the angle in the coronal plane. .
- the estimating unit inputs the feature amount extracted by the feature amount extracting unit to the elderly person's inference model that has learned the feature amount extracted in the elderly feature amount extraction period for a plurality of elderly subjects, and the user Estimate the degree of pronation/supination of the If the user is a young person, the feature amount extraction unit extracts the feature amount in the young person feature amount extraction period including the period immediately before the toe-off and the period immediately before the heel strike from the walking waveform related to the angle in the coronal plane. do.
- the estimation unit inputs the feature amount extracted by the feature amount extraction unit to an inference model for young people that has been trained on the feature amount extracted in the young person feature amount extraction period for a plurality of young subjects, and the user Estimate the degree of pronation/supination of the
- features are extracted from the gait waveform related to angles in the coronal plane during periods when unique features appear according to age. Then, in this aspect, the user's physical condition is estimated using an estimation model for each age. Therefore, according to this aspect, it is possible to estimate the degree of pronation/supination of the foot that better reflects the attribute (age) of the user.
- the estimation unit outputs a determination result indicating whether the foot is pronation/supination or normal, according to the estimated value of the foot pressure center locus index. According to this aspect, it is possible to determine which of pronation/supination and normality of the foot according to the estimated value of the foot pressure center locus index.
- the detection system of this embodiment can be applied to custom-made shoes.
- the detection system of the present embodiment can be applied to verify the degree of pronation/supination of the user's foot by making the user wear guest shoes on which the data acquisition device is installed and walking. If the data on the verification result of the degree of pronation/supination of the user's foot is provided to a manufacturer that designs shoes, it becomes possible to design shoes according to the degree of pronation/supination of the user's foot.
- the detection system of this embodiment can also be applied to monitoring the user's daily life. For example, if it is possible to extract walking habits or recommend changing shoes according to the progress of pronation/supination of the user's foot during walking, the progress of pronation/supination of the user's foot can be determined. it may be possible to suppress it. For example, if a user is using a foot pronation/supination orthotic, providing the user with information according to the degree of pronation/supination of the foot may reduce the progression of symptoms or prevent injury. may lead to the prevention of
- the detection system of the present embodiment by collecting the estimation results of a large number of users and creating a database of the estimation results regarding the degree of pronation/supination of the foot, There is a possibility that the information related to it can be utilized as big data. For example, if the degree of pronation/supination and CPEI of the feet of many users are associated with shoes and stored in a database, data that can be utilized for shoe design, maintenance, etc. can be accumulated.
- FIG. 25 is a block diagram showing an example of the configuration of the estimation device 22 of this embodiment.
- the estimating device 22 includes a feature quantity extracting section 225 and an estimating section 227 .
- the feature quantity extraction unit 225 extracts features corresponding to the user's attributes from the walking waveform extracted from the time-series data of the sensor data based on the movement of the user's feet, in the section where the features of the physical state corresponding to the user's attributes appear. Extract quantity.
- the estimation unit 227 estimates the user's physical condition using the feature amount extracted according to the user's attribute.
- the estimation system of the present embodiment uses sensor data based on foot movements measured by a data acquisition device installed on the user's foot to determine the user's physical condition according to attributes. to estimate. That is, the estimation device of the present embodiment can estimate the physical state according to the attributes based on the sensor data measured while walking.
- the hardware configuration for executing the processing of the estimation device according to each embodiment of the present invention will be described by taking the information processing device 90 of FIG. 25 as an example.
- the information processing device 90 of FIG. 25 is a configuration example for executing the processing of the estimation device of each embodiment, and does not limit the scope of the present invention.
- the information processing device 90 includes a processor 91, a main storage device 92, an auxiliary storage device 93, an input/output interface 95, and a communication interface 96.
- the interface is abbreviated as I/F (Interface).
- Processor 91 , main storage device 92 , auxiliary storage device 93 , input/output interface 95 , and communication interface 96 are connected to each other via bus 98 so as to enable data communication.
- the processor 91 , the main storage device 92 , the auxiliary storage device 93 and the input/output interface 95 are connected to a network such as the Internet or an intranet via a communication interface 96 .
- the processor 91 expands the program stored in the auxiliary storage device 93 or the like into the main storage device 92 and executes the expanded program.
- a configuration using a software program installed in the information processing device 90 may be used.
- the processor 91 executes processing by the estimation device according to this embodiment.
- the main storage device 92 has an area in which programs are expanded.
- the main memory device 92 may be, for example, a volatile memory such as a DRAM (Dynamic Random Access Memory). Also, a non-volatile memory such as MRAM (Magnetoresistive Random Access Memory) may be configured and added as the main storage device 92 .
- DRAM Dynamic Random Access Memory
- MRAM Magnetic Random Access Memory
- the auxiliary storage device 93 stores various data.
- the auxiliary storage device 93 is configured by a local disk such as a hard disk or flash memory. It should be noted that it is possible to store various data in the main storage device 92 and omit the auxiliary storage device 93 .
- the input/output interface 95 is an interface for connecting the information processing device 90 and peripheral devices.
- a communication interface 96 is an interface for connecting to an external system or device through a network such as the Internet or an intranet based on standards and specifications.
- the input/output interface 95 and the communication interface 96 may be shared as an interface for connecting with external devices.
- the information processing device 90 may be configured to connect input devices such as a keyboard, mouse, and touch panel as necessary. These input devices are used to enter information and settings. Note that when a touch panel is used as an input device, the display screen of the display device may also serve as an interface of the input device. Data communication between the processor 91 and the input device may be mediated by the input/output interface 95 .
- the information processing device 90 may be equipped with a display device for displaying information.
- the information processing device 90 is preferably provided with a display control device (not shown) for controlling the display of the display device.
- the display device may be connected to the information processing device 90 via the input/output interface 95 .
- the above is an example of the hardware configuration for enabling the estimation device according to each embodiment of the present invention.
- the hardware configuration of FIG. 25 is an example of a hardware configuration for executing arithmetic processing of the estimation device according to each embodiment, and does not limit the scope of the present invention.
- the scope of the present invention also includes a program that causes a computer to execute processing related to the estimation device according to each embodiment.
- a recording medium recording the program according to each embodiment is also included in the scope of the present invention.
- the recording medium can be implemented as an optical recording medium such as a CD (Compact Disc) or a DVD (Digital Versatile Disc).
- the recording medium may be a semiconductor recording medium such as a USB (Universal Serial Bus) memory or an SD (Secure Digital) card, a magnetic recording medium such as a flexible disk, or other recording medium.
- a program executed by a processor is recorded on a recording medium, the recording medium corresponds to a program recording medium.
- the components of the estimation device of each embodiment can be combined arbitrarily. Also, the constituent elements of the estimation device of each embodiment may be realized by software or by a circuit.
- estimation system 11 data acquisition device 12, 22 estimation device 111 acceleration sensor 112 angular velocity sensor 113 control unit 115 data transmission unit 121 detection unit 123 storage unit 125, 225 feature quantity extraction unit 127, 227 estimation unit
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Dentistry (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Physiology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
本開示に係る第1の実施形態の推定システムについて図面を参照しながら説明する。本実施形態の推定システムは、ユーザの歩行パターンに含まれる特徴(歩容とも呼ぶ)を計測し、計測された歩容を解析することによって、ユーザの身体状態を推定する。本実施形態では、足の動きに関するセンサデータに基づいて、ユーザの性別や年齢などの属性に応じて、足の回内/回外の度合を推定する例について説明する。本実施形態の手法によって推定される身体状態は、足の回内/回外の度合に限定されず、外反母趾や、O脚/X脚、肥満度などのように、性別や年齢などの属性の違いの影響が反映される身体状態の推定にも用いることができる。本実施形態においては、右足を基準の足とし、左足を反対足とする系について説明する。本実施形態の手法は、左足を基準の足とし、右足を反対足とする系についても適用できる。
図1は、本実施形態の推定システム1の構成を示すブロック図である。推定システム1は、データ取得装置11および推定装置12を備える。データ取得装置11と推定装置12は、有線で接続されてもよいし、無線で接続されてもよい。また、データ取得装置11と推定装置12は、単一の装置で構成されてもよい。また、推定システム1の構成からデータ取得装置11を除き、推定装置12だけで推定システム1が構成されてもよい。
CPEI=CPE/足幅×100・・・(1)
ただし、CPEIを導出するために用いられる上記の始点や終点、台形の囲み方、台形の切断の仕方等は、一例であって、上記の定義に限定されない。
次に、データ取得装置11の詳細構成について図面を参照しながら説明する。図10は、データ取得装置11の詳細構成の一例を示すブロック図である。データ取得装置11は、加速度センサ111、角速度センサ112、制御部113、およびデータ送信部115を有する。なお、データ取得装置11は、図示しない電源を含む。
次に、推定システム1が備える推定装置12の詳細構成について図面を参照しながら説明する。図11は、推定装置12の構成の一例を示すブロック図である。推定装置12は、検出部121、特徴量抽出部125、記憶部123、および推測部127を有する。実際には、データ取得装置11からセンサデータを受信する受信部や、推測部127による推測結果を出力する出力部などの通信インターフェースが設けられる。本実施形態においては、通信インターフェースについては省略する。
次に、本実施形態の推定システム1の推定装置12の動作について図面を参照しながら説明する。図15は、推定装置12の動作の概略について説明するためのフローチャートである。推定装置12の動作の詳細は、上述の構成に関する説明の通りである。
次に、データ取得装置11によって計測されたセンサデータに基づいて抽出された特徴量に基づいて推測されたCPEI(推測値)と、感圧センサによって計測されたCPEI(真値)との関係の検証例について説明する。
次に、本実施形態の推定システム1の適用例について図面を参照しながら説明する。本適用例では、推定装置12によって出力された足の回内/回外の度合に関する推定結果を、表示装置に表示したり、ビックデータとして活用したりする例である。以下の例においては、歩行者の靴の中にデータ取得装置11が設置され、そのデータ取得装置11によって計測された足の動きに関する物理量に基づくセンサデータが、歩行者の所持する携帯端末に送信されるものとする。携帯端末に送信されたセンサデータは、携帯端末にインストールされたアプリケーションソフトウェア等によってデータ処理されるものとする。
次に、第2の実施形態に係る推定装置について図面を参照しながら説明する。本実施形態の推定装置は、第1の実施形態の推定装置を簡略化した構成である。図25は、本実施形態の推定装置22の構成の一例を示すブロック図である。推定装置22は、特徴量抽出部225および推測部227を備える。
ここで、本発明の各実施形態に係る推定装置の処理を実行するハードウェア構成について、図25の情報処理装置90を一例として挙げて説明する。なお、図25の情報処理装置90は、各実施形態の推定装置の処理を実行するための構成例であって、本発明の範囲を限定するものではない。
11 データ取得装置
12、22 推定装置
111 加速度センサ
112 角速度センサ
113 制御部
115 データ送信部
121 検出部
123 記憶部
125、225 特徴量抽出部
127、227 推測部
Claims (10)
- ユーザの足の動きに基づくセンサデータの時系列データから抽出された歩行波形から、前記ユーザの属性に応じた身体状態の特徴が表れる区間において、前記ユーザの属性に応じた特徴量を抽出する特徴量抽出手段と、
前記ユーザの属性に応じて抽出された前記特徴量を用いて、前記ユーザの身体状態を推定する推測手段と、を備える推定装置。 - 前記推測手段は、
前記属性に応じて抽出される前記特徴量の入力に応じて、前記属性に応じた身体状態に関する推定結果を出力する推測モデルに、前記ユーザの歩行波形から抽出された前記特徴量を入力し、前記推測モデルから出力される前記推定結果に基づいて前記ユーザの身体状態を推定する請求項1に記載の推定装置。 - 前記特徴量抽出手段は、
冠状面内の角度に関する歩行波形から前記特徴量を抽出し、
前記推測手段は、
前記冠状面内の角度に関する前記歩行波形から抽出された前記特徴量を用いて、足の回内/回外の度合を推定する請求項1に記載の推定装置。 - 前記推測手段は、
複数の被験者に関して、前記属性に応じた身体状態の特徴が表れる区間において、前記冠状面内の角度に関する前記歩行波形から抽出された前記特徴量を説明変数とし、圧力センサによって計測された足圧分布から求められた足圧中心軌跡指標を目的変数とするデータセットを学習させた推測モデルに、前記ユーザの前記歩行波形から抽出された前記特徴量を入力して、前記ユーザの足の回内/回外の度合を推定する請求項3に記載の推定装置。 - 前記ユーザが女性の場合、
前記特徴量抽出手段は、
前記冠状面内の角度に関する前記歩行波形から、踵接地の直後の期間と片脚支持の期間を含む女性特徴量抽出期間において前記特徴量を抽出し、
前記推測手段は、
複数の女性被験者に関して前記女性特徴量抽出期間において抽出された前記特徴量を学習させた女性用推測モデルに、前記特徴量抽出手段によって抽出された前記特徴量を入力して前記ユーザの回内/回外の度合を推定し、
前記ユーザが男性の場合、
前記特徴量抽出手段は、
爪先離地の直前の期間と前記踵接地の直前の期間を含む男性特徴量抽出期間において前記特徴量を抽出し、
前記推測手段は、
複数の男性被験者に関して前記男性特徴量抽出期間から抽出された前記特徴量を学習させた男性用推測モデルに、前記特徴量抽出手段によって抽出された前記特徴量を入力して前記ユーザの回内/回外の度合を推定する請求項4に記載の推定装置。 - 前記ユーザが高齢者の場合、
前記特徴量抽出手段は、
前記冠状面内の角度に関する前記歩行波形から、踵接地の直後の期間と前記踵接地の直前の期間を含む高齢者特徴量抽出期間において前記特徴量を抽出し、
前記推測手段は、
複数の高齢者被験者に関して前記高齢者特徴量抽出期間において抽出された前記特徴量を学習させた高齢者用推測モデルに、前記特徴量抽出手段によって抽出された前記特徴量を入力して前記ユーザの回内/回外の度合を推定し、
前記ユーザが若年者の場合、
前記特徴量抽出手段は、
前記冠状面内の角度に関する前記歩行波形から、爪先離地の直前の期間と前記踵接地の直前の期間を含む若年者特徴量抽出期間において前記特徴量を抽出し、
前記推測手段は、
複数の若年者被験者に関して前記若年者特徴量抽出期間において抽出された前記特徴量を学習させた若年者用推測モデルに、前記特徴量抽出手段によって抽出された前記特徴量を入力して前記ユーザの回内/回外の度合を推定する請求項4に記載の推定装置。 - 前記推測手段は、
推定された前記足圧中心軌跡指標の値に応じて、足の回内/回外、および正常のうちいずれであるのかを示す判定結果を出力する請求項4乃至6のいずれか一項に記載の推定装置。 - 請求項1乃至7のいずれか一項に記載の推定装置と、
前記ユーザの足部に設置され、空間加速度および空間角速度を計測し、計測した前記空間加速度および前記空間角速度に基づくセンサデータを生成し、生成した前記センサデータを前記推定装置に送信するデータ取得装置と、を備える推定システム。 - コンピュータが、
ユーザの足の動きに基づくセンサデータの時系列データから抽出された歩行波形から、前記ユーザの属性に応じた身体状態の特徴が表れる区間において、前記ユーザの属性に応じた特徴量を抽出し、
前記ユーザの属性に応じて抽出された前記特徴量を用いて、前記ユーザの身体状態を推定する推定方法。 - ユーザの足の動きに基づくセンサデータの時系列データから抽出された歩行波形から、前記ユーザの属性に応じた身体状態の特徴が表れる区間において、前記ユーザの属性に応じた特徴量を抽出する処理と、
前記ユーザの属性に応じて抽出された前記特徴量を用いて、前記ユーザの身体状態を推定する処理と、をコンピュータに実行させるプログラムが記録された非一過性のプログラム記録媒体。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/019305 WO2022244222A1 (ja) | 2021-05-21 | 2021-05-21 | 推定装置、推定システム、推定方法、および記録媒体 |
JP2023522150A JPWO2022244222A5 (ja) | 2021-05-21 | 推定装置、推定システム、推定方法、およびプログラム |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/019305 WO2022244222A1 (ja) | 2021-05-21 | 2021-05-21 | 推定装置、推定システム、推定方法、および記録媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022244222A1 true WO2022244222A1 (ja) | 2022-11-24 |
Family
ID=84141161
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/019305 WO2022244222A1 (ja) | 2021-05-21 | 2021-05-21 | 推定装置、推定システム、推定方法、および記録媒体 |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2022244222A1 (ja) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008161228A (ja) * | 2006-12-27 | 2008-07-17 | Yokogawa Electric Corp | 歩行解析システム |
WO2015190042A1 (ja) * | 2014-06-13 | 2015-12-17 | パナソニックIpマネジメント株式会社 | 活動評価装置、評価処理装置、プログラム |
WO2017065241A1 (ja) * | 2015-10-14 | 2017-04-20 | 国立大学法人東京工業大学 | 自動診断装置 |
JP2019005340A (ja) * | 2017-06-27 | 2019-01-17 | 株式会社東芝 | 判定装置、判定システム、及び判定プログラム |
WO2019181483A1 (ja) * | 2018-03-23 | 2019-09-26 | パナソニックIpマネジメント株式会社 | 認知機能評価装置、認知機能評価システム、認知機能評価方法、及び、プログラム |
WO2020202543A1 (ja) * | 2019-04-05 | 2020-10-08 | 日本電気株式会社 | 歩行周期判定システム、歩行周期判定方法、およびプログラム記録媒体 |
WO2020230282A1 (ja) * | 2019-05-15 | 2020-11-19 | 日本電気株式会社 | 判定装置、判定方法、およびプログラム記録媒体 |
-
2021
- 2021-05-21 WO PCT/JP2021/019305 patent/WO2022244222A1/ja active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008161228A (ja) * | 2006-12-27 | 2008-07-17 | Yokogawa Electric Corp | 歩行解析システム |
WO2015190042A1 (ja) * | 2014-06-13 | 2015-12-17 | パナソニックIpマネジメント株式会社 | 活動評価装置、評価処理装置、プログラム |
WO2017065241A1 (ja) * | 2015-10-14 | 2017-04-20 | 国立大学法人東京工業大学 | 自動診断装置 |
JP2019005340A (ja) * | 2017-06-27 | 2019-01-17 | 株式会社東芝 | 判定装置、判定システム、及び判定プログラム |
WO2019181483A1 (ja) * | 2018-03-23 | 2019-09-26 | パナソニックIpマネジメント株式会社 | 認知機能評価装置、認知機能評価システム、認知機能評価方法、及び、プログラム |
WO2020202543A1 (ja) * | 2019-04-05 | 2020-10-08 | 日本電気株式会社 | 歩行周期判定システム、歩行周期判定方法、およびプログラム記録媒体 |
WO2020230282A1 (ja) * | 2019-05-15 | 2020-11-19 | 日本電気株式会社 | 判定装置、判定方法、およびプログラム記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022244222A1 (ja) | 2022-11-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7327516B2 (ja) | 異常検出装置、判定システム、異常検出方法、およびプログラム | |
WO2022244222A1 (ja) | 推定装置、推定システム、推定方法、および記録媒体 | |
WO2022070416A1 (ja) | 推定装置、推定方法、およびプログラム記録媒体 | |
US20240138757A1 (en) | Pelvic inclination estimation device, estimation system, pelvic inclination estimation method, and recording medium | |
US20240138713A1 (en) | Harmonic index estimation device, estimation system, harmonic index estimation method, and recording medium | |
US20240108251A1 (en) | Calculation device, calculation method, and program recording medium | |
US20240172966A1 (en) | Harmonic index estimation device, estimation system, harmonic index estimation method, and recording medium | |
WO2022038663A1 (ja) | 検出装置、検出システム、検出方法、およびプログラム記録媒体 | |
WO2022201338A1 (ja) | 特徴量生成装置、歩容計測システム、特徴量生成方法、および記録媒体 | |
WO2023127010A1 (ja) | 移動能力推定装置、移動能力推定システム、移動能力推定方法、および記録媒体 | |
US20240138710A1 (en) | Waist swinging estimation device, estimation system, waist swinging estimation method, and recording medium | |
WO2023127014A1 (ja) | 易転倒性推定装置、易転倒性推定システム、易転倒性推定方法、および記録媒体 | |
WO2023067694A1 (ja) | データ生成装置、学習システム、推定装置、データ生成方法、および記録媒体 | |
WO2022208838A1 (ja) | 生体情報処理装置、情報処理システム、生体情報処理方法、および記録媒体 | |
JP7459965B2 (ja) | 判別装置、判別システム、判別方法、およびプログラム | |
WO2023127007A1 (ja) | 筋力指標推定装置、筋力指標推定システム、筋力指標推定方法、および記録媒体 | |
WO2023127013A1 (ja) | 静的バランス推定装置、静的バランス推定システム、静的バランス推定方法、および記録媒体 | |
WO2023127015A1 (ja) | 筋力評価装置、筋力評価システム、筋力評価方法、および記録媒体 | |
WO2023127008A1 (ja) | 動的バランス推定装置、動的バランス推定システム、動的バランス推定方法、および記録媒体 | |
WO2023170948A1 (ja) | 歩容計測装置、計測装置、歩容計測システム、歩容計測方法、および記録媒体 | |
WO2023047558A1 (ja) | 推定装置、情報提示システム、推定方法、および記録媒体 | |
US20240164705A1 (en) | Index value estimation device, estimation system, index value estimation method, and recording medium | |
WO2022101971A1 (ja) | 検出装置、検出システム、検出方法、およびプログラム記録媒体 | |
JP2023174049A (ja) | フレイル推定装置、推定システム、フレイル推定方法、およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21940836 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023522150 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18560462 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21940836 Country of ref document: EP Kind code of ref document: A1 |