WO2012014714A1 - 歩行変化判定装置 - Google Patents
歩行変化判定装置 Download PDFInfo
- Publication number
- WO2012014714A1 WO2012014714A1 PCT/JP2011/066339 JP2011066339W WO2012014714A1 WO 2012014714 A1 WO2012014714 A1 WO 2012014714A1 JP 2011066339 W JP2011066339 W JP 2011066339W WO 2012014714 A1 WO2012014714 A1 WO 2012014714A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- feature point
- feature
- walking
- degree
- change
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/112—Gait analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
- A61B5/1122—Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/746—Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1118—Determining activity level
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6823—Trunk, e.g., chest, back, abdomen, hip
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
Definitions
- the present invention relates to a walking change determination device, and more particularly to a walking change determination device suitable for determining a change in a walking state of a user who wears the device at a predetermined site.
- the conventional technology has a problem that the user simply changes the operation speed or is easily influenced by individual differences. For this reason, there has been a problem that the degree of change in walking cannot be accurately determined and notified to the user.
- the present invention has been made to solve the above-described problems, and one of its purposes is to provide a walking change determination device that can more accurately determine the degree of change in walking.
- a walking change determination device includes a main body, an acceleration sensor for detecting acceleration of the main body, and a controller. It is an apparatus for determining a change in walking of a user wearing a predetermined part.
- the control unit calculates, based on the acceleration detected by the acceleration sensor, a specifying unit that specifies a trajectory during walking of a predetermined part to which the main body unit is attached, and a temporal change of the trajectory specified by the specifying unit. And a determination unit that calculates a change degree that is a degree of temporal change based on the temporal change calculated by the first calculation part.
- the walking change determination device further includes a storage unit.
- the control unit further includes a reception unit that receives an input of the degree of fatigue when the user is walking.
- the determination unit calculates the degree of change when received by the reception unit.
- the control unit further includes a storage control unit that stores the degree of fatigue received by the reception unit and the degree of change calculated by the determination unit in the storage unit in association with each other.
- the determination unit determines a fatigue degree corresponding to the change degree calculated by the determination unit based on the fatigue degree and the change degree stored in the storage unit.
- the walking change determination device further includes a notification unit.
- the control unit further includes a notification control unit that causes the notification unit to notify the degree of fatigue determined by the determination unit.
- the trajectory is a three-dimensional trajectory from which a moving component in the advancing direction at the time of walking of a predetermined part to which the main body unit is attached, and the trajectory pattern includes a plurality of feature points that define the features of the pattern.
- the specifying unit Based on the acceleration detected by the acceleration sensor, the specifying unit removes the moving component in the traveling direction on the plane perpendicular to each of the three directions perpendicular to the vertical direction, the traveling direction, and the left-right direction, and features the trajectory projected Specify the position of.
- the first calculation unit is obtained in advance of a second calculation unit that calculates the value of the trajectory feature factor based on the position specified by the specifying unit, and the feature factor value and the index value indicating walking.
- the third calculation unit that calculates the index value based on the characteristic factor value calculated by the second calculation unit, and the index value calculated by the third calculation unit
- a fourth calculation unit for calculating a temporal change amount of the index.
- the determination unit calculates the degree of change based on the temporal change amount calculated by the fourth calculation unit.
- the temporal change amount includes a temporal change amount of the walking posture.
- the control unit further includes a determination unit that determines the walking posture based on the value of the index calculated by the third calculation unit.
- the fourth calculation unit further calculates a temporal change amount of the walking posture determined by the determination unit.
- the determination unit further calculates a change degree including a posture change degree of the temporal change of the walking posture based on the temporal change amount calculated by the fourth calculation unit.
- the correlation is represented by a multiple regression equation that is a relational expression between the value of the characteristic factor as the objective variable and the value of the index as the explanatory variable, obtained by multiple regression analysis.
- the feature points are a first feature point when the first foot is grounded and a second feature when the locus reaches the highest position while standing on the first foot. And a third feature point when the second foot comes in contact with the ground, and a fourth feature point when the locus reaches the highest position while standing on the second foot.
- the feature factor is a first feature factor that is the distance in the vertical direction between the first feature point and the second feature point in the locus projected on the plane perpendicular to the traveling direction, and the feature factor is projected onto the plane perpendicular to the left-right direction.
- the second feature factor calculated from the distance between the first feature point and the second feature point in the locus and the distance between the third feature point and the fourth feature point is included.
- the index includes stride.
- the multiple regression equation includes the product of the first partial regression coefficient and the first characteristic factor obtained by the multiple regression analysis, and the product of the second partial regression coefficient and the second feature factor obtained by the multiple regression analysis. , Is a formula for calculating the sum with the third partial regression coefficient.
- the feature point is a first feature point when the first foot is grounded, a second feature point when the locus reaches the highest position while standing on the first foot, The third feature point on the rightmost side of the locus, the fourth feature point on the leftmost side of the locus, the fifth feature point on the most front side on the right side of the locus, and the sixth feature on the leftmost side of the locus.
- the feature factor is the distance in the horizontal direction between the third feature point and the fourth feature point in the vertical direction between the first feature point and the second feature point in the trajectory projected on the plane perpendicular to the traveling direction. And the distance in the left-right direction between the fifth feature point and the sixth feature point in the trajectory projected on the surface perpendicular to the vertical direction, and the seventh feature point and the A second feature factor that is a quotient divided by the distance in the left-right direction from the eight feature points.
- the index includes the step interval.
- the multiple regression equation includes the product of the first partial regression coefficient and the first characteristic factor obtained by the multiple regression analysis, and the product of the second partial regression coefficient and the second feature factor obtained by the multiple regression analysis. , Is a formula for calculating the sum with the third partial regression coefficient.
- the walking change determination device identifies the locus at the time of walking of the predetermined part of the user to which the main body is mounted based on the acceleration detected by the acceleration sensor, and the temporal change of the identified locus is determined. Is calculated, and based on the calculated temporal change, the degree of change, which is the degree of temporal change, is calculated and determined.
- the walking posture determination device measures not only the number of steps, but also the amount of activity (also referred to as the amount of exercise) in exercise and daily activities (for example, vacuuming, carrying light luggage, cooking, etc.).
- the embodiment will be described as an activity meter capable of However, the present invention is not limited to this, and the walking posture determination device may be a pedometer capable of measuring the number of steps.
- FIG. 1 is an external view of an activity meter 100 according to the embodiment of the present invention.
- the activity meter 100 is mainly composed of a main body portion 191 and a clip portion 192.
- the clip unit 192 is used to fix the activity meter 100 to a user's clothes or the like.
- the main body 191 includes a display change / decision switch 131, a left operation / memory switch 132, a right operation switch 133, and a part of a display unit 140, which will be described later.
- a display 141 is provided.
- display 141 is configured by a liquid crystal display (LCD), but is not limited thereto, and may be another type of display such as an EL (ElectroLuminescence) display. .
- LCD liquid crystal display
- EL ElectroLuminescence
- FIG. 2 is a diagram showing a usage state of the activity meter 100 in this embodiment.
- activity meter 100 is attached to a belt on a user's waist using clip portion 192, for example. In this embodiment, it is desirable that the activity meter 100 is fixedly mounted near the user's waist.
- the user's traveling direction during walking is the Z axis (the forward direction is the positive direction), the user's left and right direction during walking is the X axis (the right direction is the positive direction), and the vertical direction is the Y axis.
- a coordinate system with (vertically upward as a positive direction) is used.
- FIG. 3 is a diagram showing a first example in which the hip locus during walking of the user is viewed from the direction of walking.
- FIG. 4 is a diagram illustrating a second example in which the hip locus during walking of the user is viewed from the walking direction.
- FIG. 3A and FIG. 4A are diagrams in which the hip locus during walking is superimposed on the user's image.
- FIG. 3B and FIG. 4B are graphs showing the locus of the waist when the user walks.
- this locus is a locus projected on the XY plane, which is a plane perpendicular to the Z-axis, during walking.
- the right foot usually touches the ground after reaching the highest position after the right foot is released from the ground, and then the left foot is moved to the highest position after the left foot is released from the ground. After reaching, the foot is moved in the process of the left foot touching the ground.
- the locus of the user's waist first moves from the lower right to the upper left, reaches the highest position on the upper left, then moves to the lower left, reaches the lowest position on the lower left, and then moves to the upper right. It is a specific pattern in which, after reaching the highest position on the upper right, heading to the lower right and reaching the lowest position on the lower right.
- FIG. 5 is a diagram showing a plurality of examples in which the hip locus during walking of the user is viewed from the direction of walking.
- FIG. 5 (A) is a diagram similar to the diagram shown in FIG. 3 (B).
- FIG. 5A shows the locus of the waist when the user walks in the normal walking posture.
- FIG. 5B is a diagram similar to the diagram illustrated in FIG.
- FIG. 5B shows the locus of the waist when the user walks in a case where the step is wider than in the case of FIG.
- FIG. 5C shows the locus of the waist when the user walks when the step is narrower than in the case of FIG.
- FIG. 5 (D) shows the locus of the waist when the user walks when walking with a skate rather than the case of FIG. 5 (A).
- FIG. 5 (E) shows the locus of the waist when the user walks when walking with the back of his back, rather than in the case of FIG. 5 (A).
- FIG. 5 (F) shows the locus of the user's waist when walking on a forehead than in the case of FIG. 5 (A).
- FIG. 5A to FIG. 5F look different, they have a specific pattern as described in FIG. 3 and FIG.
- FIG. 6 is a diagram showing the correlation between the user's waist trajectory calculated from acceleration data and the measured user's waist trajectory calculated from the acceleration data in this embodiment.
- FIG. 6 (A) is a diagram of an actually measured waist trajectory of the user when viewed from the direction of walking.
- FIG. 6A is a view similar to FIG. 3B, FIG. 4B, and FIG. 5A to FIG. 5F.
- the locus shown in FIG. 6A is obtained by, for example, photographing a place where the user is walking from the direction of travel with a camera and connecting the movements of a certain point near the waist by image processing. .
- FIG. 6 (B) is a view of the hip locus during walking of the user, calculated from the acceleration data, as seen from the direction of walking.
- a method of calculating the hip locus during walking of the user based on the triaxial acceleration data detected by the acceleration sensor of the activity meter 100 will be described. This locus is calculated by the control unit of the activity meter 100.
- the accelerations Ax (t), Ay (t), and Az (t) in the X-axis, Y-axis, and Z-axis directions described with reference to FIG. 2 are specified.
- the detected values obtained by the acceleration sensor are directly used as the accelerations Ax in the X-axis, Y-axis, and Z-axis directions.
- T), Ay (t), Az (t) may be used.
- the X-axis, Y-axis, and Z-axis directions are converted by coordinate conversion of the detection values obtained by the acceleration sensor.
- the respective accelerations Ax (t), Ay (t), Az (t) are calculated.
- the speed excluding the average speed component in the short time during the time of ⁇ 1 step that is, the relative speed Vx ′ (t), Vy ′ (t) and Vz ′ (t) are calculated.
- the time for one step is T seconds, and T is calculated, for example, by calculating the time between acceleration peaks for each step.
- the points (X (t) and Y (t)) having the calculated positions X (t) and Y (t) as X and Y coordinate values are plotted on the XY plane while changing t.
- a trajectory obtained by projecting the trajectory of the user on the XY plane can be obtained.
- An example of this locus is the locus shown in FIG.
- These trajectories are traces of patterns as shown in FIGS. 7A to 9A, which will be described later.
- FIG. 7 is a diagram for explaining the feature points included in the locus pattern projected on the XY plane in this embodiment.
- FIG. 8 is a diagram for explaining the feature points included in the locus pattern projected on the XZ plane in this embodiment.
- FIG. 9 is a diagram for explaining the feature points included in the locus pattern projected on the YZ plane in this embodiment.
- feature point (1) is a point when the right foot touches down in the walking cycle.
- the condition for specifying the feature point (1) is that the right side is the right and the top and the bottom is the lowest.
- Feature point (2) is a point when the right foot is standing in the walking cycle (particularly a point when the user's waist is at the highest position in the vertical direction).
- the condition for specifying the feature point (2) is a condition that it is after the feature point (1) and is the highest in the vertical direction.
- Feature point (3) is the point when the left foot touches down in the walking cycle.
- the condition for specifying the feature point (3) is a condition that it is after the feature point (2) and is the lowest in the vertical direction.
- Feature point (4) is a point when the left foot is standing in the walking cycle (particularly a point when the user's waist is at the highest position in the vertical direction).
- the condition for specifying the feature point (4) is a condition that it is after the feature point (3) and is the highest in the vertical direction.
- Feature point (5) is the point when the right foot touches down in the walking cycle.
- the condition for specifying the feature point (5) is the condition that it is after the feature point (4) and is the lowest in the vertical direction.
- This feature point (5) is the feature point (1) of the next one cycle.
- Feature point (6) is a point when the user's waist is on the rightmost side in the walking cycle.
- the condition for specifying the feature point (6) is a condition that the value of X (t) calculated by Expression 7 is maximum when X (t) ⁇ 0 in one cycle.
- Feature point (7) is a point when the user's waist is on the leftmost side in the walking cycle.
- the condition for specifying the feature point (7) is a condition that the value of X (t) calculated by Expression 7 is minimum in one cycle when X (t) ⁇ 0.
- Feature point (8) is the intersection of the loci of the waist in one walking cycle in the walking cycle.
- the conditions for specifying the feature point (8) are the XY of the waist locus from the feature point (2) to the feature point (3) and the waist locus from the feature point (4) to the feature point (5). It is a condition that it is an intersection on a plane.
- the feature point (9) is a point when the right foot touches down in the walking cycle.
- the condition for specifying the feature point (9) is a condition that the right and left sides are right and the front and rear sides are the rearmost.
- the feature point (10) is a point when the right foot is standing in the walking cycle (particularly a point when the user's waist is at the foremost position relative to the average position in a short time in the traveling direction).
- the condition for specifying the feature point (10) is a condition that it is after the feature point (9) and is the foremost in terms of front and rear.
- Feature point (11) is the point when the left foot touches down in the walking cycle.
- the condition for specifying the feature point (11) is a condition that it is after the feature point (10) and is at the back of the front and rear.
- the feature point (12) is a point when the left foot is standing in the walking cycle (particularly a point when the user's waist is at the forefront relative position to the average position in a short time in the traveling direction).
- the condition for specifying the feature point (12) is a condition that it is after the feature point (11) and is the foremost in terms of front and rear.
- Feature point (13) is the point when the right foot touches down in the walking cycle.
- the condition for specifying the feature point (11) is a condition that it is after the feature point (12) and is at the back of the front and rear.
- This feature point (13) is the feature point (9) of the next one cycle.
- Feature point (14) is the intersection of the loci of the waist in one walking cycle in the walking cycle.
- the conditions for specifying the feature point (14) are the XY of the waist locus from the feature point (10) to the feature point (11) and the waist locus from the feature point (12) to the feature point (13). It is a condition that it is an intersection on a plane.
- feature points (1), (3), and (5) described in FIG. 7 are the lowest points in the locus pattern projected on the YZ plane.
- the feature points (2) and (4) are the uppermost points in the locus pattern projected on the YZ plane.
- FIG. 10 is a diagram for explaining the feature factor calculated based on the position of the feature point included in the locus pattern projected on the XY plane in this embodiment.
- FIG. 11 is a diagram for explaining the feature factor calculated based on the position of the feature point included in the locus pattern projected on the XZ plane in this embodiment.
- FIG. 12 is a diagram for explaining the feature factor calculated based on the position of the feature point included in the locus pattern projected on the YZ plane in this embodiment.
- the feature factor Wu is a distance in the X-axis direction (referred to as “upper left / right width”) between the feature point (2) and the feature point (4) on the XY plane, and the feature point (2 ) Is subtracted from the X coordinate value of the feature point (4).
- the feature factor Wd is a distance in the X-axis direction between the feature point (1) and the feature point (3) on the XY plane (referred to as “lower left / right width”), and the value of the X coordinate of the feature point (1). Is calculated by subtracting the X-coordinate value of the feature point (3).
- the feature factor W is a distance in the X-axis direction between the feature point (6) and the feature point (7) on the XY plane (referred to as “left-right width”). It is calculated by subtracting the X coordinate value of the point (7).
- the feature factor H1 is a distance in the Y-axis direction between the feature point (4) and the feature point (3) on the XY plane (referred to as “left-side vertical width”), and is obtained from the value of the Y coordinate of the feature point (4). It is calculated by subtracting the Y coordinate value of the feature point (3).
- the feature factor Hr is a distance in the Y-axis direction between the feature point (2) and the feature point (1) on the XY plane (referred to as “right upper and lower width”), and is obtained from the value of the Y coordinate of the feature point (2). It is calculated by subtracting the Y coordinate value of the feature point (1).
- Feature factor H is the average of feature factor H1 and feature factor Hr on the XY plane (referred to as “vertical width”), and is calculated by adding H1 and Hr and dividing by two.
- the feature factor Hcl is the height (“left cross point height) of the feature point (8) on the basis of the feature point (3) on the XY plane.
- the feature point (8) is obtained from the Y coordinate value of the feature point (8). It is calculated by subtracting the Y coordinate value of 3).
- the feature factor Hcr is the height of the feature point (8) with respect to the feature point (1) on the XY plane (“right cross point height), and the feature point (8) is obtained from the Y coordinate value of the feature point (8). It is calculated by subtracting the Y coordinate value of 1).
- the feature factor ISO is the height (referred to as “phase”) of the feature point (8) with respect to the vertical width of the trajectory in the XY plane.
- the feature factor Hcl is divided by the feature factor Hl, and the feature factor Hcr is the feature factor. Calculated by adding to Hr and dividing by 2.
- the feature factor Vlev is the degree of whether the upper side of the trajectory in the XY plane is open or lower (called “shape ⁇ or ⁇ ”), and is calculated by dividing the feature factor Wu by the feature factor Wd.
- the feature factor Ilev is a factor (referred to as “shape I”) for specifying whether the shape of the locus on the XY plane is a vertically long shape or a horizontally long shape. Calculated by dividing.
- the characteristic factor Hb is a ratio of the left and right vertical widths in the XY plane (referred to as “left / right vertical width ratio”), and is calculated by dividing the characteristic factor Hr by the characteristic factor Hl.
- the feature factor Yb is the ratio of the left and right heights on the XY plane (referred to as “left / right height ratio”), and the difference between the Y coordinate value of the feature point (4) and the Y coordinate value of the feature point (1). Is divided by the difference between the Y coordinate value of the feature point (2) and the Y coordinate value of the feature point (3).
- the feature factor Wb is a ratio of the left and right widths on the XY plane (referred to as “right / left width ratio”), and the difference between the X coordinate value of the feature point (6) and the X coordinate value of the feature point (8). , By dividing by the difference between the X coordinate value of the feature point (8) and the Y coordinate value of the feature point (7).
- the characteristic factor St1 is the sum of the vertical amplitudes (referred to as “vertical amplitude from the right foot grounding to the left foot grounding”) until the left foot touches the ground on the XY plane. This is calculated by adding the value obtained by subtracting the Y coordinate value of the feature point (1) and the value obtained by subtracting the Y coordinate value of the feature point (3) from the Y coordinate value of the feature point (2).
- the characteristic factor Str is a sum of vertical amplitudes from the left foot to the ground on the XY plane until the right foot contacts the ground (referred to as “vertical amplitude from the left foot grounding to the right foot grounding”). Calculated by adding the value obtained by subtracting the Y coordinate value of the feature point (3) and the value obtained by subtracting the Y coordinate value of the feature point (5) from the Y coordinate value of the feature point (4).
- the feature factor “jun” is a factor (referred to as “writing order”) indicating whether the locus is written clockwise or counterclockwise, and the X coordinate of the feature point (2) and the feature point (4) It is calculated by making a positive / negative determination.
- the feature factor WuSu is a distance in the X-axis direction (referred to as “upper left-right width”) between the feature point (10) and the feature point (12) on the XZ plane, and the feature point (10 ) Is subtracted from the X coordinate value of the feature point (12).
- the feature factor WdSu is the distance in the X-axis direction between the feature point (9) and the feature point (11) on the XZ plane (referred to as “lower left / right width”), and the value of the X coordinate of the feature point (9) Is calculated by subtracting the X-coordinate value of the feature point (11).
- the feature factor Wsu is a distance in the X-axis direction (referred to as “horizontal width”) between the feature point (6) and the feature point (7) on the XZ plane.
- the feature factor Wsu is a feature based on the X coordinate value of the feature point (6). It is calculated by subtracting the X coordinate value of the point (7).
- the feature factor HlSu is a distance in the Z-axis direction between the feature point (12) and the feature point (11) in the XZ plane (referred to as “left-side vertical width”), and is obtained from the value of the Z coordinate of the feature point (12). It is calculated by subtracting the Z coordinate value of the feature point (11).
- the feature factor HrSu is a distance in the Z-axis direction between the feature point (10) and the feature point (9) in the XZ plane (referred to as “right upper and lower width”), and is obtained from the value of the Z coordinate of the feature point (10). It is calculated by subtracting the Z coordinate value of the feature point (9).
- the feature factor Hsu is an average of the feature factor HlSu and the feature factor HrSu on the XZ plane (referred to as “vertical width”), and is calculated by adding HlSu and HrSu and dividing by two.
- the feature factor HclSu is the height (“left cross point height) of the feature point (8) on the basis of the feature point (11) in the XZ plane.
- the feature point (8) is obtained from the Z-coordinate value of the feature point (8). 11) is calculated by subtracting the value of the Z coordinate.
- the feature factor HcrSu is the height (“right cross point height) of the feature point (8) on the basis of the feature point (9) in the XZ plane.
- the feature point (8) is obtained from the Z coordinate value of the feature point (8) ( It is calculated by subtracting the Z coordinate value of 9).
- the feature factor ISOSu is the height (referred to as “phase”) of the feature point (14) with respect to the vertical width of the trajectory on the XY plane, and is the same value as the ISO on the XY plane described in FIG.
- the feature factor VlevSu is the degree of whether the upper side of the trajectory in the XZ plane is open or lower (referred to as “shape ⁇ or ⁇ ”), and is calculated by dividing the feature factor WuSu by the feature factor WdSu.
- the feature factor IlevSu is a factor (referred to as “shape I”) for specifying whether the shape of the trajectory in the XZ plane is a vertically long shape or a horizontally long shape.
- the feature factor Hsu is a feature factor Wsu. Calculated by dividing.
- the characteristic factor HbSu is a ratio of the left and right vertical widths in the XZ plane (referred to as “left / right vertical width ratio”), and is calculated by dividing the characteristic factor HrSu by the characteristic factor HlSu.
- the feature factor YbSu is the ratio of the left and right heights in the XZ plane (referred to as “left-right height ratio”), and the difference between the Z coordinate value of the feature point (13) and the Z coordinate value of the feature point (9). Is divided by the difference between the Z coordinate value of the feature point (10) and the Z coordinate value of the feature point (11).
- the characteristic factor WbSu is a ratio of the left and right widths in the XZ plane (referred to as “right / left width ratio”), and is the same value as Wb of the XY plane described in FIG.
- the characteristic factor StlSu is the sum of the front and rear amplitudes (referred to as “front and rear amplitudes from the right foot grounding to the left foot grounding”) until the left foot touches the ground in the XZ plane, and is the Z coordinate of the feature point (10). It is calculated by adding the value obtained by subtracting the Z coordinate value of the feature point (9) and the value obtained by subtracting the Z coordinate value of the feature point (11) from the Z coordinate value of the feature point (10).
- the characteristic factor StrSu is the sum of the front and rear amplitudes (referred to as “front and rear amplitudes from the left foot contact to the right foot contact”) until the right foot contacts the ground in the XZ plane, and the Z coordinate of the feature point (12) It is calculated by adding the value obtained by subtracting the Z coordinate value of the feature point (11) and the value obtained by subtracting the Z coordinate value of the feature point (13) from the Z coordinate value of the feature point (12).
- the characteristic factor Zfl is a width in which the left foot in the XZ plane is standing and moves back and forth after the position of the waist reaches the highest point (referred to as “backward movement of the waist from the uppermost point of the left stepped leg”). It is calculated by subtracting the Z coordinate value of the feature point (4) from the Z coordinate value of 12).
- the characteristic factor Zfr is a width (referred to as “backward movement of the waist from the uppermost point of the right foot stand”) when the right foot is standing on the XZ plane and the position of the waist reaches the uppermost point and then moved back and forth. It is calculated by subtracting the Z coordinate value of the feature point (2) from the Z coordinate value of 10).
- the characteristic factor Zf is a width that moves back and forth after the position of the waist reaches the highest point in the stance on the XZ plane (referred to as “backward movement of the waist from the highest point of the stance”), and the characteristic factor Zfl and the characteristic factor Calculated by adding Zfr and dividing by 2.
- the characteristic factor Zbl is a width (referred to as “backward movement of the waist from the left foot contact”) after the left foot touches the ground in the XZ plane (referred to as “backward movement of the waist from the left foot contact”). It is calculated by subtracting the Z coordinate value of the feature point (3).
- the feature factor Zbr is a width (referred to as “backward back-and-forth movement from the right foot grounding”) after the right foot in the XZ plane touches the ground, and from the value of the Z coordinate of the feature point (9). It is calculated by subtracting the Z coordinate value of the feature point (5).
- the feature factor Zb is a width in which the position of the waist has moved back and forth after the foot in the XZ plane contacts the ground (referred to as “backward movement of the waist from the ground”), and the feature factor Zbl and the feature factor Zbr are 2 Calculated by dividing by.
- the feature factor dZ is a forward / backward tilt (referred to as “front / back tilt”) in the YZ plane, and the value of the Y coordinate of the feature point (1) from the value of the Y coordinate of the feature point (2). Is divided by the value obtained by subtracting the value of the Z coordinate of the feature point (1) from the value of the Z coordinate of the feature point (2).
- the characteristic factor StlShi is the sum of the amplitudes in the left diagonal direction in the YZ plane (referred to as “left front-rear amplitude”), the distance between the characteristic points (2) and (1) in the YZ plane, and the characteristics in the YZ plane. It is calculated by adding the distance between the point (2) and the feature point (3).
- the distance between the feature point (2) and the feature point (1) on the YZ plane is the square of the value obtained by subtracting the Z coordinate value of the feature point (1) from the Z coordinate value of the feature point (2), and the feature point ( It is calculated as the square root of the sum of the value obtained by subtracting the Y coordinate value of the feature point (1) and the square of the Y coordinate value of 2).
- the distance between the feature point (2) and the feature point (3) on the YZ plane is the square of the value obtained by subtracting the Z coordinate value of the feature point (3) from the Z coordinate value of the feature point (2), and the feature point ( It is calculated as the square root of the sum of the value obtained by subtracting the Y coordinate value of the feature point (3) and the square of the Y coordinate value of 2).
- the feature factor StrShi is the sum of the amplitudes in the right diagonal direction in the YZ plane (referred to as “right front-rear amplitude”), the distance between the feature points (4) and (3) in the YZ plane, and the features in the YZ plane. It is calculated by adding the distance between the point (4) and the feature point (1).
- the distance between the feature point (4) and the feature point (3) on the YZ plane is the square of the value obtained by subtracting the Z coordinate value of the feature point (3) from the Z coordinate value of the feature point (4), and the feature point ( It is calculated as the square root of the sum of the square of the value obtained by subtracting the Y coordinate value of the feature point (3) from the Y coordinate value of 4).
- the distance between the feature point (4) and the feature point (1) on the YZ plane is the square of the value obtained by subtracting the Z coordinate value of the feature point (1) from the Z coordinate value of the feature point (4), and the feature point ( It is calculated as the square root of the sum of the square of the value obtained by subtracting the Y coordinate value of the feature point (1) from the Y coordinate value of 4).
- the feature factor StShi is the sum of the amplitudes in the oblique direction in the YZ plane (referred to as “front-rear amplitude”), and is calculated by adding the feature factor StlShi and the feature factor StrShi and dividing by two.
- FIG. 13 is a first diagram for explaining the correlation between the feature factor and the step length of the index indicating the walking posture in this embodiment.
- FIG. 14 is a second diagram for illustrating the correlation between the feature factor and the stride among the indices indicating the walking posture in this embodiment.
- the characteristic factor Hr of the locus pattern projected on the XZ plane described in FIG. 10 is the vertical axis (y), and the actually measured value of the stride, which is one of the indices indicating the walking posture, is obtained.
- the characteristic factor StShi of the locus pattern projected on the YZ plane described in FIG. 12 is taken as the vertical axis (y), and the actually measured value of the stride, which is one of the indices indicating the walking posture, is obtained.
- the stride which is an index indicating the walking posture, has a high correlation with the feature factor Hr and the feature factor StShi. Therefore, by performing multiple regression analysis, the feature factor Hr and the feature factor StShi as objective variables and the explanation are explained.
- ⁇ , ⁇ , and ⁇ are partial regression coefficients obtained by multiple regression analysis.
- FIG. 15 is a first diagram for explaining the correlation between the feature factor and the walking distance among the indices indicating the walking posture in this embodiment.
- FIG. 16 is a second diagram for explaining the correlation between the feature factor and the step indicating the walking posture in this embodiment.
- the step which is an index indicating the walking posture, is subjected to a multiple regression analysis with a multiple regression analysis of the characteristic factor Hr / W and the characteristic factor WuSu / WdSu as objective variables and the stride value as an explanatory variable.
- Step width Width ⁇ ⁇ Hr / W + ⁇ ⁇ WuSu / WdSu + ⁇ can be used to calculate the step value. Note that ⁇ , ⁇ , and ⁇ are coefficients obtained by multiple regression analysis.
- FIG. 17 is a block diagram showing an outline of the configuration of the activity meter 100 in this embodiment.
- activity meter 100 includes a control unit 110, a memory 120, an operation unit 130, a display unit 140, a sound report unit 150, an acceleration sensor 170, and a power source 190. Further, the activity meter 100 may include an interface for communicating with an external computer.
- the control unit 110, the memory 120, the operation unit 130, the display unit 140, the sound report unit 150, the acceleration sensor 170, and the power source 190 are built in the main body unit 191 described with reference to FIG.
- the operation unit 130 includes the display change / decision switch 131, the left operation / memory switch 132, and the right operation switch 133 described with reference to FIG. 1, and an operation signal indicating that these switches have been operated is sent to the control unit 110. Send.
- the acceleration sensor 170 is a semiconductor type of MEMS (Micro Electro Mechanical Systems) technology, but is not limited to this, and may be of another type such as a mechanical type or an optical type. In the present embodiment, acceleration sensor 170 outputs a detection signal indicating the acceleration in each of the three axial directions to control unit 110. However, the acceleration sensor 170 is not limited to the three-axis type, and may be one-axis or two-axis type.
- MEMS Micro Electro Mechanical Systems
- the memory 120 includes non-volatile memory such as ROM (Read Only Memory) (for example, flash memory) and volatile memory such as RAM (Random Access Memory) (for example, SDRAM (synchronous Dynamic Random Access Memory)).
- ROM Read Only Memory
- RAM Random Access Memory
- SDRAM synchronous Dynamic Random Access Memory
- the memory 120 includes program data for controlling the activity meter 100, data used for controlling the activity meter 100, setting data for setting various functions of the activity meter 100, and the number of steps and activities. Measurement result data such as quantity is stored every predetermined time (for example, every day). The memory 120 is used as a work memory when the program is executed.
- the control unit 110 includes a CPU (Central Processing Unit), and according to an operation signal from the operation unit 130 according to a program for controlling the activity meter 100 stored in the memory 120, the acceleration sensor 170 and the atmospheric pressure sensor 180.
- the memory 120, the display unit 140, and the sound report unit 150 are controlled on the basis of the detection signal from.
- the display unit 140 includes the display 141 described with reference to FIG. 1 and controls the display 141 to display predetermined information according to a control signal from the control unit 110.
- the reporting sound unit 150 controls to output a predetermined sound from the speaker according to the control signal from the control unit 110.
- the power source 190 includes a replaceable battery, and supplies power from the battery to each unit that requires power to operate, such as the control unit 110 of the activity meter 100.
- FIG. 18 is a functional block diagram showing an outline of the function of the activity meter 100 in this embodiment.
- the control unit 110 of the activity meter 100 includes an acceleration reading control unit 111, a feature point position specifying unit 112, a feature factor calculation unit 1131, an index calculation unit 1132, and a walking posture determination unit 1133.
- the memory 120 of the activity meter 100 includes an acceleration data storage unit 121, a feature point position storage unit 122, a feature factor storage unit 1231, a correlation storage unit 1232, an index storage unit 1233, and a walking posture storage unit. 1234, a temporal change amount storage unit 1235, a walking change fatigue correspondence storage unit 124, and a change degree storage unit 125.
- these units included in the control unit 110 are configured in the control unit 110 by executing software for executing the processing of FIG. 19 described later by the control unit 110. I will do it.
- the present invention is not limited to this, and each of these units included in the control unit 110 may be configured inside the control unit 110 as a hardware circuit.
- each of these units included in the memory 120 is temporarily configured in the memory 120 by executing software for executing the processing of FIG. 19 described later by the control unit 110.
- the present invention is not limited to this, and each of these units included in the memory 120 may be configured as a dedicated storage device.
- each of these units included in the memory 120 may be temporarily configured in a built-in memory of the control unit 110 such as a register instead of being configured in the memory 120.
- the acceleration reading control unit 111 detects accelerations Ax (t), Ay (t), and Az (t) in three axes from the acceleration sensor 170.
- acceleration data Ax (t), Ay (t), Az (t) in the Z-axis direction are directly used as the X-axis and Y-axis.
- acceleration data Ax (t), Ay (t), Az (t) in the Z-axis direction are directly used as the X-axis and Y-axis.
- the X-axis, Y-axis, and Z-axis directions are converted by coordinate conversion of the detection values obtained by the acceleration sensor.
- the respective acceleration data Ax (t), Ay (t), Az (t) are calculated.
- the acceleration reading control unit 111 stores the acceleration data Ax (t), Ay (t), Az (t) calculated for each sampling period in the acceleration data storage unit 121 of the memory 120.
- the feature point position specifying unit 112 is based on the acceleration data Ax (t), Ay (t), Az (t) stored in the acceleration data storage unit 121, as described with reference to FIG.
- the relative position X (t) with respect to the average position in a short time (here, ⁇ 1 step time ( ⁇ T seconds)) of the activity meter 100 in the X-axis, Y-axis, and Z-axis directions , Y (t), Z (t) are calculated.
- FIG. 19 is a first diagram showing a locus of a predetermined part of the user calculated by the activity meter 100 in this embodiment.
- FIG. 20 is a second diagram showing the locus of the predetermined part of the user calculated by the activity meter 100 in this embodiment.
- the trajectory shown in FIG. 19A to FIG. 19C is a waist trajectory that is a predetermined part when the user is fine.
- the trajectory shown in FIG. 20A to FIG. 20C is a trajectory of the waist that is a predetermined part when the user is tired.
- the amplitude from the center is larger than when he is fine, that is, the shake of the body axis is larger.
- the variation of the locus for every period is large.
- the feature point position specifying unit 112 is based on the calculated positions X (t), Y (t), and Z (t) as described in FIGS. 7 to 9.
- the coordinate value of the position of the feature point is specified by the method.
- the feature point position specifying unit 112 is based on the acceleration detected by the acceleration sensor 170, and the orthogonal three-axis directions of the Y-axis direction (vertical direction), the Z-axis direction (traveling direction), and the X-axis direction (left-right direction).
- the position of the feature point of the trajectory projected by removing the moving component in the Z-axis direction on the XZ plane, the XY plane, and the YZ plane, which are planes perpendicular to each other, is specified.
- the feature point position specifying unit 112 causes the feature point position storage unit 122 to store the calculated position of the feature point.
- the feature factor calculation unit 1131 calculates the value of the feature factor according to the calculation formula described with reference to FIGS. 10 to 12 based on the position of the feature point stored in the feature point position storage unit 122. Then, the feature factor calculation unit 1131 stores the calculated feature factor value in the feature factor storage unit 1231.
- the correlation storage unit 1232 stores in advance the multiple regression equations described with reference to FIGS.
- the index calculation unit 1132 is based on the multiple regression equation stored in the correlation storage unit 1232 and based on the value of the feature factor stored in the feature factor storage unit 1231 (for example, a stride, a step, Rotation of hips, leg height, back muscle stretch, center of gravity balance, etc.). Then, the index calculation unit 1132 causes the index storage unit 1233 to store the calculated index value.
- the feature factor stored in the feature factor storage unit 1231 for example, a stride, a step, Rotation of hips, leg height, back muscle stretch, center of gravity balance, etc.
- FIG. 21 is a graph showing the user's stride calculated by the activity meter 100 in this embodiment.
- FIG. 22 is a graph showing the user's step calculated by the activity meter 100 in this embodiment.
- 21 and 22 show the user's stride and step calculated based on the user's waist locus described with reference to FIGS. 19 and 20, respectively.
- the value on the left side is the value when you are fine, and the value on the right side is when you are tired.
- the stride when the measurement target user is fine is about 675 mm, and the stride when fatigued is about 500 mm.
- the step interval when the measurement target user is healthy is about 84 mm, and the step interval when fatigued is about 130 mm.
- the walking posture determination unit 1133 determines the walking posture based on the index value stored in the index storage unit 1233. Then, the walking posture determination unit 1133 stores the determined walking posture in the walking posture storage unit 1234 of the memory 120.
- the temporal change amount calculation unit 1134 calculates the temporal change amount of the walking posture determined based on the index value and stored in the walking posture storage unit 1234, and stores the temporal change amount in the temporal change amount storage unit 1235 of the memory 120. . For example, it calculates how much an index indicating a walking posture, for example, a stride or a step has changed since the start of exercise. Moreover, you may make it calculate how much the time differential value of the parameter
- the degree-of-change determination unit 114 determines the degree of change, which is the degree of temporal change, based on the temporal change amount calculated by the temporal change amount calculation unit 1134 and stored in the temporal change amount storage unit 1235. 120 changes are stored in the change degree storage unit 125.
- the posture change degree that is the degree of temporal change in the walking posture is determined.
- the present invention is not limited to this, and any other degree of change may be used as long as the degree of change of the trajectory changes with time.
- the degree of change of the index indicating the walking posture may be used.
- the fatigue degree input acceptance control unit 115 controls to accept an input of the fatigue degree from the operation unit 130 when the user is operating.
- the stored fatigue data for determining the future fatigue level is stored in the walking change fatigue correspondence storage unit 124 of the memory 120 in association with the fatigue level for which the input is accepted and the fatigue level determined by the change level determination unit 114.
- the degree-of-fatigue determination unit 116 corresponds to the degree of change calculated by the degree-of-change determination unit 114 and stored in the degree-of-change storage unit 125 based on the degree of fatigue and the degree of change stored in the walking change fatigue correspondence storage unit 124. Determine the degree of fatigue. As a method for determining the degree of fatigue, the following method can be considered.
- the correspondence relationship between the degree of fatigue and the amount of change ⁇ Sw, ⁇ Sp, ⁇ Sh is accumulated in the memory 120 (walking change fatigue correspondence storage unit 124), and as a result, the threshold values ⁇ Swth, ⁇ Sth, ⁇ Spth and ⁇ Shth are determined.
- the threshold values are set as ⁇ Swth, ⁇ Spth, and ⁇ Shth.
- the fatigue level determination unit 116 determines that the fatigue level is the first stage, and when any two are equal to or less than the threshold value. It is determined that the degree of fatigue is in the second stage, and when all are below the threshold, it is determined that the degree of fatigue is in the third stage.
- the third stage fatigue degree when any one of the change amounts ⁇ Sw, ⁇ Sp, ⁇ Sh is equal to or greater than the threshold value, it is determined that the second stage fatigue degree has been recovered, and the second stage In the case of the degree of fatigue at the stage, when any two are above the threshold, it is determined that the degree of fatigue at the first stage has been recovered, and in the case of the degree of fatigue at the first stage, all are above the threshold Sometimes it may be determined that fatigue has recovered.
- the threshold for determining that the fatigue has recovered is made stricter than the threshold for determining that the degree of fatigue has increased (in the case of stride, walking pitch, and foot-lifting height, the thresholds ⁇ Swth, ⁇ Spth, ⁇ Shth are increased) You may do it. As a result, it is possible to prevent the user from being erroneously determined to have recovered even though the fatigue has not recovered and to damage the user, thereby further ensuring the safety of the user.
- the initial value is the step width Sw0, the walking pitch Sp0, and the foot-lifting height Sh0 at the start of walking.
- the present invention is not limited to this, and an average value for a predetermined time (for example, 2 hours) from the start of exercise may be used as the initial value.
- the degree of fatigue is determined based on the relationship between the change amount, which is the difference between the current value and the initial value, and the threshold value.
- the present invention is not limited to this, and the degree of fatigue may be determined based on the relationship between the change rate of the current value from the initial value and the threshold value.
- the present invention when the threshold is exceeded, the level of fatigue is increased.
- the present invention is not limited to this, and the fatigue level may be increased when a state in which the threshold value is exceeded is continued for a certain time (for example, 1 minute).
- ⁇ w, ⁇ p, and ⁇ h be variations in stride, walking pitch, and leg height for a predetermined time (for example, 1 minute). Then, the correspondence relationship between the degree of fatigue and the variation degree of variations ⁇ w, ⁇ p, ⁇ h is accumulated in the memory 120 (walking change fatigue correspondence storage unit 124). As a result, the threshold values ⁇ wth, ⁇ pth, ⁇ hth is defined.
- the fatigue degree determination unit 116 determines that the fatigue degree is the first stage, and when any two are less than or equal to the threshold value, It is determined that the degree of fatigue is the second stage, and when all are below the threshold, it is determined that the degree of fatigue is the third stage.
- the fatigue level determination unit 116 determines that the fatigue level of the first stage is any one of the time differential values Swt ′, Spt ′, and Sht ′ that is equal to or less than the threshold value, and any two are equal to or less than the threshold value. When it becomes, it determines with it being the 2nd stage fatigue degree, and when all become below a threshold value, it determines with it being the 3rd stage fatigue degree.
- the notification control unit 117 controls the fatigue degree user determined by the fatigue degree determination unit 116 to notify the user.
- the notification method may be a display controlled by the display unit 140, an audio output controlled by the report sound unit 150, or a combination of display and audio output. Also good.
- the degree of fatigue may be displayed as an icon, or may be displayed as a character.
- the sound output may be a buzzer sound output at a volume according to the fatigue level, a buzzer sound output at intervals according to the fatigue level, or a sound quality according to the fatigue level. Buzzer sound may be output, or may be output in words indicating the degree of fatigue. Since the way the sound is heard varies depending on the degree of fatigue, it is desirable to output the sound at a volume and sound quality that are easy to hear according to the degree of fatigue.
- FIG. 23 is a flowchart showing the flow of the walking posture determination process executed by the control unit 110 of the activity meter 100 in this embodiment.
- control unit 110 reads the detected value of the acceleration sensor from acceleration sensor 170, and as described in acceleration reading control unit 111 in FIG. 18, acceleration data Ax (t), Ay (T) and Az (t) are stored in the memory 120 for each sampling period.
- step S102 the control unit 110 determines whether or not one step of walking has been detected. Here, it is determined that one step has been detected by detecting the feature point (1) (feature point (5)) described in FIG. If it is determined that one step has not been detected (NO in step S102), control unit 110 advances the process to be executed to step S111.
- step S103 the control unit 110 stores acceleration data Ax (t) for one step stored in the memory 120 in step S101. , Ay (t), Az (t) are read, and the coordinate value of the position of the feature point is calculated as described in the feature point position specifying unit 112 in FIG.
- step S104 the control unit 110 calculates the value of the feature factor as described in the feature factor calculation unit 1131 of FIG. 18 based on the coordinate value of the position of the feature point calculated in step S103. .
- step S105 the control unit 110 follows the correlation between the feature factor and the walking posture index as described in the index calculation unit 1132 of FIG. 18 based on the value of the feature factor calculated in step S104.
- the index value is calculated and stored in the memory 120. Then, the control part 110 advances the process to perform to the process of step S11.
- step S111 the control unit 110 reads the index indicating the walking posture stored in the memory 120 in step S105, and determines the walking posture based on the index as described in the walking posture determination unit 1133 in FIG. And stored in the memory 120.
- step S112 the control unit 110 calculates the temporal change amount of the walking posture determined in step S111 as described in the temporal change amount calculation unit 1134 in FIG. .
- step S113 the control unit 110 determines the degree of change, which is the degree of temporal change, as described in the degree-of-change determination unit 114 in FIG.
- step S114 control unit 110 determines whether an input of the degree of fatigue is received from operation unit 130 or not. When it is determined that it has been accepted (when YES is determined in step S114), in step S115, as described in the fatigue level input reception control unit 115 in FIG. The fatigue level calculated in step S113 is stored in the memory 120 in association with it.
- Step S121 determines the fatigue level (for example, It is determined whether or not (timing every minute). If it is determined that it is not (NO in step S121), control unit 110 returns the process to be executed to the process in step S101.
- step S122 when it is determined that it is time to determine the degree of fatigue (when YES is determined in step S121), in step S122, the control unit 110, as described in the fatigue degree determination unit 116 of FIG.
- the fatigue degree corresponding to the degree of change calculated in step S113 is determined based on the degree of fatigue and the degree of change stored in step S113.
- step S123 the control unit 110 controls the user to notify the fatigue degree determined in step S122, as described in the notification control unit 117 in FIG. Then, the control part 110 returns the process to perform to the process of step S101.
- the activity meter 100 includes the main body 191, the acceleration sensor 170, and the control unit 110, and the walking posture of the user who wears the main body 191 on the waist. It is an apparatus for determining.
- the control unit Based on the acceleration detected by the acceleration sensor 170, the control unit specifies a feature point position specifying unit 112 that specifies a locus during walking of the user's waist to which the main body unit 191 is worn, and is specified by the feature point position specifying unit 112.
- a temporal change calculation unit (including a feature factor calculation unit 1131, an index calculation unit 1132, a walking posture determination unit 1133, and a temporal change amount calculation unit 1134) that calculates a temporal change of the trajectory that has been performed, and a temporal change calculation unit
- the change degree determination unit 114 (including the fatigue degree input reception control unit 115 and the fatigue degree determination unit 116) that determines the change degree that is the degree of the temporal change based on the temporal change calculated by the above.
- the activity meter 100 further includes a memory 120.
- Control unit 110 further includes a fatigue level input reception control unit 115 that receives an input of the fatigue level when the user is walking.
- the degree-of-change determination unit 114 calculates the degree of change when an input of the degree of fatigue is received by the fatigue degree input reception control unit 115.
- the fatigue degree input acceptance control unit 115 associates the accepted fatigue degree with the change degree determined by the change degree determination unit 114 and causes the walking change fatigue correspondence storage unit 124 of the memory 120 to store them in association with each other.
- the control unit 110 further determines the degree of fatigue corresponding to the degree of change determined by the degree-of-change determination unit 114 based on the degree of fatigue and the degree of change stored in the walking change fatigue correspondence storage unit 124.
- Part 116 is included.
- the notification control unit 117 causes the display unit 140 or the reporting unit 150 to notify the fatigue level determined by the fatigue level determination unit 116.
- the activity meter 100 further includes a display unit 140 or a sound report unit 150.
- the control unit 110 further includes a notification control unit 117 that causes the display unit 140 or the sound reporting unit 150 to notify the fatigue level determined by the fatigue level determination unit 116.
- the trajectory is a three-dimensional trajectory from which a moving component in the traveling direction (Z-axis direction) during walking of the waist where the main body 191 is mounted is removed.
- the trajectory has the pattern described with reference to FIGS.
- the pattern includes a plurality of feature points that define the features of the pattern.
- the feature point position specifying unit 112 is a plane perpendicular to the vertical direction (Y-axis direction), the traveling direction (Z-axis direction), and the left-right direction (X-axis direction) based on the acceleration detected by the acceleration sensor 170.
- the position of the feature point of the trajectory projected on the XZ plane, the XY plane, and the YZ plane by removing the moving component in the traveling direction (Z-axis direction) is specified.
- the feature factor calculation unit 1131 calculates the value of the trajectory feature factor based on the position specified by the feature point position specifying unit 112.
- the index calculation unit 1132 calculates the value of the index based on the value of the feature factor calculated by the feature factor calculation unit 1131 according to the correlation obtained in advance between the value of the feature factor and the value of the index indicating the walking posture. To do.
- the temporal change amount calculation unit 1134 calculates the temporal change amount of the index based on the index value calculated by the index calculation unit 1132.
- the change degree determination unit 114 determines the change degree based on the temporal change amount calculated by the temporal change amount calculation unit 1134.
- the temporal change amount includes the temporal change amount of the walking posture.
- the control unit 110 further includes a walking posture determination unit 1133 that determines a walking posture based on the index value calculated by the index calculation unit 1132.
- the temporal change amount calculation unit 1134 calculates the temporal change amount of the walking posture determined by the walking posture determination unit 1133.
- the degree-of-change determination unit 114 determines the degree of change including the posture change degree of the temporal change of the walking posture based on the temporal change amount calculated by the temporal change amount calculation unit 1134.
- the index indicating the walking posture is accurately calculated based on the accurate correlation, and the index indicating various walking postures is calculated, it is possible to determine the degree of change in detail with higher accuracy. .
- the correlation is a relational expression obtained by multiple regression analysis, which is a relational expression between the value of the characteristic factor as the target variable and the value of the index as the explanatory variable. It is shown by a regression equation.
- the feature points are the feature point (1) when the right foot touches down, the feature point (2) when the locus reaches the highest position while standing on the right foot, and the left foot This includes a feature point (3) when the robot touches down and a feature point (4) when the locus reaches the highest position while standing on the left foot.
- the feature factor is a feature factor Hr that is the distance in the vertical direction (Y-axis direction) between the feature point (1) and the feature point (2) in the locus projected on the XY plane perpendicular to the traveling direction (Z-axis direction), and , Calculated from the distance between the feature point (1) and the feature point (2) and the distance between the feature point (3) and the feature point (4) in the locus projected on the YZ plane perpendicular to the left-right direction (X-axis direction).
- the characteristic factor StShi is included.
- the index includes stride.
- the feature points are the feature point (1) when the right foot touches down, the feature point (2) when the locus reaches the highest position while standing on the right foot, and the feature point on the rightmost side of the locus.
- the feature point (6), the leftmost feature point (7) of the locus, the foremost feature point (10) on the right side of the locus, the foremost feature point (12) on the left side of the locus, and the right side of the locus And the rearmost feature point (9) and the rearmost feature point (11) on the left side of the locus.
- the feature factor is a distance Hr in the vertical direction (Y-axis direction) between the feature point (1) and the feature point (2) in the locus projected on the XY plane perpendicular to the traveling direction (Z-axis direction).
- a feature factor Hr / W which is a quotient divided by a distance W in the left-right direction (X-axis direction) from the feature point (7), and a feature in the locus projected on the XZ plane perpendicular to the vertical direction (Y-axis direction)
- the quotient obtained by dividing the distance WuSu between the point (10) and the feature point (12) in the left-right direction (X-axis direction) by the distance WdSu between the feature point (9) and the feature point (11) in the left-right direction (X-axis direction).
- the characteristic factor WuSu / WdSu is included.
- the indicator includes a step interval.
- the posture change degree which is the degree of temporal change of the walking posture
- the degree of change that is the degree of temporal change in walking may be calculated based on the amount of temporal change in the index indicating walking. Further, the degree of change may be calculated based on a temporal change in the locus of the user's waist.
- the correspondence between the degree of fatigue and the degree of change is determined by the user's input of the degree of fatigue.
- the present invention is not limited to this.
- Data on the correspondence relationship between the fatigue degree and the change degree is obtained from a plurality of persons in the same manner, and the obtained correspondence relation is stored in advance in the memory 120 (walking change fatigue of the activity meter 100. You may make it memorize
- the walking posture is determined based on the relationship between the index value and the threshold value.
- the present invention is not limited to this, and the walking posture may be determined on the basis of the similarity between the combination of the index whose relationship with the walking posture is obtained in advance and the calculated combination of the index.
- the threshold value of the index indicating the walking posture may be determined based on actually measured data when a person with a good walking posture actually walks.
- the target walking posture and the user's walking posture are displayed separately as described in FIG.
- the present invention is not limited to this, and the target walking posture and the user's walking posture may be displayed in an overlapping manner.
- the average speed component is an average speed component for the time of ⁇ 1 step.
- the present invention is not limited to this, and it may be an average speed component for a time of ⁇ n steps (n is a predetermined number) or a time of ⁇ n steps (n steps before the calculation target time). It may be an average velocity component, may be an average velocity component of ⁇ s seconds (s is a predetermined number), or may be an average velocity component of -s seconds (before calculation target, s seconds).
- the invention of the device for the activity meter 100 has been described.
- the present invention is not limited to this, and can be understood as an invention of a control method for controlling the activity meter 100.
- 100 activity meter 110 control unit, 111 acceleration reading control unit, 112 feature point position specifying unit, 1311, feature factor calculation unit, 1132, index calculation unit, 1133 walking posture determination unit, 1134 temporal variation calculation unit, 114 degree of change Determination unit, 115 fatigue degree input reception control unit, 116 fatigue degree determination unit, 117 notification control unit, 120 memory, 121 acceleration data storage unit, 122 feature point position storage unit, 1231 feature factor storage unit, 1232 correlation storage unit, 1233 Indicator storage unit, 1234 Walking posture storage unit, 1235 Temporal change storage unit, 124 Walking change fatigue response storage unit, 125 Change degree storage unit, 130 Operation unit, 131 Display switching / decision switch, 132 Left operation / memory switch 133 Right operation switch, 140 display 141 display, 170 an acceleration sensor, 190 power, 191 main body, 192 clip unit.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Physiology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Geometry (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Signal Processing (AREA)
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE112011102491T DE112011102491T5 (de) | 2010-07-27 | 2011-07-19 | Gangartänderung-Bestimmungseinheit |
| CN201180036840.8A CN103025241B (zh) | 2010-07-27 | 2011-07-19 | 步行变化判断装置 |
| US13/812,362 US8608671B2 (en) | 2010-07-27 | 2011-07-19 | Gait change determination device |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2010-167932 | 2010-07-27 | ||
| JP2010167932A JP5724237B2 (ja) | 2010-07-27 | 2010-07-27 | 歩行変化判定装置 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2012014714A1 true WO2012014714A1 (ja) | 2012-02-02 |
Family
ID=45529930
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2011/066339 Ceased WO2012014714A1 (ja) | 2010-07-27 | 2011-07-19 | 歩行変化判定装置 |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US8608671B2 (enExample) |
| JP (1) | JP5724237B2 (enExample) |
| CN (1) | CN103025241B (enExample) |
| DE (1) | DE112011102491T5 (enExample) |
| WO (1) | WO2012014714A1 (enExample) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014135187A1 (en) * | 2013-03-04 | 2014-09-12 | Polar Electro Oy | Computing user's physiological state related to physical exercises |
| WO2016194330A1 (ja) * | 2015-06-01 | 2016-12-08 | パナソニックIpマネジメント株式会社 | 動作表示システム及びプログラム |
| US9836118B2 (en) | 2015-06-16 | 2017-12-05 | Wilson Steele | Method and system for analyzing a movement of a person |
| CN112764546A (zh) * | 2021-01-29 | 2021-05-07 | 重庆子元科技有限公司 | 一种虚拟人物位移控制方法、装置及终端设备 |
Families Citing this family (52)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2011163367A1 (en) * | 2010-06-22 | 2011-12-29 | Mcgregor Stephen J | Method of monitoring human body movement |
| US20150342513A1 (en) * | 2010-07-01 | 2015-12-03 | Industrial Technology Research Institute | System and Method for Analyzing Biomechanics |
| US9524424B2 (en) | 2011-09-01 | 2016-12-20 | Care Innovations, Llc | Calculation of minimum ground clearance using body worn sensors |
| US9165113B2 (en) * | 2011-10-27 | 2015-10-20 | Intel-Ge Care Innovations Llc | System and method for quantitative assessment of frailty |
| US9877667B2 (en) | 2012-09-12 | 2018-01-30 | Care Innovations, Llc | Method for quantifying the risk of falling of an elderly adult using an instrumented version of the FTSS test |
| US9999376B2 (en) * | 2012-11-02 | 2018-06-19 | Vital Connect, Inc. | Determining body postures and activities |
| EP3498332B1 (en) | 2013-01-21 | 2021-07-14 | Cala Health, Inc. | Devices for controlling tremor |
| US12453853B2 (en) | 2013-01-21 | 2025-10-28 | Cala Health, Inc. | Multi-modal stimulation for treating tremor |
| JP6131706B2 (ja) * | 2013-05-10 | 2017-05-24 | オムロンヘルスケア株式会社 | 歩行姿勢計およびプログラム |
| US9848828B2 (en) * | 2013-10-24 | 2017-12-26 | Logitech Europe, S.A. | System and method for identifying fatigue sources |
| CN103584841A (zh) * | 2013-11-27 | 2014-02-19 | 中山大学 | 一种基于居家养老中的运动信息的肌肉疲劳检测系统 |
| JP6243731B2 (ja) * | 2013-12-26 | 2017-12-06 | ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー | 移動運動状態表示装置、方法及びシステム並びにプログラム |
| JP6243730B2 (ja) * | 2013-12-26 | 2017-12-06 | ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー | 移動運動状態表示装置、方法及びシステム並びにプログラム |
| WO2015129883A1 (ja) * | 2014-02-28 | 2015-09-03 | マイクロストーン株式会社 | 歩行状態検出方法及び歩行状態検出装置 |
| JP6606105B2 (ja) | 2014-06-02 | 2019-11-13 | カラ ヘルス,インコーポレイテッド | 振戦を治療するための抹消神経刺激用のシステム及び方法 |
| JP2016002109A (ja) * | 2014-06-13 | 2016-01-12 | パナソニックIpマネジメント株式会社 | 活動評価装置、評価処理装置、プログラム |
| JP6080078B2 (ja) * | 2014-08-18 | 2017-02-15 | 高知県公立大学法人 | 姿勢および歩行状態推定装置 |
| US9387588B1 (en) | 2014-08-25 | 2016-07-12 | Google Inc. | Handling gait disturbances with asynchronous timing |
| US9618937B1 (en) | 2014-08-25 | 2017-04-11 | Google Inc. | Slip detection using robotic limbs |
| JP6369811B2 (ja) * | 2014-11-27 | 2018-08-08 | パナソニックIpマネジメント株式会社 | 歩行解析システムおよび歩行解析プログラム |
| CN104605859B (zh) * | 2014-12-29 | 2017-02-22 | 北京工业大学 | 一种基于移动终端传感器的室内导航步态检测方法 |
| US9499218B1 (en) | 2014-12-30 | 2016-11-22 | Google Inc. | Mechanically-timed footsteps for a robotic device |
| CN104548564B (zh) * | 2015-01-27 | 2017-03-01 | 北京智谷睿拓技术服务有限公司 | 信息获取方法、信息获取装置及用户设备 |
| WO2016201366A1 (en) | 2015-06-10 | 2016-12-15 | Cala Health, Inc. | Systems and methods for peripheral nerve stimulation to treat tremor with detachable therapy and monitoring units |
| KR101785284B1 (ko) * | 2015-06-16 | 2017-10-16 | (주)위비젼 | 보행정보를 이용한 피로도 측정장치 및 측정방법 |
| CN104954761A (zh) * | 2015-07-07 | 2015-09-30 | 满欣然 | 一种智能安全看护方法 |
| JP6660110B2 (ja) * | 2015-07-23 | 2020-03-04 | 原田電子工業株式会社 | 歩行解析方法および歩行解析システム |
| KR102482436B1 (ko) * | 2015-09-02 | 2022-12-28 | 삼성전자주식회사 | 보행 보조 장치 및 그 동작 방법 |
| CN108348746B (zh) | 2015-09-23 | 2021-10-12 | 卡拉健康公司 | 用于手指或手中的周围神经刺激以治疗手震颤的系统和方法 |
| JP6599473B2 (ja) * | 2015-10-13 | 2019-10-30 | アルプスアルパイン株式会社 | 歩行計測装置、歩行計測方法及びプログラム |
| US10959647B2 (en) * | 2015-12-30 | 2021-03-30 | Seismic Holdings, Inc. | System and method for sensing and responding to fatigue during a physical activity |
| WO2017132067A2 (en) | 2016-01-21 | 2017-08-03 | Cala Health, Inc. | Systems, methods and devices for peripheral neuromodulation for treating diseases related to overactive bladder |
| US9925667B1 (en) | 2016-01-25 | 2018-03-27 | Boston Dynamics, Inc. | Continuous slip recovery |
| TWI592664B (zh) * | 2016-04-25 | 2017-07-21 | 泰金寶電通股份有限公司 | 步姿評估方法以及其電子裝置 |
| CA3030029A1 (en) | 2016-07-08 | 2018-01-11 | Cala Health, Inc. | Systems and methods for stimulating n nerves with exactly n electrodes and improved dry electrodes |
| CN117959601A (zh) | 2016-08-25 | 2024-05-03 | 卡拉健康公司 | 通过周围神经刺激治疗心脏机能障碍的系统和方法 |
| JP2018093378A (ja) * | 2016-12-05 | 2018-06-14 | 株式会社Screenホールディングス | 歩行判定方法および歩行判定プログラム |
| CN110809486B (zh) | 2017-04-03 | 2024-10-11 | 卡拉健康公司 | 用于治疗与膀胱过度活动症相关的疾病的周围神经调节系统、方法和装置 |
| KR20180129140A (ko) * | 2017-05-25 | 2018-12-05 | 주식회사 누믹스미디어웍스 | 가상 현실 연동을 위한 보행 분석 방법, 가상 현실 연동 방법 및 장치 |
| EP3740274A4 (en) | 2018-01-17 | 2021-10-27 | Cala Health, Inc. | SYSTEMS AND METHODS FOR TREATING INFLAMMATORY INTESTINAL DISEASE USING PERIPHERAL NERVE STIMULATION |
| JP1632303S (enExample) * | 2018-09-27 | 2019-05-27 | ||
| GB2597378B (en) * | 2019-02-18 | 2023-03-01 | Mitsubishi Electric Corp | Fatigue determination device, fatigue determination method, and fatigue determination program |
| JP6711433B2 (ja) * | 2019-03-20 | 2020-06-17 | カシオ計算機株式会社 | ランニング解析装置、ランニング解析方法及びランニング解析プログラム |
| CN109924985B (zh) * | 2019-03-29 | 2022-08-09 | 上海电气集团股份有限公司 | 下肢康复设备及基于其的评估装置、方法 |
| GB2585861B (en) * | 2019-07-17 | 2023-09-13 | Waymap Ltd | Apparatus and associated methods for step length estimation |
| US12251560B1 (en) | 2019-08-13 | 2025-03-18 | Cala Health, Inc. | Connection quality determination for wearable neurostimulation systems |
| JP7279798B2 (ja) * | 2019-08-28 | 2023-05-23 | 日本電気株式会社 | 推定装置、推定方法、プログラム |
| US11890468B1 (en) | 2019-10-03 | 2024-02-06 | Cala Health, Inc. | Neurostimulation systems with event pattern detection and classification |
| JP7218820B2 (ja) | 2019-12-25 | 2023-02-07 | 日本電気株式会社 | 推定装置、推定システム、推定方法、およびプログラム |
| CN113133761B (zh) * | 2020-01-17 | 2024-05-28 | 宝成工业股份有限公司 | 左右步态的判断方法及其分析装置 |
| US20250114019A1 (en) * | 2022-02-09 | 2025-04-10 | University Of Pittsburgh-Of The Commonwealth System Of Higher Education | Systems and methods for performance fatigability index |
| JP2023178713A (ja) * | 2022-06-06 | 2023-12-18 | 本田技研工業株式会社 | 運転可否判定装置 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002165799A (ja) * | 2000-12-01 | 2002-06-11 | Sharp Corp | 健康状態診断方法およびその方法を用いた健康状態診断装置 |
| JP2006177749A (ja) * | 2004-12-22 | 2006-07-06 | Ritsumeikan | 周期運動体の移動軌跡算出方法及び装置 |
| JP2009106386A (ja) * | 2007-10-26 | 2009-05-21 | Panasonic Electric Works Co Ltd | 歩容改善支援システム |
| JP2009125508A (ja) * | 2007-11-27 | 2009-06-11 | Panasonic Electric Works Co Ltd | 体動検出装置及びそれを用いた運動システム |
| WO2011040259A1 (ja) * | 2009-09-30 | 2011-04-07 | 三菱化学株式会社 | 体動信号についての情報処理方法、情報処理システムおよび情報処理装置 |
Family Cites Families (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7016540B1 (en) * | 1999-11-24 | 2006-03-21 | Nec Corporation | Method and system for segmentation, classification, and summarization of video images |
| JP3700635B2 (ja) | 2001-09-28 | 2005-09-28 | オムロンヘルスケア株式会社 | 電子血圧計 |
| CN1635337A (zh) * | 2003-12-30 | 2005-07-06 | 皇家飞利浦电子股份有限公司 | 一种移动定位方法及系统 |
| JP4591918B2 (ja) * | 2004-11-01 | 2010-12-01 | 株式会社タニタ | 体動測定装置 |
| JP4352018B2 (ja) | 2005-03-30 | 2009-10-28 | 株式会社東芝 | 運動計測装置、運動計測方法および運動計測プログラム |
| JP4819887B2 (ja) * | 2006-05-29 | 2011-11-24 | シャープ株式会社 | 疲労推定装置及びそれを搭載した電子機器 |
| CN100595520C (zh) * | 2008-02-21 | 2010-03-24 | 上海交通大学 | 适用于步行者的定位方法 |
| JP4456181B1 (ja) * | 2008-10-27 | 2010-04-28 | パナソニック株式会社 | 移動体検出方法及び移動体検出装置 |
-
2010
- 2010-07-27 JP JP2010167932A patent/JP5724237B2/ja active Active
-
2011
- 2011-07-19 WO PCT/JP2011/066339 patent/WO2012014714A1/ja not_active Ceased
- 2011-07-19 CN CN201180036840.8A patent/CN103025241B/zh active Active
- 2011-07-19 US US13/812,362 patent/US8608671B2/en active Active
- 2011-07-19 DE DE112011102491T patent/DE112011102491T5/de active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002165799A (ja) * | 2000-12-01 | 2002-06-11 | Sharp Corp | 健康状態診断方法およびその方法を用いた健康状態診断装置 |
| JP2006177749A (ja) * | 2004-12-22 | 2006-07-06 | Ritsumeikan | 周期運動体の移動軌跡算出方法及び装置 |
| JP2009106386A (ja) * | 2007-10-26 | 2009-05-21 | Panasonic Electric Works Co Ltd | 歩容改善支援システム |
| JP2009125508A (ja) * | 2007-11-27 | 2009-06-11 | Panasonic Electric Works Co Ltd | 体動検出装置及びそれを用いた運動システム |
| WO2011040259A1 (ja) * | 2009-09-30 | 2011-04-07 | 三菱化学株式会社 | 体動信号についての情報処理方法、情報処理システムおよび情報処理装置 |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014135187A1 (en) * | 2013-03-04 | 2014-09-12 | Polar Electro Oy | Computing user's physiological state related to physical exercises |
| CN105210067A (zh) * | 2013-03-04 | 2015-12-30 | 博能电子公司 | 计算用户的与体育锻炼有关的生理状态 |
| US10709382B2 (en) | 2013-03-04 | 2020-07-14 | Polar Electro Oy | Computing user's physiological state related to physical exercises |
| CN105210067B (zh) * | 2013-03-04 | 2021-04-30 | 博能电子公司 | 计算用户的与体育锻炼有关的生理状态 |
| WO2016194330A1 (ja) * | 2015-06-01 | 2016-12-08 | パナソニックIpマネジメント株式会社 | 動作表示システム及びプログラム |
| JPWO2016194330A1 (ja) * | 2015-06-01 | 2018-01-25 | パナソニックIpマネジメント株式会社 | 動作表示システム及びプログラム |
| US9836118B2 (en) | 2015-06-16 | 2017-12-05 | Wilson Steele | Method and system for analyzing a movement of a person |
| CN112764546A (zh) * | 2021-01-29 | 2021-05-07 | 重庆子元科技有限公司 | 一种虚拟人物位移控制方法、装置及终端设备 |
| CN112764546B (zh) * | 2021-01-29 | 2022-08-09 | 重庆子元科技有限公司 | 一种虚拟人物位移控制方法、装置及终端设备 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2012024449A (ja) | 2012-02-09 |
| JP5724237B2 (ja) | 2015-05-27 |
| CN103025241A (zh) | 2013-04-03 |
| US8608671B2 (en) | 2013-12-17 |
| CN103025241B (zh) | 2015-05-13 |
| DE112011102491T5 (de) | 2013-05-29 |
| US20130123669A1 (en) | 2013-05-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5724237B2 (ja) | 歩行変化判定装置 | |
| JP2012024275A (ja) | 歩行姿勢判定装置 | |
| JP6131706B2 (ja) | 歩行姿勢計およびプログラム | |
| JP5504810B2 (ja) | 歩行姿勢判定装置、制御プログラム、および制御方法 | |
| US20140330171A1 (en) | Method and device for monitoring postural and movement balance for fall prevention | |
| JP6319446B2 (ja) | モーションキャプチャ装置、モーションキャプチャ方法、運動性能診断方法およびモーションキャプチャ用身体装着具 | |
| JP2012205816A (ja) | 歩行姿勢判定装置 | |
| JP5463702B2 (ja) | 歩数検出方法及び歩数検出装置 | |
| EP3305193A1 (en) | Posture detection device, spectacle-type electronic device, posture detection method, and program | |
| KR20180085916A (ko) | 웨어러블 장치의 위치를 추정하는 방법 및 이를 이용하는 장치 | |
| JP2020120807A (ja) | 転倒リスク評価装置、転倒リスク評価方法及び転倒リスク評価プログラム | |
| WO2014181603A1 (ja) | 歩行姿勢計およびプログラム | |
| WO2014181605A1 (ja) | 歩行姿勢計およびプログラム | |
| KR20210040671A (ko) | 동적으로 변화하는 인체 무게 중심 궤적 추정 장치 및 그 방법 | |
| WO2014181602A1 (ja) | 歩行姿勢計およびプログラム | |
| CN113712536B (zh) | 基于步态分析的不平衡预警方法及穿戴装置 | |
| CN112839569B (zh) | 评估人类移动的方法和系统 | |
| JP2015078959A (ja) | 歩行距離計測システム、慣性計測装置および計測用履物 | |
| JP7782663B2 (ja) | 検出装置、検出システム、歩容計測システム、検出方法、およびプログラム | |
| US20240138757A1 (en) | Pelvic inclination estimation device, estimation system, pelvic inclination estimation method, and recording medium | |
| JP7459965B2 (ja) | 判別装置、判別システム、判別方法、およびプログラム | |
| JP2019165874A (ja) | 回復力評価装置及びプログラム | |
| US20240023835A1 (en) | Feature-amount generation device, gait measurement system, feature-amount generation method, and recording medium | |
| WO2022101971A1 (ja) | 検出装置、検出システム、検出方法、およびプログラム記録媒体 | |
| HK40052123A (en) | Method and system for assessing human movements |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 201180036840.8 Country of ref document: CN |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11812306 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 13812362 Country of ref document: US |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 1120111024916 Country of ref document: DE Ref document number: 112011102491 Country of ref document: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 11812306 Country of ref document: EP Kind code of ref document: A1 |