WO2022230164A1 - 情報処理装置、情報処理方法、及びコンピュータ可読媒体 - Google Patents
情報処理装置、情報処理方法、及びコンピュータ可読媒体 Download PDFInfo
- Publication number
- WO2022230164A1 WO2022230164A1 PCT/JP2021/017158 JP2021017158W WO2022230164A1 WO 2022230164 A1 WO2022230164 A1 WO 2022230164A1 JP 2021017158 W JP2021017158 W JP 2021017158W WO 2022230164 A1 WO2022230164 A1 WO 2022230164A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- threshold
- walking
- information processing
- information
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 56
- 238000003672 processing method Methods 0.000 title claims description 5
- 238000001514 detection method Methods 0.000 claims abstract description 80
- 230000001133 acceleration Effects 0.000 claims description 33
- 238000000034 method Methods 0.000 claims description 21
- 230000008569 process Effects 0.000 claims description 16
- 230000005021 gait Effects 0.000 claims description 15
- 238000005070 sampling Methods 0.000 claims description 11
- 238000010586 diagram Methods 0.000 description 18
- 230000007704 transition Effects 0.000 description 16
- 230000015654 memory Effects 0.000 description 14
- 238000004891 communication Methods 0.000 description 9
- 238000012545 processing Methods 0.000 description 9
- 238000005259 measurement Methods 0.000 description 6
- 230000008451 emotion Effects 0.000 description 5
- 238000010295 mobile communication Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 101100521334 Mus musculus Prom1 gene Proteins 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 208000017899 Foot injury Diseases 0.000 description 1
- 206010061225 Limb injury Diseases 0.000 description 1
- 238000010521 absorption reaction Methods 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000003313 weakening effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/112—Gait analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6804—Garments; Clothes
- A61B5/6807—Footwear
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6829—Foot or ankle
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7221—Determining signal validity, reliability or quality
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7285—Specific aspects of physiological measurement analysis for synchronising or triggering a physiological measurement or image acquisition with a physiological event or waveform, e.g. an ECG signal
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0219—Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
Definitions
- the present disclosure relates to an information processing device, an information processing method, and a non-transitory computer-readable medium storing a program.
- Patent Documents 1 and 2 A technique for detecting a user's gait (walking state) based on data measured by a sensor is known (see Patent Documents 1 and 2, for example).
- Patent Literatures 1 and 2 have a problem that, for example, information about the user's gait may not be detected appropriately.
- an object of the present disclosure is to provide an information processing device, an information processing method, and a non-temporary computer-readable medium storing a program that can appropriately detect information about a user's gait. That's what it is.
- an information processing apparatus includes an acquisition unit that acquires information based on a sensor worn on a user's foot; When the walking of the user cannot be detected based on the information indicating the angle and the first threshold, the walking of the user is detected based on the information indicating the angle and a second threshold lower than the first threshold and output means for outputting information based on the detection result of the detection means.
- information based on a sensor worn on the user's foot is acquired, information indicating the angle of the walking direction between the acquired sole and the ground, and a first threshold value If the user's walking cannot be detected based on, the user's walking is detected based on the information indicating the angle and a second threshold lower than the first threshold, and information based on the detection result is output.
- a method of processing information is provided.
- the information processing apparatus is provided with a process of acquiring information based on a sensor worn on the user's foot, and the acquired angle of the walking direction between the sole of the foot and the ground. a process of detecting walking of the user based on the information indicating the angle and a second threshold lower than the first threshold when the walking of the user cannot be detected based on the information and the first threshold;
- a non-transitory computer-readable medium storing a process for outputting information based on a detection result and a program for executing the process is provided.
- FIG. 1 is a diagram illustrating a configuration example of an information processing system according to an embodiment
- FIG. It is a figure which shows the hardware structural example of the information processing apparatus which concerns on embodiment. It is a figure which shows an example of the position where the sensor which concerns on embodiment is mounted
- FIG. 6 is a flowchart illustrating an example of detection processing according to the embodiment; 6 is a flowchart illustrating an example of detection processing according to the embodiment; It is a figure which shows an example of the transition of the pitch and roll at each time point when a healthy person walks, measured by the sensor according to the embodiment.
- FIG. 4 is a diagram showing an example of changes in pitch and roll at each point in time when a patient walks, measured by the sensor according to the embodiment; It is a figure which shows an example of transition of the acceleration at each time at the time of a healthy person walking, measured by the sensor which concerns on embodiment.
- FIG. 4 is a diagram showing an example of transition of acceleration at each point in time when a patient walks, which is measured by the sensor according to the embodiment; FIG.
- FIG. 4 is a diagram showing an example of changes in angular velocity at each point in time when a healthy person walks, measured by the sensor according to the embodiment;
- FIG. 4 is a diagram showing an example of changes in angular velocity at each point in time when a patient walks, which is measured by the sensor according to the embodiment;
- FIG. 1 is a diagram showing an example of the configuration of an information processing device 10 according to an embodiment.
- the information processing device 10 has an acquisition unit 11 , a detection unit 12 and an output unit 13 .
- Each of these units may be implemented by cooperation of one or more programs installed in the information processing device 10 and hardware such as the processor 101 and the memory 102 of the information processing device 10 .
- the acquisition unit 11 acquires various types of information from a storage unit inside the information processing device 10 or from an external device.
- the acquisition unit 11 indicates, for example, the angle of the walking direction between the sole (sole) of the user's foot and the ground measured by a sensor attached to the user's foot (hereinafter also referred to as the "sole angle" as appropriate). Get information.
- the detection unit 12 detects the information acquired by the acquisition unit 11 and a second threshold lower than the first threshold. User walking is detected based on .
- the output unit 13 outputs (transmits or records) various types of information to the storage unit inside the information processing device 10 or to an external device.
- the output unit 13 outputs information based on the detection result by the detection unit 12, for example.
- FIG. 2 is a diagram showing a configuration example of the information processing system 1 according to the embodiment.
- the information processing system 1 has a measuring device 20A and a measuring device 20B (hereinafter also simply referred to as "measuring device 20" when there is no need to distinguish between them).
- the information processing system 1 also has a user terminal 30 and a server 40 .
- the number of measuring devices 20, user terminals 30, and servers 40 is not limited to the example in FIG.
- the measuring device 20 , the user terminal 30 , and the server 40 are each an example of the information processing device 10 .
- the measuring device 20 and the user terminal 30 may be communicatively connected by, for example, short-range wireless communication such as BLE (Bluetooth (registered trademark) Low Energy) or a cable.
- short-range wireless communication such as BLE (Bluetooth (registered trademark) Low Energy) or a cable.
- the user terminal 30 and the server 40 are connected by the network N so that they can communicate.
- the network N include, for example, the Internet, a mobile communication system, a wireless LAN (Local Area Network), short-range wireless communication such as BLE, a LAN, and a bus.
- mobile communication systems include, for example, fifth generation mobile communication systems (5G), fourth generation mobile communication systems (4G), third generation mobile communication systems (3G), and the like.
- the measuring device 20 has a sensor 21 worn on the user's foot.
- the measuring device 20 outputs data measured using the sensor 21 to an external device such as the user terminal 30 or the server 40 . Note that the measuring device 20 may transmit data to the server 40 without going through the user terminal 30 .
- the user terminal 30 may be, for example, a device such as a smart phone, tablet, personal computer, IoT (Internet of Things) communication device, or mobile phone.
- the user terminal 30 transmits data acquired from the measuring device 20 to the server 40, for example.
- the user terminal 30 displays information about the user's gait on the screen, for example, based on data measured by the sensor 21 .
- the server 40 is, for example, a device such as a server, a cloud, a personal computer, or a smart phone.
- the server 40 for example, records data measured by the sensor 21, and causes the user terminal 30 to display information about the user's gait based on the recorded data.
- FIG. 3 is a diagram showing a hardware configuration example of the information processing device 10 according to the embodiment.
- the information processing device 10 (computer 100) includes a processor 101, a memory 102, and a communication interface 103. FIG. These units may be connected by a bus or the like.
- Memory 102 stores at least a portion of program 104 .
- Communication interface 103 includes interfaces necessary for communication with other network elements.
- Memory 102 may be of any type suitable for a local technology network. Memory 102 may be, as a non-limiting example, a non-transitory computer-readable storage medium. Also, memory 102 may be implemented using any suitable data storage technology, such as semiconductor-based memory devices, magnetic memory devices and systems, optical memory devices and systems, fixed and removable memory, and the like. Although only one memory 102 is shown in computer 100, there may be several physically different memory modules in computer 100.
- FIG. Processor 101 may be of any type.
- Processor 101 may include one or more of a general purpose computer, a special purpose computer, a microprocessor, a Digital Signal Processor (DSP), and a processor based on a multi-core processor architecture as non-limiting examples.
- Computer 100 may have multiple processors, such as application specific integrated circuit chips that are temporally dependent on a clock that synchronizes the main processor.
- Embodiments of the present disclosure may be implemented in hardware or dedicated circuitry, software, logic, or any combination thereof. Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software, which may be executed by a controller, microprocessor or other computing device.
- the present disclosure also provides at least one computer program product tangibly stored on a non-transitory computer-readable storage medium.
- a computer program product comprises computer-executable instructions, such as those contained in program modules, to be executed on a device on a target real or virtual processor to perform the processes or methods of the present disclosure.
- Program modules include routines, programs, libraries, objects, classes, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
- the functionality of the program modules may be combined or split between program modules as desired in various embodiments.
- Machine-executable instructions for program modules may be executed within local or distributed devices. In a distributed device, program modules can be located in both local and remote storage media.
- Program code for executing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes are provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus. When the program code is executed by the processor or controller, the functions/acts in the flowchart illustrations and/or implementing block diagrams are performed. Program code may run entirely on a machine, partly on a machine, as a stand-alone software package, partly on a machine, partly on a remote machine, or entirely on a remote machine or server. be.
- Non-transitory computer-readable media include various types of tangible storage media.
- Examples of non-transitory computer-readable media include magnetic recording media, magneto-optical recording media, optical disc media, semiconductor memories, and the like.
- Magnetic recording media include, for example, flexible disks, magnetic tapes, hard disk drives, and the like.
- Magneto-optical recording media include, for example, magneto-optical disks.
- Optical disc media include, for example, Blu-ray discs, CD (Compact Disc)-ROM (Read Only Memory), CD-R (Recordable), CD-RW (ReWritable), and the like.
- Semiconductor memories include, for example, solid state drives, mask ROMs, PROMs (Programmable ROMs), EPROMs (Erasable PROMs), flash ROMs, and RAMs (random access memories).
- the program may also be delivered to the computer by various types of transitory computer readable media. Examples of transitory computer-readable media include electrical signals, optical signals, and electromagnetic waves. Transitory computer-readable media can deliver the program to the computer via wired channels, such as wires and optical fibers, or wireless channels.
- FIG. 4 is a diagram showing an example of a position where the sensor 21 according to the embodiment is mounted.
- FIG. 5 is a diagram showing an example of data measured by the sensor 21 according to the embodiment.
- FIG. 6 is a diagram showing an example of the configuration of the measuring device 20 according to the embodiment.
- the measuring device 20 is accommodated (installed) in the recess 502 of the insole (insole) 501 of the shoe worn by the user.
- the sensor 21 of the measuring device 20 may be worn on the back side of the user's foot, anywhere from the arch of the foot to the heel.
- the sensor 21 detects, for example, the acceleration in the user's walking direction (Y direction), the vertical upward direction (Z direction), and the acceleration of the other foot perpendicular to the walking direction and the vertical direction. (X direction) may be measured (calculated, measured).
- the sensor 21 may measure, for example, the sole angle ⁇ .
- the measuring device 20 has a sensor 21, a control device 22, and a communication device 23.
- Sensor 21 measures, for example, acceleration and angular velocity.
- the sensor 21 may be, for example, an inertial measurement unit (IMU) having a triaxial acceleration sensor and a triaxial gyro sensor.
- the control device 22 outputs data measured using the sensor 21 to an external device using the communication device 23 .
- the control device 22 may have the same configuration as the computer 100 shown in FIG. In this case, the controller 22 may be, for example, a microcontroller or the like.
- FIG. 7 is a flowchart illustrating an example of detection processing according to the embodiment.
- FIG. 8 is a diagram showing an example of changes in pitch and roll at each point in time when a healthy person walks, measured by the sensor 21 according to the embodiment.
- FIG. 9 is a diagram showing an example of changes in pitch and roll at each point in time when the patient walks, measured by the sensor 21 according to the embodiment.
- FIG. 10 is a diagram showing an example of transition of acceleration at each point in time when a healthy person walks, measured by the sensor 21 according to the embodiment.
- FIG. 11 is a diagram showing an example of transition of acceleration at each point in time when the patient walks, measured by the sensor 21 according to the embodiment.
- FIG. 12 is a diagram showing an example of changes in angular velocity at each point in time when a healthy person walks, measured by the sensor 21 according to the embodiment.
- FIG. 13 is a diagram showing an example of transition of angular velocity at each point in time when the patient walks, which is measured by the sensor 21 according to the embodiment.
- step S1 the detection unit 12 detects that the user wearing the sensor 21 has started walking.
- the detection unit 12 may determine that the user has started walking, for example, when a predetermined command is received from an external device.
- the detection unit 12 of the measurement device 20 may receive the command from the user terminal 30 that has received the operation from the user. Thereby, for example, measurement of the user's gait can be started in response to an operation from a doctor or the like in a hospital or the like.
- the detection unit 12 may determine that the user has started walking when at least one of the acceleration and angular velocity measured by the sensor 21 is equal to or greater than a threshold. As a result, for example, it is possible to reduce the burden of operations and the like on the user.
- the detection unit 12 determines whether or not the user's walking can be detected based on the predetermined threshold and the information measured by the sensor 21 (step S2).
- the detection unit 12 detects the distance while the user walks one step (for example, from when one foot is lifted off the ground to when it is put down again). It may be determined whether or not the time length of is measurable.
- the detection unit 12 detects that the user can measure the length of time between walking one step.
- the detection unit 12 may calculate (determine) the length of time during which the user walks one step based on the length of time between each point in time when the angle reaches the maximum value.
- the detection unit 12 detects that the user can measure the length of time between walking one step. In this case, the detection unit 12 may calculate (determine) the length of time during which the user walks one step based on the length of time between each point in time when the angle becomes the minimum value.
- a first minimum value threshold an example of a “first threshold”; for example, ⁇ 20°
- FIG. 8 shows an example of transition 801 of the sole angle (pitch) and transition 802 of the rotation angle (roll) with respect to the walking direction at each point in time when a healthy person walks.
- the sole angle has a maximum value (approximately 65°) at time 811 and a minimum value (approximately ⁇ 30°) at time 812 .
- FIG. 9 shows an example of transition 901 of the sole angle (pitch) and transition 902 of the rotation angle (roll) with respect to the walking direction at each point in time when a patient with a leg disorder or the like walks.
- the sole angle has a maximum value (approximately 10°) at time 911 and a minimum value (approximately ⁇ 5°) at time 912 . Therefore, in the case of the transition of the sole angle as shown in FIG. 9, it is determined that the detection unit 12 cannot detect the walking of the user.
- step S2 If the user's walking can be detected (YES in step S2), the process proceeds to step S5. On the other hand, if the walking of the user cannot be detected (NO in step S2), the detection unit 12 adjusts the threshold and the like (step S3). As a result, walking can be appropriately detected even if the user's sole angle changes less than that of a healthy person due to, for example, an injured leg, swaying due to illness, or weakening of the legs due to aging. .
- the detection unit 12 updates the threshold value to a value that is more moderate than the current threshold value. In this case, the detection unit 12 may use a value obtained by multiplying the current threshold value by a predetermined coefficient (for example, 0.8) as the threshold value in subsequent processing.
- a predetermined coefficient for example, 0.8
- the detection unit 12 may determine the second threshold value based on the attributes of the user wearing the sensor 21 . Thereby, for example, the threshold value for detecting walking can be adjusted faster.
- the detection unit 12 of the measurement device 20 may receive information indicating user attributes specified by the user from the user terminal 30 . Then, the detection unit 12 determines a value that is preset (registered) in association with the user's attributes, including gender, age, degree of foot injury, disease, etc., as the second threshold value. It may be set as an initial value.
- the detection unit 12 may determine the second threshold based on at least one of the acceleration and angular velocity measured by the sensor 21 . Thereby, for example, the threshold value for detecting walking can be adjusted faster.
- the detection unit 12 uses, for example, values preset in association with the maximum and minimum values of the acceleration in the walking direction and the maximum and minimum values of the sole angle as the initial values of the second threshold. May be set.
- the detection unit 12 may determine the second threshold value based on at least one of the heart rate and skin temperature of the user wearing the sensor 21 . Thereby, for example, walking can be detected more appropriately according to the user's state such as emotion.
- the acquisition unit 11 may acquire, for example, information indicating the user's emotion estimated based on the heartbeat and skin temperature measured by a wearable device or the like worn by the user. Then, the detection unit 12 may set a value preset (registered) in association with the user's emotion as the initial value of the second threshold. Further, the detection unit 12 may record, for each user's emotion, the value of the second threshold adjusted so that the user's walking can be detected. Then, the detection unit 12 may measure the length of time during which the current user walks one step, using the second threshold value according to the user's current emotion.
- the detection unit 12 determines whether or not the walking of the user can be detected based on the adjusted threshold value (second threshold value) and the information measured by the sensor 21 (step S4).
- the detection unit 12 changes the sampling frequency (sampling rate) of the analog signal indicating the sole angle measured by the sensor 21 from the first sampling frequency (for example, 100 Hz) for healthy subjects to the second sampling frequency. (eg, 150 Hz).
- the detection unit 12 may measure the length of time during which the user walks one step based on the sole angle sampled at the second sampling frequency and the second threshold. As a result, for example, when the user has a leg disorder or the like, the length of one step is relatively short, and the length of time during which the user takes one step is relatively short. can be measured.
- the detection unit 12 may measure the length of time during which the user walks one step based on the sole angle, the second threshold, and the acceleration in the walking direction of the user measured by the sensor 21 . .
- the detection unit 12 first detects a representative value (for example, average value, median value, mode) may be calculated.
- the detection unit 12 determines the representative value as the time during which the user walks one step. may be calculated as length.
- FIG. 10 shows changes in acceleration 1001 in the direction opposite to the walking direction ( ⁇ Y direction) and changes in acceleration 1002 in the vertically upward direction (Z direction) at the same times as in FIG. 8 when a healthy person walks.
- the time point 811 when the acceleration in the Y direction reaches its maximum value and the time point 811 when the sole angle reaches its maximum value coincide, and the time point when the acceleration in the Y direction reaches its minimum value and the time point 811 when the sole angle reaches its minimum value.
- FIG. 11 also shows transition 1101 of acceleration in the direction opposite to the walking direction ( ⁇ Y direction) and vertical upward direction (Z direction) at the same time points as in FIG. ) and an acceleration transition 1103 in the X direction are shown.
- it is the -Y direction (the sign of the Y direction is opposite). Therefore, the maximum value of the acceleration in the walking direction (Y direction) is about 2G with the sign of the value 1111 reversed, and the minimum value is about -3.5G with the value 1112 with the sign reversed.
- the sign of the acceleration in the Y direction changes from positive to negative at time 1121 .
- FIG. 11 shows transition 1101 of acceleration in the direction opposite to the walking direction ( ⁇ Y direction) and vertical upward direction (Z direction) at the same time points as in FIG. ) and an acceleration transition 1103 in the X direction are shown.
- the -Y direction the sign of the Y direction is opposite. Therefore, the maximum value of the acceleration in the walking direction (Y direction) is about 2G with the sign of the value 1111 reverse
- the detection unit 12 can more appropriately measure the length of time during which the user walks one step by using information such as the point in time when the sign of acceleration in the Y direction changes from positive to negative.
- the detection unit 12 may also measure the length of time during which the user walks one step based on the sole angle, the second threshold, and the angular velocity of the sole angle. As a result, for example, even if the peak of the sole angle is smooth and there are variations in the points of extreme values in the motion of each step, the length of time during which the user walks one step can be determined more appropriately. can be measured. In this case, the detection unit 12 may determine, for example, the point in time when the sign of the angular velocity of the user's sole angle measured by the sensor 21 changes from positive to negative as the point in time when the sole angle reaches the maximum value. Further, the detection unit 12 may determine, for example, the point in time when the sign of the angular velocity of the user's sole angle measured by the sensor 21 changes from negative to positive as the point in time when the sole angle becomes the minimum value.
- FIG. 12 shows an example of changes 1201 in angular velocity of the sole angle (pitch), changes in roll angular velocity 1202, and changes in yaw angular velocity 1203 at the same time points as in FIG. 8 when a healthy person walks. It is shown.
- the point in time when the sign of the angular velocity of the user's sole angle changes from positive to negative and the point in time 811 when the sole angle reaches its maximum value are approximately the same, and the sign of the angular velocity of the user's sole angle is The point of time when the angle changes from negative to positive and the point of time 812 when the sole angle becomes the minimum value are also roughly the same.
- FIG. 13 also shows transition 1301 of the angular velocity of the sole angle (pitch), transition 1302 of the angular velocity of roll, and angular velocity 1302 of the yaw at the same time points as in FIG. An example is shown with the transition 1303 of .
- the point in time when the sign of the angular velocity of the user's sole angle changes from positive to negative coincides with the point 911 when the sole angle reaches its maximum value.
- the point at which the angle changes from negative to positive and the point 912 at which the sole angle becomes the minimum value are also roughly the same.
- step S4 the process proceeds to step S3.
- the detection unit 12 calculates information about the user's gait (step S5).
- the detection unit 12 detects the distance while the user walks one step (for example, from when one foot is lifted off the ground to when it is put down again). may be measured. Then, the detection unit 12 may calculate the walking speed, stride length, leg lift height, outer turning distance, etc. based on the acceleration measured by the sensor 21 during the length of time during which the user walks one step.
- the detection unit 12 may calculate the contact angle and the take-off angle based on the angular velocity measured by the sensor 21 during the length of time during which the user walks one step.
- the ground contact angle may be the angle of the walking direction between the sole of the foot and the ground when the foot touches the ground.
- the take-off angle may be the angle in the walking direction between the sole and the ground when the foot leaves the ground.
- the output unit 13 outputs the information regarding the user's gait calculated by the detection unit 12 (step S6), and ends the process.
- the server 40 can transmit advice for improving gait, training videos, and the like to the user terminal 30 .
- the information processing device 10 may be a device included in one housing, but the information processing device 10 of the present disclosure is not limited to this.
- Each unit of the information processing apparatus 10 may be implemented by cloud computing configured by one or more computers, for example.
- each part of the information processing device 10 may be realized by a plurality of devices out of the measurement device 20, the user terminal 30, and the server 40, for example.
- the information processing device 10 such as these is also included in an example of the "information processing device" of the present disclosure.
- (Appendix 1) Acquisition means for acquiring information based on a sensor worn on a user's foot; When the walking of the user cannot be detected based on the information indicating the angle of the walking direction between the sole and the ground acquired by the acquiring means and the first threshold, the information indicating the angle and the first detection means for detecting walking of the user based on a second threshold lower than the threshold; an output means for outputting information based on the detection result by the detection means; An information processing device. (Appendix 2) The acquisition means acquires information based on the sensor worn at any position between the arch of the user's foot and the heel. The information processing device according to appendix 1.
- the detection means determines the second threshold based on the attributes of the user. 7.
- the information processing device according to any one of appendices 1 to 6.
- the detection means determines the second threshold value based on at least one of acceleration and angular velocity measured by the sensor.
- the information processing device according to any one of appendices 1 to 7.
- the detection means determines the second threshold based on at least one of heart rate and skin temperature of the user.
- the information processing apparatus according to any one of appendices 1 to 8.
- information processing system 10 information processing device 11 acquisition unit 12 detection unit 13 output unit 20 measurement device 21 sensor 22 control device 23 communication device 30 user terminal 40 server
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Physiology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Signal Processing (AREA)
- Geometry (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
(実施の形態1)
<構成>
図1を参照し、実施形態に係る情報処理装置10の構成について説明する。図1は、実施形態に係る情報処理装置10の構成の一例を示す図である。情報処理装置10は、取得部11、検出部12、及び出力部13を有する。これら各部は、情報処理装置10にインストールされた1以上のプログラムと、情報処理装置10のプロセッサ101、及びメモリ102等のハードウェアとの協働により実現されてもよい。
次に、図2を参照し、実施形態に係る情報処理システム1の構成について説明する。
<システム構成>
図2は、実施形態に係る情報処理システム1の構成例を示す図である。図2の例では、情報処理システム1は、測定装置20A、及び測定装置20B(以下で、区別する必要がない場合は、単に、「測定装置20」とも称する。)を有する。また、情報処理システム1は、ユーザ端末30、及びサーバ40を有する。なお、測定装置20、ユーザ端末30、及びサーバ40の数は図2の例に限定されない。なお、測定装置20、ユーザ端末30、及びサーバ40は、それぞれ、情報処理装置10の一例である。
<ハードウェア構成>
<測定装置20について>
次に、図7から図13を参照し、実施形態に係る検出処理の一例について説明する。図7は、実施形態に係る検出処理の一例を示すフローチャートである。図8は、実施形態に係るセンサ21により測定される、健常者が歩行する際の各時点におけるピッチ及びロールの推移の一例を示す図である。図9は、実施形態に係るセンサ21により測定される、患者が歩行する際の各時点におけるピッチ及びロールの推移の一例を示す図である。図10は、実施形態に係るセンサ21により測定される、健常者が歩行する際の各時点における加速度の推移の一例を示す図である。図11は、実施形態に係るセンサ21により測定される、患者が歩行する際の各時点における加速度の推移の一例を示す図である。図12は、実施形態に係るセンサ21により測定される、健常者が歩行する際の各時点における角速度の推移の一例を示す図である。図13は、実施形態に係るセンサ21により測定される、患者が歩行する際の各時点における角速度の推移の一例を示す図である。
情報処理装置10は、一つの筐体に含まれる装置でもよいが、本開示の情報処理装置10はこれに限定されない。情報処理装置10の各部は、例えば1以上のコンピュータにより構成されるクラウドコンピューティングにより実現されていてもよい。また、情報処理装置10の各部は、例えば、測定装置20、ユーザ端末30、及びサーバ40のうちの複数の装置により実現されてもよい。これらのような情報処理装置10についても、本開示の「情報処理装置」の一例に含まれる。
(付記1)
ユーザの足に装着されるセンサに基づく情報を取得する取得手段と、
前記取得手段により取得された前記足の裏と地面との歩行方向の角度を示す情報と、第1閾値とに基づいて前記ユーザの歩行を検出できない場合、前記角度を示す情報と、前記第1閾値よりも低い第2閾値とに基づいて前記ユーザの歩行を検出する検出手段と、
前記検出手段による検出結果に基づく情報を出力させる出力手段と、
を有する、情報処理装置。
(付記2)
前記取得手段は、前記ユーザの足の土踏まずから踵までの間のいずれかの位置に装着される前記センサに基づく情報を取得する、
付記1に記載の情報処理装置。
(付記3)
前記検出手段は、外部装置から所定のコマンドを受信した際に前記第1閾値に基づいて前記ユーザの歩行を検出できない場合、前記第2閾値に基づいて前記ユーザの歩行を検出する、
付記1または2に記載の情報処理装置。
(付記4)
前記検出手段は、前記センサにより測定される加速度と角速度との少なくとも一方が閾値以上である際に前記第1閾値に基づいて前記ユーザの歩行を検出できない場合、前記第2閾値に基づいて前記ユーザの歩行を検出する、
付記1から3のいずれか一項に記載の情報処理装置。
(付記5)
前記検出手段は、第1標本化周波数で標本化された前記角度を示す情報と、前記第1閾値とに基づいて前記ユーザの歩行を検出できない場合、前記第1標本化周波数よりも高い第2標本化周波数で標本化された前記角度を示す情報と、前記第2閾値とに基づいて前記ユーザの歩行を検出する、
付記1から4のいずれか一項に記載の情報処理装置。
(付記6)
前記検出手段は、前記第1閾値に基づいて前記ユーザの歩行を検出できない場合、前記角度を示す情報と、前記第2閾値と、前記センサにより測定される前記ユーザの歩行方向の加速度と、に基づいて前記ユーザの歩行を検出する、
付記1から5のいずれか一項に記載の情報処理装置。
(付記7)
前記検出手段は、前記ユーザの属性に基づいて、前記第2閾値を決定する、
付記1から6のいずれか一項に記載の情報処理装置。
(付記8)
前記検出手段は、前記センサにより測定される加速度及び角速度の少なくとも一方に基づいて、前記第2閾値を決定する、
付記1から7のいずれか一項に記載の情報処理装置。
(付記9)
前記検出手段は、前記ユーザの心拍及び皮膚温度の少なくとも一方に基づいて、前記第2閾値を決定する、
付記1から8のいずれか一項に記載の情報処理装置。
(付記10)
ユーザの足に装着されるセンサに基づく情報を取得し、
取得した前記足の裏と地面との歩行方向の角度を示す情報と、第1閾値とに基づいて前記ユーザの歩行を検出できない場合、前記角度を示す情報と、前記第1閾値よりも低い第2閾値とに基づいて前記ユーザの歩行を検出し、
検出結果に基づく情報を出力させる、
情報処理方法。
(付記11)
情報処理装置に、
ユーザの足に装着されるセンサに基づく情報を取得する処理と、
取得した前記足の裏と地面との歩行方向の角度を示す情報と、第1閾値とに基づいて前記ユーザの歩行を検出できない場合、前記角度を示す情報と、前記第1閾値よりも低い第2閾値とに基づいて前記ユーザの歩行を検出する処理と、
検出結果に基づく情報を出力させる処理と、
を実行させるプログラムが格納された非一時的なコンピュータ可読媒体。
10 情報処理装置
11 取得部
12 検出部
13 出力部
20 測定装置
21 センサ
22 制御装置
23 通信装置
30 ユーザ端末
40 サーバ
Claims (11)
- ユーザの足に装着されるセンサに基づく情報を取得する取得手段と、
前記取得手段により取得された前記足の裏と地面との歩行方向の角度を示す情報と、第1閾値とに基づいて前記ユーザの歩行を検出できない場合、前記角度を示す情報と、前記第1閾値よりも低い第2閾値とに基づいて前記ユーザの歩行を検出する検出手段と、
前記検出手段による検出結果に基づく情報を出力させる出力手段と、
を有する、情報処理装置。 - 前記取得手段は、前記ユーザの足の土踏まずから踵までの間のいずれかの位置に装着される前記センサに基づく情報を取得する、
請求項1に記載の情報処理装置。 - 前記検出手段は、外部装置から所定のコマンドを受信した際に前記第1閾値に基づいて前記ユーザの歩行を検出できない場合、前記第2閾値に基づいて前記ユーザの歩行を検出する、
請求項1または2に記載の情報処理装置。 - 前記検出手段は、前記センサにより測定される加速度と角速度との少なくとも一方が閾値以上である際に前記第1閾値に基づいて前記ユーザの歩行を検出できない場合、前記第2閾値に基づいて前記ユーザの歩行を検出する、
請求項1から3のいずれか一項に記載の情報処理装置。 - 前記検出手段は、第1標本化周波数で標本化された前記角度を示す情報と、前記第1閾値とに基づいて前記ユーザの歩行を検出できない場合、前記第1標本化周波数よりも高い第2標本化周波数で標本化された前記角度を示す情報と、前記第2閾値とに基づいて前記ユーザの歩行を検出する、
請求項1から4のいずれか一項に記載の情報処理装置。 - 前記検出手段は、前記第1閾値に基づいて前記ユーザの歩行を検出できない場合、前記角度を示す情報と、前記第2閾値と、前記センサにより測定される前記ユーザの歩行方向の加速度と、に基づいて前記ユーザの歩行を検出する、
請求項1から5のいずれか一項に記載の情報処理装置。 - 前記検出手段は、前記ユーザの属性に基づいて、前記第2閾値を決定する、
請求項1から6のいずれか一項に記載の情報処理装置。 - 前記検出手段は、前記センサにより測定される加速度及び角速度の少なくとも一方に基づいて、前記第2閾値を決定する、
請求項1から7のいずれか一項に記載の情報処理装置。 - 前記検出手段は、前記ユーザの心拍及び皮膚温度の少なくとも一方に基づいて、前記第2閾値を決定する、
請求項1から8のいずれか一項に記載の情報処理装置。 - ユーザの足に装着されるセンサに基づく情報を取得し、
取得した前記足の裏と地面との歩行方向の角度を示す情報と、第1閾値とに基づいて前記ユーザの歩行を検出できない場合、前記角度を示す情報と、前記第1閾値よりも低い第2閾値とに基づいて前記ユーザの歩行を検出し、
検出結果に基づく情報を出力させる、
情報処理方法。 - 情報処理装置に、
ユーザの足に装着されるセンサに基づく情報を取得する処理と、
取得した前記足の裏と地面との歩行方向の角度を示す情報と、第1閾値とに基づいて前記ユーザの歩行を検出できない場合、前記角度を示す情報と、前記第1閾値よりも低い第2閾値とに基づいて前記ユーザの歩行を検出する処理と、
検出結果に基づく情報を出力させる処理と、
を実行させるプログラムが格納された非一時的なコンピュータ可読媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023516992A JPWO2022230164A5 (ja) | 2021-04-30 | 情報処理装置、情報処理方法、及びプログラム | |
PCT/JP2021/017158 WO2022230164A1 (ja) | 2021-04-30 | 2021-04-30 | 情報処理装置、情報処理方法、及びコンピュータ可読媒体 |
US18/287,421 US20240197206A1 (en) | 2021-04-30 | 2021-04-30 | Information processing apparatus, information processing method, and computer-readable medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/017158 WO2022230164A1 (ja) | 2021-04-30 | 2021-04-30 | 情報処理装置、情報処理方法、及びコンピュータ可読媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022230164A1 true WO2022230164A1 (ja) | 2022-11-03 |
Family
ID=83848165
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/017158 WO2022230164A1 (ja) | 2021-04-30 | 2021-04-30 | 情報処理装置、情報処理方法、及びコンピュータ可読媒体 |
Country Status (2)
Country | Link |
---|---|
US (1) | US20240197206A1 (ja) |
WO (1) | WO2022230164A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002360549A (ja) * | 2001-06-13 | 2002-12-17 | Hirose Electric Co Ltd | 運動量測定装置 |
CN105698815A (zh) * | 2016-03-24 | 2016-06-22 | 广东欧珀移动通信有限公司 | 计步数据的调节方法及装置 |
CN111189469A (zh) * | 2019-12-31 | 2020-05-22 | 歌尔科技有限公司 | 计步方法、终端设备及存储介质 |
WO2020194598A1 (ja) * | 2019-03-27 | 2020-10-01 | 日本電気株式会社 | 歩行判別装置、歩行判別方法、およびプログラム記録媒体 |
CN111757232A (zh) * | 2019-03-29 | 2020-10-09 | 索诺瓦公司 | 针对听力设备用户的基于加速度计的行走检测参数优化 |
WO2020230282A1 (ja) * | 2019-05-15 | 2020-11-19 | 日本電気株式会社 | 判定装置、判定方法、およびプログラム記録媒体 |
-
2021
- 2021-04-30 WO PCT/JP2021/017158 patent/WO2022230164A1/ja active Application Filing
- 2021-04-30 US US18/287,421 patent/US20240197206A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002360549A (ja) * | 2001-06-13 | 2002-12-17 | Hirose Electric Co Ltd | 運動量測定装置 |
CN105698815A (zh) * | 2016-03-24 | 2016-06-22 | 广东欧珀移动通信有限公司 | 计步数据的调节方法及装置 |
WO2020194598A1 (ja) * | 2019-03-27 | 2020-10-01 | 日本電気株式会社 | 歩行判別装置、歩行判別方法、およびプログラム記録媒体 |
CN111757232A (zh) * | 2019-03-29 | 2020-10-09 | 索诺瓦公司 | 针对听力设备用户的基于加速度计的行走检测参数优化 |
WO2020230282A1 (ja) * | 2019-05-15 | 2020-11-19 | 日本電気株式会社 | 判定装置、判定方法、およびプログラム記録媒体 |
CN111189469A (zh) * | 2019-12-31 | 2020-05-22 | 歌尔科技有限公司 | 计步方法、终端设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
US20240197206A1 (en) | 2024-06-20 |
JPWO2022230164A1 (ja) | 2022-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11678811B2 (en) | Contextual heart rate monitoring | |
Muller et al. | Experimental evaluation of a novel inertial sensor based realtime gait phase detection algorithm | |
KR102102931B1 (ko) | 치매 예측 시스템 | |
EP3355783A1 (en) | Wearable and connected gait analytics system | |
US20180092572A1 (en) | Gathering and Analyzing Kinetic and Kinematic Movement Data | |
Kanzler et al. | Inertial sensor based and shoe size independent gait analysis including heel and toe clearance estimation | |
US20200300884A1 (en) | Analysing movement of a subject | |
WO2021140658A1 (ja) | 異常検出装置、判定システム、異常検出方法、およびプログラム記録媒体 | |
US20220240814A1 (en) | Method and system for determining a value of an advanced biomechanical gait parameter | |
CN112617807A (zh) | 一种预防和解除帕金森病患者冻结步态的装置和方法 | |
US20220183588A1 (en) | Gait cycle determination system, gait cycle determination method, and program storage medium | |
US20200126446A1 (en) | Gait Teaching System and Gait Teaching Method | |
US20220260609A1 (en) | Determination device, determination method, and program recording medium | |
WO2022230164A1 (ja) | 情報処理装置、情報処理方法、及びコンピュータ可読媒体 | |
US20230040492A1 (en) | Detection device, detection system, detection method, and program recording medium | |
Boutaayamou et al. | Validated extraction of gait events from 3D accelerometer recordings | |
Selvaraj et al. | Stair fall risk detection using wearable sensors | |
EP3991157B1 (en) | Evaluating movement of a subject | |
US11564439B2 (en) | System and method for determining foot strike pattern | |
US20230009480A1 (en) | Estimation device, estimation system, estimation method, and program recording medium | |
US20240148335A1 (en) | Estimation apparatus, estimation method, and non-transitory computer-readable recording medium | |
KR20200016655A (ko) | 개인별 특성을 감안한 맞춤형 보행패턴 측정장치 및 측정방법 | |
JP2023092150A (ja) | 情報処理装置、情報処理方法、プログラム、情報処理システム、及び生成方法 | |
US20220000430A1 (en) | Determination apparatus, sensor apparatus, determination method, and non-transitory computer-readable recording medium | |
US20240115162A1 (en) | Calculation device, calculation method, and program recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21939322 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18287421 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023516992 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21939322 Country of ref document: EP Kind code of ref document: A1 |