WO2010089676A1 - Determining a sensor alignment - Google Patents

Determining a sensor alignment Download PDF

Info

Publication number
WO2010089676A1
WO2010089676A1 PCT/IB2010/050307 IB2010050307W WO2010089676A1 WO 2010089676 A1 WO2010089676 A1 WO 2010089676A1 IB 2010050307 W IB2010050307 W IB 2010050307W WO 2010089676 A1 WO2010089676 A1 WO 2010089676A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
sensor
body part
person
posture
Prior art date
Application number
PCT/IB2010/050307
Other languages
French (fr)
Inventor
Victor M. G. Van Acht
Edwin G. J. M. Bongers
Kai Eck
Gerd Lanfermann
Original Assignee
Philips Intellectual Property & Standards Gmbh
Koninklijke Philips Electronics N.V.
Lambert, Nicolaas
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Philips Intellectual Property & Standards Gmbh, Koninklijke Philips Electronics N.V., Lambert, Nicolaas filed Critical Philips Intellectual Property & Standards Gmbh
Publication of WO2010089676A1 publication Critical patent/WO2010089676A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0223Operational features of calibration, e.g. protocols for calibrating sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/1036Measuring load distribution, e.g. podologic studies
    • A61B5/1038Measuring plantar pressure during gait
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6823Trunk, e.g., chest, back, abdomen, hip
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6828Leg

Definitions

  • the invention relates to the field of determining a sensor alignment with respect to a body part of a person.
  • a posture and body movement measuring system includes a feedback mechanism to provide information or instruction in response to time dependent output from a sensing device attached to a person for evaluating the motion of the person.
  • the posture of the person such as standing, sitting, and lying down, is detected and distinguished.
  • the user may tare or initialize the device at a desired position. For example, the user stands in front of a mirror to better view his or her own posture, and, once a desirable physical appearance or a comfortable posture, or both, is achieved, the user initializes or tares the unit.
  • physiotherapy is very labor intensive, as a physiotherapist typically can treat only one patient at the time. This makes physiotherapy very expensive, resulting in only a limited number of reimbursed treatments for the patient. The outcome for the patient was improved if he could do more or longer physiotherapy exercises, but the costs of the physiotherapist being present all the time prevents this. It would be desirable to provide a method, the system or a device to increase the efficiency of physiotherapists, and thereby reducing costs of the healthcare system to society. It would be desirable to provide a method, system or device that enables the patient to execute physiotherapy exercises with less guidance from the therapist. For example, the patient may wear one or more position and/or orientation sensors that measure his movements.
  • the movements may be evaluated, for example by a computer.
  • the method, system or device may give a real-time feedback to the patient to motivate the patient to perform the exercise in the correct way.
  • progress reports may be generated for the therapist off-line.
  • the therapist may guide 5 to 10 patients simultaneously working with such a system.
  • position and/or orientation sensors may be attached to body parts of the patient.
  • an explicit alignment procedure in which the patient is asked to stand in a particular predefined posture is very obtrusive and not patient-friendly.
  • a method of determining a sensor alignment with respect to a body part of a person comprises the steps of: recording data from at least one sensor attached to a body part of the person, the data comprising position data and/or orientation data; detecting a repetitive motion of said body part based on the recorded data; recognizing at least one posture of which it is assumed that the person was in it before the repetitive motion, based on the recorded data; determining an alignment of at least one sensor attached to a body part of the person, based on the recorded data associated with said at least one recognized posture.
  • the recorded data comprises position data and/or orientation data, that is, the recorded data comprises at least one of a group consisting of position data and orientation data.
  • detecting a repetitive motion may comprise evaluating at least a section of the recorded data in order to find an approximately periodic data section.
  • recognizing said at least on posture may comprise calculating an average of at least a section of the recorded data.
  • determining said alignment may comprise calculating an average of at least a section of the recorded data.
  • Said calculating and/or evaluating may be performed sensor-wise, that is, individually for respective recorded data from a respective sensor.
  • said calculating and/or evaluating may be performed, for each sensor, for orientation data of respective individual coordinate axes of a local sensor coordinate frame of the sensor.
  • data from more than one sensor may be taken into account.
  • the method allows automatically determining a sensor alignment with respect to a body part of a person.
  • the method may be used with an implicit alignment procedure requiring no interaction with the person.
  • the movements the person is assumed to do prior to an actual exercise for example, going from a standing posture (a first posture) to a posture to do the actual exercise (a second posture) are analyzed, and a sensor alignment is determined from recorded sensor data associated with at least one of said postures.
  • a repetitive exercise is recognized.
  • the system may automatically distinguish an exercising phase from a previous phase in which the person was in a specific posture.
  • the position data and/or orientation data comprises information of a sensor position and a sensor orientation, respectively.
  • the data comprises orientation data. This is particularly useful for evaluating physiotherapeutic exercises.
  • the method allows to automatically determine a sensor alignment. For example, the person starts performing an exercise, and it is automatically recognized that he or she has begun exercising. Thus, the person does not have to press a button or the like, and he or she is free to prepare for exercising according to his or her own timing.
  • a sensor alignment procedure has to be executed before a physiotherapy patient can start doing his exercises, during the procedure the patient is typically asked to assume a reference posture, and at that moment the sensor readings are stored.
  • the reference posture is, for example, standing up right with the arms strictly down and the palms of the hands towards the legs.
  • the alignment is then determined from these readings.
  • the recorded data comprises orientation data, which comprises information about a sensor inclination, that is, a sensor orientation relative to true vertical or horizontal orientation.
  • the data comprises information about a tilt or an angle with respect to a horizontal plane.
  • the sensor may comprise an accelerometer for measuring a sensor inclination.
  • the accelerometer is arranged to supply orientation data.
  • the sensor readings may indicate an apparent gravity vector.
  • said orientation data comprise information about an inclination of at least one axis of a local coordinate frame of the sensor.
  • the data may comprise information about a tilt or an angle of said axis with respect to a horizontal plane.
  • the recognizing comprises: determining a section of recorded data indicating substantially stable positions and/or substantially stable orientations; and comparing said section of recorded data to predetermined posture data. That is, said section of recorded data is the recorded data associated with said at least one recognized posture mentioned above. When the person remains in a posture for some time, this may be indicated by substantially stable position and/or orientation sensor readings.
  • said section of recorded data may be compared to predetermined posture data by calculating average position and/or orientation data of said section and comparing the average data to predetermined posture data.
  • determining an alignment of at least one sensor comprises: calculating a difference between predetermined posture data and an average of recorded data associated with said at least one recognized posture. For example, during later procedures such as evaluating an exercise, position and/or orientation data may be corrected taking into account the determined alignment. For example, the calculated difference may be subtracted from the sensor readings in order to calculate the orientation of the body part from the orientation of the sensor.
  • the recording step comprises: recording data from at least a first sensor attached to a first body part of the person and a second sensor attached to a second body part of the person, the data from each of the respective sensors comprising position data and/or orientation data; and, in the step of recognizing a posture, the recognizing is based on the recorded data from at least the first and the second sensor.
  • a posture may be recognized from a relative position and/or orientation of at least a first and a second body part.
  • the automatic recognizing of one or more postures allows using data from more than one sensor for determining the alignment of at least two of them.
  • a reliable determining of the alignment is achieved, and determining the alignment is less prone to errors.
  • the alignment may be determined from a first or a second posture, or from a combined evaluation of postures.
  • data from different postures and/or transitions between postures may be taken into account in determining the alignment.
  • detecting the repetitive motion comprises: calculating an auto correlation of recorded data, that is, in particular, an auto correlation function.
  • an auto correlation is calculated for at least one section of the recorded data.
  • the first maximum at none-zero lag of the auto correlation function is evaluated.
  • said first maximum is compared to the maximum at zero lag.
  • an auto correlation of recorded data of at least one sensor may be calculated independently from the recorded data of other sensors.
  • the respective sensor may be selected depending on a predetermined type of exercise.
  • the step of determining an alignment is also based on predefined alignment information.
  • the predefined alignment information may comprise information about an approximate alignment of a sensor.
  • an approximate alignment may be known from the configuration of attachment means for attaching a sensor to the body part.
  • the senor is attached to the body part such that a rotation about a first axis of a local coordinate frame of the sensor with respect to the body part is prevented.
  • determining the alignment is simplified.
  • the determining is also based on recorded data associated with said repetitive motion. For example, maximum or minimum values of position and/or orientation data may be taken into account.
  • a device for interpreting movement of a body part of a person comprising:
  • an interpreter unit adapted for receiving data from at least one sensor attached to a body part of the person, the data comprising position data and/or orientation data; wherein the interpreter unit is adapted for: - detecting a repetitive motion of said body part based on the recorded data;
  • the interpreter unit is adapted for executing a method of determining a sensor alignment with respect to a body part of a person, as described above.
  • the interpreter unit is adapted for receiving data from said at least one sensor, the data comprising orientation data, which comprises information about a sensor inclination.
  • the orientation data may comprise information about an inclination of at least one axis of a local coordinate frame of the sensor.
  • said device further comprises at least one sensor (10; 12) for being attached to a body part of the person.
  • the device comprises at least a first sensor for being attached to a first body part of the person and a second sensor for being attached to a second body part of the person, the data from each of the respective sensors comprising position data and/or orientation data; and the interpreter unit is adapted for recognizing at least one posture of which it is assumed that the person was in it before the repetitive motion, based on the recorded data from at least the first and the second sensor.
  • the device is a physiotherapy monitoring device.
  • the device is an automatic physiotherapy exerciser device or tool.
  • Fig. 1 and Fig. 2 schematically show the definitions of coordinate frames in one embodiment of the invention
  • Fig. 3 schematically shows a sensor attached coordinate frame that is different form a coordinate frame attached to a body part
  • Fig. 4 schematically shows a patient doing a physiotherapy exercise
  • Fig. 5 schematically shows the patient beginning with the physiotherapy exercise
  • Fig. 6 schematically shows recorded output data of orientation sensors
  • Fig. 7 is a diagram illustrating a method of determining a sensor alignment in accordance with an embodiment of the invention.
  • Fig. 1 schematically shows a human person and a device for interpreting movement of body parts of the person, such as a physical monitoring device, e.g. an automatic physiotherapy exerciser device.
  • the person is, for example, a physiotherapy patient.
  • a first orientation sensor 10 and a second orientation sensor 12 which are, for example, attached to the chest and the left upper leg, respectively.
  • the sensors are attached to the body using textile elastic straps 14.
  • the person can easily attach the sensors 10, 12 to his own chest, lower leg or other body part, the movement of which is to be interpreted.
  • the sensors 10, 12 each comprise, for example, a 3D-accelerometer for measuring its inclination, that is, a tilt with respect to gravity.
  • orientation data expressing the tilt with respect to the horizontal plane is output to an interpreter unit 16, for example, through a radio signal.
  • the interpreter unit 16, which, for example, includes a computer, receives the signal or orientation data and interprets the orientation data as will be described below with reference to Figs. 6 and 7, in order to determine the alignment of the sensors 10, 12 with respect to the body parts of the patient.
  • the involved coordinate systems or local coordinate frames may be defined as follows.
  • T 1 T 1 T 1 T 1 T 1 T 1 T 1 T 1 ⁇ l T 1 ⁇ l T 1 ⁇ l system ( x, y, z) is attached to the torso.
  • a torso sensor coordinate system ( x, y, z) is attached to the first sensor 10.
  • An upper leg coordinate system ( UL x, UL y, UL z) is attached to the left upper leg, and an upper leg sensor coordinate system ( x, y, z) is attached to the second sensor.
  • the coordinate systems are chosen such that when the sensors 10, 12 are perfectly aligned with their respective body parts and the person is standing up perfectly straight, the x-axes point down, the z-axes point in the forward direction, and the y- axes are chosen such that right hand sided coordinate systems are obtained. Thus, they point to the left side of the patient.
  • Fig. 2 exemplarily shows the first sensor 10 and its local coordinate frame, denoted as (x, y, z).
  • Rx, Ry, and Rz denote rotations about the x, y, and z-axis, respectively.
  • the sensor alignment that is, the alignment between the local coordinate frame of the respective sensor and the coordinate frame attached to the respective body part.
  • Fig. 3 exemplarily shows a side view of the left upper leg and the sensor 12 having respective local coordinate frames, which are differently orientated.
  • the sensor alignment may be determined automatically, as will be described in the following.
  • Fig. 4 schematically shows the patient doing a typical physiotherapy exercise such as the "bird dog".
  • the patient stands on his left hand and right knee, while stretching his right arm and left leg in a horizontal position.
  • Fig. 5 shows the patient on hands and knees.
  • Fig. 6 schematically illustrates recorded output signals of the sensors 10, 12 when the patient begins exercising the "bird dog" exercise.
  • the signals are shown for illustrating purposes and do not correspond to actual measurements.
  • the graph Tx shows the inclination of the ⁇ s x-axis of the torso sensor 10 over time with respect to a horizontal plane.
  • the graph Ty shows the inclination of the axis ⁇ s y with respect to a horizontal plane.
  • the graphs Hx and Hy show, in an analogous manner, the inclinations of the axes us x and us y of the upper leg sensor coordinate system with respect to a horizontal plane.
  • the orientation data shown in Fig. 6 is divided into sections, as is indicated by vertical dashed lines. Average data of specific data sections are indicated by horizontal, dashed lines.
  • Tx is not exactly vertical when the torso is vertical e.g. due to misalignment of the sensor 10 and the chest.
  • the average orientation data Tx during section 20 is indicated as A in Fig. 6.
  • Hx is in practice not 90° e.g. due to misalignment and the thighs of the patient having fat and muscle tissue.
  • the average orientation data Hx during section 20 is indicated as D in Fig. 6.
  • the patient is getting on hands and knees preparing for the exercise.
  • the orientation signals strongly depend on how a person is getting on the floor on hands and knees.
  • the recorded orientation data shown in Fig. 6 correspond to the activity of the patient first standing in front of, e.g. a physiotherapy exerciser device, then getting on hands and knees, and, afterwards, beginning the "bird-dog" exercise.
  • the interpreter unit 16 determines the alignment of the sensors 10, 12, for example, according to a method executed on the computer, as illustrated in Fig. 7.
  • step SlO the recording of movements, that is, the recording of orientation data received from sensors 10 and 12, is started.
  • the orientation data shown in Fig. 6 are recorded throughout the following procedure.
  • step S 12 the interpreter unit 16 waits until an exercising of the patient has been detected.
  • an auto correlation of the recorded Hx data is calculated on a section of, for example, a few seconds length.
  • said section may correspond to the latest few seconds of orientation data received from the respective sensor.
  • an auto correlation is calculated repeatedly.
  • the latest data section may be taken into account, and a new auto correlation function is calculated of said latest data section e.g. having a fixed length.
  • a repetitive motion may be detected, based on the auto correlation function, as follows. If the amplitude of the first maximum of the auto correlation function for non-zero lag is only slightly smaller than the value at zero lag, then the movements of the respective sensor coordinate frame axis are highly repetitive, and, thus, it is likely that the person is exercising. When, however, the amplitude of this first maximum is only small compared to the value at zero lag, there is only very little repetition or similarity in the data, and, thus, it is unlikely that the patient is exercising. For example, the auto correlation function of Hx during sections 20, 22 and 24 will indicate that there is only very little repetition, whereas the auto correlation function of the data of section 26 will indicate a highly repetitive signal.
  • the autocorrelation of specific sensor signals corresponding to orientations of specific sensor coordinate frame axis may be taken into account.
  • the autocorrelation function of the signal Hx may be calculated.
  • further sensor signals may be taken into account, and the autocorrelation function may be calculated in more than one dimension.
  • orientations may be represented as quaternions, and the auto correlation may be the quaternion auto correlation.
  • the length of the data section, of which the auto correlation is calculated preferably is long enough to include at least one repetition of a movement.
  • step S 12 When the data of section 26 has been recorded and analyzed in step S 12, it is determined from the repetitive orientation data that the patient is currently exercising.
  • step S 14 the previously recorded movements or orientation data are analyzed as follows.
  • step S 14 data sections of substantially stable orientations are identified in those data corresponding to the time before the repetitive motion.
  • sections 20 and 24 may be identified is data sections of substantially stable orientation data.
  • the sections of substantially stable orientation data correspond to respective postures of the patient. For example, for the "bird dog" exercise, it is assumed that the patient is on his hands and knees before actually starting the exercising. Thus, the patient has been on hands and knees in section 24. At the beginning of the data recording, the patient is, for example, assumed to be standing upright in section 20.
  • the average orientation data B, C and E during section 24 are then used for determining the alignment in step S 16. That is, the sensor alignment is determined based on the differences between the values B, C, E and the assumed orientations of the respective body parts in this posture. If the sensor 10 was perfectly aligned with the torso coordinate system and the torso was horizontal while the patient is on hands and knees, B would be equal to zero, and C would be equal to zero and, for example, if the sensor 12 was perfectly aligned with the upper leg coordinate system and the upper leg was vertical while the patient is on his hands and knees, the value of E would be equal to 90°.
  • the alignment for the z-axis of a sensor is ideal and equal to zero, when the sensor is mounted to the body such that a rotation of the sensor around its local z-axis is prevented. This is, for example, achieved by mounting the sensors 10, 12 with the textile straps 14.
  • the alignment determined in step S 16 may further be used by the interpreter unit 16 for monitoring and/or evaluating the exercising of the patient. The necessary calculations for correcting the orientation data based on the determined alignment are within the knowledge of a person skilled in the art.
  • the physiotherapy monitoring device may be configured to receive orientation data from more than one person and determine a sensor alignment for each person individually.
  • orientation data is recorded and interpreted
  • position data may be recorded and interpreted in order to determine a sensor alignment in a similar manner.
  • relative horizontal and/or vertical positions of sensors mounted to respective body parts of a person and, thus, a positional alignment of the sensors with respect to the body parts, may be determined.
  • the invention may also be applied in sports and recreation.
  • the device may be a training monitoring device or an automatic exerciser device.

Abstract

The invention relates to a method of determining a sensor alignment with respect to a body part of a person, and a device for interpreting movement of a body part of a person. A repetitive motion of a body part is detected based on recorded data (Tx; Ty; Hx; Hy) from at least one sensor (10; 12) attached to the body part. Based on the recorded data, at least one posture is recognized, of which it is assumed that the person was in it before the repetitive motion, and an alignment of at least one sensor (10; 12) is determined based on the recorded data associated with said at least one recognized posture.

Description

DETERMINING A SENSOR ALIGNMENT
FIELD OF THE INVENTION
The invention relates to the field of determining a sensor alignment with respect to a body part of a person. BACKGROUND OF THE INVENTION
From US 2005/0126026 Al, a posture and body movement measuring system is known that includes a feedback mechanism to provide information or instruction in response to time dependent output from a sensing device attached to a person for evaluating the motion of the person. The posture of the person, such as standing, sitting, and lying down, is detected and distinguished. For postural control applications, the user may tare or initialize the device at a desired position. For example, the user stands in front of a mirror to better view his or her own posture, and, once a desirable physical appearance or a comfortable posture, or both, is achieved, the user initializes or tares the unit. SUMMARY OF THE INVENTION
Presently, physiotherapy is very labor intensive, as a physiotherapist typically can treat only one patient at the time. This makes physiotherapy very expensive, resulting in only a limited number of reimbursed treatments for the patient. The outcome for the patient was improved if he could do more or longer physiotherapy exercises, but the costs of the physiotherapist being present all the time prevents this. It would be desirable to provide a method, the system or a device to increase the efficiency of physiotherapists, and thereby reducing costs of the healthcare system to society. It would be desirable to provide a method, system or device that enables the patient to execute physiotherapy exercises with less guidance from the therapist. For example, the patient may wear one or more position and/or orientation sensors that measure his movements. The movements may be evaluated, for example by a computer. The method, system or device may give a real-time feedback to the patient to motivate the patient to perform the exercise in the correct way. Furthermore, progress reports may be generated for the therapist off-line. Thus, for example, the therapist may guide 5 to 10 patients simultaneously working with such a system.
In guiding physiotherapy patients on how to do their exercises in the right way, position and/or orientation sensors may be attached to body parts of the patient. In order to better evaluate what movements the patient is making, it is often desirable to know the alignment of the motion sensor(s) to the body part(s) of the patient. However, an explicit alignment procedure in which the patient is asked to stand in a particular predefined posture is very obtrusive and not patient-friendly. Thus, it would be desirable to dispense with an explicit alignment procedure. It would be desirable to determine the sensor alignment automatically.
To better address one or more of these concerns, in a first aspect of the invention, a method of determining a sensor alignment with respect to a body part of a person is provided, which comprises the steps of: recording data from at least one sensor attached to a body part of the person, the data comprising position data and/or orientation data; detecting a repetitive motion of said body part based on the recorded data; recognizing at least one posture of which it is assumed that the person was in it before the repetitive motion, based on the recorded data; determining an alignment of at least one sensor attached to a body part of the person, based on the recorded data associated with said at least one recognized posture.
The recorded data comprises position data and/or orientation data, that is, the recorded data comprises at least one of a group consisting of position data and orientation data.
For example, detecting a repetitive motion may comprise evaluating at least a section of the recorded data in order to find an approximately periodic data section.
For example, recognizing said at least on posture may comprise calculating an average of at least a section of the recorded data.
For example, determining said alignment may comprise calculating an average of at least a section of the recorded data. Said calculating and/or evaluating may be performed sensor-wise, that is, individually for respective recorded data from a respective sensor. Furthermore, said calculating and/or evaluating may be performed, for each sensor, for orientation data of respective individual coordinate axes of a local sensor coordinate frame of the sensor. In said evaluating and/or said recognizing at least one posture, data from more than one sensor may be taken into account.
Because, based on the recorded data, at least one posture is recognized of which posture it is assumed that the person was in it before the detected repetitive motion, and because the sensor alignment is determined based on the recorded data associated with the recognized posture, no interaction of the person with a system or device performing the method is required. In particular, explicitly triggering an alignment procedure, such as pressing a button, may be dispensed with. Thus, in particular, the method allows automatically determining a sensor alignment with respect to a body part of a person. The method may be used with an implicit alignment procedure requiring no interaction with the person. For example, the movements the person is assumed to do prior to an actual exercise, for example, going from a standing posture (a first posture) to a posture to do the actual exercise (a second posture) are analyzed, and a sensor alignment is determined from recorded sensor data associated with at least one of said postures. By detecting a repetitive motion, a repetitive exercise is recognized. Thus, the system may automatically distinguish an exercising phase from a previous phase in which the person was in a specific posture.
The position data and/or orientation data comprises information of a sensor position and a sensor orientation, respectively. For example, the data comprises orientation data. This is particularly useful for evaluating physiotherapeutic exercises.
The method allows to automatically determine a sensor alignment. For example, the person starts performing an exercise, and it is automatically recognized that he or she has begun exercising. Thus, the person does not have to press a button or the like, and he or she is free to prepare for exercising according to his or her own timing. In contrast to that, when a sensor alignment procedure has to be executed before a physiotherapy patient can start doing his exercises, during the procedure the patient is typically asked to assume a reference posture, and at that moment the sensor readings are stored. The reference posture is, for example, standing up right with the arms strictly down and the palms of the hands towards the legs. The alignment is then determined from these readings. When the patient does not accurately stand in the required reference posture, and this is not detected, all calculations that are based on the accuracy of the determined alignment fail completely. Furthermore, it would be desirable to improve the user- friendliness of such a sensor alignment procedure. In one embodiment of the invention, in the recording step, the recorded data comprises orientation data, which comprises information about a sensor inclination, that is, a sensor orientation relative to true vertical or horizontal orientation. For example, the data comprises information about a tilt or an angle with respect to a horizontal plane. For example, the sensor may comprise an accelerometer for measuring a sensor inclination. For example, the accelerometer is arranged to supply orientation data. For example, the sensor readings may indicate an apparent gravity vector.
For example, said orientation data comprise information about an inclination of at least one axis of a local coordinate frame of the sensor. For example, the data may comprise information about a tilt or an angle of said axis with respect to a horizontal plane.
For example, in the posture recognizing step, the recognizing comprises: determining a section of recorded data indicating substantially stable positions and/or substantially stable orientations; and comparing said section of recorded data to predetermined posture data. That is, said section of recorded data is the recorded data associated with said at least one recognized posture mentioned above. When the person remains in a posture for some time, this may be indicated by substantially stable position and/or orientation sensor readings.
For example, said section of recorded data may be compared to predetermined posture data by calculating average position and/or orientation data of said section and comparing the average data to predetermined posture data.
In one embodiment, determining an alignment of at least one sensor comprises: calculating a difference between predetermined posture data and an average of recorded data associated with said at least one recognized posture. For example, during later procedures such as evaluating an exercise, position and/or orientation data may be corrected taking into account the determined alignment. For example, the calculated difference may be subtracted from the sensor readings in order to calculate the orientation of the body part from the orientation of the sensor.
In one embodiment, the recording step comprises: recording data from at least a first sensor attached to a first body part of the person and a second sensor attached to a second body part of the person, the data from each of the respective sensors comprising position data and/or orientation data; and, in the step of recognizing a posture, the recognizing is based on the recorded data from at least the first and the second sensor. Thus, a posture may be recognized from a relative position and/or orientation of at least a first and a second body part. Thus, the automatic recognizing of one or more postures allows using data from more than one sensor for determining the alignment of at least two of them. Thus, a reliable determining of the alignment is achieved, and determining the alignment is less prone to errors. In one embodiment, in the recognizing step, at least two postures are recognized. For example, the alignment may be determined from a first or a second posture, or from a combined evaluation of postures. For example, data from different postures and/or transitions between postures may be taken into account in determining the alignment. Thus, knowing the past and the future of postures and/or transitions between postures, a more reliable alignment is possible.
In one embodiment, detecting the repetitive motion comprises: calculating an auto correlation of recorded data, that is, in particular, an auto correlation function. For example, an auto correlation is calculated for at least one section of the recorded data. For example, the first maximum at none-zero lag of the auto correlation function is evaluated. For example, said first maximum is compared to the maximum at zero lag. For example, an auto correlation of recorded data of at least one sensor may be calculated independently from the recorded data of other sensors. For example, the respective sensor may be selected depending on a predetermined type of exercise.
In one embodiment, the step of determining an alignment is also based on predefined alignment information. For example, the predefined alignment information may comprise information about an approximate alignment of a sensor. For example, an approximate alignment may be known from the configuration of attachment means for attaching a sensor to the body part.
In one embodiment, the sensor is attached to the body part such that a rotation about a first axis of a local coordinate frame of the sensor with respect to the body part is prevented. Thus, determining the alignment is simplified.
In one embodiment, in the determining step, the determining is also based on recorded data associated with said repetitive motion. For example, maximum or minimum values of position and/or orientation data may be taken into account. In a second aspect of the invention, a device for interpreting movement of a body part of a person is provided, comprising:
- an interpreter unit adapted for receiving data from at least one sensor attached to a body part of the person, the data comprising position data and/or orientation data; wherein the interpreter unit is adapted for: - detecting a repetitive motion of said body part based on the recorded data;
- recognizing at least one posture of which it is assumed that the person was in it before the repetitive motion, based on the recorded data; and
- determining an alignment of at least one sensor attached to a body part of the person, based on the recorded data associated with said at least one recognized posture.
For example, the interpreter unit is adapted for executing a method of determining a sensor alignment with respect to a body part of a person, as described above.
In one embodiment, the interpreter unit is adapted for receiving data from said at least one sensor, the data comprising orientation data, which comprises information about a sensor inclination. For example, the orientation data may comprise information about an inclination of at least one axis of a local coordinate frame of the sensor.
For example, said device further comprises at least one sensor (10; 12) for being attached to a body part of the person.
In one embodiment, the device comprises at least a first sensor for being attached to a first body part of the person and a second sensor for being attached to a second body part of the person, the data from each of the respective sensors comprising position data and/or orientation data; and the interpreter unit is adapted for recognizing at least one posture of which it is assumed that the person was in it before the repetitive motion, based on the recorded data from at least the first and the second sensor. In one embodiment, the device is a physiotherapy monitoring device. For example, the device is an automatic physiotherapy exerciser device or tool.
These and other aspects of the invention will be apparent from and illustrated with reference to the embodiments described hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 and Fig. 2 schematically show the definitions of coordinate frames in one embodiment of the invention;
Fig. 3 schematically shows a sensor attached coordinate frame that is different form a coordinate frame attached to a body part;
Fig. 4 schematically shows a patient doing a physiotherapy exercise; Fig. 5 schematically shows the patient beginning with the physiotherapy exercise; Fig. 6 schematically shows recorded output data of orientation sensors; and Fig. 7 is a diagram illustrating a method of determining a sensor alignment in accordance with an embodiment of the invention.
DETAILED DESCRIPTION OF EMBODIMENTS
Fig. 1 schematically shows a human person and a device for interpreting movement of body parts of the person, such as a physical monitoring device, e.g. an automatic physiotherapy exerciser device. The person is, for example, a physiotherapy patient. Exemplarily shown are a first orientation sensor 10 and a second orientation sensor 12, which are, for example, attached to the chest and the left upper leg, respectively. For example, the sensors are attached to the body using textile elastic straps 14. Thus, the person can easily attach the sensors 10, 12 to his own chest, lower leg or other body part, the movement of which is to be interpreted.
The sensors 10, 12 each comprise, for example, a 3D-accelerometer for measuring its inclination, that is, a tilt with respect to gravity. For example, orientation data expressing the tilt with respect to the horizontal plane is output to an interpreter unit 16, for example, through a radio signal. The interpreter unit 16, which, for example, includes a computer, receives the signal or orientation data and interprets the orientation data as will be described below with reference to Figs. 6 and 7, in order to determine the alignment of the sensors 10, 12 with respect to the body parts of the patient.
When determining the alignment of the sensors 10 and 12, the involved coordinate systems or local coordinate frames may be defined as follows. A torso coordinate
T1 T1 T1 T1^l T1^l T1^l system ( x, y, z) is attached to the torso. A torso sensor coordinate system ( x, y, z) is attached to the first sensor 10. An upper leg coordinate system (ULx, ULy, ULz) is attached to the left upper leg, and an upper leg sensor coordinate system ( x, y, z) is attached to the second sensor. For example, the coordinate systems are chosen such that when the sensors 10, 12 are perfectly aligned with their respective body parts and the person is standing up perfectly straight, the x-axes point down, the z-axes point in the forward direction, and the y- axes are chosen such that right hand sided coordinate systems are obtained. Thus, they point to the left side of the patient. Fig. 2 exemplarily shows the first sensor 10 and its local coordinate frame, denoted as (x, y, z). Rx, Ry, and Rz denote rotations about the x, y, and z-axis, respectively.
In order to better evaluate motions of the patient, it is desirable to determine the sensor alignment, that is, the alignment between the local coordinate frame of the respective sensor and the coordinate frame attached to the respective body part.
In practice, the alignment of the sensor coordinate system with the body part coordinate system is not perfect. For example, Fig. 3 exemplarily shows a side view of the left upper leg and the sensor 12 having respective local coordinate frames, which are differently orientated. According to the invention, the sensor alignment may be determined automatically, as will be described in the following.
Fig. 4 schematically shows the patient doing a typical physiotherapy exercise such as the "bird dog". In Fig. 4, the patient stands on his left hand and right knee, while stretching his right arm and left leg in a horizontal position. Fig. 5 shows the patient on hands and knees.
Fig. 6 schematically illustrates recorded output signals of the sensors 10, 12 when the patient begins exercising the "bird dog" exercise. The signals are shown for illustrating purposes and do not correspond to actual measurements. The graph Tx shows the inclination of the τsx-axis of the torso sensor 10 over time with respect to a horizontal plane. The graph Ty shows the inclination of the axis τsy with respect to a horizontal plane. The graphs Hx and Hy show, in an analogous manner, the inclinations of the axes usx and usy of the upper leg sensor coordinate system with respect to a horizontal plane.
The orientation data shown in Fig. 6 is divided into sections, as is indicated by vertical dashed lines. Average data of specific data sections are indicated by horizontal, dashed lines.
During the first few seconds in Fig. 6, indicated as section 20, the patient is standing upright. When the patient is upright, the torso and upper leg are vertical, so that one would expect that Tx and Hx would both be 90°, that is, rotated downwards by 90° with respect to the horizontal. In practice, Tx is not exactly vertical when the torso is vertical e.g. due to misalignment of the sensor 10 and the chest. The average orientation data Tx during section 20 is indicated as A in Fig. 6.
Also, Hx is in practice not 90° e.g. due to misalignment and the thighs of the patient having fat and muscle tissue. The average orientation data Hx during section 20 is indicated as D in Fig. 6. In the next few seconds, indicated as a section 22, the patient is getting on hands and knees preparing for the exercise. In this section 22, the orientation signals strongly depend on how a person is getting on the floor on hands and knees.
When the patient finally is on hands and knees in section 24, this corresponds to the fact that Tx and Ty are close to zero and substantially stable, meaning that the torso is more or less horizontal, and Hx is close to 90°, meaning that the left upper leg is more or less vertical. During this section 24 of stable orientation data, the average orientation Tx is indicated as B, the average orientation Ty is indicated as C and the average orientation Hx is indicated as E. After a while, the patient starts exercising in section 26, which is clearly observable from the repetitive orientation signal Hx, as is indicated in an area F in Fig. 6. Here, Hx shows that the left upper leg is repeatedly changed from a vertical to a horizontal position and back again.
When the patient is exercising, also the signals Tx and Ty show a wobbly behavior, meaning that the patient is not holding his torso stable.
To summarize, the recorded orientation data shown in Fig. 6 correspond to the activity of the patient first standing in front of, e.g. a physiotherapy exerciser device, then getting on hands and knees, and, afterwards, beginning the "bird-dog" exercise.
The interpreter unit 16 determines the alignment of the sensors 10, 12, for example, according to a method executed on the computer, as illustrated in Fig. 7.
In step SlO, the recording of movements, that is, the recording of orientation data received from sensors 10 and 12, is started. For example, the orientation data shown in Fig. 6 are recorded throughout the following procedure.
In step S 12, the interpreter unit 16 waits until an exercising of the patient has been detected. For example, an auto correlation of the recorded Hx data is calculated on a section of, for example, a few seconds length. For example, said section may correspond to the latest few seconds of orientation data received from the respective sensor. For example, an auto correlation is calculated repeatedly. For example, in regular time intervals, the latest data section may be taken into account, and a new auto correlation function is calculated of said latest data section e.g. having a fixed length.
A repetitive motion may be detected, based on the auto correlation function, as follows. If the amplitude of the first maximum of the auto correlation function for non-zero lag is only slightly smaller than the value at zero lag, then the movements of the respective sensor coordinate frame axis are highly repetitive, and, thus, it is likely that the person is exercising. When, however, the amplitude of this first maximum is only small compared to the value at zero lag, there is only very little repetition or similarity in the data, and, thus, it is unlikely that the patient is exercising. For example, the auto correlation function of Hx during sections 20, 22 and 24 will indicate that there is only very little repetition, whereas the auto correlation function of the data of section 26 will indicate a highly repetitive signal.
For example, depending on the type of exercise, the autocorrelation of specific sensor signals corresponding to orientations of specific sensor coordinate frame axis may be taken into account. For example, for the "bird dog" exercise shown in Fig. 6, the autocorrelation function of the signal Hx may be calculated. However, further sensor signals may be taken into account, and the autocorrelation function may be calculated in more than one dimension. For example, if the autocorrelation function of orientation data from two sensors, or an orientation difference between two sensors, is calculated, a repetitive motion may be detected independently from the kind of exercise. For example, orientations may be represented as quaternions, and the auto correlation may be the quaternion auto correlation.
The length of the data section, of which the auto correlation is calculated, preferably is long enough to include at least one repetition of a movement.
When the data of section 26 has been recorded and analyzed in step S 12, it is determined from the repetitive orientation data that the patient is currently exercising.
Then, in step S 14, the previously recorded movements or orientation data are analyzed as follows.
In step S 14, data sections of substantially stable orientations are identified in those data corresponding to the time before the repetitive motion. For example, in Fig, 6, sections 20 and 24 may be identified is data sections of substantially stable orientation data.
The sections of substantially stable orientation data correspond to respective postures of the patient. For example, for the "bird dog" exercise, it is assumed that the patient is on his hands and knees before actually starting the exercising. Thus, the patient has been on hands and knees in section 24. At the beginning of the data recording, the patient is, for example, assumed to be standing upright in section 20.
The average orientation data B, C and E during section 24 are then used for determining the alignment in step S 16. That is, the sensor alignment is determined based on the differences between the values B, C, E and the assumed orientations of the respective body parts in this posture. If the sensor 10 was perfectly aligned with the torso coordinate system and the torso was horizontal while the patient is on hands and knees, B would be equal to zero, and C would be equal to zero and, for example, if the sensor 12 was perfectly aligned with the upper leg coordinate system and the upper leg was vertical while the patient is on his hands and knees, the value of E would be equal to 90°.
Furthermore, it may be assumed that the alignment for the z-axis of a sensor is ideal and equal to zero, when the sensor is mounted to the body such that a rotation of the sensor around its local z-axis is prevented. This is, for example, achieved by mounting the sensors 10, 12 with the textile straps 14. The alignment determined in step S 16 may further be used by the interpreter unit 16 for monitoring and/or evaluating the exercising of the patient. The necessary calculations for correcting the orientation data based on the determined alignment are within the knowledge of a person skilled in the art.
Depending on the specific exercises and the exercise order, it may also be possible to use the values A and D for alignment.
While the automatic rehabilitation exerciser may tell the patient to get down on the floor on hands and knees and begin exercising, alternatively a physiotherapist could tell a group of patients to first get down on the floor on hands and knees and then start exercising. The physiotherapy monitoring device may be configured to receive orientation data from more than one person and determine a sensor alignment for each person individually.
While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive. The invention is not limited to the disclosed embodiments.
Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. For example, whereas in the examples given above, orientation data is recorded and interpreted, additionally or alternatively, position data may be recorded and interpreted in order to determine a sensor alignment in a similar manner. For example, relative horizontal and/or vertical positions of sensors mounted to respective body parts of a person, and, thus, a positional alignment of the sensors with respect to the body parts, may be determined. Furthermore, the invention may also be applied in sports and recreation. For example, the device may be a training monitoring device or an automatic exerciser device.
Furthermore, all the disclosed elements and features of each disclosed embodiment of the method or the system can be combined with, or substituted for, the disclosed elements and features of every other disclosed embodiment of the method or the system, except where such elements or features are mutually exclusive. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. Any reference signs in the claims should not be construed as limiting the scope.

Claims

CLAIMS:
1. Method of determining a sensor alignment with respect to a body part of a person, comprising the steps of: recording data (Tx; Ty; Hx; Hy) from at least one sensor (10; 12) attached to a body part of the person, the data comprising position data and/or orientation data; detecting a repetitive motion of said body part based on the recorded data (Tx; Ty; Hx; Hy); recognizing at least one posture of which it is assumed that the person was in it before the repetitive motion, based on the recorded data (Tx; Ty; Hx; Hy); - determining an alignment of at least one sensor (10; 12) attached to a body part of the person, based on the recorded data associated with said at least one recognized posture.
2. Method as claimed in claim 1, wherein in the recording step, the recorded data comprises orientation data (Tx; Ty; Hx; Hy), which comprises information about a sensor inclination.
3. Method as claimed in claim 2, wherein said orientation data comprise information about an inclination of at least one axis (τsx; τsy) of a local coordinate frame (τsx, τsy, TSz) of the sensor (10).
4. Method as claimed in one of the claims 1 to 3, wherein in the posture recognizing step, the recognizing comprises: determining a section (24) of recorded data (Tx; Ty; Hx; Hy) indicating substantially stable positions and/or substantially stable orientations; and comparing said section (24) of recorded data to predetermined posture data.
5. Method as claimed in one of the claims 1 to 4, wherein determining an alignment of at least one sensor (10; 12) comprises: calculating a difference between predetermined posture data and an average (B; C; E) of recorded data associated with said at least one recognized posture.
6. Method as claimed in one of the claims 1 to 5, wherein the recording step comprises: recording data (Tx; Ty; Hx; Hy) from at least a first sensor (10) attached to a first body part of the person and a second sensor (12) attached to a second body part of the person, the data from each of the respective sensors comprising position data and/or orientation data; and wherein in the step of recognizing a posture, the recognizing is based on the recorded data (Tx; Ty; Hx; Hy) from at least the first and the second sensor (10; 12).
7. Method as claimed in one of the claims 1 to 6, wherein detecting the repetitive motion comprises: calculating an auto correlation of recorded data (Tx; Ty; Hx; Hy).
8. Method as claimed in one of the claims 1 to 7, wherein the sensor (10; 12) is attached to the body part such that rotation about a first axis (τsz) of a local coordinate frame
( x, y, z; x, y, z) of the sensor (10; 12) with respect to the body part is prevented.
9. Device for interpreting movement of a body part of a person, comprising: - an interpreter unit (16) adapted for receiving data (Tx; Ty; Hx; Hy) from at least one sensor (10; 12) attached to a body part of the person, the data comprising position data and/or orientation data; wherein the interpreter unit (16) is adapted for: detecting a repetitive motion of said body part based on the recorded data (Tx; Ty; Hx; Hy); recognizing at least one posture of which it is assumed that the person was in it before the repetitive motion, based on the recorded data (Tx; Ty; Hx; Hy); and determining an alignment of at least one sensor (10; 12) attached to a body part of the person, based on the recorded data associated with said at least one recognized posture.
10. Device as claimed in claim 9, wherein the device is a physiotherapy monitoring device.
11. Device as claimed in claim 9 or 10, further comprising at least one sensor (10; 12) for being attached to a body part of the person.
12. Computer program or computer program product for performing the method as claimed in one of the claims 1 to 8 when executed on a computer.
13. Data carrier including a computer program for performing the steps of the method as claimed in any one of claims 1 to 8.
14. Computer for executing a computer program as claimed in claim 12.
PCT/IB2010/050307 2009-02-03 2010-01-25 Determining a sensor alignment WO2010089676A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP09151909 2009-02-03
EP09151909.0 2009-02-03

Publications (1)

Publication Number Publication Date
WO2010089676A1 true WO2010089676A1 (en) 2010-08-12

Family

ID=42122896

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2010/050307 WO2010089676A1 (en) 2009-02-03 2010-01-25 Determining a sensor alignment

Country Status (1)

Country Link
WO (1) WO2010089676A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996029007A1 (en) * 1995-03-21 1996-09-26 David John Walker Activity recording device
US20050126026A1 (en) 2001-02-23 2005-06-16 Townsend Christopher P. Posture and body movement measuring system
WO2009112981A1 (en) * 2008-03-14 2009-09-17 Koninklijke Philips Electronics N.V. An activity monitoring system insensitive to accelerations induced by external motion factors

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1996029007A1 (en) * 1995-03-21 1996-09-26 David John Walker Activity recording device
US20050126026A1 (en) 2001-02-23 2005-06-16 Townsend Christopher P. Posture and body movement measuring system
WO2009112981A1 (en) * 2008-03-14 2009-09-17 Koninklijke Philips Electronics N.V. An activity monitoring system insensitive to accelerations induced by external motion factors

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
BACHMANN E R ET AL: "Sourceless tracking of human posture using small inertial/magnetie sensors", COMPUTATIONAL INTELLIGENCE IN ROBOTICS AND AUTOMATION, 2003. PROCEEDIN GS. 2003 IEEE INTERNATIONAL SYMPOSIUM ON JULY 16 - 20, 2003, PISCATAWAY, NJ, USA,IEEE, vol. 2, 16 July 2003 (2003-07-16), pages 822 - 829, XP010651873, ISBN: 978-0-7803-7866-7 *
RICHARD D WILLMANN ET AL: "Home Stroke Rehabilitation for the Upper Limbs", ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY, 2007. EMBS 2007. 29TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE, IEEE, PISCATAWAY, NJ, USA, 1 August 2007 (2007-08-01), pages 4015 - 4018, XP031150371, ISBN: 978-1-4244-0787-3 *
VICTOR VAN ACHT ET AL: "Miniature Wireless Inertial Sensor for Measuring Human Motions", ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY, 2007. EMBS 2007. 29TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE, IEEE, PISCATAWAY, NJ, USA, 1 August 2007 (2007-08-01), pages 6278 - 6281, XP031150947, ISBN: 978-1-4244-0787-3 *

Similar Documents

Publication Publication Date Title
EP3627514B1 (en) System and method for optimised monitoring of joints in physiotherapy
CA2934354C (en) Instrumented total body recumbent cross trainer system
KR101738678B1 (en) System for evaluating the ability of physical activity
KR102503910B1 (en) Method and apparatus of standing assistance
US20140350703A1 (en) Method and device for assessing muscular capacities of athletes using short tests
JP6772276B2 (en) Motion recognition device and motion recognition method
JP2011516915A (en) Motion content-based learning apparatus and method
US20170354843A1 (en) Method and system for measuring, monitoring, controlling and correcting a movement or a posture of a user
JP5034012B2 (en) Motor ability detection device
JP5742423B2 (en) Method for obtaining margin of lower limb muscle strength, and lower limb muscle strength evaluation apparatus used therefor
JP2008229266A (en) System for proposing exercise function improvement menu from walking ability and method for proposing exercise function improvement menu from walking ability
EP2220997A1 (en) Device, system and method for monitoring motion sequences
JP2010536449A (en) Accelerometer and method for controlling accelerometer
JP6516283B2 (en) Motion analysis device
Gauthier et al. Human movement quantification using Kinect for in-home physical exercise monitoring
KR102172585B1 (en) Mulit-exercise apparatus with smart mirror
JP6678492B2 (en) Dynamic balance evaluation device
CN110621264B (en) Method for detecting sensor data
KR101578609B1 (en) System for measuring muscular strength of leg
JP2020146103A (en) Mounting posture estimation method of inertial sensor
WO2021186709A1 (en) Exercise assistance apparatus, exercise assistance system, exercise assistance method, and exercise assistance program
US20060134583A1 (en) Simulation and training sphere for receiving persons
JP2013533999A (en) Method and apparatus for presenting options
WO2010089676A1 (en) Determining a sensor alignment
CN113303765B (en) System for detecting specific kind muscle problem of interested object

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10703513

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10703513

Country of ref document: EP

Kind code of ref document: A1