WO2017005591A1 - Apparatus and method for motion tracking of at least a portion of a limb - Google Patents

Apparatus and method for motion tracking of at least a portion of a limb Download PDF

Info

Publication number
WO2017005591A1
WO2017005591A1 PCT/EP2016/065266 EP2016065266W WO2017005591A1 WO 2017005591 A1 WO2017005591 A1 WO 2017005591A1 EP 2016065266 W EP2016065266 W EP 2016065266W WO 2017005591 A1 WO2017005591 A1 WO 2017005591A1
Authority
WO
WIPO (PCT)
Prior art keywords
limb
orientation
space
motion
camera
Prior art date
Application number
PCT/EP2016/065266
Other languages
French (fr)
Inventor
Cheng Chen
Jim Wang
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Priority to CN201680039901.9A priority Critical patent/CN107847187B/en
Publication of WO2017005591A1 publication Critical patent/WO2017005591A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof

Definitions

  • the present invention generally relates to motion tracking of at least a portion of a limb, especially to the motion tracking of at least a portion of a limb in rehabilitation.
  • Stroke is one of the leading causes of death in the world and a vast majority of survivors from stroke suffer from a number of dysfunctions. The most common one is the motor dysfunction. Long term rehabilitation is needed to help patients to improve their motor abilities. Usually, patients are hospitalized for about one month and then sent back home due to limited number of therapists and rehabilitation centers. Therefore, unsupervised
  • WO2014108824A1 disclosed a visual based motion tracking system for rehabilitation.
  • the system has a camera to capture the images of the limb in motion with markers attached to at least two joints of up limb so that range of motion of the up limb can be evaluated based on the location of the markers in the images.
  • Orientation information for movement estimation will be derived for movement depiction from the location change of the at least two markers in the images captured based on the fact that the length projection of the limb on the images will change accordingly due the change of the orientation of the limb.
  • An apparatus for motion tracking of at least a portion of a limb, the limb comprising two jointed straight portions, comprising:
  • a motion sensor 110c to be attached to the portion of the limb for measuring orientation of the portion of the limb, the orientation being derived from the outputs of the motion sensor 110c;
  • a camera 120 for capturing images of the portion of the limb in motion with a marker attached to a predefined position 110a, 110b, 1 lOd, a location of the marker in each image captured by the camera 120 being a reference location;
  • a processor for estimating the movement of the portion of the limb based on reference locations, the orientation corresponding to each reference location, a predetermined length of the portion of the limb and the predefined position by determining the spatial locations of both ends of the portion of the limb
  • a motion sensor to provide orientation information of at least a portion of a limb (or the whole limb) in motion, the derivation of which is not feasible via images captured by the camera when occlusion happens.
  • orientation information provided, the estimation of spatial locations of the joints is facilitated with even only one marker attached.
  • the position, where the marker is attached, is predefined according to the motion track of particular rehabilitation exercise to ensure the marker is detectable during the exercise. Instead of tracking the locations of joints in the images directly, spatial locations are derived based on the location of one marker observable and according orientation information.
  • the occlusion problem is solved and the movement of the portion of the limb (or the whole limb) can be estimated based on an accurate and instantaneous measurement of spatial locations of interest.
  • the limb stays straight without bending.
  • the straight portion of a limb may be an upper arm, a forearm, a shin or a thigh.
  • the marker is one of multiple distinguishable markers attached to respective predefined positions 110a, 110b, 1 lOd of the portion of the limb to ensure that at least one marker is observable in each image captured by the camera 120 during the motion.
  • the apparatus further comprises a calibration unit to a calibration unit to align the orientation space and the image space by determining angular difference between coordinate systems of the orientation space and coordinate systems of the image space, wherein the orientation space corresponds to the space of orientation derived.
  • the orientation information derived from the outputs of the motion sensor to depict the orientation space is expressed in a coordinate system, which may be different from the one to depict the image space.
  • the image space refers to the coordinate system used to depict the location of the marker in each image captured by the camera.
  • a calibration unit is further provided to align the orientation space and the image space by determining angular difference between two coordinate systems, hence preparing the orientation measured by the motion sensor to be utilized in the image space to determine spatial location of joint of interest.
  • the calibration unit comprises a compass to be attached to the camera.
  • the orientation information of the limb derived is presented in a world fixed coordinate frame, of which the axes is of a direction to the Geographic North Pole, a direction of gravitational force and a direction perpendicular to the aforementioned two directions.
  • a compass attached to the camera helps to calibrate the camera to be fixed to a position that either of the two edges of the camera, the surface determined by which circumscribes a shutter, is aligned with the direction to the Geographic North Pole. Then the other edge may easily be aligned with the direction of gravitational force, for example, by naturally sagging.
  • the orientation space and the image space are aligned that the orientation information of the portion of the limb (or the whole limb) generated by the motion sensor is the orientation angles of the portion of limb (or the whole limb) in the image space.
  • the calibration unit comprises a motion sensor to be attached to the camera.
  • the calibration unit may be embodied as a motion sensor to be attached the camera.
  • the motion sensor is placed along gravitational edge of the camera.
  • the output of motion sensor attached to the camera indicates the orientation difference between the image space and the orientation space.
  • the camera can be further adjusted until gravitational component of the output equals to 0 and the other two components equal to 90 degrees, which means the gravitational axis is aligned. Then the rest of the axes can be aligned in the same way. Then the two spaces are aligned.
  • each output of the motion sensor attached along each edge of the camera, indicating angular difference between the two spaces may be recorded for further
  • the processor further compensates the estimation of the movement of the portion of the limb by the angular difference measured by the motion sensor 110c.
  • the compensation information is collected to mitigate the difference between the orientation space and the image space in the phase of movement measurement.
  • the spatial locations are the locations in the image space and the movement is estimated in the image space.
  • Movement in the image space is measured, which may be sufficient for the purpose of recovery evaluation.
  • the processor further estimates the movement in orientation space by mapping the spatial locations of the image space to the actual space based on the predetermined mapping rules.
  • mapping is predetermined according to the original settings of relative spatial relationship between the camera and the subject, for example, the distance.
  • the movement is one of rotation angle, angular velocity, angular acceleration, linear velocity and motion trajectory
  • the motion sensor comprises two or combination of an accelerometer, a magnetometer and a gyroscope.
  • a motion sensor comprises at least an accelerometer and a magnetometer.
  • an accelerometer With an accelerometer, a 2D inclination can be determined. Then combined with a magnetometer, the rotation around the vertical axis is further determined, which a full 3D orientation in a world fixed coordinate frame is derived.
  • the orientation reconstruction with accelerometers and magnetometers has the disadvantages that it works well for situations where the accelerations by movements are only small in comparison with the gravitational accelerations.
  • gyroscopes can be included in the orientation reconstruction, which is integrating the gyroscope data.
  • the combination of an accelerometer With an accelerometer, a 2D inclination can be determined. Then combined with a magnetometer, the rotation around the vertical axis is further determined, which a full 3D orientation in a world fixed coordinate frame is derived.
  • the orientation reconstruction with accelerometers and magnetometers has the disadvantages that it works well for situations where the accelerations by movements are only small in comparison with the gravitational accelerations.
  • accelerometer and a gyroscope can be implemented as to measure the full 3D orientation.
  • the invention comprises a method for steps of:
  • Fig. 1 shows a schematic diagram of an apparatus for motion tracking of a limb in accordance with an embodiment of the present invention
  • Fig. 2(a) - Fig.(d) illustrate gesture examples designed for up limb rehabilitation recovery.
  • Fig. 3 shows a schematic diagram showing the measurement of the limb movement based on the motion sensor output and images captured in an embodiment of the invention.
  • Fig. 4(a) shows a schematic diagram showing the status of the up limb of a patient with stroke at the beginning of shoulder flexion
  • Fig. 4(b) shows a schematic diagram showing the status of the up limb of a patient with stroke in the middle of shoulder flexion.
  • Fig. 5 shows a method of motion tracking of a limb.
  • Fig. 1 is a schematic diagram of an apparatus for motion tracking of an up limb in accordance with an embodiment of the present invention, which includes a marker attached to a hand 110b of a subject, a motion sensor 110c attached to the upper limb, a camera 120 located in front of the subject and a processor (not illustrated) to estimate the movement of the upper limb.
  • the up limb comprises two straight portions, an upper arm and a forearm.
  • the camera is illustrated as positioned in front of the subject in Fig. 1, the camera can also be positioned by the side of the subject, which should not be considered as a limitation.
  • the motion sensor 110c is placed along the limb to measure the orientation of the limb in motion, for example, doing a rehabilitation exercise, while the camera 120 is configured to capture images of the limb with a marker attached in motion simultaneously.
  • the marker is attached to the hand 110b, which is predefined before the exercise.
  • gestures of moving the forearm or upper arm are designed to evaluate the recovery of the up limb after stroke.
  • Each gesture may emphasize on the rotation of a particular joint in different directions.
  • Fig. 2(a) - Fig.(d) gives some examples of the gestures designed, where the shoulder or the elbow serves as the joint to be served and keeps still in the gesture.
  • the degrees of freedom and range of movement of the upper arm or forearm are measured during the exercises.
  • Fig 2(a) illustrates a shoulder flexion exercise.
  • Moving position Parallel to longitudinal axis of the humerus pointing toward the lateral epicondyle;
  • Fig 2(b) illustrates a shoulder abduction exercise.
  • Moving position Anterior aspect of the upper arm parallel to longitudinal axis of the humerus;
  • Fig 2(c) illustrates an ahoulder internal/exeternal rotation exercise.
  • Start position Parallel to the supporting surface or perpendicular to the floor; Moving position: Parallel to the longitudinal axis of the ulna pointing toward the styloid process;
  • Fig. 2(d) illustrates an elbow flexion exercise.
  • Moving position Parallel to longitudinal axis of the radius pointing toward the styloid process of the radius;
  • Fig. 3 is a schematic diagram showing the measurement of the limb movement in shoulder flexion based on the motion sensor 110c output and an image captured in an embodiment of the invention.
  • the image space and the orientation space is assumed to be perfectly aligned and the world fixed coordinate frame of the motion sensor 110c will be used to depict the image, where x indicates a direction to the Geographic North Pole, y indicates a direction of gravitational force and z indicates a direction perpendicular to the aforementioned two directions.
  • An image 210 is captured by the camera 120 for a limb A 0 B 0 at a certain moment during the shoulder flexion.
  • a reference location of the hand B Q ' ⁇ x b ' Q , y b ' Q , 0) indicates the coordinate of the marker attached to hand 110b observed in the image, which is a projection of the location B 0 (x bQ , y b0 , z b0 )onto the xy plane.
  • the orientation information ⁇ 0 ( ⁇ ⁇ 0 , ⁇ ⁇ 0 , ⁇ ⁇ 0 ) (not illustrated in the figure, where ⁇ ⁇ 0 indicates the angle between the limb A 0 B 0 and x axis, ⁇ ⁇ 0 indicates the angle between the limb A 0 B 0 and y axis, and ⁇ ⁇ 0 indicates the angle between the limb A 0 B 0 and z axis) measured by the motion sensor, indicating the orientation of limb A 0 B 0 in the image space, and the predefined length L of the limb A 0 B 0 , the coordinate of the hand B 0 (x bQ , y b0 , z b0 )and the coordinate of the shoulder A 0 (x b0 , y b0 , 0) can be determined, where the shift of the shoulder in the z direction is assumed to be negligible.
  • the location of the shoulder A 0 is able to be estimated even when the limb is perpendicular to the shutter of the camera, which results in an overlap of the shoulder with the hand in the image captured if the conventional method is adopted.
  • the movement of the limb can be estimated continuously and instantaneously.
  • the predefined position to which the marker is attached could be implemented as any position over the limb, such as the other end of the limb (shoulder 110a), middle point of the limb or quarter point of the limb, under the criteria that the marker is observable in the images captured by the camera 120 throughout the motion.
  • Fig. 4(b) illustrates one case which might happen.
  • the up limb may not be able to be stretched out straight, the forearm and upper arm of which keeps an angle with each other.
  • the upper arm will be considered as the target (a portion of a limb) in motion to be estimated and a motion sensor 110c is attached along the upper arm.
  • the shoulder 110a nor the elbow
  • Fig.4(a) illustrates, the elbowl lOd is in occlusion due to the handl 10b and forearm;
  • Fig.4(b) illustrates, the shoulder 110a is in occlusion due to the handl 10b and forearm. Therefore, two distinguishable markers are attached to both shoulder 110a and elbow 1 lOd to ensure that at least one marker is observable in the images captured.
  • the coordinates of shoulder 110a and elbow 1 lOd can still be determined. More markers can be applied accordingly if two markers are not sufficient.
  • the distinguishing feature between the multiple markers can be visual property, as color and reflective intensities, and geometric property, as shape and pattern.
  • the alignment between the orientation space and the image space is realized by a calibration unit by determining angular difference between coordinate systems of the orientation space and coordinate systems of the image space.
  • the orientation space is a world fixed coordinate frame, which the orientation derived from the outputs of the motion sensor
  • the image space relates to the positioning of the camera, of which x, y axes parallel to the edges of the camera.
  • the calibration unit is optional if the image space and the orientation space have already been aligned.
  • One embodiment of the calibration unit is a compass to be attached to the camera.
  • a compass attached to one edge of the up surface of the camera helps to align one edge of surface comprising the shutter to be aligned with the direction to the Geographic
  • the other edge of the surface comprising the shutter can be aligned with the direction of gravitational force by naturally sagging.
  • a gradienter can be adopted to align the other edge to the direction of gravitational force.
  • One embodiment of the calibration unit is a motion sensor to be attached along one edge of the surface comprising the shutter. In the calibration, adjust the position of the camera until one component of the output equals to 0 and the other two components equal to 90 degrees, which means an axis is aligned. Then other edge of the surface can be aligned in the same way. When the two axes are aligned, the image space and the orientation space are aligned.
  • the outputs of the motion sensor attached along the two axes of the surface comprising the shutter are recorded for further compensation in the estimation of the movement of the limb, where the outputs indicate the angular difference between the image space and the orientation space.
  • the motion sensor can also be placed along the edge perpendicular to the surface comprising the shutter to calibrate the two spaces in the same way as mentioned above.
  • the length of the portion of the length is derived from the limb length based on a statistic ratio.
  • the length of an upper arm is calculated based on a statistic ratio between the length of whole upper limb and the length of the upper arm.
  • the predetermined limb length L is determined accordingly based on the statistic ratio.
  • the estimation of the movement can be firstly completed in the image space, which is further converted to the absolute movement in the actual space.
  • a mapping rule is predetermined for such conversion for the ratio of the length in actual space and the image space based on the distance between the camera and the subject.
  • the movement is one of rotation angle, angular velocity, angular acceleration, linear velocity and motion trajectory.
  • the apparatus proposed by the invention can also be applied to measure the movement of a lower limb.
  • the estimation of the movement of a thigh, a shin or a whole leg can be facilitated similarly as the embodiments mentioned above, where the thigh is the counterpart to the upper arm, the shin is the counterpart to the forearm, a hip is the counterpart to the shoulder, a knee is counterpart to the elbow and a foot is counterpart to the hand.
  • Fig.5 shows a method of continuously motion tracking of a limb, the method 100 comprising the following steps:
  • Step SI 00 aligning an orientation space and the image space by determining angular difference between coordinate systems of the orientation space and coordinate systems of the image space, wherein the orientation space corresponds to the space of orientation derived;
  • Step S101 the orientation of the portion of the limb, the orientation being derived from outputs of the motion sensor 110c to be attached to the portion of the limb;
  • Step SI 02 capturing images of the portion of the limb in motion with a marker attached to a predefined position 110a, 110b, 1 lOd of the portion of the limb, a location of the marker in each image captured by the camera 120 being a reference location;
  • Step S 103 estimating the movement of the portion of the limb based on reference locations, the orientation corresponding to each reference location, a predetermined length the portion of limb and the predefined position by determining the spatial locations of both ends of the portion of the limb.
  • Step 100 is an optional step if the image space and the orientation space have been already aligned.
  • a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Abstract

The invention proposes an apparatus for motion tracking of at least a portion of limb comprising two jointed straight portions. The apparatus comprises a motion sensor (110c) to be attached to a portion of the limb for measuring orientation of the portion of the limb; a camera (120) for capturing images of the portion of the limb in motion with a marker attached at a predefined position (110a, 110b, 110d) of the portion of the limb, a location of the marker in each image captured by the camera being a reference location; a processor for estimating movement of the portion of the limb based on reference locations, the orientation corresponding to each reference location, a predetermined length of the portion of limb and the predefined position. Instead of tracking the locations of joints in the images directly, locations are derived based on the location of one marker observable and according orientation information. In this way, the occlusion problem is solved and the movement of the limb can be estimated based on an accurate and instantaneous measurement of locations of interest.

Description

APPARATUS AND METHOD FOR MOTION TRACKING OF AT
LEAST A PORTION OF A LIMB
FIELD OF THE INVENTION
The present invention generally relates to motion tracking of at least a portion of a limb, especially to the motion tracking of at least a portion of a limb in rehabilitation. BACKGROUND OF THE INVENTION
Stroke is one of the leading causes of death in the world and a vast majority of survivors from stroke suffer from a number of dysfunctions. The most common one is the motor dysfunction. Long term rehabilitation is needed to help patients to improve their motor abilities. Usually, patients are hospitalized for about one month and then sent back home due to limited number of therapists and rehabilitation centers. Therefore, unsupervised
rehabilitation systems have been developed to help patients at home to reach their maximum possible level of recovery.
WO2014108824A1 disclosed a visual based motion tracking system for rehabilitation. The system has a camera to capture the images of the limb in motion with markers attached to at least two joints of up limb so that range of motion of the up limb can be evaluated based on the location of the markers in the images. Orientation information for movement estimation will be derived for movement depiction from the location change of the at least two markers in the images captured based on the fact that the length projection of the limb on the images will change accordingly due the change of the orientation of the limb. However, due to the 2D dimension nature of the image, an occlusion of the marker will happen during the motion of the limb whenever one marker is covered by another marker or a portion of the limb, which results in undetectability of the location of the marker overlapped in the image. Hence the consistency of the motion tracking is interrupted. Though
interpolation may help to get over the problem, it provides a low accuracy of instantaneous joint location estimation. SUMMARY OF THE INVENTION
It would be desirable to have a more robust system to facilitate continuous motion tracking of at least a portion of a limb and provide a better accuracy of the motion tracking for further evaluations.
An apparatus for motion tracking of at least a portion of a limb, the limb comprising two jointed straight portions, comprising:
a motion sensor 110c to be attached to the portion of the limb for measuring orientation of the portion of the limb, the orientation being derived from the outputs of the motion sensor 110c;
a camera 120 for capturing images of the portion of the limb in motion with a marker attached to a predefined position 110a, 110b, 1 lOd, a location of the marker in each image captured by the camera 120 being a reference location;
a processor for estimating the movement of the portion of the limb based on reference locations, the orientation corresponding to each reference location, a predetermined length of the portion of the limb and the predefined position by determining the spatial locations of both ends of the portion of the limb
It is proposed in the present invention to introduce a motion sensor to provide orientation information of at least a portion of a limb (or the whole limb) in motion, the derivation of which is not feasible via images captured by the camera when occlusion happens. With the orientation information provided, the estimation of spatial locations of the joints is facilitated with even only one marker attached. The position, where the marker is attached, is predefined according to the motion track of particular rehabilitation exercise to ensure the marker is detectable during the exercise. Instead of tracking the locations of joints in the images directly, spatial locations are derived based on the location of one marker observable and according orientation information. In this way, the occlusion problem is solved and the movement of the portion of the limb (or the whole limb) can be estimated based on an accurate and instantaneous measurement of spatial locations of interest. When the limb is of interest, the limb stays straight without bending. The straight portion of a limb may be an upper arm, a forearm, a shin or a thigh.
In one embodiment, the marker is one of multiple distinguishable markers attached to respective predefined positions 110a, 110b, 1 lOd of the portion of the limb to ensure that at least one marker is observable in each image captured by the camera 120 during the motion.
Due to the content of the rehabilitation exercise or the limitation of patient's condition, it happens that there is no such position to keep one marker attached to it detectable all the time through the whole procedure of the exercise. Then multiple markers, which are distinguishable from each other in the image, are adopted to ensure that for a moment at least one marker is observable. Each marker is attached to respective predefined positions over the portion of limb (or the whole limb). Then the spatial locations of joints of interest can be estimated based on the location of an identified marker in the image, its according predefined position and orientation information measured by the motion sensor.
In one embodiment, the apparatus further comprises a calibration unit to a calibration unit to align the orientation space and the image space by determining angular difference between coordinate systems of the orientation space and coordinate systems of the image space, wherein the orientation space corresponds to the space of orientation derived.
The orientation information derived from the outputs of the motion sensor to depict the orientation space is expressed in a coordinate system, which may be different from the one to depict the image space. The image space refers to the coordinate system used to depict the location of the marker in each image captured by the camera. A calibration unit is further provided to align the orientation space and the image space by determining angular difference between two coordinate systems, hence preparing the orientation measured by the motion sensor to be utilized in the image space to determine spatial location of joint of interest.
In one embodiment, the calibration unit comprises a compass to be attached to the camera.
The orientation information of the limb derived is presented in a world fixed coordinate frame, of which the axes is of a direction to the Geographic North Pole, a direction of gravitational force and a direction perpendicular to the aforementioned two directions. A compass attached to the camera helps to calibrate the camera to be fixed to a position that either of the two edges of the camera, the surface determined by which circumscribes a shutter, is aligned with the direction to the Geographic North Pole. Then the other edge may easily be aligned with the direction of gravitational force, for example, by naturally sagging. Then the orientation space and the image space are aligned that the orientation information of the portion of the limb (or the whole limb) generated by the motion sensor is the orientation angles of the portion of limb (or the whole limb) in the image space.
In one embodiment, the calibration unit comprises a motion sensor to be attached to the camera.
The calibration unit may be embodied as a motion sensor to be attached the camera. For example, the motion sensor is placed along gravitational edge of the camera. The output of motion sensor attached to the camera indicates the orientation difference between the image space and the orientation space. The camera can be further adjusted until gravitational component of the output equals to 0 and the other two components equal to 90 degrees, which means the gravitational axis is aligned. Then the rest of the axes can be aligned in the same way. Then the two spaces are aligned. Alternatively, instead of adjusting the camera, each output of the motion sensor attached along each edge of the camera, indicating angular difference between the two spaces, may be recorded for further
compensation
In one embodiment, the processor further compensates the estimation of the movement of the portion of the limb by the angular difference measured by the motion sensor 110c.
Based on the orientation information generated, the compensation information is collected to mitigate the difference between the orientation space and the image space in the phase of movement measurement.
In one embodiment, the spatial locations are the locations in the image space and the movement is estimated in the image space.
Movement in the image space is measured, which may be sufficient for the purpose of recovery evaluation.
In one embodiment, the processor further estimates the movement in orientation space by mapping the spatial locations of the image space to the actual space based on the predetermined mapping rules.
Instead of movement in the image space, the movement in the actual space of interest is further derived by mapping. The mapping rules are predetermined according to the original settings of relative spatial relationship between the camera and the subject, for example, the distance.
In one embodiment, the movement is one of rotation angle, angular velocity, angular acceleration, linear velocity and motion trajectory
In one embodiment, the motion sensor comprises two or combination of an accelerometer, a magnetometer and a gyroscope.
A motion sensor comprises at least an accelerometer and a magnetometer. With an accelerometer, a 2D inclination can be determined. Then combined with a magnetometer, the rotation around the vertical axis is further determined, which a full 3D orientation in a world fixed coordinate frame is derived. The orientation reconstruction with accelerometers and magnetometers has the disadvantages that it works well for situations where the accelerations by movements are only small in comparison with the gravitational accelerations. To increase the accuracy, gyroscopes can be included in the orientation reconstruction, which is integrating the gyroscope data. In another embodiment, the combination of an
accelerometer and a gyroscope can be implemented as to measure the full 3D orientation.
Detailed description of orientation reconstruction with accelerometers, magnetometers and gyroscopes can be found in previous literature, such as Miniature Wireless Inertial Sensor for Measuring Human Motions by Victor Van Acht, Edwin Bongers, Niek Lambert and Rene Verberne, EMBC2007, Lyon, France.
The invention comprises a method for steps of:
aligning SI 00 the orientation space and the image space by determining angular difference between coordinate systems of the orientation space and coordinate systems of the image space, wherein the orientation space corresponds to the space of orientation derived;
measuring S101 the orientation of the portion of the limb, the orientation being derived from outputs of the motion sensor 110c to be attached to the portion of the limb;
capturing SI 02 images of the portion of the limb in motion with a marker attached to a predefined positionl 10a, 110b, 1 lOd , a location of the marker in each image captured by the camera 120 being a reference location;
estimating SI 03 the movement of the portion of the limb based on reference locations, the orientation corresponding to each reference location, a predetermined length of the portion of limb and the predefined position by determining the spatial locations of both ends of the portion of the limb.
Various aspects and features of the disclosure are described in further detail below. And other objects and advantages of the present invention will become more apparent from the following description when taken in conjunction with the accompanying drawings.
DESCRIPTION OF THE DRAWINGS
The present invention will be described and explained hereinafter in more detail in combination with embodiments and with reference to the drawings, wherein:
Fig. 1 shows a schematic diagram of an apparatus for motion tracking of a limb in accordance with an embodiment of the present invention;
Fig. 2(a) - Fig.(d) illustrate gesture examples designed for up limb rehabilitation recovery.
Fig. 3 shows a schematic diagram showing the measurement of the limb movement based on the motion sensor output and images captured in an embodiment of the invention.
Fig. 4(a) shows a schematic diagram showing the status of the up limb of a patient with stroke at the beginning of shoulder flexion;
Fig. 4(b) shows a schematic diagram showing the status of the up limb of a patient with stroke in the middle of shoulder flexion.
Fig. 5 shows a method of motion tracking of a limb.
The same reference signs in the drawings indicate similar or corresponding features and/or functionalities. DETAILED DESCRIPTION
The present invention will be described with respect to particular embodiments and with reference to certain drawings, but the invention is not limited thereto but only by the claims. The drawings described are only schematic and are non-limiting. In the drawings, the size of some of the elements may be exaggerated and not drawn to scale for illustrative purposes. Fig. 1 is a schematic diagram of an apparatus for motion tracking of an up limb in accordance with an embodiment of the present invention, which includes a marker attached to a hand 110b of a subject, a motion sensor 110c attached to the upper limb, a camera 120 located in front of the subject and a processor (not illustrated) to estimate the movement of the upper limb. The up limb comprises two straight portions, an upper arm and a forearm. The embodiment as to estimate the movement of only an upper arm or a forearm will be discussed later. Though the camera is illustrated as positioned in front of the subject in Fig. 1, the camera can also be positioned by the side of the subject, which should not be considered as a limitation.
The motion sensor 110c is placed along the limb to measure the orientation of the limb in motion, for example, doing a rehabilitation exercise, while the camera 120 is configured to capture images of the limb with a marker attached in motion simultaneously. The marker is attached to the hand 110b, which is predefined before the exercise.
As for rehabilitation exercises, gestures of moving the forearm or upper arm are designed to evaluate the recovery of the up limb after stroke. Each gesture may emphasize on the rotation of a particular joint in different directions. Fig. 2(a) - Fig.(d) gives some examples of the gestures designed, where the shoulder or the elbow serves as the joint to be served and keeps still in the gesture. Correspondingly, the degrees of freedom and range of movement of the upper arm or forearm are measured during the exercises.
Fig 2(a) illustrates a shoulder flexion exercise.
Start position: Parallel to midaxillary line of the trunk;
Moving position: Parallel to longitudinal axis of the humerus pointing toward the lateral epicondyle;
Movement: Shoulder Flexion;
Range: 30 -180°.
Fig 2(b) illustrates a shoulder abduction exercise.
Start position: Parallel to midaxillary line of the body;
Moving position: Anterior aspect of the upper arm parallel to longitudinal axis of the humerus;
Movement: Shoulder elevation in scapular or frontal plane
Range: 30-180°. Fig 2(c) illustrates an ahoulder internal/exeternal rotation exercise.
Start position: Parallel to the supporting surface or perpendicular to the floor; Moving position: Parallel to the longitudinal axis of the ulna pointing toward the styloid process;
Movement: Internal and External Rotation;
Range: internal 0 ~ 80°, external 0 ~ 60°.
Fig. 2(d) illustrates an elbow flexion exercise.
Start Position: Parallel to the longitudinal axis of the humerus pointing towards the tip of the Acromion;
Moving position: Parallel to longitudinal axis of the radius pointing toward the styloid process of the radius;
Movement: Elbow flexion;
Range: 30-150°„
Fig. 3 is a schematic diagram showing the measurement of the limb movement in shoulder flexion based on the motion sensor 110c output and an image captured in an embodiment of the invention. The image space and the orientation space is assumed to be perfectly aligned and the world fixed coordinate frame of the motion sensor 110c will be used to depict the image, where x indicates a direction to the Geographic North Pole, y indicates a direction of gravitational force and z indicates a direction perpendicular to the aforementioned two directions. An image 210 is captured by the camera 120 for a limb A0B0 at a certain moment during the shoulder flexion. As in the image 210, a reference location of the hand BQ' {xb' Q, yb'Q, 0) indicates the coordinate of the marker attached to hand 110b observed in the image, which is a projection of the location B0 (xbQ , yb0 , zb0 )onto the xy plane. With the orientation information θ0χ0 , θγ0 , θζ0 ) (not illustrated in the figure, where θχ0 indicates the angle between the limb A0B0 and x axis, θγ0 indicates the angle between the limb A0B0 and y axis, and θζ0 indicates the angle between the limb A0B0 and z axis) measured by the motion sensor, indicating the orientation of limb A0B0 in the image space, and the predefined length L of the limb A0B0, the coordinate of the hand B0 (xbQ , yb0 , zb0 )and the coordinate of the shoulder A0 (xb0 , yb0 , 0) can be determined, where the shift of the shoulder in the z direction is assumed to be negligible. Hence the location of the shoulder A0 is able to be estimated even when the limb is perpendicular to the shutter of the camera, which results in an overlap of the shoulder with the hand in the image captured if the conventional method is adopted. Based on the derived coordinates of the shoulder and the hand corresponding to each image captured by the camera 120, the movement of the limb can be estimated continuously and instantaneously. The predefined position to which the marker is attached could be implemented as any position over the limb, such as the other end of the limb (shoulder 110a), middle point of the limb or quarter point of the limb, under the criteria that the marker is observable in the images captured by the camera 120 throughout the motion.
However, one marker may not be sufficient to be detectable all the time in the rehabilitation exercises due to, for example, the constraints of the subject conditions. Fig. 4(a)
- Fig. 4(b) illustrates one case which might happen. For a patient who had a stroke, the up limb may not be able to be stretched out straight, the forearm and upper arm of which keeps an angle with each other. During the motion, for example, shoulder flexion, the upper arm will be considered as the target (a portion of a limb) in motion to be estimated and a motion sensor 110c is attached along the upper arm. But neither the shoulder 110a nor the elbow
1 lOd is observable throughout the exercise: At the beginning of the exercise, as Fig.4(a) illustrates, the elbowl lOd is in occlusion due to the handl 10b and forearm; In the middle of the exercise , as Fig.4(b) illustrates, the shoulder 110a is in occlusion due to the handl 10b and forearm. Therefore, two distinguishable markers are attached to both shoulder 110a and elbow 1 lOd to ensure that at least one marker is observable in the images captured. Thus based on the one identified marker(either be the one attached to the elbow 1 lOd or to the shoulder 110a) in each image and according output of the motion sensor 110c, the coordinates of shoulder 110a and elbow 1 lOd can still be determined. More markers can be applied accordingly if two markers are not sufficient. The distinguishing feature between the multiple markers can be visual property, as color and reflective intensities, and geometric property, as shape and pattern.
The alignment between the orientation space and the image space is realized by a calibration unit by determining angular difference between coordinate systems of the orientation space and coordinate systems of the image space. The orientation space is a world fixed coordinate frame, which the orientation derived from the outputs of the motion sensor
110c fits. The image space relates to the positioning of the camera, of which x, y axes parallel to the edges of the camera. The calibration unit is optional if the image space and the orientation space have already been aligned.
One embodiment of the calibration unit is a compass to be attached to the camera. A compass attached to one edge of the up surface of the camera helps to align one edge of surface comprising the shutter to be aligned with the direction to the Geographic
North Pole. The other edge of the surface comprising the shutter can be aligned with the direction of gravitational force by naturally sagging. For a delicate calibration, a gradienter can be adopted to align the other edge to the direction of gravitational force.
One embodiment of the calibration unit is a motion sensor to be attached along one edge of the surface comprising the shutter. In the calibration, adjust the position of the camera until one component of the output equals to 0 and the other two components equal to 90 degrees, which means an axis is aligned. Then other edge of the surface can be aligned in the same way. When the two axes are aligned, the image space and the orientation space are aligned.
For another embodiment, the outputs of the motion sensor attached along the two axes of the surface comprising the shutter are recorded for further compensation in the estimation of the movement of the limb, where the outputs indicate the angular difference between the image space and the orientation space.
The motion sensor can also be placed along the edge perpendicular to the surface comprising the shutter to calibrate the two spaces in the same way as mentioned above.
The length of the portion of the length is derived from the limb length based on a statistic ratio. For example, the length of an upper arm is calculated based on a statistic ratio between the length of whole upper limb and the length of the upper arm. As the length of the upper limb naturally sagging in the image is determined, the predetermined limb length L is determined accordingly based on the statistic ratio.
The estimation of the movement can be firstly completed in the image space, which is further converted to the absolute movement in the actual space. A mapping rule is predetermined for such conversion for the ratio of the length in actual space and the image space based on the distance between the camera and the subject.
The movement is one of rotation angle, angular velocity, angular acceleration, linear velocity and motion trajectory. The apparatus proposed by the invention can also be applied to measure the movement of a lower limb. The estimation of the movement of a thigh, a shin or a whole leg can be facilitated similarly as the embodiments mentioned above, where the thigh is the counterpart to the upper arm, the shin is the counterpart to the forearm, a hip is the counterpart to the shoulder, a knee is counterpart to the elbow and a foot is counterpart to the hand.
Fig.5 shows a method of continuously motion tracking of a limb, the method 100 comprising the following steps:
Step SI 00, aligning an orientation space and the image space by determining angular difference between coordinate systems of the orientation space and coordinate systems of the image space, wherein the orientation space corresponds to the space of orientation derived;
Step S101 , the orientation of the portion of the limb, the orientation being derived from outputs of the motion sensor 110c to be attached to the portion of the limb;
Step SI 02, capturing images of the portion of the limb in motion with a marker attached to a predefined position 110a, 110b, 1 lOd of the portion of the limb, a location of the marker in each image captured by the camera 120 being a reference location;
Step S 103, estimating the movement of the portion of the limb based on reference locations, the orientation corresponding to each reference location, a predetermined length the portion of limb and the predefined position by determining the spatial locations of both ends of the portion of the limb.
Step 100 is an optional step if the image space and the orientation space have been already aligned.
Further steps are involved to facilitate the compensation of the angular difference between the image space and the orientation space as:
measuring the angular difference between the orientation space and the image space;
compensating the estimation of the movement of the portion of the limb by the angular difference measured.
Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. A computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Claims

CLAIMS:
1. An apparatus for motion tracking of at least a portion of a limb, the limb comprising two jointed straight portions, comprising:
a motion sensor(l 10c) to be attached to the portion of the limb for measuring orientation of the portion of the limb, the orientation being derived from the outputs of the motion sensor(l 10c);
a camera (120) for capturing images of the portion of the limb in motion with a marker attached to a predefined position(l 10a, 110b, 1 lOd), a location of the marker in each image captured by the camera(120) being a reference location;
a processor for estimating the movement of the portion of the limb based on reference locations, the orientation corresponding to each reference location, a predetermined length of the portion of the limb and the predefined position by determining the spatial locations of both ends of the portion of the limb in the image space;
wherein the limb is an arm or a leg.
2. An apparatus according to claim 1, wherein the marker is one of multiple distinguishable markers attached to respective predefined positions (110a, 110b, 1 lOd) of the portion of the limb to ensure that at least one marker is observable in each image captured by the camera(120) during the motion.
3. An apparatus according to any one of claim 1 or claim 2, further comprising a calibration unit to align the orientation space and the image space by determining angular difference between coordinate systems of the orientation space and coordinate systems of the image space, wherein the orientation space corresponds to the space of orientation derived.
4. An apparatus according to claim 3, wherein the calibration unit comprises a compass to be attached to the camera.
5. An apparatus according to claim 3, wherein the calibration unit comprises a motion sensor to be attached to the camera.
6. An apparatus according to claim 5, wherein the processor further compensates the estimation of the movement of the portion of the limb by the angular difference measured by the motion sensor(l 10c).
7. An apparatus according to any one of claims 1 to 6, wherein the length of the portion of the limb is derived from the limb length based on a statistic ratio.
8. An apparatus according to any one of claims 1 to 7, wherein the spatial locations are the locations in the image space.
9. An apparatus according to claim 8, wherein the processor further estimates the movement in actual space by mapping the spatial locations in the image space to the actual space based on the predetermined mapping rules.
10. An apparatus according to any one of claims 1 to 9, wherein the movement is one of rotation angle, angular velocity, angular acceleration, linear velocity and motion trajectory.
11. An apparatus according to any one of claims 1 to 9, wherein the motion sensor comprises two or more of an accelerometer, a magnetometer and a gyroscope.
12. A method of motion tracking of at least a portion of a limb, the limb comprising two jointed straight portions , the method comprising the steps of:
measuring (SI 01) the orientation of the portion of the limb, the orientation being derived from outputs of the motion sensor(l 10c) to be attached to the portion of the limb;
capturing (SI 02) images of the portion of the limb in motion with a marker attached to a predefined position(l 10a, 110b, 1 lOd) , a location of the marker in each image captured by the camera(120) being a reference location;
estimating (SI 03) the movement of the portion of the limb based on reference locations, the orientation corresponding to each reference location, a predetermined length of the portion of limb and the predefined position by determining the spatial locations of both ends of the portion of the limb in the image space;
wherein the limb is an arm or a leg.
13. A method according to claim 12, further comprising a step of
aligning (SI 00) the orientation space and the image space by determining angular difference between coordinate systems of the orientation space and coordinate systems of the image space, wherein the orientation space corresponds to the space of orientation derived.
14. A method according to claim 12, further comprising a step of
measuring the angular difference between the orientation space and the image space;
compensating the estimation of the movement of the portion of the limb by the angular difference measured.
15. A computer program product comprising computer program code means for causing a computer to perform the steps of the method as claimed in claim 12 when said computer program code means is run on the computer.
PCT/EP2016/065266 2015-07-07 2016-06-30 Apparatus and method for motion tracking of at least a portion of a limb WO2017005591A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201680039901.9A CN107847187B (en) 2015-07-07 2016-06-30 Apparatus and method for motion tracking of at least part of a limb

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN2015083491 2015-07-07
CNPCT/CN2015/083491 2015-07-07
EP15189716 2015-10-14
EP15189716.2 2015-10-14

Publications (1)

Publication Number Publication Date
WO2017005591A1 true WO2017005591A1 (en) 2017-01-12

Family

ID=56321940

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/065266 WO2017005591A1 (en) 2015-07-07 2016-06-30 Apparatus and method for motion tracking of at least a portion of a limb

Country Status (2)

Country Link
CN (1) CN107847187B (en)
WO (1) WO2017005591A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109394232A (en) * 2018-12-11 2019-03-01 上海金矢机器人科技有限公司 A kind of locomitivity monitoring system and method based on wolf scale
EP3621083A1 (en) * 2018-09-10 2020-03-11 Koninklijke Philips N.V. Rehabilitation device and method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113111678A (en) * 2019-12-25 2021-07-13 华为技术有限公司 Method, device, medium and system for determining position of limb node of user

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6377401B1 (en) * 1999-07-28 2002-04-23 Bae Systems Electronics Limited Head tracker system
US20060284979A1 (en) * 2005-06-09 2006-12-21 Sony Corporation Activity recognition apparatus, method and program
US20080319349A1 (en) * 2005-04-18 2008-12-25 Yitzhak Zilberman System and Related Method For Determining a Measurement Between Locations on a Body
US20090209343A1 (en) * 2008-02-15 2009-08-20 Eric Foxlin Motion-tracking game controller
US20090259148A1 (en) * 2006-07-19 2009-10-15 Koninklijke Philips Electronics N.V. Health management device
US20100060570A1 (en) * 2006-02-08 2010-03-11 Oblong Industries, Inc. Control System for Navigating a Principal Dimension of a Data Space
US20100144414A1 (en) * 2008-12-04 2010-06-10 Home Box Office, Inc. System and method for gathering and analyzing objective motion data
US20100239121A1 (en) * 2007-07-18 2010-09-23 Metaio Gmbh Method and system for ascertaining the position and orientation of a camera relative to a real object
US20110102568A1 (en) * 2009-10-30 2011-05-05 Medical Motion, Llc Systems and methods for comprehensive human movement analysis
US20110202306A1 (en) * 2008-08-25 2011-08-18 Universitat Zurich Prorektorat Mnw Adjustable Virtual Reality System
US20130201291A1 (en) * 2012-02-08 2013-08-08 Microsoft Corporation Head pose tracking using a depth camera
WO2014108824A1 (en) 2013-01-11 2014-07-17 Koninklijke Philips N.V. A system and method for evaluating range of motion of a subject
US20140303524A1 (en) * 2011-11-08 2014-10-09 Nanyang Technological University Method and apparatus for calibrating a motion tracking system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4148281B2 (en) * 2006-06-19 2008-09-10 ソニー株式会社 Motion capture device, motion capture method, and motion capture program
CN103099623B (en) * 2013-01-25 2014-11-05 中国科学院自动化研究所 Extraction method of kinesiology parameters

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6377401B1 (en) * 1999-07-28 2002-04-23 Bae Systems Electronics Limited Head tracker system
US20080319349A1 (en) * 2005-04-18 2008-12-25 Yitzhak Zilberman System and Related Method For Determining a Measurement Between Locations on a Body
US20060284979A1 (en) * 2005-06-09 2006-12-21 Sony Corporation Activity recognition apparatus, method and program
US20100060570A1 (en) * 2006-02-08 2010-03-11 Oblong Industries, Inc. Control System for Navigating a Principal Dimension of a Data Space
US20090259148A1 (en) * 2006-07-19 2009-10-15 Koninklijke Philips Electronics N.V. Health management device
US20100239121A1 (en) * 2007-07-18 2010-09-23 Metaio Gmbh Method and system for ascertaining the position and orientation of a camera relative to a real object
US20090209343A1 (en) * 2008-02-15 2009-08-20 Eric Foxlin Motion-tracking game controller
US20110202306A1 (en) * 2008-08-25 2011-08-18 Universitat Zurich Prorektorat Mnw Adjustable Virtual Reality System
US20100144414A1 (en) * 2008-12-04 2010-06-10 Home Box Office, Inc. System and method for gathering and analyzing objective motion data
US20110102568A1 (en) * 2009-10-30 2011-05-05 Medical Motion, Llc Systems and methods for comprehensive human movement analysis
US20140303524A1 (en) * 2011-11-08 2014-10-09 Nanyang Technological University Method and apparatus for calibrating a motion tracking system
US20130201291A1 (en) * 2012-02-08 2013-08-08 Microsoft Corporation Head pose tracking using a depth camera
WO2014108824A1 (en) 2013-01-11 2014-07-17 Koninklijke Philips N.V. A system and method for evaluating range of motion of a subject

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3621083A1 (en) * 2018-09-10 2020-03-11 Koninklijke Philips N.V. Rehabilitation device and method
CN109394232A (en) * 2018-12-11 2019-03-01 上海金矢机器人科技有限公司 A kind of locomitivity monitoring system and method based on wolf scale
CN109394232B (en) * 2018-12-11 2023-06-23 上海金矢机器人科技有限公司 Exercise capacity monitoring system and method based on wolf scale

Also Published As

Publication number Publication date
CN107847187B (en) 2021-08-17
CN107847187A (en) 2018-03-27

Similar Documents

Publication Publication Date Title
US10679360B2 (en) Mixed motion capture system and method
US8165844B2 (en) Motion tracking system
Takeda et al. Gait posture estimation using wearable acceleration and gyro sensors
US9401025B2 (en) Visual and physical motion sensing for three-dimensional motion capture
Roetenberg et al. Xsens MVN: Full 6DOF human motion tracking using miniature inertial sensors
US20150213653A1 (en) Motion capture system
US20100194879A1 (en) Object motion capturing system and method
KR101751760B1 (en) Method for estimating gait parameter form low limb joint angles
US20160258779A1 (en) Inertial Motion Capture Calibration
KR101080078B1 (en) Motion Capture System using Integrated Sensor System
WO2017004403A1 (en) Biomechanical information determination
CN107613867B (en) Action display system and recording medium
CN110609621B (en) Gesture calibration method and human motion capture system based on microsensor
JP6288858B2 (en) Method and apparatus for estimating position of optical marker in optical motion capture
JP6852673B2 (en) Sensor device, sensor system and information processing device
JP5807290B2 (en) Autonomous system and method for determining information representing joint chain motion
Horenstein et al. Validation of magneto-inertial measuring units for measuring hip joint angles
WO2017005591A1 (en) Apparatus and method for motion tracking of at least a portion of a limb
Zabat et al. IMU-based sensor-to-segment multiple calibration for upper limb joint angle measurement—A proof of concept
Madrigal et al. 3D motion tracking of the shoulder joint with respect to the thorax using MARG sensors and data fusion algorithm
CN109453505B (en) Multi-joint tracking method based on wearable device
Koda et al. 3D measurement of forearm and upper arm during throwing motion using body mounted sensor
Madrigal et al. Evaluation of suitability of a micro-processing unit of motion analysis for upper limb tracking
Ambrósio et al. Spatial reconstruction of human motion by means of a single camera and a biomechanical model
JP6205387B2 (en) Method and apparatus for acquiring position information of virtual marker, and operation measurement method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16734330

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16734330

Country of ref document: EP

Kind code of ref document: A1