CN107847187B - Apparatus and method for motion tracking of at least part of a limb - Google Patents

Apparatus and method for motion tracking of at least part of a limb Download PDF

Info

Publication number
CN107847187B
CN107847187B CN201680039901.9A CN201680039901A CN107847187B CN 107847187 B CN107847187 B CN 107847187B CN 201680039901 A CN201680039901 A CN 201680039901A CN 107847187 B CN107847187 B CN 107847187B
Authority
CN
China
Prior art keywords
limb
orientation
space
motion
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201680039901.9A
Other languages
Chinese (zh)
Other versions
CN107847187A (en
Inventor
陈城
王进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Publication of CN107847187A publication Critical patent/CN107847187A/en
Application granted granted Critical
Publication of CN107847187B publication Critical patent/CN107847187B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Dentistry (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention proposes an apparatus for motion tracking of at least a portion of a limb comprising two articulated straight portions. The device comprises: a motion sensor (110c) attached to a portion of the limb to measure an orientation of the portion of the limb; a camera (120) for capturing images of the part of the limb in motion together with markers at predefined locations (110a, 110b, 110d) attached to the part of the limb, the position of the markers in each image captured by the camera as a reference location; a processor for estimating movement of the portion of the limb based on reference locations, an orientation corresponding to each reference location, a predetermined length of the portion of the limb, and the predefined location. Instead of tracking the position of the joint directly in the image, the position is derived based on the position of one marker that can be observed and from the orientation information. In this way, the occlusion problem is solved and the movement of the limb can be estimated based on the accuracy and instantaneous measurement of the location of interest.

Description

Apparatus and method for motion tracking of at least part of a limb
Technical Field
The present invention relates generally to motion tracking of at least a portion of a limb, and more particularly to motion tracking of at least a portion of a limb in rehabilitation.
Background
Stroke is one of the leading causes of death worldwide, and most stroke survivors suffer from a number of dysfunctions. One of the most common dysfunctions is motor dysfunction. Long-term rehabilitation is needed to help patients improve their motor ability. Typically, patients are hospitalized for about one month and then discharged home due to the limited number of therapists and rehabilitation centers. Therefore, unsupervised rehabilitation systems have been developed to help patients at home to their maximum possible level of rehabilitation.
WO2014108824a1 discloses a vision based motion tracking system for rehabilitation. The system has a camera that captures images of a limb in motion, wherein markers are attached to at least two points of the upper limb, so that the range of motion of the upper limb can be assessed based on the position of the markers in the images. Orientation information for motion estimation will be derived for motion prediction from changes in the position in the image of at least two markers captured based on the fact that the projected length of a limb on the image will change accordingly due to changes in the orientation of the limb. However, due to the 2D dimensional nature of the image, occlusion of the markers will occur during movement of the limb whenever one marker is covered by another marker or part of the limb, which will result in non-detectability of the positions of the overlapping markers in the image. Thus, the consistency of motion tracking is interrupted. Although interpolation may help overcome this problem, it provides an instantaneous joint position estimate with low accuracy.
Disclosure of Invention
It would be desirable to have a more robust system to facilitate continuous motion tracking of at least a portion of a limb and to provide motion tracking of greater accuracy for further evaluation.
An apparatus for motion tracking of at least a portion of a limb, the limb comprising two articulated straight portions, the apparatus comprising:
a motion sensor 110c attached to the portion of the limb for measuring an orientation of the portion of the limb, the orientation being derived from an output of the motion sensor 110 c;
a camera 120 for capturing images of the part of the limb in motion and markers attached to predefined positions 110a, 110b, 110d, the position of the markers in each image captured by the camera 120 being a reference position;
a processor for estimating movement of the portion of the limb by determining spatial positions of both ends of the portion of the limb based on reference positions, the orientation corresponding to each reference position, the predetermined length of the portion of the limb, and the predefined position.
It is proposed in the present invention to introduce a motion sensor to provide orientation information of at least part of (or the entire) limb in motion, which is not feasible to derive via images captured by a camera when occlusion occurs. With the provided orientation information, estimation of the spatial position of the joint is facilitated even with only one attached marker. The position where the markers are attached is predefined according to the motion tracking for a particular rehabilitation exercise to ensure that the markers are detectable during the exercise. Instead of tracking the position of the joint directly in the image, the spatial position is derived based on the position of a marker that can be observed and the corresponding orientation information. In this way, occlusion problems are solved and the movement of a part of the limb (or the whole limb) can be estimated based on an accurate and instantaneous measurement of the spatial position of interest. When the limb is of interest, the limb remains straight and unbent. The straight portion of the limb may be the upper arm, forearm, calf or thigh.
In one embodiment, the marker is one of a plurality of distinguishable markers attached to respective predefined locations 110a, 110b, 110d of the portion of the limb to ensure that at least one marker is observable in each image captured by the camera 120 during motion.
Due to limitations in the content of the rehabilitation exercise or the condition of the patient, there may happen to be no such position so that one marker attached thereto is always detectable throughout the entire flow of the exercise. That way, multiple markers that are distinguishable from each other in the image will be employed to ensure that at least one marker is observable for a certain time instant. Each marker is attached to a respective predefined location on a part of the limb (or the whole limb). The spatial position of the joint of interest can then be estimated based on the position of the identified markers in the image, the respective predefined positions of the markers, and the orientation information measured by the motion sensor.
In an embodiment, the apparatus further comprises a calibration unit for aligning the orientation space and the image space by determining an angular difference between a coordinate system of the orientation space and a coordinate system of the image space, wherein the orientation space corresponds to the derived space of orientations.
The orientation information derived from the output of the motion sensor to depict the orientation space is expressed by a coordinate system that may be different from the coordinate system depicting the image space. The image space refers to a coordinate system used to depict the position of the markers in each image captured by the camera. A calibration unit is also provided to align the orientation space and the image space by determining the angular difference between the two coordinate systems, thereby preparing the orientation measured by the motion sensor to be used in the image space for determining the spatial position of the joint of interest.
In one embodiment, the calibration unit comprises a compass attached to the camera.
The derived orientation information of the limb is presented in a world-fixed coordinate system whose axes have a direction pointing to geographic north, a direction of gravity and a direction perpendicular to the two axes. A compass attached to the camera helps calibrate the camera to be fixed in a position such that either of its two edges (defining the shutter by its determined surface) is aligned with the direction pointing to geographic north. The other edge can then be easily aligned with the direction of gravity, for example by natural sagging. The orientation space and the image space are then aligned such that the orientation information of the part of the limb (or the whole limb) generated by the motion sensor is the orientation angle of the part of the limb (or the whole limb) in the image space.
In one embodiment, the calibration unit comprises a motion sensor attached to the camera.
The calibration unit may be implemented as a motion sensor to be attached to the camera. For example, the motion sensor is placed along the gravitational edge of the camera. The output of the motion sensor attached to the camera is indicative of the orientation difference between the image space and the orientation space. The camera can be further adjusted until the gravity component of the output equals 0 and the other two components equal 90 degrees, which means that the gravity axes are aligned. The remaining axes can then be aligned in the same manner. Then, the two spaces are aligned. Alternatively, instead of adjusting the camera, each output of a motion sensor attached along each edge of the camera indicating the angular difference between the two spaces may be recorded for further compensation.
In one embodiment, the processor also compensates the estimate of the movement of the portion of the limb by the angular difference measured by the motion sensor 110 c.
Based on the generated orientation information, compensation information is collected to mitigate differences in phase of the movement measurements between the orientation space and the image space.
In one embodiment, the spatial location is a location in image space and the movement is estimated in the image space.
Movement in image space is measured, which may be sufficient for the purpose of recovery evaluation.
In one embodiment, the processor further estimates the movement in the orientation space by mapping the spatial position of the image space to the actual space based on a predetermined mapping rule.
Instead of the movement in image space, the movement in the actual space of interest is further derived by mapping. The mapping rules are predetermined according to the original setting of the relative spatial relationship (e.g. distance) between the camera and the object.
In one embodiment, the movement is one of: rotation angle, angular velocity, angular acceleration, linear velocity, and motion trajectory.
In one embodiment, the motion sensor comprises two or a combination of: accelerometers, magnetometers, and gyroscopes.
The motion sensor comprises at least an accelerometer and a magnetometer. With an accelerometer, the 2D tilt can be determined. Then, in combination with the magnetometer, the rotation around the vertical axis is further determined, thereby deriving a full 3D orientation in the world fixed coordinate system. The disadvantages of orientation reconstruction with accelerometers and magnetometers are: it works well for accelerations caused by movement that are only small compared to the acceleration of gravity. To improve accuracy, the gyroscope can be incorporated into the orientation reconstruction, which integrates the gyroscope data. In another embodiment, a combination of accelerometers and gyroscopes can be implemented to measure full 3D orientation. A detailed description of the orientation reconstruction using accelerometers, magnetometers and gyroscopes can be found in the prior literature, such as in the Miniature Wireless Inertial Sensor for Measuring Human Motors by Victor Van Acht, Edwin Bogers, Niek Lambert and Rene Verberge, EMBC2007, Lyon, France.
The invention comprises a method for the following steps:
aligning S100 an orientation space with an image space by determining an angular difference between a coordinate system of the orientation space and a coordinate system of the image space, wherein the orientation space corresponds to the derived space of orientations;
measuring S101 an orientation of a portion of a limb, the orientation being derived from an output of a motion sensor 110c attached to the portion of the limb;
capturing S102 an image of the part of the limb in motion together with markers attached to predefined locations 110a, 110b, 110d, the positions of the markers in each image captured by the camera 120 as reference positions;
estimating S103 a movement of the portion of the limb by determining spatial positions of both ends of the portion of the limb based on reference positions, the orientation corresponding to each reference position, a predetermined length of the portion of the limb and the predefined position.
Various aspects and features of the disclosure are described in further detail below. Other objects and advantages of the present invention will become more apparent from the following description, which is to be read in connection with the accompanying drawings.
Drawings
The invention will be described and explained in more detail hereinafter with reference to embodiments and with reference to the accompanying drawings, in which:
FIG. 1 shows a schematic view of an apparatus for motion tracking of a limb according to an embodiment of the invention;
fig. 2(a) -2 (d) illustrate examples of postures designed for upper limb rehabilitation.
Fig. 3 shows a schematic diagram of a measurement of limb movement based on motion sensor output and captured images in an embodiment of the invention.
Fig. 4(a) shows a schematic diagram illustrating the state of the upper limb of a patient suffering from stroke at the beginning of shoulder flexion.
Fig. 4(b) shows an explanatory view showing a state in which the upper limb of a patient suffering from stroke is in the middle of shoulder flexion.
Figure 5 illustrates a method of motion tracking of a limb.
Similar or corresponding features and/or functions are denoted by the same reference numerals in the figures.
Detailed Description
The present invention will be described with respect to particular embodiments and with reference to certain drawings but the invention is not limited thereto but only by the claims. The drawings described are only schematic and are non-limiting. In the drawings, the size of some of the elements may be exaggerated and not drawn on scale for illustrative purposes.
Fig. 1 is a schematic diagram of an apparatus for motion tracking of an upper limb according to an embodiment of the present invention, which includes a marker attached to a hand 110b of a subject, a motion sensor 110c attached to the upper limb, a camera 120 located in front of the subject, and a processor (not shown) that estimates the movement of the upper limb. The upper limb comprises two straight portions: the upper arm and the forearm. Embodiments will be discussed later regarding estimating movement of only the upper arm or forearm. Although the camera is illustrated in fig. 1 as being positioned in front of the subject, the camera can also be positioned beside the subject, and should not be considered limiting in this regard.
The motion sensor 110c is placed along the limb to measure the orientation of the limb in motion, e.g. being rehabilitated, while the camera 120 is configured to simultaneously capture images of the limb in motion together with the attached markers. The markers are attached to the hand 110b, which is predefined before exercising.
For rehabilitation exercises, the posture of moving the forearm or upper arm is designed to assess the recovery of the upper limb after a stroke. Each gesture may emphasize rotation of a particular joint in a different direction. Fig. 2(a) -2 (d) give some examples of planned postures, where the shoulders or elbows act as joints to be served and remain stationary in the posture. Accordingly, the degrees of freedom and range of movement of the upper arm or forearm are measured during the exercise.
Fig. 2(a) illustrates a shoulder flexion exercise.
Starting position: parallel to the trunk axillary midline;
moving position: parallel to the longitudinal axis of the humerus that points towards the lateral epicondyle;
and (3) movement: shoulder buckling;
the range is as follows: 30 to 180 degrees.
Fig. 2(b) illustrates shoulder abduction exercise.
Starting position: parallel to the body axillary midline;
moving position: the anterior side of the upper arm is parallel to the longitudinal axis of the humerus;
moving: shoulder lift in the scapular or frontal plane;
the range is as follows: 30 to 180 degrees.
Fig. 2(c) illustrates the shoulder inside/outside rotation exercise.
Starting position: parallel to the support surface or perpendicular to the floor;
moving position: parallel to the longitudinal axis of the ulna pointing towards the styloid process;
moving: inner rotation and outer rotation;
the range is as follows: the inner is 0-80 degrees and the outer is 0-60 degrees.
Fig. 2(d) illustrates an elbow flexion exercise.
Starting position: parallel to the longitudinal axis of the humerus pointing toward the apical end of the acromion;
moving position: parallel to the longitudinal axis of the radius directed at the styloid process of the radius;
moving: bending the elbow;
the range is as follows: 30 to 150 degrees.
Figure 3 is a schematic diagram illustrating measurement of limb movement in shoulder flexion based on the output of the motion sensor 110c and the captured image in an embodiment of the present invention. It is assumed that the image space and orientation space are perfectly aligned and that the image will be depicted using the world fixed coordinate system of the motion sensor 110c, where x indicates the direction towards geographic north, y indicates the direction of gravity, and z indicates the direction perpendicular to the two aforementioned directions. Image 210 is of limb a targeted by camera 120 at a particular moment during shoulder flexion0B0Captured. Reference position B 'of the hand as in image 210'0(x′b0,y′b00) indicates the coordinates of the marker attached to hand 110B observed in the image, which is position B0(xb0,yb0,zb0) Projection onto the xy plane. Using the indicating limb A measured by a motion sensor0B0Orientation information θ of orientation in image space0x0y0z0) (not shown in the figure, where θx0Indicating limb A0B0Angle with the x-axis, thetay0Indicating limb A0B0Angle with the y-axis, thetaz0Indicating limb A0B0The angle with the z-axis) and limb a0B0Can determine the coordinates B of the hand0(xb0,yb0,zb0) And the coordinates of the shoulder A0(xb0,yb00), wherein the displacement of the shoulder in the z-direction is assumed to be negligible. Thus, even when the limbs are perpendicular to the shutter of the camera, resulting in the shoulder and hand overlapping in the captured image with the conventional method, the position a of the shoulder can be estimated0. Based on the derived coordinates of the shoulder and hand corresponding to each image captured by the camera 120, the movement of the limb can be continuously and instantaneously estimated. The predefined location to which the marker is attached may be implemented as any location on the limb, such as the other end of the limb (shoulder 110a), the midpoint of the limb, or a quarter of the limb, based on the criterion that the marker is observable in the images captured by the camera 120 throughout the motion.
However, one marker may not be sufficient to be consistently detectable during a rehabilitation exercise, for example due to constraints of the subject's condition. Fig. 4(a) -4 (b) illustrate one situation that may occur. For patients with stroke, the upper limb may not be able to straighten out, with the forearm and upper arm held at an angle to each other. During motion, e.g. during shoulder flexion, the upper arm will be considered as the target (part of the limb) in motion to be estimated, and the motion sensor 110c is attached along the upper arm. However, both the shoulder 110a and elbow 110d are not observable throughout the exercise. At the beginning of the exercise, elbow 110d is occluded due to hand 110b and forearm as illustrated in fig. 4 (a). In the middle of the exercise, as illustrated in fig. 4(b), the shoulder 110a is obscured by the hand 110b and forearm. Thus, two distinguishable markers are attached to both the shoulder 110a and elbow 110d to ensure that at least one marker is observable in the captured image. Thus, based on an identified marker in each image (either the one attached to elbow 110d or the one attached to shoulder 110a) and from the output of motion sensor 110c, the coordinates of shoulder 110a and elbow 110d can still be determined. If two marks are not sufficient, more marks can be applied accordingly. The distinguishing features between the plurality of markings can be visual properties such as color and reflection intensity, as well as geometric properties such as shape and pattern.
The alignment between the orientation space and the image space is achieved by a calibration unit by determining an angular difference between the coordinate system of the orientation space and the coordinate system of the image space. The orientation space is a world fixed coordinate system to which the orientation derived from the output of the motion sensor 110c is fitted. The image space relates to the positioning of the camera, wherein the x-axis and the y-axis are parallel to the edges of the camera. The calibration unit is optional if the image space and the orientation space have been aligned.
One embodiment of the calibration unit is a compass attached to the camera. A compass attached to one edge of the upper surface of the camera helps align one edge of the surface including the shutter with the direction toward the geographic north pole. The other edge of the surface comprising the shutter can be aligned with the direction of gravity by natural sagging. To achieve a fine calibration, an inclinometer can be employed to align the other edge with the direction of gravity.
One embodiment of the calibration unit is a motion sensor attached along one edge of the surface comprising the shutter. In calibration, the position of the camera is adjusted until one component of the output equals 0 and the other two components equal 90 degrees, which means that the axes are aligned. The other edge of the surface can be aligned in the same way. When the two axes are aligned, the image space and the orientation space are aligned.
For another embodiment, the output of motion sensors attached along two axes of the surface comprising the shutter is recorded for further compensation of the estimation of the movement of the limb, wherein the output is indicative of the angular difference between the image space and the orientation space.
The motion sensor can also be placed along the edge perpendicular to the surface comprising the shutter, so that both spaces are calibrated in the same way as mentioned above.
The length of the portion of the limb is derived based on a statistical ratio from limb lengths. For example, the length of the upper arm is calculated based on a statistical ratio between the length of the entire upper limb and the length of the upper arm. With the determination of the length of the naturally drooping upper limb in the image, the predetermined limb length L is accordingly determined based on the statistical ratio.
The estimation of the movement can be done first in image space, which is further translated into absolute movement in real space. The mapping rule for such a translation is predetermined for the ratio of the length in real space to the length within image space based on the distance between the camera and the object.
The movement is one of: rotation angle, angular velocity, angular acceleration, linear velocity, and motion trajectory.
The device proposed by the invention can also be applied to measure the movements of the lower limbs. The estimation of the movement of the thigh, the lower leg or the entire leg can be facilitated in a similar way as mentioned above, wherein the thigh is the corresponding part of the upper arm, the lower leg is the corresponding part of the forearm, the hip is the corresponding part of the shoulder, the knee is the corresponding part of the elbow and the foot is the corresponding part of the hand.
Fig. 5 illustrates a method of continuous motion tracking of a limb, the method 100 comprising the steps of:
step S100 of aligning the orientation space and the image space by determining an angular difference between a coordinate system of the orientation space and a coordinate system of the image space, wherein the orientation space corresponds to the space of the derived orientation.
A step S101 of deriving an orientation of the portion of the limb from an output of a motion sensor 110c attached to the portion of the limb;
step S102 of capturing images of the part of the limb in motion together with markers attached to predefined positions 110a, 110b, 110d of the part of the limb, the positions of the markers in each image captured by the camera 120 being a reference position;
step S103, estimating a movement of the portion of the limb by determining spatial positions of both ends of the portion of the limb based on reference positions, orientations corresponding to each reference position, a predetermined length of the portion of the limb and the predefined position.
Step 100 is an optional step if the image space and the orientation space have been aligned.
Further steps are involved to facilitate compensation of the angular difference between image space and orientation space, said steps being:
measuring an angular difference between the orientation space and the image space;
compensating the estimate of movement of the portion of the limb with the measured angular difference.
Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the singular articles "a" or "an" do not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of certain measures cannot be used to advantage. The computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the internet or other wired or wireless telecommunication systems. Reference signs in the claims shall not be construed as limiting the scope.

Claims (15)

1. An apparatus for motion tracking of at least a portion of a limb, the limb comprising two articulated straight portions, the apparatus comprising:
a motion sensor (110c) attached to the portion of the limb to measure an orientation of the portion of the limb, the orientation being derived from an output of the motion sensor (110 c);
a camera (120) for capturing images of the part of the limb in motion and markers attached to predefined locations (110a, 110b, 110d) of the part of the limb, the predefined locations with the markers attached being predefined according to the motion tracking to ensure that the markers are observable in each image captured by the camera (120) during the motion, the locations of the markers in each image captured by the camera (120) being a reference location;
a processor for estimating movement of the portion of the limb by determining spatial positions of two ends of the portion of the limb in image space based on reference positions, an orientation corresponding to each reference position, a predetermined length of the portion of the limb and the predefined position;
wherein the limb is an arm or a leg.
2. The apparatus of claim 1, wherein the marker is one of a plurality of distinguishable markers attached to respective predefined locations (110a, 110b, 110d) of the portion of the limb to ensure that at least one marker is observable in each image captured by the camera (120) during the motion.
3. The apparatus of claim 1 or 2, further comprising a calibration unit to align an orientation space with the image space by determining an angular difference between a coordinate system of the orientation space and a coordinate system of the image space, wherein the orientation space corresponds to the derived orientation space.
4. The apparatus of claim 3, wherein the calibration unit comprises a compass to be attached to the camera.
5. The apparatus of claim 3, wherein the calibration unit comprises a motion sensor to be attached to the camera.
6. The apparatus of claim 5, wherein the processor further compensates the estimate of the movement of the portion of the limb with the angular difference measured by the motion sensor (110 c).
7. The apparatus of claim 1 or 2, wherein the predetermined length of the portion of the limb is derived based on a statistical ratio from the length of the limb.
8. The apparatus of claim 1 or 2, wherein the spatial position is a position in the image space.
9. The apparatus of claim 8, wherein the processor further estimates movement in real space by mapping the spatial location in the image space to real space based on a predetermined mapping rule.
10. The apparatus of claim 9, wherein the movement in the real space is one of: rotation angle, angular velocity, angular acceleration, linear velocity, and motion trajectory.
11. The apparatus of claim 1 or 2, wherein the motion sensor comprises two or more of: accelerometers, magnetometers, and gyroscopes.
12. A method of motion tracking of at least a portion of a limb, the limb comprising two articulated straight portions, the method comprising the steps of:
measuring (S101) an orientation of the part of the limb, the orientation being derived from an output of a motion sensor (110c) attached to the part of the limb;
capturing (S102) images of the part of the limb in motion and markers attached to predefined positions (110a, 110b, 110d) of the part of the limb, the predefined positions to which the markers are attached being predefined according to the motion tracking to ensure that the markers are observable in each image captured by a camera (120) during the motion, the positions of the markers in each image captured by the camera (120) being used as reference positions;
estimating (S103) a movement of the part of the limb by determining spatial positions of both ends of the part of the limb in image space based on reference positions, an orientation corresponding to each reference position, a predetermined length of the part of the limb and the predefined position;
wherein the limb is an arm or a leg.
13. The method of claim 12, further comprising the steps of:
aligning (S100) an orientation space with the image space by determining an angular difference between a coordinate system of the orientation space and a coordinate system of the image space, wherein the orientation space corresponds to a space of the derived orientations.
14. The method of claim 13, further comprising the steps of:
measuring the angular difference between the orientation space and the image space;
compensating the estimate of the movement of the portion of the limb with the measured angular difference.
15. A computer readable medium having stored thereon computer program instructions for causing a computer to perform the steps of the method according to claim 12 when the computer program instructions are run on the computer.
CN201680039901.9A 2015-07-07 2016-06-30 Apparatus and method for motion tracking of at least part of a limb Expired - Fee Related CN107847187B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CNPCT/CN2015/083491 2015-07-07
CN2015083491 2015-07-07
EP15189716.2 2015-10-14
EP15189716 2015-10-14
PCT/EP2016/065266 WO2017005591A1 (en) 2015-07-07 2016-06-30 Apparatus and method for motion tracking of at least a portion of a limb

Publications (2)

Publication Number Publication Date
CN107847187A CN107847187A (en) 2018-03-27
CN107847187B true CN107847187B (en) 2021-08-17

Family

ID=56321940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201680039901.9A Expired - Fee Related CN107847187B (en) 2015-07-07 2016-06-30 Apparatus and method for motion tracking of at least part of a limb

Country Status (2)

Country Link
CN (1) CN107847187B (en)
WO (1) WO2017005591A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3621083A1 (en) * 2018-09-10 2020-03-11 Koninklijke Philips N.V. Rehabilitation device and method
CN109394232B (en) * 2018-12-11 2023-06-23 上海金矢机器人科技有限公司 Exercise capacity monitoring system and method based on wolf scale
CN113111678B (en) * 2019-12-25 2024-05-24 华为技术有限公司 Method, device, medium and system for determining position of limb node of user

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6377401B1 (en) * 1999-07-28 2002-04-23 Bae Systems Electronics Limited Head tracker system
CN1877340A (en) * 2005-06-09 2006-12-13 索尼株式会社 Activity recognition apparatus, method and program
US20100144414A1 (en) * 2008-12-04 2010-06-10 Home Box Office, Inc. System and method for gathering and analyzing objective motion data
CN102176888A (en) * 2008-08-25 2011-09-07 苏黎世大学数学和自然科学部 Adjustable virtual reality system
CN102781320A (en) * 2009-10-30 2012-11-14 医学运动有限公司 Systems and methods for comprehensive human movement analysis
CN103099623A (en) * 2013-01-25 2013-05-15 中国科学院自动化研究所 Extraction method of kinesiology parameters
CN104106262A (en) * 2012-02-08 2014-10-15 微软公司 Head pose tracking using a depth camera

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2006236428A1 (en) * 2005-04-18 2006-10-26 Bioness Inc. System and related method for determining a measurement between locations on a body
US8531396B2 (en) * 2006-02-08 2013-09-10 Oblong Industries, Inc. Control system for navigating a principal dimension of a data space
JP4148281B2 (en) * 2006-06-19 2008-09-10 ソニー株式会社 Motion capture device, motion capture method, and motion capture program
JP2009543649A (en) * 2006-07-19 2009-12-10 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Health management device
DE102007033486B4 (en) * 2007-07-18 2010-06-17 Metaio Gmbh Method and system for mixing a virtual data model with an image generated by a camera or a presentation device
US8696458B2 (en) * 2008-02-15 2014-04-15 Thales Visionix, Inc. Motion tracking system and method using camera and non-camera sensors
US9119569B2 (en) * 2011-11-08 2015-09-01 Nanyang Technological University Method and apparatus for calibrating a motion tracking system
KR20150107783A (en) 2013-01-11 2015-09-23 코닌클리케 필립스 엔.브이. A system and method for evaluating range of motion of a subject

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6377401B1 (en) * 1999-07-28 2002-04-23 Bae Systems Electronics Limited Head tracker system
CN1877340A (en) * 2005-06-09 2006-12-13 索尼株式会社 Activity recognition apparatus, method and program
CN102176888A (en) * 2008-08-25 2011-09-07 苏黎世大学数学和自然科学部 Adjustable virtual reality system
US20100144414A1 (en) * 2008-12-04 2010-06-10 Home Box Office, Inc. System and method for gathering and analyzing objective motion data
CN102781320A (en) * 2009-10-30 2012-11-14 医学运动有限公司 Systems and methods for comprehensive human movement analysis
CN104106262A (en) * 2012-02-08 2014-10-15 微软公司 Head pose tracking using a depth camera
CN103099623A (en) * 2013-01-25 2013-05-15 中国科学院自动化研究所 Extraction method of kinesiology parameters

Also Published As

Publication number Publication date
CN107847187A (en) 2018-03-27
WO2017005591A1 (en) 2017-01-12

Similar Documents

Publication Publication Date Title
US10679360B2 (en) Mixed motion capture system and method
KR101751760B1 (en) Method for estimating gait parameter form low limb joint angles
US8165844B2 (en) Motion tracking system
JP2017503225A (en) Motion capture system
CN103438904B (en) A kind of inertial positioning method and system using vision auxiliary corrective
US20100194879A1 (en) Object motion capturing system and method
US20140085429A1 (en) Sensor positioning for 3d scanning
CN107205783A (en) Acetabular bone edge digitalizer and method
JP2013500812A (en) Inertial measurement of kinematic coupling
CN107847187B (en) Apparatus and method for motion tracking of at least part of a limb
KR101080078B1 (en) Motion Capture System using Integrated Sensor System
JP5807290B2 (en) Autonomous system and method for determining information representing joint chain motion
WO2017004403A1 (en) Biomechanical information determination
CN110609621B (en) Gesture calibration method and human motion capture system based on microsensor
CN108309301B (en) Human body segment quality measuring method
Koda et al. 3D measurement of forearm and upper arm during throwing motion using body mounted sensor
Ambrósio et al. Spatial reconstruction of human motion by means of a single camera and a biomechanical model
CN106872990B (en) A kind of Three dimensional Targets precise positioning and method for tracing
JP6205387B2 (en) Method and apparatus for acquiring position information of virtual marker, and operation measurement method
JP3394979B2 (en) Method and apparatus for measuring joint angle
JP2014117409A (en) Method and apparatus for measuring body joint position
Daponte et al. A stereo vision method for IMU-based motion tracking systems calibration
JP5424224B2 (en) Relative angle estimation system
CN111307173B (en) Method for mutual calibration of positioning and inertial sensors based on alternating electromagnetic field
CN105575239B (en) A kind of reduction of the fracture training pattern angle detection device and its method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210817