US9759539B2 - Method of motion tracking - Google Patents
Method of motion tracking Download PDFInfo
- Publication number
- US9759539B2 US9759539B2 US14/119,960 US201214119960A US9759539B2 US 9759539 B2 US9759539 B2 US 9759539B2 US 201214119960 A US201214119960 A US 201214119960A US 9759539 B2 US9759539 B2 US 9759539B2
- Authority
- US
- United States
- Prior art keywords
- coordinate system
- sensor
- body part
- joint
- inertial
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B5/00—Measuring arrangements characterised by the use of mechanical techniques
- G01B5/004—Measuring arrangements characterised by the use of mechanical techniques for measuring coordinates of points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
- A61B5/1114—Tracking parts of the body
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/10—Athletes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2505/00—Evaluating, monitoring or diagnosing in the context of a particular type of medical care
- A61B2505/09—Rehabilitation or training
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6822—Neck
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6823—Trunk, e.g., chest, back, abdomen, hip
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6824—Arm or wrist
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6825—Hand
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6828—Leg
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6829—Foot or ankle
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/683—Means for maintaining contact with the body
- A61B5/6831—Straps, bands or harnesses
Definitions
- the present invention relates to a motion tracking method, and more particularly, to a motion tracking method which can accurately track a motion even when an inertial measurement unit (IMU) is attached to a human body in any directions because there is no need to accurately match the direction of a sensor coordinate system of each IMU with the direction of a body part coordinate system.
- IMU inertial measurement unit
- Motion tracking or motion capture is a method of tracking a motion of a human body and is technology used for various fields such as film or animation production, sports motion analysis, medical rehabilitation, etc.
- a motion tracking method there is an optical method and a magnetic field method.
- a reflection marker is attached to a human body and an infrared ray is irradiated onto the reflection marker by using an infrared camera so as to receive a light reflected from the reflection marker.
- a magnetic field method a magnetic member is attached to the body of a user and when the user moves in a field where a magnetic field is formed, a motion of the magnetic member is determined by a change in the magnetic field.
- MEMS micro-electromechanical systems
- IMU inertial measurement unit
- the IMU is an apparatus for measuring an inertial force acting on an object moved by an applied acceleration.
- various motion information such as acceleration, speed, direction, distance, etc. of the object may be provided.
- FIG. 1 illustrates a state in which a plurality of inertial sensors S are attached to a human body which has joints and body parts.
- FIG. 2 is a view schematically illustrating a kinematical model of the human body illustrated in FIG. 1 .
- FIG. 3 is a view for explaining coordinate systems used for the kinematical model of FIG. 2 .
- a human body may be represented as a kinematical model having joints J 1 , J 2 , and J 3 and body parts 1 , 2 , 3 , and 4 that are rotatable with respect to the joints J 1 , J 2 , and J 3 .
- the inertial sensors S 1 , S 2 , S 3 , and S 4 are not attached to the joints J 1 , J 2 , and J 3 , but to the body parts 1 , 2 , 3 , and 4 such as the trunk, an upper arm, a lower arm, and the pelvis, respectively.
- information regarding a motion of a human body may be obtained by using the inertial sensors S 1 , S 2 , S 3 , and S 4 attached to the body parts 1 , 2 , 3 , and 4 .
- a motion of a human body may be tracked by processing the information.
- the inertial sensors S 1 , S 2 , S 3 , and S 4 are attached to the body parts 1 , 2 , 3 , and 4 , as illustrated in FIG. 3 , the directions of coordinate systems ⁇ A and B ⁇ fixed to each of the body parts 1 , 2 , 3 , and 4 need to be accurately matched with the directions of coordinate systems ⁇ S A and S B ⁇ of each of the inertial sensors S 1 , S 2 , S 3 , and S 4 .
- the surface of a human body is uneven, it is practically almost impossible to match the directions of the coordinate systems.
- a method of tracking a motion of a model including a joint, at least one body part that rotates with respect to the joint, and a plurality of inertial sensors attached to each body part and measuring a rotational motion of the body part includes a sensor posture measurement operation of obtaining a rotational matrix of a sensor coordinate system fixed to each of the plurality of inertial sensors with respect to an inertial coordinate system fixed to ground, by using a signal measured by the inertial sensor attached to each body part; a rotational matrix conversion operation of obtaining a rotational matrix of the sensor coordinate systems with respect to a body part coordinate system fixed to each body part, by using a rotational matrix value of the sensor coordinate system obtained in the sensor posture measurement operation; a body part posture calculation operation of obtaining a rotational matrix of each body part coordinate system with respect to the inertial coordinate system, by using a rotational matrix of the sensor coordinate systems calculated in the rotational matrix conversion operation; and a joint variable calculation operation of calculating a joint variable with respect to
- the rotational matrix conversion operation of obtaining a rotational matrix of the sensor coordinate systems with respect to the body part coordinate systems fixed to the body parts, respectively, by using the rotational matrix of the sensor coordinate systems obtained in the sensor posture measurement operation is provided, there is no need to accurately match the direction of the body part coordinate system with the direction of the sensor coordinate system of each of the inertial sensor.
- accurate motion tracking may be possible even when the inertial sensors are attached to a human body in any directions.
- FIG. 1 is an image illustrating a state in which a plurality of inertial sensors are attached to a human body which has joints and body parts;
- FIG. 2 is a view schematically illustrating a kinematical model of the human body illustrated in FIG. 1 ;
- FIG. 3 is a view for explaining coordinate systems used for the kinematical model of FIG. 2 ;
- FIG. 4 is a flowchart for explaining a motion tracking method according to an embodiment of the present invention.
- FIG. 5 is a view for explaining necessary coordinate systems used for the motion tracking method of FIG. 4 and a rotational matrix among the coordinate systems;
- FIG. 6 is a view for explaining necessary rotational matrix for the motion tracking method of FIG. 4 .
- a method of tracking a motion of a model including a joint, at least one body part that rotates with respect to the joint, and a plurality of inertial sensors attached to each body part and measuring a rotational motion of the body part includes a sensor posture measurement operation of obtaining a rotational matrix of a sensor coordinate system fixed to each of the plurality of inertial sensors with respect to an inertial coordinate system fixed to ground, by using a signal measured by the inertial sensor attached to each body part; a rotational matrix conversion operation of obtaining a rotational matrix of the sensor coordinate systems with respect to a body part coordinate system fixed to each body part, by using a rotational matrix value of the sensor coordinate system obtained in the sensor posture measurement operation; a body part posture calculation operation of obtaining a rotational matrix of each body part coordinate system with respect to the inertial coordinate system, by using a rotational matrix of the sensor coordinate systems calculated in the rotational matrix conversion operation; and a joint variable calculation operation of calculating a joint variable with respect to
- the model may be a human body including a trunk, a pelvis coupled to the trunk through a joint, an upper arm coupled to the trunk through a joint, and a lower arm coupled to the upper arm through a joint.
- the inertial sensor may be a sensor capable of measuring translational inertia, rotational inertia, and terrestrial magnetism.
- the inertial sensor may transmit a measured signal to the outside wirelessly.
- the inertial sensor may be manufactured according to micro-electromechanical systems (MEMS) technology to be small and light.
- MEMS micro-electromechanical systems
- the term “and/or” includes any and all combinations of one or more of the associated listed items. Expressions such as “at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
- FIG. 4 is a flowchart for explaining a motion tracking method according to an embodiment of the present invention.
- FIG. 5 is a view for explaining necessary coordinate systems used for the motion tracking method of FIG. 4 and a rotational matrix among the coordinate systems.
- the motion tracking method according to the present embodiment is a method of tracking a motion of a human body by attaching a plurality of inertial sensors S to a dynamic model formed of joints and body parts, for example, a moving object such as a human body.
- the motion tracking method according to the present embodiment includes a sensor posture measurement operation S 100 , a rotational matrix conversion operation S 200 , a body part posture calculation operation S 300 , and a joint variable calculation operation S 400 .
- the human body model includes the joints J 1 , J 2 , and J 3 , at least one of the body parts 1 , 2 , 3 , and 4 that rotates with respect to at least one of the joints J 1 , J 2 , and J 3 , and the inertial sensors S 1 , S 2 , S 3 , and S 4 attached to the body parts 1 , 2 , 3 , and 4 and measuring rotational motions of the body parts 1 , 2 , 3 , and 4 , respectively.
- the inertial sensors S 1 , S 2 , S 3 , and S 4 are sensors manufactured according to MEMS technology and may each be inertial measurement units (IMU) capable of measuring translational inertia, rotational inertia, and terrestrial magnetism. A measured signal may be transmitted to the outside wirelessly.
- IMU inertial measurement units
- the human body model includes the trunk 1 , the pelvis 4 coupled to the trunk 1 through the waist joint J 3 , the upper arm 2 coupled to the trunk 1 through the shoulder joint J 1 , and the lower arm 3 coupled to the upper arm 2 through an elbow joint J 2 .
- the trunk 1 , the upper arm 2 , the lower arm 3 , and the pelvis 4 form the body parts 1 , 2 , 3 , and 4 of the human body model, respectively.
- the waist joint J 3 between the trunk 1 and the pelvis 4 has a joint variable Q3 of three degrees of freedom.
- the shoulder joint J 1 between the upper arm 2 and the trunk 1 has a joint variable Q1 of three degrees of freedom.
- the elbow joint J 2 between the lower arm 3 and the upper arm 2 has a joint variable Q2 of three degrees of freedom.
- the joint variables Q1, Q2, and Q3 denote a rotational motion of each of the joints J 1 , J 2 , and J 3 and are well known to one skilled in the art so that detailed descriptions thereof are omitted herein.
- a variety of coordinate systems are used. As illustrated in FIG. 5 , there are an global coordinate system ⁇ 0 ⁇ that is fixed to the ground as a inertial coordinate system, sensor coordinate systems ⁇ S B , S A , S E , and S W ⁇ that are fixed to the inertial sensors S 1 , S 2 , S 3 , and S 4 , respectively, and body part coordinate systems ⁇ B, A, E, and W ⁇ that are fixed to the body parts 1 , 2 , 3 , and 4 , respectively.
- ⁇ 0 ⁇ that is fixed to the ground as a inertial coordinate system
- sensor coordinate systems ⁇ S B , S A , S E , and S W ⁇ that are fixed to the inertial sensors S 1 , S 2 , S 3 , and S 4 , respectively
- body part coordinate systems ⁇ B, A, E, and W ⁇ that are fixed to the body parts 1 , 2 , 3 , and 4 , respectively.
- a rotational matrix R B SB , R A SA , R E SE , and R W SW of the sensor coordinate systems ⁇ S B , S A , S E , and S W ⁇ is obtained with respect to the body part coordinate systems ⁇ B, A, E, and W ⁇ fixed to the body parts 1 , 2 , 3 , and 4 , respectively, by using the rotational matrix R 0 SB , R 0 SA , R 0 SE , and R 0 SW of the sensor coordinate systems ⁇ S B , S A , S E , and S W ⁇ obtained in the sensor posture measurement operation S 100 .
- Equation (1) derived from an equation of a motion between the trunk 1 and the upper arm 2 .
- R A B denotes a rotational matrix between the body part coordinate system ⁇ B ⁇ fixed to the trunk 1 and the body part coordinate system ⁇ A ⁇ fixed to the upper arm 2 .
- R 0 SA R A SA R 0 SB R B SB R A B (1)
- Equation (1) Since the unknowns in Equation (1) are R A SA and R B SB , the other variables R 0 SA , R 0 SB , and R A B are constants in order to obtain the unknowns R A SA , and R B SB .
- the other variables R 0 SA , R 0 SB , and R A B are determined after a human body model performs three predetermined motions.
- the three motions may be, for example, a motion of stretching an arm forward, a motion of raising an arm upwards, and a motion of stretching an arm laterally.
- Equation (2) When Equation (1) is applied to the three motions, the following Equation (2) is produced.
- R 0 SA (k), R 0 SB (k), and R A B (k) are constant values determined to the three motions.
- the rotational matrix R B 0 , R A 0 , R E 0 , and R W 0 of the body part coordinate systems ⁇ B, A, E, and W ⁇ is expressed by a rotational matrix ⁇ T t , T u , T f , T p ⁇ as illustrated in FIG. 6 .
- the following Equations (3) through (6) are used therefor.
- R E SE R 0 SB R B SB R A B R A E (5)
- R A E denotes a rotational matrix between the body part coordinate system ⁇ A ⁇ fixed to the upper arm 2 and the body part coordinate system ⁇ E ⁇ fixed to the lower arm 3 .
- R W B denotes a rotational matrix between the body part coordinate system ⁇ B ⁇ fixed to the trunk 1 and the body part coordinate system ⁇ W ⁇ fixed to the pelvis 4 .
- Equation (7) is derived from the kinematics from the trunk 1 to the upper arm 2
- Equation (8) is derived from the kinematics from the upper arm 2 to the lower arm 3
- Equation (9) is derived from the kinematics from the trunk 1 to the pelvis 4 .
- T t T u T su e Q1 T ts (7)
- T u T f T ef e Q2 T ue (8)
- T t T p T wp e Q3 T tw (9)
- T ts denotes a matrix in consideration of a distance from the trunk 1 to the shoulder joint J 1 .
- T su denotes a matrix in consideration of a distance from the shoulder joint J 1 to the upper arm 2 .
- T ue denotes a matrix in consideration of a distance from the upper arm 2 to the elbow joint J 2 .
- T ef denotes a matrix in consideration of a distance from the shoulder joint J 1 to the lower arm 3 .
- T tw denotes a matrix in consideration of a distance from the trunk 1 to the waist joint J 3 .
- T wp denotes a matrix in consideration of a distance from the waist joint J 3 to the pelvis 4 .
- the joint variables Q1, Q2, and Q3 of the joints J 1 , J 2 , and J 3 may be obtained from the Equations (10) through (12). (Joint variable calculation operation S 400 )
- the joint variable calculation operation S 400 for calculating the joint variables Q1, Q2, and Q3 with respect to each of the joints J 1 , J 2 , and J 3 by using the rotational matrix ⁇ T t , T u , T f , T p ⁇ of the body part coordinate systems ⁇ B, A, E, and W ⁇ calculated in the body part posture calculation operation S 300 is provided, the joint variables Q1, Q2, and Q3 may be obtained by using the rotational matrix ⁇ T t , T u , T f , T p ⁇ of the joints J 1 , J 2 , and J 3 .
- the inertial sensors S 1 , S 2 , S 3 , and S 4 for measuring translational inertia, rotational inertia, and terrestrial magnetism are used, compared to a conventional optical method or a magnetic field method, response speeds of the inertial sensors S 1 , S 2 , S 3 , and S 4 for measuring rotational motions of the body parts 1 , 2 , 3 , and 4 are fast. Accordingly, even when a human body model moves fast, accurate motion tracking is possible.
- the inertial sensors S 1 , S 2 , S 3 , and S 4 capable of transmitting measured signals to the outside wirelessly are used, it is easy to attach the inertial sensors S 1 , S 2 , S 3 , and S 4 to a human body and complicated wiring is not needed so that a motion of a human body, which is an object of motion tracking, may be tracked without hindrance.
- the inertial sensors S 1 , S 2 , S 3 , and S 4 are manufactured according to MEMS technology to be small and light, the inertial sensors S 1 , S 2 , S 3 , and S 4 are easy to carry and provide comfort like they are not attached to a human body.
- motion tracking is performed with respect to a human body model, mainly to the upper body including the trunk 1 , the pelvis 4 coupled to the trunk 1 , the upper arm 2 coupled to the trunk 1 , and the lower arm 3 coupled to the upper arm 2
- the motion tracking may also be applied to a human body model including the lower body such as thigh, calf, foot, etc.
- the rotational matrix conversion operation of obtaining a rotational matrix of the sensor coordinate systems with respect to the body part coordinate systems fixed to the body parts, respectively, by using the rotational matrix of the sensor coordinate systems obtained in the sensor posture measurement operation is provided, there is no need to accurately match the direction of the body part coordinate system with the direction of the sensor coordinate system of each of the inertial sensor.
- accurate motion tracking may be possible even when the inertial sensors are attached to a human body in any directions.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Pathology (AREA)
- Surgery (AREA)
- Dentistry (AREA)
- Biophysics (AREA)
- Physiology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
R 0 SA R A SA =R 0 SB R B SB R A B (1)
R 0 SA(k)R A SA =R 0 SB(k)R B SB R A B(k), where k=1,2, and 3 (1)
T t =R B 0 =R 0 SB R B SB (3)
T u =R A 0 =R 0 SA R A SA =R 0 SB R B SB R A B (4)
T f =R E 0 =R 0 SE R E SE =R 0 SB R B SB R A B R A E (5)
T p =R W 0 =R 0 SW R W SW =R 0 SB R B SB R W B (6)
T t =T u T su e Q1 T ts (7)
T u =T f T ef e Q2 T ue (8)
T t =T p T wp e Q3 T tw (9)
Q1=log((T u T su)−1 T t T ts −1) (10)
Q2=log((T f T ef)−1 T u T ue −1) (11)
Q3=log((T p T wp)−1 T t T tw −1) (12)
Claims (4)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020110049804A KR101214227B1 (en) | 2011-05-25 | 2011-05-25 | method of motion tracking. |
| KR10-2011-0049804 | 2011-05-25 | ||
| PCT/KR2012/001119 WO2012161407A1 (en) | 2011-05-25 | 2012-02-15 | Method of motion tracking |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20140156218A1 US20140156218A1 (en) | 2014-06-05 |
| US9759539B2 true US9759539B2 (en) | 2017-09-12 |
Family
ID=47217448
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/119,960 Expired - Fee Related US9759539B2 (en) | 2011-05-25 | 2012-02-15 | Method of motion tracking |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US9759539B2 (en) |
| KR (1) | KR101214227B1 (en) |
| WO (1) | WO2012161407A1 (en) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| USD817791S1 (en) * | 2016-12-07 | 2018-05-15 | Harshavardhana Narayana Kikkeri | Sensor array |
| USD834967S1 (en) * | 2016-12-07 | 2018-12-04 | Harshavardhana Narayana Kikkeri | Sensor array |
| WO2020107113A1 (en) * | 2018-11-28 | 2020-06-04 | Magna International Inc. | Apparel for ergonomic evaluation |
Families Citing this family (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9163921B2 (en) | 2013-12-18 | 2015-10-20 | Hexagon Metrology, Inc. | Ultra-portable articulated arm coordinate measurement machine |
| US9594250B2 (en) * | 2013-12-18 | 2017-03-14 | Hexagon Metrology, Inc. | Ultra-portable coordinate measurement machine |
| WO2016002318A1 (en) * | 2014-06-30 | 2016-01-07 | ソニー株式会社 | Information processing device, information processing method, computer program, and image processing system |
| US10973713B2 (en) * | 2014-06-30 | 2021-04-13 | Rehabilitation Institute Of Chicago | Body signal control device and related methods |
| EP3297520B1 (en) * | 2015-05-18 | 2022-11-02 | Vayu Technology Corp. | Devices for measuring human gait and related methods of use |
| WO2016208290A1 (en) * | 2015-06-26 | 2016-12-29 | Necソリューションイノベータ株式会社 | Measurement device and measurement method |
| CN110720922B (en) * | 2018-07-17 | 2022-07-15 | 西门子股份公司 | Subject body size measurement method, device and system |
| CN109781104B (en) * | 2019-01-31 | 2021-06-08 | 深圳创维数字技术有限公司 | Motion attitude determination and positioning method and device, computer equipment and medium |
| CN110215216B (en) * | 2019-06-11 | 2020-08-25 | 中国科学院自动化研究所 | Behavior identification method and system based on skeletal joint point regional and hierarchical level |
| CN117503120B (en) * | 2023-12-18 | 2024-04-16 | 北京铸正机器人有限公司 | Human body posture estimation method and system |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20050097181A (en) | 2004-03-31 | 2005-10-07 | 학교법인 대양학원 | Walking pattern analysis apparatus and method using inertial sensor |
| KR100601981B1 (en) | 2005-01-14 | 2006-07-18 | 삼성전자주식회사 | Activity pattern monitoring method and device |
| US7089148B1 (en) | 2000-10-30 | 2006-08-08 | The United States Of America As Represented By The Secretary Of The Navy | Method and apparatus for motion tracking of an articulated rigid body |
| US20080285805A1 (en) * | 2007-03-15 | 2008-11-20 | Xsens Technologies B.V. | Motion Tracking System |
| US20090322763A1 (en) * | 2008-06-30 | 2009-12-31 | Samsung Electronics Co., Ltd. | Motion Capture Apparatus and Method |
-
2011
- 2011-05-25 KR KR1020110049804A patent/KR101214227B1/en not_active Expired - Fee Related
-
2012
- 2012-02-15 US US14/119,960 patent/US9759539B2/en not_active Expired - Fee Related
- 2012-02-15 WO PCT/KR2012/001119 patent/WO2012161407A1/en not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7089148B1 (en) | 2000-10-30 | 2006-08-08 | The United States Of America As Represented By The Secretary Of The Navy | Method and apparatus for motion tracking of an articulated rigid body |
| KR20050097181A (en) | 2004-03-31 | 2005-10-07 | 학교법인 대양학원 | Walking pattern analysis apparatus and method using inertial sensor |
| KR100601981B1 (en) | 2005-01-14 | 2006-07-18 | 삼성전자주식회사 | Activity pattern monitoring method and device |
| US20080285805A1 (en) * | 2007-03-15 | 2008-11-20 | Xsens Technologies B.V. | Motion Tracking System |
| US20090322763A1 (en) * | 2008-06-30 | 2009-12-31 | Samsung Electronics Co., Ltd. | Motion Capture Apparatus and Method |
| KR20100002803A (en) | 2008-06-30 | 2010-01-07 | 삼성전자주식회사 | Apparatus and method for capturing a motion of human |
Non-Patent Citations (2)
| Title |
|---|
| Internatnional Search Report dated Sep. 25, 2012 in corresponding International Patent Application No. PCT/KR2012/001119 (3 pages, in Korean). |
| Korean Office Action dated Nov. 19, 2012 on corresponding Korean Patent Application No. KR10-2011-0049804 (7 pages, in Korean including English Translation). |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| USD817791S1 (en) * | 2016-12-07 | 2018-05-15 | Harshavardhana Narayana Kikkeri | Sensor array |
| USD834967S1 (en) * | 2016-12-07 | 2018-12-04 | Harshavardhana Narayana Kikkeri | Sensor array |
| WO2020107113A1 (en) * | 2018-11-28 | 2020-06-04 | Magna International Inc. | Apparel for ergonomic evaluation |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2012161407A1 (en) | 2012-11-29 |
| US20140156218A1 (en) | 2014-06-05 |
| KR101214227B1 (en) | 2012-12-20 |
| KR20120131553A (en) | 2012-12-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9759539B2 (en) | Method of motion tracking | |
| US10679360B2 (en) | Mixed motion capture system and method | |
| Li et al. | Real-time human motion capture based on wearable inertial sensor networks | |
| Zhou et al. | Reducing drifts in the inertial measurements of wrist and elbow positions | |
| JP2010534316A (en) | System and method for capturing movement of an object | |
| EP3361948B1 (en) | Integration of inertial tracking and position aiding for motion capture | |
| KR101347838B1 (en) | Motion capture device and associated method | |
| JP6852673B2 (en) | Sensor device, sensor system and information processing device | |
| Meng et al. | Self-contained pedestrian tracking during normal walking using an inertial/magnetic sensor module | |
| JP2013500812A (en) | Inertial measurement of kinematic coupling | |
| JP5421571B2 (en) | Walking characteristic evaluation system and locus generation method | |
| CN104834917A (en) | Mixed motion capturing system and mixed motion capturing method | |
| KR101080078B1 (en) | Motion Capture System using Integrated Sensor System | |
| CN109284006A (en) | A kind of human motion capture device and method | |
| Bleser et al. | Using egocentric vision to achieve robust inertial body tracking under magnetic disturbances | |
| WO2018057301A1 (en) | User-specific learning for improved pedestrian motion modeling in a mobile device | |
| CN110609621A (en) | Posture calibration method and human motion capture system based on micro-sensor | |
| CN114332912B (en) | Human motion capture and joint force analysis method based on IMU | |
| Alshamaa et al. | RobCap: A mobile motion capture system mounted on a robotic arm | |
| JP6205387B2 (en) | Method and apparatus for acquiring position information of virtual marker, and operation measurement method | |
| Jatesiktat et al. | Recovery of forearm occluded trajectory in kinect using a wrist-mounted inertial measurement unit | |
| de Villa et al. | IMU-based Characterization of the Leg for the Implementation of Biomechanical Models | |
| Ricci et al. | Dynamic accuracy assessment of data-fusion techniques for wearable, inertial and magnetic based human motion capture | |
| CN111197983B (en) | Three-dimensional pose measurement method based on human body distribution inertia node vector distance measurement | |
| JP2022133212A (en) | Method for capturing and calibration of motion by inertia measurement sensor on the basis of position estimation sensor |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: KOREA INSTITUTE OF SCIENCE AND TECHNOLOGY, KOREA, Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KIM, JINWOOK;JUNG, YUJIN;KANG, DONGHOON;REEL/FRAME:031667/0906 Effective date: 20131122 |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 4 |
|
| FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
| LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
| STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
| FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20250912 |