CN110646014B - IMU installation error calibration method based on human joint position capturing equipment assistance - Google Patents
IMU installation error calibration method based on human joint position capturing equipment assistance Download PDFInfo
- Publication number
- CN110646014B CN110646014B CN201910938329.5A CN201910938329A CN110646014B CN 110646014 B CN110646014 B CN 110646014B CN 201910938329 A CN201910938329 A CN 201910938329A CN 110646014 B CN110646014 B CN 110646014B
- Authority
- CN
- China
- Prior art keywords
- imu
- joint position
- limb
- human joint
- formula
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
- G01C25/005—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
Landscapes
- Engineering & Computer Science (AREA)
- Manufacturing & Machinery (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The invention discloses a method for calibrating an installation error of an inertial sensor based on the assistance of a human joint position capturing device (such as Kinect). In the calibration process, the human body does not need to execute strict calibration actions which are usually required, and only needs to perform short natural movements of the part to be captured of the human body, so that the calibration of the installation errors can be realized. Besides the conventional installation posture error calibration, the invention can realize the installation position error calibration at the same time, thus establishing the complete kinematic relationship between the inertial sensor and the joint and being beneficial to effectively estimating the joint line motion parameters. The invention solves the problem that the hemiplegia patient cannot strictly execute the calibration action to a great extent, reduces the calibration difficulty and the calibration time, and has great application value in the fields of hemiplegia exercise rehabilitation, somatosensory games and the like.
Description
Technical Field
The invention belongs to the technical field of human body motion capture, relates to an inertial sensor installation error calibration technology for human body motion capture, and in particular relates to an inertial sensor installation error calibration method assisted by human body joint position information.
Background
The human motion capture technology has wide application prospect in the aspects of medical diagnosis, motion rehabilitation, virtual reality, human-computer interaction and the like.
The human motion capture system based on the miniature inertial measurement unit (IMU for short, which consists of a three-axis gyroscope and a three-axis accelerometer and usually comprises a three-axis magnetic sensor) is not limited by light, places or shielding and the like, has good dynamic performance, can be used for measuring outdoor motion of a human body, and is particularly suitable for home motion rehabilitation, sports analysis, movie animation and the like of hemiplegic patients.
Because the human body is not a standard geometrical body, and soft tissues between bones and epidermis of the human body are easy to deform, the IMU cannot be really worn along the limb direction, and the installation error calibration is required to obtain the accurate posture between the IMU coordinate system and the limb coordinate system. In the prior art, a tested person is usually required to execute specific preset actions, such as swing of limbs, internal and external rotation, standing of a human body in a T shape, and the like, so that a limb coordinate axis is kept static along the gravity direction, a triaxial accelerometer senses the limb axial direction, or the limb rotates along the limb axial direction, and a triaxial gyroscope senses the limb axial direction. However, it is difficult for the subject to strictly rotate along the limb axis or to place the limb in the direction of gravity (or horizontal direction), especially for hemiplegic patients who have limited movement themselves. In addition, only the mounting posture error is calibrated, but the calibration of the mounting position error is neglected, and the mounting position of the IMU relative to the joint cannot be known, so that the linear motion parameters of the human joint are difficult to effectively estimate.
In 2010, microsoft officially releases a body-sensing peripheral Kinect, and a Kinect sensor consists of three lenses, namely an RGB color camera, an infrared transmitter and a structured light depth sensor, wherein the measuring distance is as follows: 0.8 to 4 meters. By using Kinect and interface software (such as Microsoft's software development kit), the 3D spatial position of the human joint can be obtained directly. The Kinect human joint capture algorithm is trained by a large amount of data in advance, and the obtained 3D position of the human joint can be used as a reference standard. Similar products are also RealSense et al, intel corporation.
Disclosure of Invention
The invention aims to: in order to overcome the defects in the prior art, the motion capturing method without strictly executing specific alignment actions is provided, the IMU installation error calibration, including the installation position error calibration, can be realized by only executing natural actions on the part to be captured of the human body by means of the joint position information of the human body, and further the human body motion capturing based on the IMU can be performed, including the estimation of the motion parameters of the joint line of the human body.
The technical scheme is as follows: the application is to adopt an IMU installation error calibration scheme assisted by human joint position capturing equipment (such as Kinect), and a user to-be-captured part moves in the measuring range of the human joint position capturing equipment and is prevented from shielding. The part to be captured of the human body naturally moves, human joint position information of human joint position capturing equipment and IMU output data are collected, and an installation error calibration algorithm is executed to obtain an installation posture error and an installation position error of the IMU. The method comprises the following steps:
the limb coordinate system is marked as a b system, the IMU coordinate system is marked as an s system, the local northeast coordinate system is marked as a geographic coordinate system g system, the reference system r is defined manually according to the requirement, the posture of the reference system r relative to the g system is known and kept unchanged, the coordinate system of the human joint position capturing device is marked as a k system, the posture of the k system relative to the g system is known and kept unchanged,representing m-to n-series rotation matrices. Mapping of limb orientation l under k-series l k The human joint position vectors at two ends of the limb obtained by the human joint position capturing device are subtracted, and the limb direction l is mapped in the r system +.>The spin angle of the limb is θ.
The mounting posture of the IMU with respect to the limb can be described, satisfying the expression (1).
In the formula (1)And the 9-axis output of the IMU is obtained according to an attitude determination algorithm in the attitude reference system. l (L) r And θ actually constitutes an axial angle description of the limb posture, θ being unknown. The IMU is fixed at the limb part, and the mounting posture of the IMU is +.>Does not change with time, that is, at different moments, the formula (1) is true and t is taken 1 And t 2 Two moments, then:
converting the rotation matrix in the formula (1) into quaternions,the corresponding quaternion is P, p= [ P ] 0 ,p 1 ,p 2 ,p 3 ] T ;/>The corresponding quaternion is Q, q= [ Q ] 0 ,q 1 ,q 2 ,q 3 ] T ;/>The corresponding quaternion is R, R= [ R ] 0 ,r 1 ,r 2 ,r 3 ] T . Satisfying the formula (3).
Wherein the method comprises the steps ofIs a quaternion multiplication symbol written in a matrix form
Wherein the method comprises the steps of
Converting the rotation matrix on both sides of the equal sign of the formula (2) into quaternions according to the formulas (3) - (5), wherein the formulas contain two unknowns theta 1 And theta 2 The theta can be determined by respectively and equally constructing an equation set according to 4 elements of quaternions on two sides of the equal sign 1 And theta 2 And then determine
The joints at the two ends of the limb where the IMU is positioned are respectively a joint 1 and a joint 2, the ratio of the distance between the IMU and the joint 1 to the distance between the IMU and the joint 1 and the joint 2 is lambda, the lambda value can describe the installation position of the IMU, the tiny displacement of the IMU relative to the contact part of the limb is ignored, and lambda does not change with time. Let the spatial coordinates of joints 1 and 2 in k-series beAnd->They are measured by human joint position capturing equipment, and the spatial coordinate of IMU under k system is p k Equation (6) is satisfied.
The displacement obtained by inertial navigation calculation through IMU output is recorded as s in g system g The state equation of the Kalman filtering system is:
here acceleration a g The solution is output by the IMU. Let the initial position coordinate of IMU in k system be p k (0) Then satisfyThe observation equation is obtained by the comprehensive formula (6): />
When the limb only translates, the displacement of the IMU is consistent with the displacement of the joint, the installation position of the IMU on the limb cannot be determined, and when the limb rotates, the displacement of the IMU is unequal with the displacement of the joint, and a proportional relation lambda exists between the displacement of the IMU and the displacement of the joint. It can also be seen from the observation equation (8) that the coefficient of λ in equation (8) is 0 and λ is not observable when translating. In practice, when a person moves naturally, the limb part is difficult to achieve translational movement, usually comprises rotational movement, especially the limb part with higher freedom of movement, and can prompt a tested person to rotate the limb at will in a calibration stage, so that observability can be ensured, and the IMU installation position lambda can be estimated by carrying out Kalman filtering through a state equation (7) and an observation equation (8), so that the installation position error calibration is realized. Wherein the Kalman filtering process is as follows:
first, the expression (7) is written in a discretized form as shown in the expression (9).
X(t n )=A·X(t n-1 )+B·U(t n-1 )+W(t n-1 ) (9)
Here x= [ v g s g λ] T ,U=a g ,T=t n -t n-1 W is system noise, the noise variance matrix is set according to the IMU output noise size, and the system noise variance matrix is set as Q (t n ). From the observation equation (8), an update time t is obtained n The observation equation is that
Z(t n )=H(t n )X(t n )+V(t n ) (10)
Wherein, the liquid crystal display device comprises a liquid crystal display device,V(t n ) Is observation noise, the noise variance matrix is set according to the accuracy of the human joint position capturing device, and the observation noise variance matrix is set as R (t n ). According to the formula (9) and the formula (10), the discrete Kalman filtering basic equation (11) is adopted to optimally estimate the state quantity:
K(t n )=P(t n/n-1 )H(t n ) T [H(t n )P(t n/n-1 )H(t n ) T +R(t n )] -1 (11c)
P(t n/n-1 )=AP(t n-1 )A T +Q(t n-1 ) (11d)
P(t n )=[I-K(t n )H(t n )]P(t n/n-1 ) (11e)
as long as the initial value is givenAnd P (t) 0 ) According to t n Measurement of time Z (t) n ) Can recursively calculate t n State estimation of time of day->Thereby estimating the mounting position lambda of the IMU.
Obtaining the mounting error of the IMU, a Denavit-Hartenberg equation (12) of the IMU relative to the joint can be obtained, where d is the spatial position of the joint relative to the IMU, given λ and limb length.
Let the pose of IMU relative to reference system be expressed as coordinate transformation matrixThe coordinate transformation matrix of the joint with respect to the reference frame is +.>Satisfy->Wherein->Including angular and linear motion parameters of the joint relative to the frame of reference.
The beneficial effects are that: compared with the prior art, the invention can realize the installation error calibration of the IMU without executing strict calibration action by means of the human joint position capturing equipment (such as Kinect), and simultaneously realize the estimation of the installation position error of the IMU, thereby establishing the complete kinematic relationship between the IMU and the adjacent joint, and being beneficial to realizing the effective estimation of the joint line motion parameters. The technical scheme of the invention is particularly suitable for hemiplegia patients who perform rehabilitation training, because the hemiplegia patients possibly cannot complete specific alignment actions, and for common users, the accuracy of the alignment actions is difficult to ensure.
Drawings
FIG. 1 is a schematic diagram of an IMU installation error calibration process for an upper limb example;
fig. 2 is a schematic diagram of visual analysis of observability of an IMU installation position, wherein (a) is a schematic diagram of non-observability of a translation process and (b) is a schematic diagram of observability of a rotation process.
Detailed Description
The invention is further illustrated below with reference to the accompanying drawings and motion capture embodiments for the upper extremities, wherein the human joint location capture device employs Kinect.
The upper limb movement range of the tested person is within the sight distance of the Kinect, and the position of the Kinect sensor is kept unchanged. IMUs are mounted on the upper and lower arms of the tester, respectively, as shown in fig. 1. The upper limb performs natural movement for a few seconds, the computer simultaneously collects the human joint position information of Kinect and the 9-axis output data of the IMU sensor, and an installation error calibration algorithm is operated. The installation error calibration algorithm execution procedure is as follows.
The limb coordinate system is marked as a b system, the IMU coordinate system is marked as an s system, the local northeast coordinate system is marked as a geographic coordinate system g system, the Kinect coordinate system is marked as a k system, and the reference system r is defined artificially according to the needs. The pose of the r system, the g system and the k system is kept unchanged.Representing m-to n-series rotation matrices, measured initially>And->Note that the limb orientation is l and the spin angle of the limb is θ.
The 9-axis output data of the IMU of the upper arm is obtained according to the attitude determination algorithm in the attitude reference systemSubtracting the position vectors of the shoulder joint and the elbow joint of the human body obtained by Kinect to obtain a mapping l of the limb orientation under the k system k According toObtaining the mapping l of vector l under r system r 。
In the formula (1), θ is unknown, and the mounting posture of the IMUDoes not substantially change with time, i.e. at different moments in time, equation (1) holds, let t be 1 And t 2 Two moments, then:
converting the rotation matrix in the formula (1) into quaternions,the corresponding quaternion is P, p= [ P ] 0 ,p 1 ,p 2 ,p 3 ] T ;/>The corresponding quaternion is Q, q= [ Q ] 0 ,q 1 ,q 2 ,q 3 ] T ;/>The corresponding quaternion is R, R= [ R ] 0 ,r 1 ,r 2 ,r 3 ] T . Satisfying the formula (3).
Wherein the method comprises the steps ofIs a quaternion multiplication symbol written in a matrix form
Wherein the method comprises the steps of
Converting the rotation matrix on both sides of equation number of equation (2) into quaternions according to equations (3) - (5), wherein the equation contains two unknowns theta 1 And theta 2 According to the two sides of the equal signRespectively and equally constructing equation sets by 4 elements of quaternion of (2) to determine theta 1 And theta 2 And then determine
The ratio of the distance between the IMU and the shoulder joint and the distance between the IMU and the elbow joint are lambda, the lambda value can describe the installation position of the IMU, the tiny displacement of the IMU relative to the contact part of the limb is ignored, and lambda does not change with time. Let the spatial coordinates of the shoulder and elbow joints under the k-system as measured by KinectAnd->Let the spatial coordinates of the IMU be p k The mapping of displacement, speed and acceleration obtained by IMU inertial navigation under g system is recorded as s respectively g 、v g And a g . A kalman filter is used to estimate lambda.
The state equation of the Kalman filtering system is shown in the formula (6).
X(t n )=A·X(t n-1 )+B·U(t n-1 )+W(t n-1 ) (6)
Here x= [ v g s g λ] T ,U=a g ,T=t n -t n-1 W is system noise, its noise variance matrix Q (t n ) And setting according to the output noise of the IMU. The observation equation is shown in formula (7).
Z(t n )=H(t n )X(t n )+V(t n ) (7)
Wherein, the liquid crystal display device comprises a liquid crystal display device,V(t n ) Is observation noise, its noise variance matrix R (t n ) Precision setting for positioning joint position of human body according to Kinect. According to the formula (6) and the formula (7), the discrete Kalman filtering basic equation (8) is adopted to optimally estimate the state quantity:
K(t n )=P(t n/n-1 )H(t n ) T [H(t n )P(t n/n-1 )H(t n ) T +R(t n )] -1 (8c)
P(t n/n-1 )=AP(t n-1 )A T +Q(t n-1 ) (8d)
P(t n )=[I-K(t n )H(t n )]P(t n/n-1 ) (8e)
as long as the initial value is givenAnd P (t) 0 ) According to t n Measurement of time Z (t) n ) Can recursively calculate t n State estimation of time of day->Thereby estimating the mounting position lambda of the IMU. The IMU mounting position error estimation process of the lower arm is the same as that of the upper arm IMU.
When the limb has only a translational process, as shown in fig. 2 (a), the displacement of the IMU and the displacement of the joint are consistent, and the mounting position of the IMU on the limb cannot be determined, whereas when the limb is rotated, as shown in fig. 2 (b), the displacement of the IMU and the displacement of the joint are unequal, and a proportional relationship lambda exists between them. It can also be seen from the observation equation (7) that the coefficient of λ in equation (7) is 0 and λ is not observable when translating. In practice, when a person moves naturally, the limb part is difficult to achieve translational movement, usually including rotational movement, especially the limb part with higher freedom of movement, and the person to be tested can be prompted to rotate the limb at will in the calibration stage, so that observability can be ensured.
The tested person in the embodiment is a hemiplegia patient, and in the whole testing process, the tested person does not need to complete the appointed alignment action, and on the premise of ensuring the precision, the alignment time is reduced, and the test is smoothly completed.
Claims (4)
1. An IMU installation error calibration method based on the assistance of human joint position capturing equipment is characterized in that: the method comprises the following steps: the part to be captured of the user moves in the measuring range of the human joint position capturing device, and the IMU is arranged on the limb of the user; acquiring human joint position information and IMU output data of a human joint position capturing device, and executing an installation error calibration algorithm to obtain an installation posture error and an installation position error of the IMU;
the limb coordinate system is marked as a b system, the IMU coordinate system is marked as an s system, the local northeast coordinate system is marked as a geographic coordinate system g system, the human joint position capturing device coordinate system is marked as a k system, the posture of the reference system r relative to the g system is known and kept unchanged, the posture of the k system relative to the g system is known and kept unchanged,representing m-to-n rotation matrix, mapping of limb orientation l under k k The human joint position vectors at two ends of the limb obtained by the human joint position capturing device are subtracted to obtain the mapping of the limb pointing direction l under the r systemThe spin angle of the limb is θ, +.>Describing the mounting posture of the IMU relative to the limb, satisfying formula (1);
in the formula (1)According to the attitude determination algorithm of the navigation attitude reference system, the attitude is calculated r And θ constitute an axial angle description of the limb posture, mounting posture of IMU +.>Does not change with time, and at different moments, the formula (1) is established, t is taken 1 And t 2 At two moments, equation (2) is satisfied, the rotation matrix at both sides of the equal sign of equation (2) is converted into quaternion, and the equation contains two unknowns theta 1 And theta 2 The theta can be determined by respectively and equally constructing an equation set according to 4 elements of quaternions on two sides of the equal sign 1 And theta 2 Further determine->
2. The IMU installation error calibration method based on human joint position capture device assistance of claim 1, wherein: when IMU installation error calibration is carried out, the part of the human body to be moved and captured naturally moves, so that the existence of a rotation component is ensured.
3. The IMU installation error calibration method based on human joint position capture device assistance of claim 1, wherein: the joints at the two ends of the limb where the IMU is positioned are respectively a joint 1 and a joint 2, the ratio of the distance between the IMU and the joint 1 to the distance between the joints 1 and 2 is lambda, the lambda value describes the installation position of the IMU, and the space coordinates of the joints 1 and 2 under the k system are recorded asAnd->The spatial coordinate of the IMU under the k system is p measured by the human joint position capturing equipment k The initial space coordinate of IMU in k system is p k (0) Inertial navigation is carried out by the IMU output to obtain a mapping s of displacement under g system g The IMU outputs the calculated acceleration a g The state equation and the observation equation of the Kalman filtering system are respectively shown as a formula (3) and a formula (4);
writing formula (3) and formula (4) into discretized forms as shown in formula (5) and formula (6), respectively, wherein X (t) n )=[v g (t n ) s g (t n ) λ(t n )] T ,U(t n )=a g (t n ),T=t n -t n-1 W is system noise, its noise variance matrix Q (t n ) According to the output noise size setting of the IMU,
is observation noise, its noise variance matrix R (t n ) Positioning the precision setting of the joint position of the human body according to the Kinect;
X(t n )=A·X(t n-1 )+B·U(t n-1 )+W(t n-1 ) (5)
Z(t n )=H(t n )X(t n )+V(t n ) (6)
according to the formulas (5) and (6),the state quantity is optimally estimated by adopting a discrete Kalman filtering basic equation (7), and an initial value is givenAnd P (t) 0 ) According to t n Measurement of time Z (t) n ) Recursively calculating t n State estimation of time of day->Thereby estimating the mounting position lambda of the IMU;
P(t n/n-1 )=AP(t n-1 )A T +Q(t n-1 ) (7b)
k(t n )=P(t n/n-1 )H(t n ) T [H(t n )P(t n/n-1 )H(t n ) T +R(t n )] -1 (7c)
P(t n )=[I-K(t n )H(t n )]P(t n/n-1 ) (7e)。
4. a method of calibrating an IMU installation error based on assistance of a human joint position capturing device according to any of claims 1-3, wherein the human joint position capturing device is a Kinect or other optical based human joint position capturing device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910938329.5A CN110646014B (en) | 2019-09-30 | 2019-09-30 | IMU installation error calibration method based on human joint position capturing equipment assistance |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910938329.5A CN110646014B (en) | 2019-09-30 | 2019-09-30 | IMU installation error calibration method based on human joint position capturing equipment assistance |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110646014A CN110646014A (en) | 2020-01-03 |
CN110646014B true CN110646014B (en) | 2023-04-25 |
Family
ID=68993290
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910938329.5A Active CN110646014B (en) | 2019-09-30 | 2019-09-30 | IMU installation error calibration method based on human joint position capturing equipment assistance |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110646014B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104197987A (en) * | 2014-09-01 | 2014-12-10 | 北京诺亦腾科技有限公司 | Combined-type motion capturing system |
CN104658012A (en) * | 2015-03-05 | 2015-05-27 | 第二炮兵工程设计研究院 | Motion capture method based on inertia and optical measurement fusion |
CN104834917A (en) * | 2015-05-20 | 2015-08-12 | 北京诺亦腾科技有限公司 | Mixed motion capturing system and mixed motion capturing method |
CN107349570A (en) * | 2017-06-02 | 2017-11-17 | 南京邮电大学 | Rehabilitation training of upper limbs and appraisal procedure based on Kinect |
CN108268129A (en) * | 2016-12-30 | 2018-07-10 | 北京诺亦腾科技有限公司 | The method and apparatus and motion capture gloves calibrated to multiple sensors on motion capture gloves |
CN109297507A (en) * | 2018-09-27 | 2019-02-01 | 南京邮电大学 | The human limb motion capture method for exempting from alignment actions based on inertial sensor |
EP3491334A2 (en) * | 2016-08-01 | 2019-06-05 | Infinity Augmented Reality Israel Ltd. | Method and system for calibrating components of an inertial measurement unit (imu) using scene-captured data |
-
2019
- 2019-09-30 CN CN201910938329.5A patent/CN110646014B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104197987A (en) * | 2014-09-01 | 2014-12-10 | 北京诺亦腾科技有限公司 | Combined-type motion capturing system |
CN104658012A (en) * | 2015-03-05 | 2015-05-27 | 第二炮兵工程设计研究院 | Motion capture method based on inertia and optical measurement fusion |
CN104834917A (en) * | 2015-05-20 | 2015-08-12 | 北京诺亦腾科技有限公司 | Mixed motion capturing system and mixed motion capturing method |
EP3491334A2 (en) * | 2016-08-01 | 2019-06-05 | Infinity Augmented Reality Israel Ltd. | Method and system for calibrating components of an inertial measurement unit (imu) using scene-captured data |
CN108268129A (en) * | 2016-12-30 | 2018-07-10 | 北京诺亦腾科技有限公司 | The method and apparatus and motion capture gloves calibrated to multiple sensors on motion capture gloves |
CN107349570A (en) * | 2017-06-02 | 2017-11-17 | 南京邮电大学 | Rehabilitation training of upper limbs and appraisal procedure based on Kinect |
CN109297507A (en) * | 2018-09-27 | 2019-02-01 | 南京邮电大学 | The human limb motion capture method for exempting from alignment actions based on inertial sensor |
Non-Patent Citations (1)
Title |
---|
一种绕任意轴旋转的捷联惯导系统多位置初始对准方法;谭彩铭等;《中国惯性技术学报》;20150630;第23卷(第3期);第293-297页 * |
Also Published As
Publication number | Publication date |
---|---|
CN110646014A (en) | 2020-01-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108939512B (en) | Swimming posture measuring method based on wearable sensor | |
CN102245100B (en) | Graphical representations | |
Bo et al. | Joint angle estimation in rehabilitation with inertial sensors and its integration with Kinect | |
Hyde et al. | Estimation of upper-limb orientation based on accelerometer and gyroscope measurements | |
CN106153077A (en) | A kind of initialization of calibration method for M IMU human motion capture system | |
CN107348931B (en) | Capsule endoscope space attitude determination system | |
Spitzley et al. | Feasibility of using a fully immersive virtual reality system for kinematic data collection | |
US20100194879A1 (en) | Object motion capturing system and method | |
JP2004264060A (en) | Error correction method in attitude detector, and action measuring instrument using the same | |
Lin et al. | Development of the wireless ultra-miniaturized inertial measurement unit WB-4: Preliminary performance evaluation | |
CN110327048B (en) | Human upper limb posture reconstruction system based on wearable inertial sensor | |
WO2002037827A9 (en) | Method and apparatus for motion tracking of an articulated rigid body | |
CN103340632A (en) | Human joint angle measuring method based on feature point space position | |
Bai et al. | Low cost inertial sensors for the motion tracking and orientation estimation of human upper limbs in neurological rehabilitation | |
Lin et al. | Development of an ultra-miniaturized inertial measurement unit WB-3 for human body motion tracking | |
CN110609621A (en) | Posture calibration method and human motion capture system based on micro-sensor | |
CN110646014B (en) | IMU installation error calibration method based on human joint position capturing equipment assistance | |
CN115919250A (en) | Human dynamic joint angle measuring system | |
JP2009186244A (en) | Tilt angle estimation system, relative angle estimation system, and angular velocity estimation system | |
CN109297507B (en) | Human body limb movement capturing method free of alignment action based on inertial sensor | |
CN108534772A (en) | Attitude angle acquisition methods and device | |
Lin et al. | Using hybrid sensoring method for motion capture in volleyball techniques training | |
JP3394979B2 (en) | Method and apparatus for measuring joint angle | |
Carneiro et al. | Study of an assistive robotics 5-dof system prototype to be taught in undergraduate engineering classes | |
JP2014117409A (en) | Method and apparatus for measuring body joint position |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |