CN110646014B - IMU installation error calibration method based on human joint position capturing equipment assistance - Google Patents

IMU installation error calibration method based on human joint position capturing equipment assistance Download PDF

Info

Publication number
CN110646014B
CN110646014B CN201910938329.5A CN201910938329A CN110646014B CN 110646014 B CN110646014 B CN 110646014B CN 201910938329 A CN201910938329 A CN 201910938329A CN 110646014 B CN110646014 B CN 110646014B
Authority
CN
China
Prior art keywords
imu
joint position
limb
human joint
formula
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910938329.5A
Other languages
Chinese (zh)
Other versions
CN110646014A (en
Inventor
谭彩铭
高翔
徐国政
朱博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Posts and Telecommunications
Original Assignee
Nanjing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Posts and Telecommunications filed Critical Nanjing University of Posts and Telecommunications
Priority to CN201910938329.5A priority Critical patent/CN110646014B/en
Publication of CN110646014A publication Critical patent/CN110646014A/en
Application granted granted Critical
Publication of CN110646014B publication Critical patent/CN110646014B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a method for calibrating an installation error of an inertial sensor based on the assistance of a human joint position capturing device (such as Kinect). In the calibration process, the human body does not need to execute strict calibration actions which are usually required, and only needs to perform short natural movements of the part to be captured of the human body, so that the calibration of the installation errors can be realized. Besides the conventional installation posture error calibration, the invention can realize the installation position error calibration at the same time, thus establishing the complete kinematic relationship between the inertial sensor and the joint and being beneficial to effectively estimating the joint line motion parameters. The invention solves the problem that the hemiplegia patient cannot strictly execute the calibration action to a great extent, reduces the calibration difficulty and the calibration time, and has great application value in the fields of hemiplegia exercise rehabilitation, somatosensory games and the like.

Description

IMU installation error calibration method based on human joint position capturing equipment assistance
Technical Field
The invention belongs to the technical field of human body motion capture, relates to an inertial sensor installation error calibration technology for human body motion capture, and in particular relates to an inertial sensor installation error calibration method assisted by human body joint position information.
Background
The human motion capture technology has wide application prospect in the aspects of medical diagnosis, motion rehabilitation, virtual reality, human-computer interaction and the like.
The human motion capture system based on the miniature inertial measurement unit (IMU for short, which consists of a three-axis gyroscope and a three-axis accelerometer and usually comprises a three-axis magnetic sensor) is not limited by light, places or shielding and the like, has good dynamic performance, can be used for measuring outdoor motion of a human body, and is particularly suitable for home motion rehabilitation, sports analysis, movie animation and the like of hemiplegic patients.
Because the human body is not a standard geometrical body, and soft tissues between bones and epidermis of the human body are easy to deform, the IMU cannot be really worn along the limb direction, and the installation error calibration is required to obtain the accurate posture between the IMU coordinate system and the limb coordinate system. In the prior art, a tested person is usually required to execute specific preset actions, such as swing of limbs, internal and external rotation, standing of a human body in a T shape, and the like, so that a limb coordinate axis is kept static along the gravity direction, a triaxial accelerometer senses the limb axial direction, or the limb rotates along the limb axial direction, and a triaxial gyroscope senses the limb axial direction. However, it is difficult for the subject to strictly rotate along the limb axis or to place the limb in the direction of gravity (or horizontal direction), especially for hemiplegic patients who have limited movement themselves. In addition, only the mounting posture error is calibrated, but the calibration of the mounting position error is neglected, and the mounting position of the IMU relative to the joint cannot be known, so that the linear motion parameters of the human joint are difficult to effectively estimate.
In 2010, microsoft officially releases a body-sensing peripheral Kinect, and a Kinect sensor consists of three lenses, namely an RGB color camera, an infrared transmitter and a structured light depth sensor, wherein the measuring distance is as follows: 0.8 to 4 meters. By using Kinect and interface software (such as Microsoft's software development kit), the 3D spatial position of the human joint can be obtained directly. The Kinect human joint capture algorithm is trained by a large amount of data in advance, and the obtained 3D position of the human joint can be used as a reference standard. Similar products are also RealSense et al, intel corporation.
Disclosure of Invention
The invention aims to: in order to overcome the defects in the prior art, the motion capturing method without strictly executing specific alignment actions is provided, the IMU installation error calibration, including the installation position error calibration, can be realized by only executing natural actions on the part to be captured of the human body by means of the joint position information of the human body, and further the human body motion capturing based on the IMU can be performed, including the estimation of the motion parameters of the joint line of the human body.
The technical scheme is as follows: the application is to adopt an IMU installation error calibration scheme assisted by human joint position capturing equipment (such as Kinect), and a user to-be-captured part moves in the measuring range of the human joint position capturing equipment and is prevented from shielding. The part to be captured of the human body naturally moves, human joint position information of human joint position capturing equipment and IMU output data are collected, and an installation error calibration algorithm is executed to obtain an installation posture error and an installation position error of the IMU. The method comprises the following steps:
the limb coordinate system is marked as a b system, the IMU coordinate system is marked as an s system, the local northeast coordinate system is marked as a geographic coordinate system g system, the reference system r is defined manually according to the requirement, the posture of the reference system r relative to the g system is known and kept unchanged, the coordinate system of the human joint position capturing device is marked as a k system, the posture of the k system relative to the g system is known and kept unchanged,
Figure BDA0002222182930000021
representing m-to n-series rotation matrices. Mapping of limb orientation l under k-series l k The human joint position vectors at two ends of the limb obtained by the human joint position capturing device are subtracted, and the limb direction l is mapped in the r system +.>
Figure BDA0002222182930000022
The spin angle of the limb is θ.
Figure BDA0002222182930000023
The mounting posture of the IMU with respect to the limb can be described, satisfying the expression (1).
Figure BDA0002222182930000024
In the formula (1)
Figure BDA0002222182930000025
And the 9-axis output of the IMU is obtained according to an attitude determination algorithm in the attitude reference system. l (L) r And θ actually constitutes an axial angle description of the limb posture, θ being unknown. The IMU is fixed at the limb part, and the mounting posture of the IMU is +.>
Figure BDA0002222182930000026
Does not change with time, that is, at different moments, the formula (1) is true and t is taken 1 And t 2 Two moments, then:
Figure BDA0002222182930000027
converting the rotation matrix in the formula (1) into quaternions,
Figure BDA0002222182930000028
the corresponding quaternion is P, p= [ P ] 0 ,p 1 ,p 2 ,p 3 ] T ;/>
Figure BDA0002222182930000029
The corresponding quaternion is Q, q= [ Q ] 0 ,q 1 ,q 2 ,q 3 ] T ;/>
Figure BDA00022221829300000210
The corresponding quaternion is R, R= [ R ] 0 ,r 1 ,r 2 ,r 3 ] T . Satisfying the formula (3).
Figure BDA00022221829300000211
Wherein the method comprises the steps of
Figure BDA00022221829300000212
Is a quaternion multiplication symbol written in a matrix form
Figure BDA00022221829300000213
Wherein the method comprises the steps of
Figure BDA0002222182930000031
Converting the rotation matrix on both sides of the equal sign of the formula (2) into quaternions according to the formulas (3) - (5), wherein the formulas contain two unknowns theta 1 And theta 2 The theta can be determined by respectively and equally constructing an equation set according to 4 elements of quaternions on two sides of the equal sign 1 And theta 2 And then determine
Figure BDA0002222182930000032
The joints at the two ends of the limb where the IMU is positioned are respectively a joint 1 and a joint 2, the ratio of the distance between the IMU and the joint 1 to the distance between the IMU and the joint 1 and the joint 2 is lambda, the lambda value can describe the installation position of the IMU, the tiny displacement of the IMU relative to the contact part of the limb is ignored, and lambda does not change with time. Let the spatial coordinates of joints 1 and 2 in k-series be
Figure BDA0002222182930000033
And->
Figure BDA0002222182930000034
They are measured by human joint position capturing equipment, and the spatial coordinate of IMU under k system is p k Equation (6) is satisfied.
Figure BDA0002222182930000035
The displacement obtained by inertial navigation calculation through IMU output is recorded as s in g system g The state equation of the Kalman filtering system is:
Figure BDA0002222182930000036
here acceleration a g The solution is output by the IMU. Let the initial position coordinate of IMU in k system be p k (0) Then satisfy
Figure BDA0002222182930000037
The observation equation is obtained by the comprehensive formula (6): />
Figure BDA0002222182930000038
When the limb only translates, the displacement of the IMU is consistent with the displacement of the joint, the installation position of the IMU on the limb cannot be determined, and when the limb rotates, the displacement of the IMU is unequal with the displacement of the joint, and a proportional relation lambda exists between the displacement of the IMU and the displacement of the joint. It can also be seen from the observation equation (8) that the coefficient of λ in equation (8) is 0 and λ is not observable when translating. In practice, when a person moves naturally, the limb part is difficult to achieve translational movement, usually comprises rotational movement, especially the limb part with higher freedom of movement, and can prompt a tested person to rotate the limb at will in a calibration stage, so that observability can be ensured, and the IMU installation position lambda can be estimated by carrying out Kalman filtering through a state equation (7) and an observation equation (8), so that the installation position error calibration is realized. Wherein the Kalman filtering process is as follows:
first, the expression (7) is written in a discretized form as shown in the expression (9).
X(t n )=A·X(t n-1 )+B·U(t n-1 )+W(t n-1 ) (9)
Here x= [ v g s g λ] T
Figure BDA0002222182930000041
U=a g ,T=t n -t n-1 W is system noise, the noise variance matrix is set according to the IMU output noise size, and the system noise variance matrix is set as Q (t n ). From the observation equation (8), an update time t is obtained n The observation equation is that
Z(t n )=H(t n )X(t n )+V(t n ) (10)
Wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0002222182930000042
V(t n ) Is observation noise, the noise variance matrix is set according to the accuracy of the human joint position capturing device, and the observation noise variance matrix is set as R (t n ). According to the formula (9) and the formula (10), the discrete Kalman filtering basic equation (11) is adopted to optimally estimate the state quantity:
Figure BDA0002222182930000043
Figure BDA0002222182930000044
K(t n )=P(t n/n-1 )H(t n ) T [H(t n )P(t n/n-1 )H(t n ) T +R(t n )] -1 (11c)
P(t n/n-1 )=AP(t n-1 )A T +Q(t n-1 ) (11d)
P(t n )=[I-K(t n )H(t n )]P(t n/n-1 ) (11e)
as long as the initial value is given
Figure BDA0002222182930000045
And P (t) 0 ) According to t n Measurement of time Z (t) n ) Can recursively calculate t n State estimation of time of day->
Figure BDA0002222182930000046
Thereby estimating the mounting position lambda of the IMU.
Obtaining the mounting error of the IMU, a Denavit-Hartenberg equation (12) of the IMU relative to the joint can be obtained, where d is the spatial position of the joint relative to the IMU, given λ and limb length.
Figure BDA0002222182930000047
Let the pose of IMU relative to reference system be expressed as coordinate transformation matrix
Figure BDA0002222182930000048
The coordinate transformation matrix of the joint with respect to the reference frame is +.>
Figure BDA0002222182930000049
Satisfy->
Figure BDA00022221829300000410
Wherein->
Figure BDA00022221829300000411
Including angular and linear motion parameters of the joint relative to the frame of reference.
The beneficial effects are that: compared with the prior art, the invention can realize the installation error calibration of the IMU without executing strict calibration action by means of the human joint position capturing equipment (such as Kinect), and simultaneously realize the estimation of the installation position error of the IMU, thereby establishing the complete kinematic relationship between the IMU and the adjacent joint, and being beneficial to realizing the effective estimation of the joint line motion parameters. The technical scheme of the invention is particularly suitable for hemiplegia patients who perform rehabilitation training, because the hemiplegia patients possibly cannot complete specific alignment actions, and for common users, the accuracy of the alignment actions is difficult to ensure.
Drawings
FIG. 1 is a schematic diagram of an IMU installation error calibration process for an upper limb example;
fig. 2 is a schematic diagram of visual analysis of observability of an IMU installation position, wherein (a) is a schematic diagram of non-observability of a translation process and (b) is a schematic diagram of observability of a rotation process.
Detailed Description
The invention is further illustrated below with reference to the accompanying drawings and motion capture embodiments for the upper extremities, wherein the human joint location capture device employs Kinect.
The upper limb movement range of the tested person is within the sight distance of the Kinect, and the position of the Kinect sensor is kept unchanged. IMUs are mounted on the upper and lower arms of the tester, respectively, as shown in fig. 1. The upper limb performs natural movement for a few seconds, the computer simultaneously collects the human joint position information of Kinect and the 9-axis output data of the IMU sensor, and an installation error calibration algorithm is operated. The installation error calibration algorithm execution procedure is as follows.
The limb coordinate system is marked as a b system, the IMU coordinate system is marked as an s system, the local northeast coordinate system is marked as a geographic coordinate system g system, the Kinect coordinate system is marked as a k system, and the reference system r is defined artificially according to the needs. The pose of the r system, the g system and the k system is kept unchanged.
Figure BDA0002222182930000051
Representing m-to n-series rotation matrices, measured initially>
Figure BDA0002222182930000052
And->
Figure BDA0002222182930000053
Note that the limb orientation is l and the spin angle of the limb is θ.
The 9-axis output data of the IMU of the upper arm is obtained according to the attitude determination algorithm in the attitude reference system
Figure BDA0002222182930000054
Subtracting the position vectors of the shoulder joint and the elbow joint of the human body obtained by Kinect to obtain a mapping l of the limb orientation under the k system k According to
Figure BDA0002222182930000055
Obtaining the mapping l of vector l under r system r
Figure BDA0002222182930000056
The mounting posture of the upper arm IMU can be described, satisfying the expression (1).
Figure BDA0002222182930000057
In the formula (1), θ is unknown, and the mounting posture of the IMU
Figure BDA0002222182930000058
Does not substantially change with time, i.e. at different moments in time, equation (1) holds, let t be 1 And t 2 Two moments, then:
Figure BDA0002222182930000059
converting the rotation matrix in the formula (1) into quaternions,
Figure BDA00022221829300000510
the corresponding quaternion is P, p= [ P ] 0 ,p 1 ,p 2 ,p 3 ] T ;/>
Figure BDA00022221829300000511
The corresponding quaternion is Q, q= [ Q ] 0 ,q 1 ,q 2 ,q 3 ] T ;/>
Figure BDA00022221829300000512
The corresponding quaternion is R, R= [ R ] 0 ,r 1 ,r 2 ,r 3 ] T . Satisfying the formula (3).
Figure BDA0002222182930000061
Wherein the method comprises the steps of
Figure BDA0002222182930000062
Is a quaternion multiplication symbol written in a matrix form
Figure BDA0002222182930000063
/>
Wherein the method comprises the steps of
Figure BDA0002222182930000064
Converting the rotation matrix on both sides of equation number of equation (2) into quaternions according to equations (3) - (5), wherein the equation contains two unknowns theta 1 And theta 2 According to the two sides of the equal signRespectively and equally constructing equation sets by 4 elements of quaternion of (2) to determine theta 1 And theta 2 And then determine
Figure BDA0002222182930000065
The ratio of the distance between the IMU and the shoulder joint and the distance between the IMU and the elbow joint are lambda, the lambda value can describe the installation position of the IMU, the tiny displacement of the IMU relative to the contact part of the limb is ignored, and lambda does not change with time. Let the spatial coordinates of the shoulder and elbow joints under the k-system as measured by Kinect
Figure BDA0002222182930000066
And->
Figure BDA0002222182930000067
Let the spatial coordinates of the IMU be p k The mapping of displacement, speed and acceleration obtained by IMU inertial navigation under g system is recorded as s respectively g 、v g And a g . A kalman filter is used to estimate lambda.
The state equation of the Kalman filtering system is shown in the formula (6).
X(t n )=A·X(t n-1 )+B·U(t n-1 )+W(t n-1 ) (6)
Here x= [ v g s g λ] T
Figure BDA0002222182930000068
U=a g ,T=t n -t n-1 W is system noise, its noise variance matrix Q (t n ) And setting according to the output noise of the IMU. The observation equation is shown in formula (7).
Z(t n )=H(t n )X(t n )+V(t n ) (7)
Wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0002222182930000069
V(t n ) Is observation noise, its noise variance matrix R (t n ) Precision setting for positioning joint position of human body according to Kinect. According to the formula (6) and the formula (7), the discrete Kalman filtering basic equation (8) is adopted to optimally estimate the state quantity:
Figure BDA00022221829300000610
Figure BDA0002222182930000071
K(t n )=P(t n/n-1 )H(t n ) T [H(t n )P(t n/n-1 )H(t n ) T +R(t n )] -1 (8c)
P(t n/n-1 )=AP(t n-1 )A T +Q(t n-1 ) (8d)
P(t n )=[I-K(t n )H(t n )]P(t n/n-1 ) (8e)
as long as the initial value is given
Figure BDA0002222182930000072
And P (t) 0 ) According to t n Measurement of time Z (t) n ) Can recursively calculate t n State estimation of time of day->
Figure BDA0002222182930000073
Thereby estimating the mounting position lambda of the IMU. The IMU mounting position error estimation process of the lower arm is the same as that of the upper arm IMU.
When the limb has only a translational process, as shown in fig. 2 (a), the displacement of the IMU and the displacement of the joint are consistent, and the mounting position of the IMU on the limb cannot be determined, whereas when the limb is rotated, as shown in fig. 2 (b), the displacement of the IMU and the displacement of the joint are unequal, and a proportional relationship lambda exists between them. It can also be seen from the observation equation (7) that the coefficient of λ in equation (7) is 0 and λ is not observable when translating. In practice, when a person moves naturally, the limb part is difficult to achieve translational movement, usually including rotational movement, especially the limb part with higher freedom of movement, and the person to be tested can be prompted to rotate the limb at will in the calibration stage, so that observability can be ensured.
The tested person in the embodiment is a hemiplegia patient, and in the whole testing process, the tested person does not need to complete the appointed alignment action, and on the premise of ensuring the precision, the alignment time is reduced, and the test is smoothly completed.

Claims (4)

1. An IMU installation error calibration method based on the assistance of human joint position capturing equipment is characterized in that: the method comprises the following steps: the part to be captured of the user moves in the measuring range of the human joint position capturing device, and the IMU is arranged on the limb of the user; acquiring human joint position information and IMU output data of a human joint position capturing device, and executing an installation error calibration algorithm to obtain an installation posture error and an installation position error of the IMU;
the limb coordinate system is marked as a b system, the IMU coordinate system is marked as an s system, the local northeast coordinate system is marked as a geographic coordinate system g system, the human joint position capturing device coordinate system is marked as a k system, the posture of the reference system r relative to the g system is known and kept unchanged, the posture of the k system relative to the g system is known and kept unchanged,
Figure FDA0004124450120000011
representing m-to-n rotation matrix, mapping of limb orientation l under k k The human joint position vectors at two ends of the limb obtained by the human joint position capturing device are subtracted to obtain the mapping of the limb pointing direction l under the r system
Figure FDA0004124450120000012
The spin angle of the limb is θ, +.>
Figure FDA0004124450120000013
Describing the mounting posture of the IMU relative to the limb, satisfying formula (1);
Figure FDA0004124450120000014
in the formula (1)
Figure FDA0004124450120000015
According to the attitude determination algorithm of the navigation attitude reference system, the attitude is calculated r And θ constitute an axial angle description of the limb posture, mounting posture of IMU +.>
Figure FDA0004124450120000016
Does not change with time, and at different moments, the formula (1) is established, t is taken 1 And t 2 At two moments, equation (2) is satisfied, the rotation matrix at both sides of the equal sign of equation (2) is converted into quaternion, and the equation contains two unknowns theta 1 And theta 2 The theta can be determined by respectively and equally constructing an equation set according to 4 elements of quaternions on two sides of the equal sign 1 And theta 2 Further determine->
Figure FDA0004124450120000017
Figure FDA0004124450120000018
2. The IMU installation error calibration method based on human joint position capture device assistance of claim 1, wherein: when IMU installation error calibration is carried out, the part of the human body to be moved and captured naturally moves, so that the existence of a rotation component is ensured.
3. The IMU installation error calibration method based on human joint position capture device assistance of claim 1, wherein: the joints at the two ends of the limb where the IMU is positioned are respectively a joint 1 and a joint 2, the ratio of the distance between the IMU and the joint 1 to the distance between the joints 1 and 2 is lambda, the lambda value describes the installation position of the IMU, and the space coordinates of the joints 1 and 2 under the k system are recorded as
Figure FDA0004124450120000019
And->
Figure FDA00041244501200000110
The spatial coordinate of the IMU under the k system is p measured by the human joint position capturing equipment k The initial space coordinate of IMU in k system is p k (0) Inertial navigation is carried out by the IMU output to obtain a mapping s of displacement under g system g The IMU outputs the calculated acceleration a g The state equation and the observation equation of the Kalman filtering system are respectively shown as a formula (3) and a formula (4);
Figure FDA00041244501200000111
Figure FDA0004124450120000021
writing formula (3) and formula (4) into discretized forms as shown in formula (5) and formula (6), respectively, wherein X (t) n )=[v g (t n ) s g (t n ) λ(t n )] T
Figure FDA0004124450120000022
U(t n )=a g (t n ),T=t n -t n-1 W is system noise, its noise variance matrix Q (t n ) According to the output noise size setting of the IMU,
Figure FDA0004124450120000023
is observation noise, its noise variance matrix R (t n ) Positioning the precision setting of the joint position of the human body according to the Kinect;
X(t n )=A·X(t n-1 )+B·U(t n-1 )+W(t n-1 ) (5)
Z(t n )=H(t n )X(t n )+V(t n ) (6)
according to the formulas (5) and (6),the state quantity is optimally estimated by adopting a discrete Kalman filtering basic equation (7), and an initial value is given
Figure FDA0004124450120000024
And P (t) 0 ) According to t n Measurement of time Z (t) n ) Recursively calculating t n State estimation of time of day->
Figure FDA0004124450120000025
Thereby estimating the mounting position lambda of the IMU;
Figure FDA0004124450120000026
P(t n/n-1 )=AP(t n-1 )A T +Q(t n-1 ) (7b)
k(t n )=P(t n/n-1 )H(t n ) T [H(t n )P(t n/n-1 )H(t n ) T +R(t n )] -1 (7c)
Figure FDA0004124450120000027
P(t n )=[I-K(t n )H(t n )]P(t n/n-1 ) (7e)。
4. a method of calibrating an IMU installation error based on assistance of a human joint position capturing device according to any of claims 1-3, wherein the human joint position capturing device is a Kinect or other optical based human joint position capturing device.
CN201910938329.5A 2019-09-30 2019-09-30 IMU installation error calibration method based on human joint position capturing equipment assistance Active CN110646014B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910938329.5A CN110646014B (en) 2019-09-30 2019-09-30 IMU installation error calibration method based on human joint position capturing equipment assistance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910938329.5A CN110646014B (en) 2019-09-30 2019-09-30 IMU installation error calibration method based on human joint position capturing equipment assistance

Publications (2)

Publication Number Publication Date
CN110646014A CN110646014A (en) 2020-01-03
CN110646014B true CN110646014B (en) 2023-04-25

Family

ID=68993290

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910938329.5A Active CN110646014B (en) 2019-09-30 2019-09-30 IMU installation error calibration method based on human joint position capturing equipment assistance

Country Status (1)

Country Link
CN (1) CN110646014B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104197987A (en) * 2014-09-01 2014-12-10 北京诺亦腾科技有限公司 Combined-type motion capturing system
CN104658012A (en) * 2015-03-05 2015-05-27 第二炮兵工程设计研究院 Motion capture method based on inertia and optical measurement fusion
CN104834917A (en) * 2015-05-20 2015-08-12 北京诺亦腾科技有限公司 Mixed motion capturing system and mixed motion capturing method
CN107349570A (en) * 2017-06-02 2017-11-17 南京邮电大学 Rehabilitation training of upper limbs and appraisal procedure based on Kinect
CN108268129A (en) * 2016-12-30 2018-07-10 北京诺亦腾科技有限公司 The method and apparatus and motion capture gloves calibrated to multiple sensors on motion capture gloves
CN109297507A (en) * 2018-09-27 2019-02-01 南京邮电大学 The human limb motion capture method for exempting from alignment actions based on inertial sensor
EP3491334A2 (en) * 2016-08-01 2019-06-05 Infinity Augmented Reality Israel Ltd. Method and system for calibrating components of an inertial measurement unit (imu) using scene-captured data

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104197987A (en) * 2014-09-01 2014-12-10 北京诺亦腾科技有限公司 Combined-type motion capturing system
CN104658012A (en) * 2015-03-05 2015-05-27 第二炮兵工程设计研究院 Motion capture method based on inertia and optical measurement fusion
CN104834917A (en) * 2015-05-20 2015-08-12 北京诺亦腾科技有限公司 Mixed motion capturing system and mixed motion capturing method
EP3491334A2 (en) * 2016-08-01 2019-06-05 Infinity Augmented Reality Israel Ltd. Method and system for calibrating components of an inertial measurement unit (imu) using scene-captured data
CN108268129A (en) * 2016-12-30 2018-07-10 北京诺亦腾科技有限公司 The method and apparatus and motion capture gloves calibrated to multiple sensors on motion capture gloves
CN107349570A (en) * 2017-06-02 2017-11-17 南京邮电大学 Rehabilitation training of upper limbs and appraisal procedure based on Kinect
CN109297507A (en) * 2018-09-27 2019-02-01 南京邮电大学 The human limb motion capture method for exempting from alignment actions based on inertial sensor

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种绕任意轴旋转的捷联惯导系统多位置初始对准方法;谭彩铭等;《中国惯性技术学报》;20150630;第23卷(第3期);第293-297页 *

Also Published As

Publication number Publication date
CN110646014A (en) 2020-01-03

Similar Documents

Publication Publication Date Title
CN108939512B (en) Swimming posture measuring method based on wearable sensor
CN102245100B (en) Graphical representations
Bo et al. Joint angle estimation in rehabilitation with inertial sensors and its integration with Kinect
Hyde et al. Estimation of upper-limb orientation based on accelerometer and gyroscope measurements
CN106153077A (en) A kind of initialization of calibration method for M IMU human motion capture system
CN107348931B (en) Capsule endoscope space attitude determination system
Spitzley et al. Feasibility of using a fully immersive virtual reality system for kinematic data collection
US20100194879A1 (en) Object motion capturing system and method
JP2004264060A (en) Error correction method in attitude detector, and action measuring instrument using the same
Lin et al. Development of the wireless ultra-miniaturized inertial measurement unit WB-4: Preliminary performance evaluation
CN110327048B (en) Human upper limb posture reconstruction system based on wearable inertial sensor
WO2002037827A9 (en) Method and apparatus for motion tracking of an articulated rigid body
CN103340632A (en) Human joint angle measuring method based on feature point space position
Bai et al. Low cost inertial sensors for the motion tracking and orientation estimation of human upper limbs in neurological rehabilitation
Lin et al. Development of an ultra-miniaturized inertial measurement unit WB-3 for human body motion tracking
CN110609621A (en) Posture calibration method and human motion capture system based on micro-sensor
CN110646014B (en) IMU installation error calibration method based on human joint position capturing equipment assistance
CN115919250A (en) Human dynamic joint angle measuring system
JP2009186244A (en) Tilt angle estimation system, relative angle estimation system, and angular velocity estimation system
CN109297507B (en) Human body limb movement capturing method free of alignment action based on inertial sensor
CN108534772A (en) Attitude angle acquisition methods and device
Lin et al. Using hybrid sensoring method for motion capture in volleyball techniques training
JP3394979B2 (en) Method and apparatus for measuring joint angle
Carneiro et al. Study of an assistive robotics 5-dof system prototype to be taught in undergraduate engineering classes
JP2014117409A (en) Method and apparatus for measuring body joint position

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant