CN109000633A - Human body attitude motion capture algorithm design based on isomeric data fusion - Google Patents

Human body attitude motion capture algorithm design based on isomeric data fusion Download PDF

Info

Publication number
CN109000633A
CN109000633A CN201611272618.9A CN201611272618A CN109000633A CN 109000633 A CN109000633 A CN 109000633A CN 201611272618 A CN201611272618 A CN 201611272618A CN 109000633 A CN109000633 A CN 109000633A
Authority
CN
China
Prior art keywords
human body
motion
data fusion
posture
motion capture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611272618.9A
Other languages
Chinese (zh)
Inventor
李孝辉
王哲龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN201611272618.9A priority Critical patent/CN109000633A/en
Publication of CN109000633A publication Critical patent/CN109000633A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C19/00Gyroscopes; Turn-sensitive devices using vibrating masses; Turn-sensitive devices without moving masses; Measuring angular rate using gyroscopic effects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/02Preprocessing
    • G06F2218/04Denoising

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a kind of human body attitude motion capture algorithms based on isomeric data fusion, its step are as follows: (1) acquiring human action data using multiple sensing node Posture acquisition modules for being fixed on each characteristic portion of human body, each sensor node is made of a three axis accelerometer, a three-axis gyroscope and a three axis magnetometers;(2) respectively to accelerometer, gyroscope, magnetometer signal modeling, and signal is filtered, is converted;(3) the displacement rotating feature of three-dimensional space is resolved using complementary Kalman filter blending algorithm;And combine space displacement and hyperspin feature to calculate position of the limbs relative to space, and the characteristic parameter of resolving is mapped to corresponding body part, for driving the movement of body in real time;This invention address that develop a kind of Multisensor Data Fusion Algorithm of precise and high efficiency to improve current inertia action and capture existing low precision, the problems such as response lag, and certain reference is provided for the development of related fields.

Description

Human body posture motion capture algorithm design based on heterogeneous data fusion
Technical Field
The invention relates to a plurality of fields, in particular to an application field of interdigitation and combination of a plurality of disciplines and technologies, relates to a plurality of disciplines such as electronics, communication, control, computer graphics, human ergonomics, navigations and the like, and is actually a high-efficiency man-machine interaction method constructed by micro-electromechanical inertial elements. Is a new research hotspot in the field of human-computer interaction at present.
Background
The motion capture technology is to realize the simulation of human motion by collecting and processing the limb motion data of an experimenter and even rich facial expression data and driving a virtual three-dimensional human body model by utilizing the limb motion data or the facial expression data. The motion capture technology mainly refers to a novel human-computer interaction technology for acquiring, processing, simulating and storing data motion of a captured object by means of computer assistance, integrating motion computer graphics, electronics, human ergonomics, optics, mechanics and other technologies, acquiring motion data of each key part of the captured object by adopting methods such as measurement, tracking, calculation and the like, then performing a series of processing such as filtering, fusion and the like on initial data, using attitude information to drive a virtual three-dimensional computer model through a transmission protocol, and recovering the captured object.
The application of the motion capture technology based on the inertial sensor has a great market prospect, and a novel interaction mechanism of the motion capture technology is another great innovation in the field of human-computer interaction, so that the development of the field of human-computer interaction is greatly promoted. Meanwhile, if the current micro inertial sensing technology and the wireless communication technology can be combined, data are collected in real time and are processed into a whole, and a micro mechanical inertial motion capture system with independent intellectual property rights is developed, the development of related fields can be greatly enriched, and meanwhile, considerable economic benefits and social benefits are brought.
In the process of implementing the invention, the inventor finds that the existing motion capture technology at least has the defects of attitude data drift, poor remote controllability, low real-time performance and the like.
Disclosure of Invention
The invention aims to provide a human body posture motion capture algorithm based on heterogeneous data fusion aiming at the problems of the traditional motion capture, and designs a posture characteristic resolving strategy with relatively high precision and strong real-time performance on the basis of the traditional algorithm to realize more accurate capture of posture information, and meanwhile, a 3D real-time motion tracking man-machine interface based on a three-dimensional skeleton stack is utilized to carry out visual verification on the algorithm precision.
In order to realize the purpose of the invention, the technical scheme is as follows: the invention mainly relates to a human body posture motion capture algorithm based on heterogeneous data fusion. Each sensing node is composed of an inertial sensor and a microprocessor core, the inertial sensor is a 9-axis micro-sensor unit which is composed of a three-axis accelerometer, a three-axis gyroscope and a three-axis geomagnetic instrument, attitude information of a motion characteristic part is captured in real time, filtering and data fusion processing are carried out on original attitude information, and initial attitude information is solved; and then, converting the real-time attitude signals into real-time attitude signals suitable for driving the human body model through specific calculation and calibration, thereby driving the human body model to realize tracking display of real human body motion.
Other advantages of the present invention will be explained in the following description, and the technical solution of the present invention will be further explained by the accompanying drawings and examples.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is a flow of overall algorithm solution;
FIG. 2 is an overall solution strategy of the system to the characteristic attitude;
FIG. 3 is a solution strategy for the attitude angle of the node in this embodiment;
FIG. 4 is a solution strategy for node displacement in this embodiment example;
FIG. 5 is the initial acceleration dynamic measurement of the tri-axial accelerometer of the present embodiment;
FIG. 6 is an initial dynamic measurement of angular velocity of the three-axis gyroscope of this embodiment;
fig. 7 is a dynamic measurement value of initial magnetic strength of the triaxial magnetometer in the present embodiment;
FIG. 8 is a calculated attitude angle of the gyroscope tested in the present embodiment;
FIG. 9 is a diagram illustrating the calculated attitude angle of the gyroscope and the accelerometer under test in the present embodiment;
10a-c are the angular oscillations at rest after complementary filtering tested in this example embodiment;
11a-c are Kalman filtered pose versus complementary filtering tests in this example embodiment;
FIG. 12 is a display of human motion capture effect of the present embodiment;
Detailed Description
The following description of the preferred embodiments of the present invention is provided in connection with the accompanying drawings, and it should be noted that the preferred embodiments described herein are merely illustrative of the present invention and are not limiting.
The process of human motion can be mainly expressed as the correlation between the displacement of a relative space and the posture angle of relative motion between bones, the displacement of the relative space is mainly expressed as a motion displacement vector of a reference root node in the space, and the motion of other nodes can be obtained by taking the motion displacement vector as a reference object and carrying out relative angle change. Therefore, in the system, the main characteristic quantities are two vectors, namely, a motion displacement vector and a motion angle vector, in the embodiment, the two vectors are solved in the sensing node, and the original information of the sensor is received and processed through the MCU integrated with the node.
In the embodiment, the calculation process of the sensing node on the attitude is as follows:
step 1: coordinate definition
First, a terrestrial magnetic coordinate system (GCS), a human body coordinate system (GHS), and a sensor coordinate system (GSS) are defined, respectively. Herein we define the GCS as: the x axis points to the front, the y axis points to the right, and the z axis points to the upper part, so that a right-hand coordinate system is integrally formed; GHS is defined as a coordinate system corresponding to each limb; the GSS is a coordinate system of the sensor, and for convenience of analysis, the GHS is assumed to be the calibrated GSS;
step 2: signal modeling
The sensor model selected in this embodiment is MPU9250 (integrated three-axis accelerometer, three-axis gyroscope, three-axis geomagnetic instrument). Firstly, signal modeling is carried out on a sensor:
for a three-axis gyroscope, let the angular velocity beThe gyroscope offset isThe interference of white noise isThe signal measured by the gyroscopeCan be expressed as:
for a three-axis accelerometer, let the acceleration of motion beAcceleration of gravity ofThe accelerometer is offset byThe interference of white noise isThe measurement signal of the accelerometerCan be expressed as:
for a three-axis magnetometer, the earth magnetic field is set asGeomagnetic disturbance ofThe interference of white noise isThe geomagnetic measurement signal may be expressed as:
wherein the bias variation of the gyroscope and the accelerometer can be white Gaussian noiseAnda first order Makove model of the covariance matrix of (1):
the following figures 5, 6, 7 are the dynamic measurements of the initial accelerometer, gyroscope and magnetometer, respectively:
and step 3: displacement resolving strategy
In the process of motion, in order to avoid the appearance of singular values, quaternions are adopted to represent the positioning relation of the motion. Based on the experience summarized by the predecessor, according to the signal model of the gyroscope, the positioning relation q of each limb at the moment t in the global coordinate systemt' may be expressed as:
wherein q ist=[q1,q2,q3,q4]And Δ t is the sampling interval time,qt-1for a quaternion representation of the orientation direction at time t-1,is a 4x4 order skew symmetric matrix.
WhereinIs a cross product operator. The predicted velocity and displacement may be obtained by integrating the acceleration
Wherein
Being the gravity vector under the GHS,is formed by predicting quaternionThe directional cosine matrix is described.
Wherein Is composed ofAlthough the cross product operator of (2) can calculate the related displacement parameter at this time, it can be seen from the curve of fig. 5 that it is difficult to calculate more accurate data due to the interference of offset and noise existing in the acceleration sensor, and the integration process will continuously promote errors, so the motion displacement at this time is not accurate, and has more effectLarge data drifts, which still need to be filtered. Here, we adopt a complementary Kalman filter to perform fusion filtering processing on the displacement parameters. The overall solution strategy is shown in FIG. 4. Firstly, establishing a state error model and a measurement model:
wherein,is the process noise, P, of the covariance matrix QtIs a state transition matrix. And taking a certain segment of the human body as an example, the state error vectorCan be represented by the above formula (I),is the error in the measurement and is,measurement noise, M, being the covariance matrix RTIs a measurement matrix. At this time the true motion quaternion qtCan predict the valueSum error value δ qtTo show that:
by further processing, real-time updated values of the positioning direction error, the velocity error and the displacement error can be obtained:
whereinI3Is a 3 x 3 order identity matrix,the cross product operator after the offset is removed for the accelerometer.
And calculating the input measurement value of the complementary Kalman filtering by normalizing the acceleration and the geomagnetic vector. And finally, correcting the positioning direction, the speed and the displacement at the moment t, wherein a correction equation is as follows:
whereinThe real value, the predicted value and the error value of the positioning direction corresponding to the time t are respectively, corresponding to the real speed, the predicted speed and the error speed at the time t respectively,and predicting the displacement and the error displacement corresponding to the real displacement at the time t. And finally, updating through Kalman filtering to obtain a final updating equation of the displacement.
U=[I3-I3](28)
WhereinIs filtered by Kalman filter and then processed in fullDisplacement in local coordinates, KtIn order to be the basis of the kalman gain,is a vector of the predicted displacement of the image,for process noise, U is the measurement matrix,to measure the noise, the resulting displacement amount can be obtained therefrom.
And 4, step 4: angle resolving strategy
The traditional attitude calculation strategy is developed based on inertial navigations, the attitude angle is subjected to integral calculation by adopting a gyroscope at first, but due to the influence of integral drift, the calculated attitude angle is always large in deviation and difficult to use in the actual process, and the gyroscope and the acceleration are combined to be used for calculating the attitude later, but due to the influence of a geographic magnetic field, the calculation of the navigation angle can generate large drift, and the attitude angles calculated by combining a triaxial gyroscope and the gyroscope with the accelerometer are shown in the following figures 8 and 9 respectively. Later, attitude angle data are solved by adopting an acceleration meter and a geomagnetic meter, but because the attitude calculated by the accelerometer and the geomagnetic meter has poor dynamic response and is sensitive to noise and vibration, feedback data of a gyroscope needs to be fused in the process of solving the angle of a motion node, so that the attitude stability is improved. In the fusion process, firstly, the data preprocessed by the accelerometer and the geomagnetic instrument are subjected to combined calculation through an FAQ algorithm to obtain an initial attitude angle. To avoid the appearance of singular values, the Euler angles are converted into quaternions q0,q1,q2,q3And the attitude quaternion is used as a group of reference quantity, then a complementary filter is introduced, the attitude is fused and solved, a stable attitude quaternion is calculated, the solved attitude information is accurate, but small amplitude oscillation still exists in the static process, and the quaternion is converted into a motion curve of an Euler angle at the momentShown in FIGS. 10a-c below:
in order to further improve the anti-interference performance, on this basis, Kalman filtering is performed on the attitude quaternion to improve the anti-interference performance, and a resolving strategy is shown in fig. 3.
In the resolving process, an attitude quaternion Q ═ Q from a geomagnetic coordinate system to a human body coordinate system is defined0,q1,q2,q3]Let the quaternion error amount be:
wherein q is [ q ]1q2q3]. The quaternion complementary filtering algorithm may be represented by the following equation:
in the formula,ygis the three-axis output of the gyroscope, bgCalculated drift, k, for the gyroscopep>0,kiIs greater than 0. The attitude angle of the quaternion form at this moment can be solved by a complementary filter, and further corrected by a Kalman filter through the model and measurement noise matrixes Q and R. The matrix form of Q and R is as follows:
R=[r_meas](33)
during the experiment, q _ acc is 0.001, q _ gyr is 0.003, and r _ meas is 0.001. And (3) an optimal attitude angle recursion formula at the moment K:
in the formula phik,k-1Is a state transition matrix, K is Kalman gain, CkIs a measurement matrix and is initializedAfter Kalman filtering, the optimal estimation angle can be obtained. The Kalman filtered euler angles can be obtained by converting the quaternion and the euler angles, and the attitude angles have smaller oscillation than the complementary filtering, as shown in fig. 11a-c below.
In the implementation example, due to the difference between the wearing position and the experimental individual, during each test, the initial posture calibration is required, that is, the joint limb of the experimental individual is calibrated through some specific actions, and the rotation matrix of any section of the skeleton coordinate system of the human body and the sensor coordinate system corresponding to the section of the skeleton is obtained. Taking a lower limb system as an example, according to the sensor node planning, the lower limb needs 3 sensing nodes, which are respectively defined as a foot portion J, a shank K and a thigh L, and the initial calibration is performed, and the calibration process is as follows:
(1) the lower limbs are horizontally unfolded towards the side: from the static action at this time, the gravitational acceleration g of the lower leg sensing node and the foot sensing node can be obtainedJ、gkThus, the rotation matrix q of the lower leg can be obtainedJ bY of (A) isJAnd the rotation matrix q of the footK bY-axis component Y ofKAnd (3) calibrating:
(2) in case (1), the lower limbs are extended forward to the maximum, and then turned backward to the maximum: according to the dynamic posture, in the lower limb movement process, the corresponding output angular velocity positive readings of the sensing nodes of the thigh, the calf and the foot can be obtainedAnd reverse readingThereby allowing for a rotation matrix of the footX-axis component X ofJAnd rotation matrix of the lower legX-axis component X ofKAnd (3) calibrating:
further we can derive the rotation matrix of the foot and lower leg:
qJ b=(XJ,XJ×(YJ×XJ),YJ×XJ) (39)
qK b=(XK,XK×(YK×XK),YK×XK) (40)
(3) the lower limbs are in an upright state and then extend towards the right front to the maximum extent: angular velocity ω to thigh that can be achieved at this timeLI.e. rotation matrix for thighqL bY-axis component Y ofLAnd (3) calibrating:
(4) the lower limbs are in an upright state and then extend towards the side, namely, the state of (1) is recovered: according to this posture, when the thigh is in the upright state, the gravitational acceleration g of the thigh can be obtainedL TWhen the lower limb is lifted straight ahead, the gravity acceleration g of the thigh at the moment can be obtainedL UFrom this, a rotation matrix q of the thigh at that moment can be derivedL bZ-axis component Z ofLIs composed of
Simultaneous rotation matrix qL bIs composed of
qL b=(ZL×YL,YL,YL×(ZL×YL)) (43)
In order to obtain the accurate position of the human skeleton model by using the human motion data measured by the inertial sensor, the relative position between the attitude measurement unit and the human motion limb and the relative position between the motion limbs need to be calibrated, and the calibration process is to determine a rotation matrix between an MEMS sensor coordinate system and a measured limb coordinate system thereof and a rotation matrix between the two motion limbs. From the perspective of rigid body dynamics, two connected bones are regarded as rigid body structures, and the posture is calculated by taking the m-1 th bone and the m-th bone connected with the m-1 th bone as an example.
Let qm-1 aRotation matrix of sensor coordinate system corresponding to the (m-1) th bone with respect to geomagnetic coordinate system, qm-1 bIs a rotation matrix of the (m-1) th bone with respect to the coordinate system of the sensor arranged thereon, qm-1 aCan be determined from the initial data measured by the sensor, qm-1 bThe method can be obtained in the initial calibration process, and therefore the posture matrix q of the skeleton under the self coordinate system can be calculatedm-1Comprises the following steps:
the rotation matrix q of the m-1 th bone to the m-th bonem-1 mComprises the following steps:
from formulas (10) and (11):
let Qm-1 mFor the transformation matrix from the m-1 th bone to the m-th bone in the rigid limb of the human body, the coordinate transformation property of the matrix can be obtained as follows:
wherein T ism-1 mIs a translation matrix from the m-1 th bone to the m-th bone, and Tm-1 mEach element is real three-dimensional data of the mth bone. Herein, the translation vector of the conversion of the sensor coordinate system to the human body coordinate system and the coordinate translation component of the geomagnetism to the sensing node are components upward of the coordinate system integrated with the acceleration value.
According to the transformation matrix multiplication, the transformation matrix of the 1 st bone corresponding to the m-th bone can be calculated as:
Q1 m=Qm-1 mQm-2 m-1Qm-3 m-2…Q2 3Q1 2(48)
through the position of the first sensor node, the transformation matrix of the mth skeleton can be calculated, and therefore the motion of the whole human body model is achieved.
The embodiment explains the design principle and the implementation process of the human body posture motion capture algorithm based on heterogeneous data fusion, and fig. 12 is the motion capture effect display of the embodiment. In addition, the motion capture system has wide application prospect, and the currently known fields include movie and television subject production, game design, sports training, robot control and the like, so the motion capture system has higher economic and academic reference values.
Finally, it should be noted that the above description is only a preferred embodiment of the present invention, but the present invention should not be limited to the disclosure of the embodiment and the drawings. Therefore, it is intended that all equivalents and modifications which do not depart from the spirit of the invention disclosed herein are deemed to be within the scope of the invention.

Claims (4)

1. The human body posture motion capture algorithm based on heterogeneous data fusion is characterized in that angular velocity, acceleration and geomagnetic signals in a 3-dimensional space can be collected in real time through a 9-axis inertial sensor, and the motion of a human body in a three-dimensional space can be specifically described as the motion of displacement in a relative space and the motion of a relative angle between bones; carrying out low-pass and high-pass filtering on the acceleration and the angular velocity, then carrying out attitude fusion on the 9-dimensional information by adopting a core algorithm of complementary Kalman filtering, and solving displacement and angle vectors in the human motion process; the three-dimensional human body model carries out reverse kinematic processing on the signals and carries out limb posture calibration, so that the signals are converted into real posture signals which can be used for driving the model.
2. The human body posture motion capture algorithm based on heterogeneous data fusion of claim 1, wherein each sensor node unit comprises an inertial sensor and a corresponding microprocessor, and can independently calculate the posture information of each motion node in real time.
3. The human body posture motion capture algorithm based on heterogeneous data fusion as claimed in claim 1, wherein the algorithm not only comprises an inertial navigation algorithm to perform multi-sensor multi-data fusion processing, but also can realize real-time tracking and recovery of each node motion.
4. The human body posture motion capture algorithm based on heterogeneous data fusion of claim 1, wherein by resolving posture information of a plurality of moving nodes, a 3D human body model in an upper computer can be driven in real time, and posture information of each node can be displayed in real time, so that visualization and credibility of a system are enhanced.
CN201611272618.9A 2017-06-06 2017-06-06 Human body attitude motion capture algorithm design based on isomeric data fusion Pending CN109000633A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611272618.9A CN109000633A (en) 2017-06-06 2017-06-06 Human body attitude motion capture algorithm design based on isomeric data fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611272618.9A CN109000633A (en) 2017-06-06 2017-06-06 Human body attitude motion capture algorithm design based on isomeric data fusion

Publications (1)

Publication Number Publication Date
CN109000633A true CN109000633A (en) 2018-12-14

Family

ID=64572641

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611272618.9A Pending CN109000633A (en) 2017-06-06 2017-06-06 Human body attitude motion capture algorithm design based on isomeric data fusion

Country Status (1)

Country Link
CN (1) CN109000633A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110393533A (en) * 2019-07-25 2019-11-01 森博迪(深圳)科技有限公司 A kind of combination inertia and infrared wearing-type motion capture system and method
CN111158482A (en) * 2019-12-30 2020-05-15 华中科技大学鄂州工业技术研究院 Human body motion posture capturing method and system
CN111895997A (en) * 2020-02-25 2020-11-06 哈尔滨工业大学 Human body action acquisition method based on inertial sensor without standard posture correction
CN111956230A (en) * 2020-08-14 2020-11-20 山东省肿瘤防治研究院(山东省肿瘤医院) Attitude capturing method and system based on inertial measurement unit in endoscopic surgery
CN112037312A (en) * 2020-11-04 2020-12-04 成都市谛视科技有限公司 Real-time human body posture inverse kinematics solving method and device
CN113393561A (en) * 2021-05-26 2021-09-14 完美世界(北京)软件科技发展有限公司 Method, device and storage medium for generating limb action expression packet of virtual character
CN113984051A (en) * 2020-04-30 2022-01-28 深圳市瑞立视多媒体科技有限公司 IMU and rigid body pose fusion method, device, equipment and storage medium
CN114533039A (en) * 2021-12-27 2022-05-27 重庆邮电大学 Human body joint position and angle calculating method based on redundant sensors

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110393533A (en) * 2019-07-25 2019-11-01 森博迪(深圳)科技有限公司 A kind of combination inertia and infrared wearing-type motion capture system and method
CN111158482A (en) * 2019-12-30 2020-05-15 华中科技大学鄂州工业技术研究院 Human body motion posture capturing method and system
CN111158482B (en) * 2019-12-30 2023-06-27 华中科技大学鄂州工业技术研究院 Human body motion gesture capturing method and system
CN111895997A (en) * 2020-02-25 2020-11-06 哈尔滨工业大学 Human body action acquisition method based on inertial sensor without standard posture correction
CN113984051A (en) * 2020-04-30 2022-01-28 深圳市瑞立视多媒体科技有限公司 IMU and rigid body pose fusion method, device, equipment and storage medium
CN111956230A (en) * 2020-08-14 2020-11-20 山东省肿瘤防治研究院(山东省肿瘤医院) Attitude capturing method and system based on inertial measurement unit in endoscopic surgery
CN112037312A (en) * 2020-11-04 2020-12-04 成都市谛视科技有限公司 Real-time human body posture inverse kinematics solving method and device
CN112037312B (en) * 2020-11-04 2021-02-09 成都市谛视科技有限公司 Real-time human body posture inverse kinematics solving method and device
CN113393561A (en) * 2021-05-26 2021-09-14 完美世界(北京)软件科技发展有限公司 Method, device and storage medium for generating limb action expression packet of virtual character
CN114533039A (en) * 2021-12-27 2022-05-27 重庆邮电大学 Human body joint position and angle calculating method based on redundant sensors
CN114533039B (en) * 2021-12-27 2023-07-25 重庆邮电大学 Human joint position and angle resolving method based on redundant sensor

Similar Documents

Publication Publication Date Title
CN109000633A (en) Human body attitude motion capture algorithm design based on isomeric data fusion
CN108939512B (en) Swimming posture measuring method based on wearable sensor
Valenti et al. A linear Kalman filter for MARG orientation estimation using the algebraic quaternion algorithm
CN108627153B (en) Rigid body motion tracking system based on inertial sensor and working method thereof
CN105203098B (en) Agricultural machinery all-attitude angle update method based on nine axis MEMS sensors
CN103968827B (en) A kind of autonomic positioning method of wearable body gait detection
CN110986939B (en) Visual inertia odometer method based on IMU (inertial measurement Unit) pre-integration
Zhu et al. A real-time articulated human motion tracking using tri-axis inertial/magnetic sensors package
Yun et al. Design, implementation, and experimental results of a quaternion-based Kalman filter for human body motion tracking
CN107478223A (en) A kind of human body attitude calculation method based on quaternary number and Kalman filtering
CN108225308A (en) A kind of attitude algorithm method of the expanded Kalman filtration algorithm based on quaternary number
US20100250177A1 (en) Orientation measurement of an object
CN108225370B (en) Data fusion and calculation method of motion attitude sensor
CN109724602A (en) A kind of attitude algorithm system and its calculation method based on hardware FPU
Sun et al. Adaptive sensor data fusion in motion capture
CN102937450B (en) A kind of relative attitude defining method based on gyro to measure information
CN109284006B (en) Human motion capturing device and method
CN107532907A (en) Posture detection equipment
CN113793360A (en) Three-dimensional human body reconstruction method based on inertial sensing technology
CN116027905A (en) Double kayak upper limb motion capturing method based on inertial sensor
CN107782309A (en) Noninertial system vision and double tops instrument multi tate CKF fusion attitude measurement methods
Zhang et al. Micro-IMU-based motion tracking system for virtual training
CN108734762B (en) Motion trail simulation method and system
CN110609621A (en) Posture calibration method and human motion capture system based on micro-sensor
CN108121890A (en) A kind of navigation attitude information fusion method based on linear Kalman filter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20181214