CN111895997B - Human body action acquisition method based on inertial sensor without standard posture correction - Google Patents

Human body action acquisition method based on inertial sensor without standard posture correction Download PDF

Info

Publication number
CN111895997B
CN111895997B CN202010117861.3A CN202010117861A CN111895997B CN 111895997 B CN111895997 B CN 111895997B CN 202010117861 A CN202010117861 A CN 202010117861A CN 111895997 B CN111895997 B CN 111895997B
Authority
CN
China
Prior art keywords
joint
imu
angular velocity
human body
axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010117861.3A
Other languages
Chinese (zh)
Other versions
CN111895997A (en
Inventor
衣淳植
姜峰
杨炽夫
王学嘉
张龙海
张�浩
李芳卓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN202010117861.3A priority Critical patent/CN111895997B/en
Publication of CN111895997A publication Critical patent/CN111895997A/en
Application granted granted Critical
Publication of CN111895997B publication Critical patent/CN111895997B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • A61B5/1122Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C1/00Measuring angles

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • General Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Dentistry (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Automation & Control Theory (AREA)
  • Geometry (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a human body action acquisition method based on an inertial sensor without standard posture correction. Step 1: arranging an IMU and establishing joint physiological kinematic constraint; step 2: by utilizing the IMU in the step 1, when the number of sampling points of the IMU exceeds 100, running an optimization program according to a constraint equation, and estimating a joint axis and a joint position vector by utilizing a Gauss-Newton method; and 3, step 3: respectively solving two groups of joint angles by utilizing the solved joint axis and joint position vector through acceleration information and angular velocity integral; and 4, step 4: and (4) solving the weighted average of the two joint angles in the step (3) through complementary filtering to obtain the solved joint angle. The invention aims to decode IMU signals of lower limbs of a human body and solve the angles of hip joints, knee joints and ankle joints of the lower limbs in real time according to a kinematic model of the lower limbs of the human body.

Description

Human body action acquisition method based on inertial sensor without standard posture correction
Technical Field
The invention belongs to the technical field of greenhouses; in particular to a human body action acquisition method based on an inertial sensor without standard posture correction.
Background
Nowadays, people are increasingly demanding more and more convenience and comfort in life and work. Therefore, when the human body moves, the lower limb flexible exoskeleton is utilized to assist the human body and replace a human muscle system to do work, so that the reduction of the energy consumption of the human body is more and more concerned. In the research of human body assistance, more and more researchers participate in how to optimize an assistance curve and improve the assistance effect and user experience of an exoskeleton. However, in the current research, because it is difficult to conveniently acquire the information of the joint angle of the lower limbs of the experimenter, most of the researchers can only calculate the angle curve based on the joint angle curve of the statistical law or based on a set of complex joint angle algorithm prepared in the early stage to assist the experimenter, which causes that the assistance curve of the exoskeleton is difficult to accurately reflect the individual difference and gait change of the user and can not be adjusted in real time according to the used activity state. The problems greatly reduce the power assisting effect of the exoskeleton and simultaneously weaken the user experience of users, and the problems are urgently needed to be overcome.
Kan et al, stanford university, have made special mechanical devices to measure body motion information, where joint angles are measured by encoders at the joints, joint angular velocities are measured by gyroscopes and simply calculated from the relative relationships of the limb segments, and angular accelerations are obtained by angular velocity differentials. The method can measure the human motion signals through a series of common sensors, and expected data can be obtained through simple data processing. However, in the testing process, the passive mechanical rod is not easy to align with the human joint axis, so that the joint angle information is distorted, and the universality of the mechanical device is poor, so that the human body may be damaged.
The optical motion capture system is the highest precision human motion information measuring equipment at present. During measurement, optical mark points are pasted on the body of an experimenter, and the track of the mark points moving along with the human body can be captured by the optical lens in the system. On the basis, the matched software is used for calculating the human motion information including the joint angle. Although the method can obtain the highest precision, the equipment is very expensive, the operation is relatively complicated, and online calculation cannot be carried out. Meanwhile, the optical lens can only be used indoors generally, and the optical environment is required to be ensured to be good, and the above disadvantages limit the practicability.
In recent years, IMU-based motion capture technology has gradually entered the field of view of people. Different from the two methods, the IMU sensor cannot directly measure the required human motion information, but after the IMU is fixedly connected with a human body, the angular velocity and the acceleration of the IMU and the magnetic field information of the space where the IMU is located are respectively measured through a 3-axis gyroscope, a 3-axis accelerometer and a 3-axis magnetometer which are integrated in the IMU. After the motion information is collected, complex algorithms are combined to solve the required human motion information.
In the IMU-based joint angle solution method, the current method with higher precision generally needs to correct the posture of the human body before the experiment starts, for example, calibration actions are adopted to align the sensor with the body segment or to acquire the orientation relation between the sensor and the world coordinate system, so as to simplify the use of tools and enable the sensor to be placed at any position of the limb. The method usually needs to manufacture a special calibration tool, is complex to operate, cannot continue the test when the position of the sensor deviates, and needs to calibrate again to continue calculating the joint angle.
Disclosure of Invention
The invention provides a human body action acquisition method based on an inertial sensor without standard posture correction, and aims to decode IMU signals of lower limbs of a human body and solve angles of hip joints, knee joints and ankle joints of the lower limbs in real time according to a human body lower limb kinematics model.
The invention is realized by the following technical scheme:
a human body action acquisition method based on an inertial sensor and without standard posture correction comprises the following specific steps:
step 1: arranging an IMU and establishing joint physiological kinematic constraint;
step 2: by utilizing the IMU in the step 1, when the number of sampling points of the IMU exceeds 100, running an optimization program according to a constraint equation, and estimating a joint axis and a joint position vector by utilizing a Gauss-Newton method;
and step 3: respectively solving two groups of joint angles by utilizing the solved joint axis and joint position vector through acceleration information and angular velocity integral;
and 4, step 4: and (4) solving the weighted average of the two joint angles in the step (3) through complementary filtering to obtain the joint angle.
Further, the step 1 specifically includes placing the IMU on the thigh or the calf, acquiring sensing information of the IMU placed at any position of the thigh or the calf of the human body, that is, information of measured angular velocity and acceleration, and establishing a joint physiological constraint equation as follows:
||ω 1 ×j 1 ||-||ω 2 ×j 2 ||=0 (1)
ω 1 ,ω 2 angular velocity, j, measured for IMU on thigh and calf, respectively 1 ,j 2 Coordinate representation of the joint axis on the thigh and the shank respectively; the joint axis is an axis of joint rotation, and the coordinate expressions of the joint axis in different IMU fixed coordinate systems are different:
Figure BDA0002392045090000021
a 1 ,a 2 acceleration, r, measured for the IMU on the thigh and calf, respectively 1 ,r 2 And joint position vectors from the origin of the IMU fixed coordinate system on the thigh and the shank to any point on the joint rotating shaft are respectively shown.
Further, step 2 is specifically to take the left side of the constraint equation of the two joint physiological kinematics (1) and (2) as an objective function, and let
j i =[cos(θ i )cos(u i ),cos(θ i )sin(u i ),sin(θ i )],i=1,2 (3)
r i =[x i ,y i ,z i ],i=1,2 (4)
And after the number of sampling points exceeds 100, substituting all sampling points into an error function to obtain an error vector consisting of errors of all sampling points, continuously searching unknown quantity solutions and alternating the error vector, simultaneously filtering data points which do not meet conditions according to the joint flexion-extension axis vector after each iteration, and after repeated iteration, obtaining an optimal solution of unknown quantity when the modulus of the error vector reaches the minimum, so as to solve the optimal solution of the joint axis vector and the joint position vector.
Further, the step 3 is specifically describedTo obtain a joint axis vector j through step 2 1 ,j 2 After the coordinates are obtained, the joint angle can be obtained through angular velocity integration, and since the difference value of the angular velocity of the limb in the joint axis direction is the rotation angular velocity of the joint, the integral of the angular velocity difference value in the joint axis direction in time is as follows:
q w =∫(ω 1 ·j 12 ·j 2 )dt (5)
since the angular velocity integral amplifies the measurement noise, the angular drift obtained by the angular velocity integral is severe and depends on the acceleration a 1 ,a 2 The projection of the first two components on the joint position vector can solve the joint angle according to the acceleration dip angle, and the calculation is as follows:
Figure BDA0002392045090000031
further, the step 4 is specifically the step of obtaining q in the step 3 w And q is a Because the angular velocity and the acceleration respectively contain low-frequency and high-frequency noises, the angular velocity and the acceleration are fused by complementary filtering to filter the noises, wherein the filter coefficient gamma is selected to be 0.01,
q=γ·q a +(1-γ)·q w (7)。
the invention has the beneficial effects that:
according to the invention, no correction action and special IMU placement direction are needed, and the automatic calculation of the IMU relative to the human body direction under any action of the human body can be rapidly realized.
Drawings
FIG. 1 is a simplified model diagram of the physiological angular velocity constraint of a joint according to the present invention.
FIG. 2 is a schematic diagram of a joint physiological acceleration constraint model of the present invention.
FIG. 3 is a flow chart of a method of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
A human body action acquisition method based on an inertial sensor without standard posture correction comprises the following specific steps:
step 1: arranging an IMU and establishing joint physiological kinematic constraint;
step 2: by utilizing the IMU in the step 1, when the number of sampling points of the IMU exceeds 100, running an optimization program according to a constraint equation, and estimating a joint axis and a joint position vector by utilizing a Gauss-Newton method;
and 3, step 3: respectively solving two groups of joint angles by utilizing the solved joint axis and joint position vector through acceleration information and angular velocity integral;
and 4, step 4: and (4) solving the weighted average of the two joint angles in the step (3) through complementary filtering to obtain the joint angle.
Further, the step 1 is specifically to place the IMU on the thigh or the calf, obtain the sensing information of the IMU placed on any position of the thigh or the calf of the human body, that is, the measured angular velocity and acceleration information, and establish a joint physiological constraint equation as follows:
||ω 1 ×j 1 ||-||ω 2 ×j 2 ||=0 (1)
ω 1 ,ω 2 angular velocity, j, measured for IMU on thigh and calf, respectively 1 ,j 2 Coordinate representation of the joint axis on the thigh and the shank respectively; as shown in fig. 2, the joint axis is an axis of joint rotation, and its coordinate expression in different IMU fixed coordinate systems is different:
Figure BDA0002392045090000041
a 1 ,a 2 acceleration, r, measured for the IMU on the thigh and calf, respectively 1 ,r 2 The joint position vectors from the origin of the IMU fixed coordinate system on the thigh and the calf to any point on the joint rotation axis are shown in fig. 3.
Further, step 2 is specifically to take the left side of the constraint equation of the two joint physiological kinematics (1) and (2) as an objective function, so that
j i =[cos(θ i )cos(u i ),cos(θ i )sin(u i ),sin(θ i )],i=1,2, (3)
r i =[x i ,y i ,z i ],i=1,2 (4)
And after the number of sampling points exceeds 100, substituting all sampling points into an error function to obtain an error vector consisting of errors of all sampling points, continuously searching unknown solutions and alternating the error vector, simultaneously filtering data points which do not meet conditions according to the joint flexion-extension axis vector after each iteration, and after repeated iteration, obtaining the optimal solution of the unknown quantity when the modulus of the error vector reaches the minimum, so as to solve the optimal solution of the joint axis vector and the joint position vector.
Further, the step 3 is specifically to obtain a joint axis vector j through the step 2 1 ,j 2 After the coordinates are obtained, the joint angle can be obtained through angular velocity integration, and since the difference value of the angular velocity of the limb in the joint axis direction is the rotation angular velocity of the joint, the integral of the angular velocity difference value in the joint axis direction over time is:
q w =∫(ω 1 ·j 12 ·j 2 )dt (5)
since the angular velocity integral amplifies the measurement noise, the angular drift obtained by the angular velocity integral is severe and depends on the acceleration a 1 ,a 2 The projection of the first two components on the joint position vector can solve the joint angle according to the acceleration dip angle, and the calculation is as follows:
Figure BDA0002392045090000051
further, the step 4 is specifically the step of obtaining q in the step 3 w And q is a Because the angular velocity and the acceleration respectively contain low-frequency and high-frequency noise, the angular velocity and the acceleration are fused through complementary filtering, and the noise is filtered, wherein the filtering coefficient gamma is selected to be 0.01.
q=γ·q a +(1-γ)·q w (7)。

Claims (3)

1. A human body action acquisition method based on an inertial sensor without standard posture correction is characterized by comprising the following specific steps:
step 1: arranging an IMU and establishing joint physiological kinematic constraint;
step 2: by utilizing the IMU in the step 1, when the number of sampling points of the IMU exceeds 100, running an optimization program according to a constraint equation, and estimating a joint axis and a joint position vector by utilizing a Gauss-Newton method;
and step 3: respectively solving two groups of joint angles by utilizing the solved joint axis and joint position vector through acceleration information and angular velocity integral;
and 4, step 4: the two joint angles in the step 3 are weighted and averaged through complementary filtering, and the joint angle is solved;
specifically, the step 1 is to place the IMU on the thigh or the calf, acquire the sensing information of the IMU placed at any position of the thigh or the calf of the human body, that is, the measured angular velocity and acceleration information, and establish a joint physiological constraint equation as follows:
||ω 1 ×j 1 ||-||ω 2 ×j 2 ||=0 (1)
ω 12 angular velocity, j, measured for IMU on thigh and calf, respectively 1 ,j 2 Coordinate representation of the joint axis on the thigh and the shank respectively; the joint axis is an axis of joint rotation, and the coordinate expressions of the joint axis in different IMU fixed coordinate systems are different:
Figure FDA0003720532140000011
a 1 ,a 2 acceleration, r, measured for the IMU on the thigh and calf, respectively 1 ,r 2 Respectively are joint position vectors from the origin of an IMU (inertial measurement Unit) fixed coordinate system on the thigh and the shank to any point on a joint rotating shaft;
the step 3 is specifically to obtain a joint axis vector j through the step 2 1 ,j 2 After the coordinates are obtained, the joint angle can be obtained through angular velocity integration, and the difference value of the angular velocity of the limb in the joint axis direction is the rotation angular velocity of the joint, so the integral q of the angular velocity difference value in the joint axis direction in time w Namely:
q w =∫(ω 1 ·j 12 ·j 2 )dt (5)
since the angular velocity integral amplifies the measurement noise, the angular drift obtained by the angular velocity integral is severe and depends on the acceleration a 1 ,a 2 The projection of the first two components on the joint position vector can solve the joint angle q according to the acceleration dip angle a The calculation is as follows:
Figure FDA0003720532140000021
2. the method for acquiring human body motion based on inertial sensor without standard posture correction according to claim 1, wherein the step 2 is implemented by taking the left side of the constraint equation of the physiological kinematics of two joints (1) and (2) as an objective function
j i =[cos(θ i )cos(u i ),cos(θ i )sin(u i ),sin(θ i )],i=1,2 (3)
r i =[x i ,y i ,z i ],i=1,2 (4)
And after the number of sampling points exceeds 100, substituting all sampling points into an error function to obtain an error vector consisting of errors of all sampling points, continuously searching unknown quantity solutions and alternating the error vector, simultaneously filtering data points which do not meet conditions according to the joint flexion-extension axis vector after each iteration, and after repeated iteration, obtaining an optimal solution of unknown quantity when the modulus of the error vector reaches the minimum, so as to solve the optimal solution of the joint axis vector and the joint position vector.
3. The method for acquiring human body motion based on inertial sensor without standard posture correction according to claim 1, wherein the step 4 is specifically the step 3 of obtaining q w And q is a Because the angular velocity and the acceleration respectively contain low-frequency noise and high-frequency noise, the angular velocity and the acceleration are fused through complementary filtering to filter the noise, wherein the filter coefficient gamma is selected to be 0.01,
q=γ·q a +(1-γ)·q w (7)。
CN202010117861.3A 2020-02-25 2020-02-25 Human body action acquisition method based on inertial sensor without standard posture correction Active CN111895997B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010117861.3A CN111895997B (en) 2020-02-25 2020-02-25 Human body action acquisition method based on inertial sensor without standard posture correction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010117861.3A CN111895997B (en) 2020-02-25 2020-02-25 Human body action acquisition method based on inertial sensor without standard posture correction

Publications (2)

Publication Number Publication Date
CN111895997A CN111895997A (en) 2020-11-06
CN111895997B true CN111895997B (en) 2022-10-25

Family

ID=73169801

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010117861.3A Active CN111895997B (en) 2020-02-25 2020-02-25 Human body action acquisition method based on inertial sensor without standard posture correction

Country Status (1)

Country Link
CN (1) CN111895997B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112711332B (en) * 2020-12-29 2022-07-15 上海交通大学宁波人工智能研究院 Human body motion capture method based on attitude coordinates
CN113197572A (en) * 2021-05-08 2021-08-03 解辉 Human body work correction system based on vision
CN114533039B (en) * 2021-12-27 2023-07-25 重庆邮电大学 Human joint position and angle resolving method based on redundant sensor

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107309872A (en) * 2017-05-08 2017-11-03 南京航空航天大学 A kind of flying robot and its control method with mechanical arm

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9597015B2 (en) * 2008-02-12 2017-03-21 Portland State University Joint angle tracking with inertial sensors
TWI549655B (en) * 2012-05-18 2016-09-21 國立成功大學 Joint range of motion measuring apparatus and measuring method thereof
CN104061934B (en) * 2014-06-10 2017-04-26 哈尔滨工业大学 Pedestrian indoor position tracking method based on inertial sensor
CN104613963B (en) * 2015-01-23 2017-10-10 南京师范大学 Pedestrian navigation system and navigation locating method based on human cinology's model
US10121066B1 (en) * 2017-11-16 2018-11-06 Blast Motion Inc. Method of determining joint stress from sensor data
CN106344026A (en) * 2016-09-21 2017-01-25 苏州坦特拉自动化科技有限公司 Portable human joint parameter estimation method based on IMU (inertial measurement unit)
CN106153077B (en) * 2016-09-22 2019-06-14 苏州坦特拉智能科技有限公司 A kind of initialization of calibration method for M-IMU human motion capture system
CN109000633A (en) * 2017-06-06 2018-12-14 大连理工大学 Human body attitude motion capture algorithm design based on isomeric data fusion
US10521703B2 (en) * 2017-06-21 2019-12-31 Caterpillar Inc. System and method for controlling machine pose using sensor fusion
CN108245164B (en) * 2017-12-22 2021-03-26 北京精密机电控制设备研究所 Human body gait information acquisition and calculation method for wearable inertial device
CN108888473B (en) * 2018-05-22 2021-04-09 哈尔滨工业大学 Lower limb joint motion control method based on wearable walking assisting exoskeleton
CN109540126B (en) * 2018-12-03 2020-06-30 哈尔滨工业大学 Inertial vision integrated navigation method based on optical flow method
CN110561391B (en) * 2019-09-24 2022-12-09 中国船舶重工集团公司第七0七研究所 Inertia information feedforward control device and method for lower limb exoskeleton system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107309872A (en) * 2017-05-08 2017-11-03 南京航空航天大学 A kind of flying robot and its control method with mechanical arm

Also Published As

Publication number Publication date
CN111895997A (en) 2020-11-06

Similar Documents

Publication Publication Date Title
CN111895997B (en) Human body action acquisition method based on inertial sensor without standard posture correction
US7089148B1 (en) Method and apparatus for motion tracking of an articulated rigid body
Zhou et al. Reducing drifts in the inertial measurements of wrist and elbow positions
CN104757976B (en) A kind of Human Body Gait Analysis method and system based on Multi-sensor Fusion
CN108939512A (en) A kind of swimming attitude measurement method based on wearable sensor
CN107616898B (en) Upper limb wearable rehabilitation robot based on daily actions and rehabilitation evaluation method
CN106419928A (en) Portable device and method oriented for device for real-time measuring of step length
WO2018081986A1 (en) Wearable device and real-time step length measurement method for device
CN111693024A (en) Wearable human body sensing monitoring equipment based on nine-axis inertia measurement unit
WO2018132999A1 (en) Human body step length measuring method for use in wearable device and measuring device of the method
CN110609621B (en) Gesture calibration method and human motion capture system based on microsensor
Kim et al. StrokeTrack: wireless inertial motion tracking of human arms for stroke telerehabilitation
CN111887856B (en) Inertial sensor-based real-time calculation method for position-drunkenness-resistant joint angle
Lin et al. Assessment of shoulder range of motion using a wearable inertial sensor network
CN109453505B (en) Multi-joint tracking method based on wearable device
Chen et al. Human motion capture algorithm based on inertial sensors
Qiu et al. Heterogeneous data fusion for three-dimensional gait analysis using wearable MARG sensors
CN115607146B (en) Wearable single-node device for leg posture estimation and measurement method
CN114469078B (en) Human motion detection method based on light-inertia fusion
Xu et al. Measuring human joint movement with IMUs: Implementation in custom-made low cost wireless sensors
CN114440883A (en) Pedestrian positioning method based on foot and leg micro-inertia measurement unit
Joukov et al. Closed-chain pose estimation from wearable sensors
CN114748306A (en) Exoskeleton equipment wearing error correction method
Megharjun et al. A Kalman filter based full body gait measurement system
Meng et al. Effect of walking variations on complementary filter based inertial data fusion for ankle angle measurement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant