CN116019442B - Motion posture assessment system based on UWB/IMU fusion - Google Patents

Motion posture assessment system based on UWB/IMU fusion Download PDF

Info

Publication number
CN116019442B
CN116019442B CN202211595275.5A CN202211595275A CN116019442B CN 116019442 B CN116019442 B CN 116019442B CN 202211595275 A CN202211595275 A CN 202211595275A CN 116019442 B CN116019442 B CN 116019442B
Authority
CN
China
Prior art keywords
tag
uwb
information
gesture
positioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211595275.5A
Other languages
Chinese (zh)
Other versions
CN116019442A (en
Inventor
孟琳
张广举
明东
程龙龙
庞珺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN202211595275.5A priority Critical patent/CN116019442B/en
Publication of CN116019442A publication Critical patent/CN116019442A/en
Application granted granted Critical
Publication of CN116019442B publication Critical patent/CN116019442B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention discloses a motion posture assessment system based on UWB/IMU fusion. The human body wears gesture positioning tag, wherein IMU data and gesture positioning tag identification are transmitted to a UWB receiving module by using UWB wireless transmission technology, the UWB receiving module transmits the data to a processing unit, the processing unit filters the transmitted data, and then an information fusion technology such as an improved Kalman filtering and limiting condition fusion technology is used for acquiring accurate position information of each tag, and then motion gesture information of the human body is displayed in a three-dimensional mode or transmitted to other platforms. The invention is applied to the fields of disabled people rehabilitation, life, entertainment and the like, and obtains better social and economic values.

Description

Motion posture assessment system based on UWB/IMU fusion
Technical Field
The invention relates to the field of human body motion gesture model reconstruction, in particular to a motion gesture evaluation system and method based on UWB/IMU fusion, and especially relates to a human body motion gesture model reconstruction system based on information fusion of a UWB/IMU sensor.
Background
The application of posture assessment in sports rehabilitation runs through the whole sports training and rehabilitation treatment process, pain points of patients can be quickly and accurately found through the posture assessment, for example, the dynamic posture is measured and assessed through dynamic balance, the detection is carried out from the aspects of body, vestibule, vision, gravity center distribution and the like, the posture balance disorder of parkinsonism patients can be quantitatively assessed, the rehabilitation training is purposefully arranged, and the gait of parkinsonism patients is improved. Different gait poses may be indicative of a higher likelihood of suffering from a corresponding disease. For example, drunk steps, clinically called ataxia, are dangerous, and have poor balance ability and easy wrestling, and can also prompt the conditions of brain tumor, cerebral hemorrhage, cerebellar lesions and the like; the magnetic step is insufficient in supporting force and poor in balance, and especially for the old, the walking posture is easy to fall down. Clinically, it is a typical symptom of patients with positive pressure hydrocephalus. The gesture evaluation can help the athlete or special personnel to accurately find the weak chain of the individual, and then the athlete or special personnel can be individually and pointedly guided to perform scientific and effective exercise training and exercise rehabilitation, for example, the physical function evaluation of the middle-distance runner by using the Y balance test can play a good role in predicting the injury of the lower limbs.
Currently, the sensing technology of human body dynamic posture tracking: inertial sensors, acoustic sensors, magnetic sensors, mechanical sensors, and visual sensors. The position information of the inertial tracking system is influenced based on accumulated errors and drift generated by the inertial tracking system along with time, so that the gesture discrimination error is larger; based on the fact that the acoustic tracking system may be affected by sound reflection or audio interference during tracking, the influence is more serious especially in a relatively closed space; based on the magnetic sensor, the tracking system is difficult to ensure the application time and precision due to the influence of power supply; based on a mechanical motion tracking system, more devices are worn, and the motion posture is influenced; the tracking system based on the video mode is the system with highest precision at present, is greatly influenced by space shielding, and has a severe space requirement.
Ultra Wideband (UWB) is a new wireless communication technology that is very different from the conventional communication technology. The UWB technology is a wireless technology with high transmission rate, lower transmitting power, stronger penetrating power and extremely narrow pulse based and no carrier wave. The Ultra Wideband (UWB) wireless positioning technology has the advantages of low power consumption, good multipath resistance effect, high safety, low system complexity, very accurate positioning accuracy and the like, and simultaneously has double advantages of real-time positioning and accurate positioning, and the delay time of positioning is far less than that of other indoor positioning technologies such as Bluetooth positioning, WIFI positioning and the like. Common ranging-based positioning methods are: time of Arrival (TOA) based positioning, time difference of Arrival (TIME DIFFERENCE of Arrival, TDOA) based positioning, RSSI based positioning, angle of Arrival (AOA) based positioning, etc.
Positioning is performed using UWB and Inertial Measurement Unit (IMU) fusion techniques. The IMU system can be used for monitoring the motion change state, and the UWB and IMU fusion positioning technology is used for accurately positioning key parts of the human body, so that the tracking of the human body posture is realized.
Disclosure of Invention
The invention provides a new method for reconstructing a motion gesture based on UBW/IMU fusion, which is characterized in that a gesture positioning tag is worn on a specific part of a human body such as a waist, a chest, a head, an arm, an upper arm, a thigh, a shank and a foot, IMU data is transmitted to a processing unit by using UWB, a UWB receiver is used for receiving a UWB transmitting signal in the tag, reading information is transmitted to the processing unit, the processing unit filters the transmitted data, then an information fusion technology is used for acquiring accurate position information of each tag, and then motion gesture information of the human body is displayed in a three-dimensional mode or transmitted to other platforms.
The technical scheme of the invention is a motion posture assessment system based on UWB/IMU fusion, which comprises:
the gesture positioning tag senses a motion state, transmits UWB tag information and transmits IMU information to the processing unit;
the UWB receiving module reads the IMU information and the UWB tag information in each gesture positioning tag and then transmits the information to the processing unit;
The processing unit is used for receiving and preprocessing the IMU information of the gesture positioning tags, receiving the UWB tag information, acquiring the position information of each tag by utilizing an information fusion technology, and obtaining gesture information according to priori knowledge;
And the communication display module is used for displaying or transmitting the gesture information acquired by the processing unit to other modules.
Preferably, the gesture positioning tag includes:
The IMU sensor is used for sensing information of a fixed part, and is not limited to three-dimensional acceleration, three-dimensional angular velocity and magnetic field information.
UWB tags for identifying specific locations and transmitting IMU data information,
The singlechip module is used for controlling the IMU sensor and the UWB module to work,
A power supply battery for providing power for the gesture positioning tag,
And a fixing band for fixing the posture positioning tag to a designated position.
Preferably, the UWB receiving module receives the gesture positioning tag information, and may read tag information without limitation: the IMU sensor data, the sending time, the reading time, the information intensity and the label number are transmitted to the processing unit. The processing unit comprises a wireless communication module, a wired communication module, a filtering module and an information processing unit, and is used for receiving and processing the real-time information of the gesture positioning tag, then acquiring the motion state and the position information of each tag, and displaying or transmitting the position information through the communication display module.
The method adopted by the motion gesture evaluation system based on UWB/IMU fusion comprises the following working procedures: firstly, building a positioning environment, installing the UWB receiving module, recording the coordinate position of the UWB receiving module, and inputting the coordinate of the UWB receiving module into the processing unit; furthermore, the posture positioning tag is required to be worn according to the requirement of fig. 2; and then calibrating an inertial unit of the IMU in the gesture positioning tag in a static state, and finally calculating the position of each tag by a data fusion algorithm and reconstructing a three-dimensional motion gesture model.
Further, the IMU of the gesture positioning tag performs coordinate transformation to obtain acceleration, the quaternion method is used for performing coordinate transformation as shown in the following formula (1),
alp=qalq-1(1)
Wherein, al= [0, a ],
A= [ a x,ay,az ] is the gesture positioning tag output acceleration,
Q is the information quaternion of the current pose location tag,
Wherein the current gesture locates the tag outputθ,γ。
And setting special actions, such as special actions of lifting hands, lifting legs, forward squatting and the like, judging the correctness of the installation positions of the positioning labels of all the gestures, and ensuring that the positioning labels of all the gestures are installed in the forward direction, namely in the normal direction, and the Z-axis acceleration is downward when squatting.
Further, the acceleration after transformation of the waist movement starting point at the moment t of the trainer is set to be (a x1t,ay1t,az1t), the coordinate after transformation of the abdomen movement starting point is set to be (a x2t,ay2t,az2t), the coordinate after transformation of the chest movement starting point is set to be (a x3t,ay3t,az3t), and the acceleration value of the whole human body is set to be
Further, by utilizing the inertial gait model,
SLk=afk+bSk+C(7)
Wherein SL k is the step of the kth step,
F k is the inverse of the time taken for the kth walk, i.e., f k=tk-tk-1,
S k is the acceleration value variance in the kth step,
Wherein,
Wherein a k is the average value of the acceleration in the kth step,
Wherein N is the sampling point number in the current step,
A. b and C are coefficients to be solved, and are generally related to height, leg length and exercise environment, and can be obtained from experimental experience.
The method comprises the following steps: under the current environment, 3-4 gait information and the measurement step length are obtained, and a, b,
C value.
Further, solving the current acceleration (a xit,ayit,azit) of each gesture positioning tag according to equation (2), then there is an integral to obtain its position as
Wherein,Coordinates of the tag are located for the gesture at time t 0,/>The tag velocity vector is located for the pose at time t 0.
Further, as shown in fig. 6, the UWB receiving modules are placed at four corners of a fixed site, the UWB receiving modules are numbered 1,2 …, and the coordinates of the positions are (X j,Yj,Zj), wherein i=1, 2 … 8, and the UWB receiving modules at the upper diagonal line position the receiving modules as a base, and the coordinates of the receiving modules are (X 9,Y9,Z9).
The positioning of the gesture positioning tag adopts a positioning method based on a signal arrival Time Difference (TDOA), the coordinate of the gesture positioning tag i is (x i,yi,zi), the time from the gesture positioning tag i to the UWB receiving module is t i, and i=1, 2 … 11. Then the distance from the gesture positioning tag i to the UWB receiving module j is d ij, then there is
Wherein i=1, 2 … 11, j=1, 2 …;
The distance difference D ij between the gesture positioning tag i and the UWB receiving module j and the basic positioning receiving module is that
Dij=dij-di9=c(ti-t9)(12)
Where i=1, 2 … 11, j=1, 2 …, c is the wireless signal propagation speed.
Obtainable from (6)
dij=Dij+di9(13)
Order theWherein j=1, 2 … 9 in combination with formula (5) and formula (7) there are
Equation (14) can be converted to,
A[xi,yi,zi]T+Cdij+D=0(15)
Wherein A, C, D are the equation set using equation (14), the equation set formed by solving by least square method
Building a function f (x, y, x), then there is
Then find its |f (x, y, x) | as the minimum.
The coordinate values are obtained by using the formulas (8), (9) and (10) as initial values of f (x, y, x), and the optimal solution is solved by using the Newton gradient method. Defining conditions, and according to the mannequin, the tag wearing position meets the following conditions:
For example, the positions of tags i through j satisfy a certain maximum and minimum, i.e., LRM ij
LRMij>LRMijmin
LRMij<LRMijmax(17)
The wearing positions of the tag i and the tag j are the most in all actions, the LRM ijmin is the least, the LRM ijmax is the greatest, and the positions are actually determined according to the body index.
And (3) utilizing Kalman filtering to improve the positioning accuracy according to the self-limiting condition of the body as a filter limiting condition.
Further, assuming that the system is now in state K, the system model is as follows:
Wherein X (k) is the position of the tag at time k, U (k) is the control input to the system at time k, where there is no control input; w k is the process noise of the system, where it is assumed that the process noise follows a Gaussian distribution, independent of the state variables, A is typically a state transition matrix or process gain matrix describing the target state transition, B is the system gain parameter, Z (k) is the data measurement at time k, H is typically an observation matrix, V k is the observation noise, and the observation noise follows a Gaussian distribution.
Wherein W-N (0, Q), V-N (0, R)
Xcal(k)=AX(k-1)(19)
rk=Z(k)-HXcal(k)(21)
K(k)=P(k)-HT[HP(k)HT+R(k)]-1(22)
X(k)=Xcal(k)+K(k)rk(23)
Wherein X cal (k) and X (k) are estimated values and predicted values after Kalman filtering at the moment k,And P (K) is a covariance matrix of a state variable estimation error and a prediction error at the moment K, K (K) is a Kalman filtering gain at the moment K, Q is a prediction noise covariance matrix, R is a measurement noise covariance matrix, R k is a innovation amount of Z (K), and the corresponding covariance matrix is
In a normal environment, r k obeys the Gaussian distribution with the mean value of 0, when the signal is abnormal (when the human body is shielded or the environment is interfered), r k does not obey the Gaussian distribution with the mean value of 0, the anti-interference information is added,
Wherein, let the threshold be C, its value is obtained by multiple experiments.
Two adjacent or special two points are selected as the verification conditions, for example, two points of an upper arm and an arm are selected as the verification conditions.
Here, X (k) i is the kalman filtered position at i, which is the column vector.
Will beSubstituting R (k) to carry over the calculation of formula (22).
The beneficial effects are that:
1. According to the invention, the gesture positioning label information placed on the specific part of the body is collected, the information is preprocessed, the gesture of the main key part of the human body is obtained by utilizing an information fusion technology, and then an accurate movement gesture model is established according to movement time sequence, so that the movement state can be evaluated in the later period.
2. The new method for reconstructing the motion gesture model has the characteristics of high precision and low cost.
3. The method based on UWB/IMU fusion motion posture assessment system wears posture positioning labels on specific parts of human body such as waist, chest, head and hand, arms, upper arms, thighs, calves and feet, wherein the IMU data and posture positioning label identification are transmitted to a UWB receiving module by using a UWB wireless transmission technology, the UWB receiving module transmits the data to a processing unit, the processing unit filters the transmitted data, and then the information fusion technology such as improved Kalman filtering and limiting conditions are used for acquiring accurate position information of each label, and then the motion posture information of the human body is displayed in three dimensions or transmitted to other platforms.
4. The invention is applied to the fields of disabled people rehabilitation, life, entertainment and the like, and obtains better social and economic values.
Drawings
FIG. 1 is a block diagram of a UWB/IMU fused motion pose assessment system;
FIG. 2 is a schematic illustration of a locating tag being worn;
FIG. 3 is a schematic diagram of a gesture positioning tag;
FIG. 4 is a schematic diagram of a UWB receiving module;
FIG. 5 is a schematic diagram of a system fusion processing module;
FIG. 6 is a schematic view of the placement of UWB receiving modules;
Fig. 7 is a schematic diagram based on a signal time difference of arrival (TDOA).
Reference numerals:
The device comprises a gesture positioning tag-1, an inertial sensor-101, a UWB module-102, a singlechip module-103, a power supply battery-104, a UWB receiving module-2, a UWB module-201, a control unit-202, a communication module-203, a power supply battery-204, a processing unit-3, a communication module-301, an IMU filter-302, an information fusion module-303, gesture information-304 and a communication display module-4.
Detailed Description
The invention aims to provide a novel motion posture assessment method based on UWB/IMU fusion, which obtains accurate position information of each tag through wearing a portable sensor and an information fusion technology, and then obtains motion posture information of a human body.
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in further detail below.
Example 1
As shown in fig. 1, the motion pose estimation system based on UWB/IMU fusion of the present invention includes: a gesture positioning tag 1, a UWB receiving module 2, a processing unit 3 and a communication display module 4.
Specifically, the gesture locates the tag 1, perceives the motion state and provides UWB tag information, transmitting IMU information to the UWB receiving module 2;
The UWB receiving module 2 is used for receiving the IMU information and the UWB tag information transmitted by each gesture positioning tag, transmitting the information to the processing unit 3 and providing the serial number and the position information of the UWB receiving module;
the processing unit 3 receives the IMU information and the transmitted UWB information of the gesture positioning tags, acquires the position information of each tag by utilizing an information fusion technology, and obtains gesture information according to priori knowledge;
and the communication display module 4 displays or transmits the gesture information acquired by the processing unit 3 to other modules.
According to the invention, the human body motion posture reconstruction is realized by wearing the posture positioning tag 1 on specific parts of the human body such as the waist, the chest, the head, the arms, the upper arms, the thighs, the calves and the feet, transmitting IMU data and RFID tag information of the specific parts in motion to the processing unit 3 by utilizing a wireless transmission technology, filtering the transmitted data, acquiring accurate position information of each tag by utilizing an information fusion technology, and then acquiring motion posture information of the human body.
Further, as shown in fig. 2, the posture positioning tag 1 is worn at the waist, abdomen, chest, head arm, upper arm, thigh, calf and foot 11, and in particular, the waist and abdomen posture positioning tag is required to be fixed with one elastic fixing band and at the same height.
Further, as shown in fig. 3, the attitude positioning tag 1 includes an inertial sensor 101, a uwb module 102, a single chip module 103, and a power supply battery 104.
The inertial sensor 101 can obtain three-dimensional information of human body movement and gait information such as stride, stride frequency, etc., and the inertial sensor 101 here is composed of a triaxial accelerometer ADXL345, a triaxial gyroscope ITG3205 and a triaxial magnetometer HMC 5583L.
The UWB module 102 is composed of a UWB chip and an antenna, and transmits the information of the inertial sensor 101 and the UWB tag to the UWB receiving module 2, where DW1000 chip developed by DecaWave is selected.
The singlechip module 103 mainly comprises a singlechip and a crystal oscillator, controls the inertial sensor 101 and the UWB module 102 to work, and selects STM32L103 series singlechips.
The power supply battery 104 comprises a wireless charging module, a power management module, a battery and the like, and provides electric energy for the inertial sensor 101, the UWB module 102 and the singlechip module 103.
The fixing belt is used for fixing the gesture label 1 to a specific position, and the shape and the length of the fixing belt are different according to different positions.
Preferably, the UWB receiving module 2 is mainly composed of a UWB module 201, a control unit 202, a communication module 203, and a power supply battery 204.
The UWB module 201 is composed of an antenna, a control module and a radio frequency communication module, and is configured to read IMU information and UWB information in the gesture positioning tag 1.
The control unit 202 controls the UWB module 201 and the communication module 203 to operate.
The communication module 203 transmits the UWB reception information and the information of the UWB reception module 2 to a computing center. Preferably, a wireless communication module is selected for easy installation.
The power supply battery 204 comprises a wireless charging module, a power management module and an energy storage module, and supplies power to the UWB module 201, the control unit 202 and the communication module 203.
Preferably, the processing unit 3 mainly comprises a communication module 301, an imu filtering module 302, an information fusion module 303 and posture information 304, which are used for providing data filtering, information fusion positioning and three-dimensional movement posture reconstruction.
The communication module 301 is configured to receive data information transmitted by the communication module 203 in the UWB receiving module 2, and also be used for external data input and three-dimensional motion gesture information interaction, for example, input of position information of the UWB receiving module 2, and output the three-dimensional motion gesture information to a display or other systems.
The IMU filter 302 is configured to perform filtering processing on IMU data in the postural positioning tag 1,
The information fusion module 303 is configured to process the gesture positioning tag 1 and the UWB data, and then obtain location information of the tag,
The gesture information 304 is used for reconstructing a three-dimensional motion gesture model.
Further, IMU filtering 302, information fusion module 303, and gesture information 304 may be implemented using hardware, i.e., a DSP processor and peripheral circuitry, or the data may be transmitted to a server where it is processed by a data processing server. .
A method of a motion pose assessment system based on UWB/IMU fusion, the steps comprising: firstly, building a positioning environment, installing the UWB receiving module, recording the coordinate position of the UWB receiving module, and inputting the coordinate of the UWB receiving module into the processing unit 3; furthermore, the posture positioning tag 1 is required to be worn according to the requirement of fig. 2; and then calibrating an inertial unit of the IMU in the gesture positioning tag 1 in a static state, and finally calculating the position of each tag by a data fusion algorithm and reconstructing a three-dimensional motion gesture model.
Further, the IMU of the gesture positioning tag performs coordinate transformation to obtain acceleration, the quaternion method is used for performing coordinate transformation as shown in the following formula (1),
alp=qalq;1(1)
Wherein, al= [0, a ],
A= [ a x,ay,az ] is the gesture positioning tag output acceleration,
Q is the information quaternion of the current pose location tag,
Wherein the current gesture locates the tag outputθ,γ。
And setting special actions, such as special actions of lifting hands, lifting legs, forward squatting and the like, judging the correctness of the installation positions of the positioning labels of all the gestures, and ensuring that the positioning labels of all the gestures are installed in the forward direction, namely in the normal direction, and the Z-axis acceleration is downward when squatting.
Further, the acceleration of the trainer after transformation of the waist movement starting point at the moment t can be set to be (a x1t,ay1t,az1t), the coordinate of the trainer after transformation of the abdomen movement starting point is set to be (a x2t,ay2t,az2t), the coordinate of the trainer after transformation of the chest movement starting point is set to be (a x3t,ay3t,az3t), and the acceleration value of the whole human body is set to be
Further, by utilizing the inertial gait model,
SLk=afk+bSk+C(7)
Wherein SL k is the step of the kth step,
F k is the inverse of the time taken for the kth walk, i.e., f k=tk-tk-1,
S k is the acceleration value variance in the kth step,
Wherein,
Wherein a k is the average value of the acceleration in the kth step,
Wherein N is the sampling point number in the current step,
A. b and C are coefficients to be solved, and are generally related to height, leg length and exercise environment, and can be obtained from experimental experience.
The method comprises the following steps: under the current environment, 3-4 gait information and the measurement step length are obtained, and a, b,
C value.
Further, solving the current acceleration (a xit,ayit,azit) of each gesture positioning tag according to equation 2, then there is an integral to obtain its position as
Wherein,Coordinates of the tag are located for the gesture at time t 0,/>The tag velocity vector is located for the pose at time t 0.
Further, as shown in fig. 6, the UWB receiving modules are placed at four corners of the fixed site, the UWB receiving modules are numbered 1,2 …, and the coordinates of the positions are (X j,Yj,Zij), wherein j=1, 2 … 8, and the UWB receiving modules at the upper diagonal line position the receiving modules as a base, and the coordinates of the receiving modules are (X 9,Y9,Z9).
The positioning of the gesture positioning tag adopts a positioning method based on a signal arrival Time Difference (TDOA), the coordinate of the gesture positioning tag i is (x i,yi,zi), the time from the gesture positioning tag i to the UWB receiving module is t i, and i=1, 2 … 11. Then the distance from the gesture positioning tag i to the UWB receiving module j is d ij, then there is
Wherein i=1, 2 … 11, j=1, 2 …;
the distance difference D ij between the gesture positioning tag i and the UWB receiving module j and the basic positioning receiving module 9 is that
Dij=dij-di9=c(ti-t9)(12)
Where i=1, 2 … 11, j=1, 2 …, c is the wireless signal propagation speed.
Obtainable from (6)
dij=Dij+di9(13)
Order theWherein j=1, 2 … 9 in combination with formula (5) and formula (7) there are
Equation (14) can be converted to,
A[xi,yi,zi]T+Cdij+D=0(15)
Wherein A, C, D are the equation set using equation (14), and the equation set construction function f (x, y, x) formed by solving the least square method is
Then find its |f (x, y, x) | as the minimum.
The coordinate values are obtained by using the formulas (8), (9) and (10) as initial values of f (x, y, x), and the optimal solution is solved by using the Newton gradient method. Defining conditions, and according to the mannequin, the tag wearing position meets the following conditions:
For example, the positions of tags i through j satisfy a certain maximum and minimum, i.e., LRM ij
LRMij>LRMijmin
LRMij<LRMijmax(17)
The wearing positions of the tag i and the tag j are the most in all actions, the LRM ijmin is the least, the LRM ijmax is the greatest, and the positions are actually determined according to the body index.
The limited condition is adopted to improve the Kalman filter, so that the performance of the Kalman filter is improved, and the positioning accuracy is improved.
And (3) utilizing Kalman filtering to improve the positioning accuracy according to the self-limiting condition of the body as a filter limiting condition.
Further, assuming that the system is now in state K, the system model is as follows:
Wherein X (k) is the position of the tag at time k, U (k) is the control input to the system at time k, where there is no control input; w k is the process noise of the system, where it is assumed that the process noise follows a Gaussian distribution, independent of the state variables, A is typically a state transition matrix or process gain matrix describing the target state transition, B is the system gain parameter, Z (k) is the data measurement at time k, H is typically an observation matrix, V k is the observation noise, and the observation noise follows a Gaussian distribution.
Wherein W-N (0, Q), V-N (0, R)
Xcal(k)=AX(k-1)(19)
rk=Z(k)-HXcal(k)(21)
K(k)=P(k)-HT[HP(k)HT+R(k)]-1(22)
X(k)=Xcal(k)+K(k)rk(23)
Wherein X cal (k) and X (k) are estimated values and predicted values after Kalman filtering at the moment k,And P (K) is a covariance matrix of a state variable estimation error and a prediction error at the moment K, K (K) is a Kalman filtering gain at the moment K, Q is a prediction noise covariance matrix, R is a measurement noise covariance matrix, R k is a innovation amount of Z (K), and the corresponding covariance matrix is
In a normal environment, r k obeys the Gaussian distribution with the mean value of 0, when the signal is abnormal (when the human body is shielded or the environment is interfered), r k does not obey the Gaussian distribution with the mean value of 0, the anti-interference information is added,
Wherein, let the threshold be C, its value is obtained by multiple experiments.
Two adjacent or special two points are selected as the verification conditions, for example, two points of an upper arm and an arm are selected as the verification conditions.
Here, X (k) i is the kalman filtered position at i, which is the column vector.
Will beSubstituting R (k) to carry over the calculation of formula (22).
The embodiment of the invention does not limit the types of other devices except the types of the devices, so long as the devices can complete the functions.
Those skilled in the art will appreciate that the drawings are schematic representations of only one preferred embodiment, and that the above-described embodiment numbers are merely for illustration purposes and do not represent advantages or disadvantages of the embodiments.
The foregoing description of the preferred embodiments of the invention is not intended to limit the invention to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and scope of the invention are intended to be included within the scope of the invention.

Claims (2)

1. A motion pose estimation system based on UWB/IMU fusion, the system comprising four parts:
1) Gesture positioning tag: sensing a motion state, transmitting UWB tag information and transmitting the IMU information to a processing unit;
the gesture positioning tag includes:
the IMU sensor is used for sensing information of a specific part;
UWB tags for identifying specific locations and transmitting IMU data information,
The singlechip module is used for controlling the IMU sensor and the UWB module to work,
A power supply battery for providing power to the gesture positioning tag;
2) UWB receiving module: reading IMU information and UWB tag information in each gesture positioning tag, and transmitting the information to a processing unit;
The information includes, but is not limited to: IMU sensor data, sending time, reading time, information intensity and label number;
3) And a processing unit: receiving and preprocessing IMU information of the gesture positioning tags, receiving UWB tag information, acquiring position information of each tag by utilizing an information fusion technology, and obtaining gesture information according to priori knowledge; the processing unit comprises a communication module, a filtering module and an information processing unit, and is used for receiving and processing the real-time information of the gesture positioning tag, then acquiring the motion state and the position information of each tag, and displaying or transmitting the position information through a communication display module;
4) And a communication display module: displaying or transmitting the gesture information obtained by the processing unit;
The method adopted by the motion gesture evaluation system based on UWB/IMU fusion is as follows:
firstly, building a positioning environment, installing a UWB receiving module, recording the coordinate position of the UWB receiving module, and inputting the coordinate of the UWB receiving module into a processing unit; then wearing a posture positioning label; calibrating the inertial unit of the gesture positioning tag IMU in a static state, calculating the position of each tag by utilizing a data fusion algorithm, and reconstructing a three-dimensional motion gesture model;
The IMU of the gesture positioning tag performs coordinate conversion to obtain acceleration, and then stride information is obtained;
Coordinate conversion is performed by using the quaternion method as shown in the following (1),
alp=qalq-1(1)
Wherein, al= [0, a ],
A= [ a x,ay,az ] is the gesture positioning tag output acceleration,
Q is the information quaternion of the current pose location tag,
Wherein, the current gesture positioning label outputs phi, theta and gamma;
Setting special actions, namely lifting hands, lifting legs, and squatting forward and downward, judging the correctness of the mounting positions of the positioning labels of all the gestures, and ensuring that each positioning label of all the gestures is mounted according to the forward direction, namely the cis position;
setting the acceleration (a x1t,ay1t,az1t) after waist movement starting point conversion at the moment t of the trainer, the coordinate (a x2t,ay2t,az2t) after abdomen movement starting point conversion, the coordinate (a x3t,ay3t,az3t) after chest movement starting point conversion, and the acceleration value of the whole human body as follows
By using the inertial gait model,
SLk=afk+bSk+C (7)
Wherein SL k is the step of the kth step,
F k is the inverse of the time taken for the kth walk, i.e., f k=tk-tk-1,
S k is the acceleration value variance in the kth step,
Wherein,
Wherein a k is the average value of the acceleration in the kth step,
Wherein N is the sampling point number in the current step,
A. b, C are coefficients to be solved, and are related to height, leg length and exercise environment and obtained by experimental experience;
Solving the current acceleration (a xit,ayit,azit) of each gesture positioning tag according to equation (2), then integrating to obtain the position of the gesture positioning tag as follows:
Wherein, Coordinates of the tag are located for the gesture at time t 0,/>Positioning a label speed vector for the gesture at the time t 0;
Further, the UWB receiving modules are placed at four corners of the fixed site, the UWB receiving modules are numbered 1,2 … 8, the coordinates of the UWB receiving modules are (X j,Yj,Zj), j=1, 2 …, and the UWB receiving modules at the top diagonal line are used as base positioning receiving modules, and the coordinates of the UWB receiving modules are (X 9,Y9,Z9);
The positioning of the gesture positioning tag adopts a positioning method based on signal arrival time difference, the coordinate of the gesture positioning tag i is (x i,yi,zi), the time from the gesture positioning tag i to the UWB receiving module is t i, wherein i=1, 2 … 11, the distance from the gesture positioning tag i to the UWB receiving module j is d ij, and the gesture positioning tag i comprises
Wherein i=1, 2 … 11, j=1, 2 …;
The distance difference D ij between the gesture positioning tag i and the UWB receiving module j and the basic positioning receiving module is that
Dij=dij-di9=c(ti-t9) (12)
Where i=1, 2 … 11, j=1, 2 …, c is the propagation speed of the wireless signal
Obtainable from (6)
dij=Dij+di9 (13)
Order theWherein j=1, 2 … 9 in combination with formula (5) and formula (7) there are
Equation (14) can be converted to,
A[xi,yi,zi]T+Cdij+D=0 (15)
Wherein A, C, D are the equation set using equation (14), the equation set formed by solving by least square method
Building a function f (x, y, x), then there is
Then find its |f (x, y, x) | as minimum;
The coordinate values are obtained by using the formulas (8), (9) and (10) as initial values of f (x, y, x), and the optimal solution is solved by using the Newton gradient method.
2. The motion pose estimation system based on UWB/IMU fusion according to claim 1, wherein the filter constraint is based on body self-constraint by using kalman filter filtering;
Defining conditions, and according to the mannequin, the tag wearing position meets the following conditions:
The positions of the labels i to j satisfy certain maximum and minimum values, i.e
LRMij>LRMijmin
LRMij<LRMijmax (17)
The wearing positions of the tag i and the tag j are the most in all actions, the LRM ijmin is the least, the LRM ijmax is the greatest, and the wearing positions are actually determined according to the body index;
further, assuming that the system is now in state K, the system model is as follows:
Wherein X (k) is the position of the tag at time k, U (k) is the control input to the system at time k, where there is no control input; w k is the process noise of the system, it is assumed here that the process noise follows a Gaussian distribution and is mutually independent of state variables, A is a state transition matrix or a process gain matrix describing target state transition, B is a system gain parameter, Z (k) is a data measurement value at k moment, H is an observation matrix, V k is an observation noise, and the observation noise follows the Gaussian distribution;
Wherein W-N (0, Q), V-N (0, R)
Xcal(k)=AX(k-1) (19)
rk=Z(k)-HXcal(k) (21)
K(k)=P(k)-HT[HP(k)HT+R(k)]-1 (22)
X(k)=Xcal(k)+K(k)rk (23)
Wherein X cal (k) and X (k) are estimated values and predicted values after Kalman filtering at the moment k,And P (K) is a covariance matrix of a state variable estimation error and a prediction error at the moment K, K (K) is a Kalman filtering gain at the moment K, Q is a prediction noise covariance matrix, R is a measurement noise covariance matrix, R k is a innovation amount of Z (K), and the corresponding covariance matrix is
In a normal environment, r k obeys the Gaussian distribution with the mean value of 0, when the signal is abnormal, the human body is shielded or the environment is interfered, r k does not obey the Gaussian distribution with the mean value of 0, the anti-interference information is added,
The threshold value is set as C, and the value is obtained by multiple experiments;
Selecting two adjacent or special two points as verification conditions;
here, X (k) i is the kalman filtered position at i, which is the column vector
Will beSubstituting R (k) to carry over the calculation of formula (22).
CN202211595275.5A 2022-12-12 2022-12-12 Motion posture assessment system based on UWB/IMU fusion Active CN116019442B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211595275.5A CN116019442B (en) 2022-12-12 2022-12-12 Motion posture assessment system based on UWB/IMU fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211595275.5A CN116019442B (en) 2022-12-12 2022-12-12 Motion posture assessment system based on UWB/IMU fusion

Publications (2)

Publication Number Publication Date
CN116019442A CN116019442A (en) 2023-04-28
CN116019442B true CN116019442B (en) 2024-05-14

Family

ID=86071870

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211595275.5A Active CN116019442B (en) 2022-12-12 2022-12-12 Motion posture assessment system based on UWB/IMU fusion

Country Status (1)

Country Link
CN (1) CN116019442B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109284006A (en) * 2018-11-09 2019-01-29 中科数字健康科学研究院(南京)有限公司 A kind of human motion capture device and method
WO2021006812A1 (en) * 2019-07-05 2021-01-14 National University Of Singapore System and method for motion analysis
CN112957033A (en) * 2021-02-01 2021-06-15 山东大学 Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation
CN113900061A (en) * 2021-05-31 2022-01-07 深圳市易艾得尔智慧科技有限公司 Navigation positioning system and method based on UWB wireless positioning and IMU fusion
CN114088091A (en) * 2022-01-21 2022-02-25 北京慧拓无限科技有限公司 Multi-sensor-based underground mine pose fusion method and system
WO2022242075A1 (en) * 2021-05-19 2022-11-24 深圳市优必选科技股份有限公司 Robot positioning method and apparatus, robot and readable storage medium
CN116108873A (en) * 2022-12-12 2023-05-12 天津大学 Motion posture assessment system based on RFID/IMU fusion

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8203487B2 (en) * 2009-08-03 2012-06-19 Xsens Holding, B.V. Tightly coupled UWB/IMU pose estimation system and method
US10415975B2 (en) * 2014-01-09 2019-09-17 Xsens Holding B.V. Motion tracking with reduced on-body sensors set
US10222450B2 (en) * 2015-10-12 2019-03-05 Xsens Holding B.V. Integration of inertial tracking and position aiding for motion capture
EP3786757B1 (en) * 2018-04-25 2022-09-21 SZ DJI Technology Co., Ltd. Camera stabilizer position correction method and device
US11016305B2 (en) * 2019-04-15 2021-05-25 Magic Leap, Inc. Sensor fusion for electromagnetic tracking
US11992961B2 (en) * 2020-11-17 2024-05-28 Ubtech Robotics Corp Ltd Pose determination method, robot using the same, and computer readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109284006A (en) * 2018-11-09 2019-01-29 中科数字健康科学研究院(南京)有限公司 A kind of human motion capture device and method
WO2021006812A1 (en) * 2019-07-05 2021-01-14 National University Of Singapore System and method for motion analysis
CN114096193A (en) * 2019-07-05 2022-02-25 新加坡国立大学 System and method for motion analysis
CN112957033A (en) * 2021-02-01 2021-06-15 山东大学 Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation
WO2022242075A1 (en) * 2021-05-19 2022-11-24 深圳市优必选科技股份有限公司 Robot positioning method and apparatus, robot and readable storage medium
CN113900061A (en) * 2021-05-31 2022-01-07 深圳市易艾得尔智慧科技有限公司 Navigation positioning system and method based on UWB wireless positioning and IMU fusion
CN114088091A (en) * 2022-01-21 2022-02-25 北京慧拓无限科技有限公司 Multi-sensor-based underground mine pose fusion method and system
CN116108873A (en) * 2022-12-12 2023-05-12 天津大学 Motion posture assessment system based on RFID/IMU fusion

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
低成本IMU与RFID技术结合的 AGV实时定位方法研究;王爽,石朝,曾大懿,邹益胜;机械设计与制造;20210531(第5期);269-272 *

Also Published As

Publication number Publication date
CN116019442A (en) 2023-04-28

Similar Documents

Publication Publication Date Title
CN104922890B (en) Smart motion protector
CN106462665B (en) Wearable electronic device and method of estimating lifestyle metrics
KR101690649B1 (en) Activity classification in a multi-axis activity monitor device
CN104757976B (en) A kind of Human Body Gait Analysis method and system based on Multi-sensor Fusion
US10078795B2 (en) Systems and methods for non-contact tracking and analysis of physical activity using imaging
US20180056123A1 (en) Systems and methods of swimming analysis
CN101330863B (en) Detection and compensation method for monitoring the place of activity on the body
US7634379B2 (en) Newtonian physical activity monitor
Olivares et al. Wagyromag: Wireless sensor network for monitoring and processing human body movement in healthcare applications
CN105388495B (en) Estimating local motion in physical exercise
KR20160091694A (en) Method, apparatus, and system for providing exercise guide information
US20120245714A1 (en) System and method for counting swimming laps
US20120009553A1 (en) Apparatus for assisting swimming training
CN110974242B (en) Gait abnormal degree evaluation method for wearable device and wearable device
JP2010519525A (en) Device and method for detecting path of moving object in two dimensions
Wang et al. Electromyography-based locomotion pattern recognition and personal positioning toward improved context-awareness applications
Horenstein et al. Validation of magneto-inertial measuring units for measuring hip joint angles
US20160030806A1 (en) Exercise ability evaluation method, exercise ability evaluation apparatus, exercise ability calculation method, and exercise ability calculation apparatus
CN116108873B (en) Motion posture assessment system based on RFID/IMU fusion
CN107397535B (en) Comprehensive physical sign monitoring device and system for pregnant woman
Cho Design and implementation of a lightweight smart insole for gait analysis
Suh et al. Kalman-filter-based walking distance estimation for a smart-watch
CN116019442B (en) Motion posture assessment system based on UWB/IMU fusion
Janidarmian et al. Affordable erehabilitation monitoring platform
CN110916639A (en) Method, system, wearable device and computer-readable storage medium for acquiring exercise heart rate recovery rate

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant