CN112957033B - Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation - Google Patents
Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation Download PDFInfo
- Publication number
- CN112957033B CN112957033B CN202110133657.5A CN202110133657A CN112957033B CN 112957033 B CN112957033 B CN 112957033B CN 202110133657 A CN202110133657 A CN 202110133657A CN 112957033 B CN112957033 B CN 112957033B
- Authority
- CN
- China
- Prior art keywords
- coordinate
- coordinate system
- coordinates
- calculating
- joint
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1112—Global tracking of patients, e.g. by using GPS
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Physics & Mathematics (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Physiology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Radar, Positioning & Navigation (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
A human body real-time indoor positioning and motion gesture capturing method and system in human-computer cooperation comprises the following steps: (1) Acquiring pose information of a human body under a geographic coordinate system, and converting position coordinates returned by the UWB into the geographic coordinate system; (2) Calculating the coordinates of the tracking points by using a positive kinematics tree according to the foot touchdown condition; (3) Performing data fusion on the tracking point coordinate calculated based on the inertial sensor network and the label coordinate returned by the UWB; (4) Calculating posture information and position information of the whole trunk according to the fused tracking point coordinate values; the system comprises a data receiving module, a coordinate conversion module, a data fusion module and an attitude calculation module. According to the invention, the tag three-dimensional coordinate information fixed at the following point is obtained in real time by an ultra-wideband positioning technology, and a smoother and more accurate following point coordinate is obtained through unscented Kalman filtering fusion, so that the distance and the position are accurately determined, and the real-time and accurate tracking and display of the motion attitude of the human body are realized.
Description
Technical Field
The invention relates to the technical field of man-machine cooperation, in particular to three-dimensional positioning and posture tracking of a tester in a man-machine cooperation process.
Background
In the human-computer cooperation process, accurate capture and positioning of human motion gestures are particularly important, and the safety problem of personnel in the cooperation process is directly concerned. The existing motion capture technology is divided into an optical type, a mechanical type, an electromagnetic type, an acoustic type and an inertia sensing type, the optical type motion capture technology is the most mature, but the price is high, and the influence of illumination is serious; the mechanical capture system adopts a mechanical structure, is heavy and is difficult to calibrate; the electromagnetic motion capture system determines the position of the electromagnetic motion capture system through the field intensity and the phase of the position, is easily interfered by a magnetic field in space, and reduces the estimation precision; the ultrasonic motion capture system calculates the spatial position of each sensor through the ultrasonic flight time, but is easily influenced by the propagation environment and the path when needing to be within the sight distance range; the inertial sensor is widely applied because of no need of external auxiliary equipment, light weight, small volume, no fear of shielding and low requirements on light and environment. The motion capture system based on the inertial sensor can complete the acquisition of attitude data only by wearing the inertial sensor on the appointed trunk of the body, and displacement is obtained through double integration of acceleration. Although the inertial sensor can determine the rotation of each joint very accurately, the accumulated error generated by the acceleration integral is larger and larger as the use time is increased, and the accurate displacement of the human body cannot be realized.
The existing positioning modes comprise ultrasonic positioning, bluetooth positioning, infrared positioning, GPS positioning, ultra-wideband positioning, inertial navigation positioning and the like. In the positioning mode, the ultrasonic waves realize positioning according to the arrival time principle, the positioning real-time performance is poor, the power consumption of Bluetooth positioning is generally low, the Bluetooth positioning is easy to integrate with current equipment, but the positioning accuracy stability is poor, and the positioning is easy to interfere. Infrared positioning limits the range of line-of-sight that the target to be positioned must be within, while being susceptible to ambient light. The GPS positioning is used as the positioning mode with the widest application range at present, and has the advantages of all weather, real time and continuity. However, the corresponding GPS cannot be used in an indoor environment, and loses a signal and thus loses a target in a highly dynamic motion state. In addition, the GPS is high in cost and not beneficial to multi-anchor node deployment.
Disclosure of Invention
In order to solve the technical problems, the invention provides a method for realizing real-time indoor positioning and motion attitude capture of a human body in man-machine cooperation by combining an Ultra-wideband (UWB) sensor and an inertial sensor, and a system for realizing the method. The inertial sensor obtains attitude information of each trunk of a human body, real-time coordinates of a heel point are calculated through touchdown, three-dimensional coordinate information of a label fixed at the heel point is obtained in real time through an ultra-wideband positioning technology, smoother and more accurate coordinates of the heel point are obtained through unscented Kalman filtering fusion, the attitude information of the whole body is calculated, the obtained attitude information and position information are assigned to a human body model in software, and the human body attitude motion is reproduced.
The invention discloses a human body real-time indoor positioning and motion gesture capturing method in man-machine cooperation, which comprises the following steps:
(1) Acquiring pose information of a human body in a geographic coordinate system, and converting position coordinate data returned by the ultra-wideband positioning system into the geographic coordinate system;
(2) Calculating the coordinates of a heel point (pelvis) by using a positive kinematics tree according to the foot touchdown condition;
(3) Performing data fusion on the tracking point coordinate calculated based on the inertial sensor network and the label coordinate returned by the UWB;
(4) And calculating posture information and position information of the whole trunk according to the fused tracking point coordinate values.
The coordinate system used is:
a geographical coordinate system: taking the initial position as an origin, wherein the positive direction of an X axis refers to east, the positive direction of a Y axis refers to north, and a square of a Z axis points to the sky according to the rule of a right hand, namely a coordinate system of the northeast sky;
sensor coordinate system: the origin of coordinates is the center of the sensor, the positive direction of the X axis is parallel to the long side of the sensor and is towards the right, the positive direction of the Y axis is parallel to the short side of the sensor and is towards the front, and the positive direction of the Z axis is perpendicular to the sensor and is towards the upper;
a base coordinate system: the coordinate system of each joint of the human body is in the same direction as the coordinate system of the following point, the following point is a base coordinate system, the positive direction of an X axis points to the left hand side of the human body, the positive direction of a Y axis points to the rear side of the human body, and the positive direction of a Z axis is vertically upward according to the rule of a right hand;
UWB local positioning system coordinate system: and determining a local coordinate system according to the placement condition of the base station, selecting the base station A0 as the origin of the coordinate system, pointing to the base station A3 from the base station A0 in the positive direction of the X axis, pointing to the base station A1 from the base station A0 in the positive direction of the Y axis and being vertical to the X axis, and pointing to the sky from the Z axis to the XY axis plane.
The specific process of acquiring the pose information of the human body under the geographic coordinate system in the step (1) is as follows:
fix Inertial Measurement Unit (IMU) suit on the human body, through the automatic action data (chooseing for use 12 wireless inertial measurement unit) of catching and gathering human contact site of inertial sensor, fix respectively at the upper torso, lower torso, left big arm, right big arm, left forearm, right forearm, left thigh, right thigh, left calf, right calf, left foot and right foot, select the Link Track module that NoopLoop provided as ultra wide band positioning system (UWB), use 4 modules as basic station (A0, A1, A2, A3), 1 module is placed at human upper torso as label (T0), basic station module places in level ground.
The inertial measurement unit transmits the angular velocity or quaternion obtained by processing to an upper computer, and simultaneously collects label coordinates in a UWB system, firstly, a selected person calculates the coordinates of a following point according to the coordinates of the foot contacting the ground in the walking process, and the coordinates are fused with label position information transmitted back by the UWB system to obtain pelvic bone coordinate information with higher positioning precision, secondly, the pelvic bone is selected as the following point, the inertial measurement unit captures the motion information of each trunk in real time, the posture information and the position information of each trunk are solved according to a tree structure and a positive kinematic function, the pose information obtained in real time is assigned to a three-dimensional human body model, and the three-dimensional human body model is driven to move;
the specific process of converting the position coordinates (obtained data) returned by the UWB in the step (1) to a geographic coordinate system is as follows:
assuming that the relationship between the sensor coordinate system and the base coordinate system remains unchanged during the movement, determining the relationship between the base coordinate system and the sensor coordinate system,the representation of the sensor coordinate system under the geographical coordinate system is calculated by a direct cosine matrix algorithm,obtaining the representation of the coordinate system of the substrate in the geographic coordinate system according to the coordinate transformation between the coordinate systems,the specific conversion relationship is as follows:
using the unitized quaternion q = [ q = 0 q 1 q 2 q 3 ]Representing a rotation matrixThe formula is as follows:
and converting the coordinate values obtained under the coordinate system of the local positioning system into the geographic coordinate system.
The specific process of calculating the coordinates of the tracking point (pelvic bone) in the step (2) is as follows:
when the left foot touches the ground, the specific process of acquiring the coordinates of the human heel point (pelvis) is as follows:
since the length of the limbs of the human body is known, the coordinates of the left foot at the initial moment can be known, the coordinates of the heel point are calculated, and the coordinates of the left knee joint in a coordinate system taking the left ankle joint as the coordinate system are set asIts transformation matrix is written as:
rotation matrix representing the left thigh, by the same tokenThe displacement of the heel point relative to the left ankle at the time of left foot strike is expressed as:
if the left ankle coordinate at this time is expressed asThe real-time coordinates of the following points are as follows:
and estimating a coordinate value of the right ankle according to the heel point coordinate, wherein the coordinate of the right ankle relative to the heel point is expressed as:
the right ankle real-time coordinates are expressed as:
the process of calculating the tracking point coordinate and the left ankle coordinate by touchdown of the right foot is the same as that of the calculation of the tracking point coordinate and the left ankle coordinate by touchdown of the right foot;
when the feet touch the ground, the tracking point coordinates are respectively calculated by touching the ground of the left ankle and the right ankle, the tracking point coordinates are calculated, and the average value of the two obtained coordinate values is used as the tracking point coordinates when the feet touch the bottom.
The step (3) of data fusion is to use an Unscented Kalman Filter (UKF) algorithm to fuse data in the XY axis direction, and the Kalman Filter (KF) algorithm is used for carrying out filtering estimation on the Z axis position coordinate calculated based on the IMU;
the data obtained by using the inertial sensor is used for establishing a system state equation, the data obtained by UWB is used for establishing a system observation equation, and the specific fusion algorithm flow is as follows:
(1) Assuming an initial state estimate and a variance estimate;
(2) Selecting a Sigma point set according to a proportional correction symmetric sampling strategy;
(3) Carrying out one-step prediction on the Sigma point set;
(4) Calculating a mean value and a covariance by using the prediction result;
(5) Obtaining a new Sigma point set by using UT conversion again according to the predicted value
(6) Predicting an observation value by using a nonlinear observation function;
(7) Calculating an observation prediction mean value and a covariance;
(8) Calculating a filtering gain;
(9) The state and variance are updated.
Filtering estimation is carried out on the Z-axis position coordinate based on IMU calculation, and the specific filtering algorithm flow is as follows:
(1) Assuming an initial state estimate and a variance estimate;
(2) Calculating a priori state estimation value;
(3) Calculating prior error covariance;
(4) Updating the filtering gain;
(5) Updating the observed value;
(6) The error covariance is updated.
The specific process of calculating the posture information and the position information of the whole body trunk in the step (4) is as follows:
establishing a local coordinate system taking a pelvis as a center, wherein the relative coordinate of the left shoulder joint in the pelvis coordinate system is known (the trunk length is fixed), and calculating the relative displacement in the motion process by utilizing a rotation matrix so as to further calculate the coordinate of the left shoulder joint;
taking the left shoulder joint as the origin of a coordinate system, knowing the relative coordinate of the left elbow joint coordinate in the left shoulder joint coordinate system (being the length of the trunk and being fixed in length), calculating the relative displacement in the motion process by utilizing a rotation matrix, and further calculating the left elbow joint coordinate;
taking the left elbow joint as the origin of a coordinate system, knowing the relative coordinate of the left wrist joint in the left elbow joint coordinate system (the length is the trunk length and the length is fixed), and calculating the relative displacement in the motion process by using a rotation matrix so as to calculate the coordinate of the left wrist joint;
calculating the coordinates of a right shoulder joint, a right elbow joint and a right wrist in the same derivation mode;
next, the lower body joint coordinates are calculated:
taking the pelvis as the origin of a heel point coordinate system, calculating a rotation matrix according to an inertial sensor fixed on the lower half body to obtain the left hip joint coordinate, wherein the relative position of the left hip joint coordinate in the heel point coordinate system is known (the length of the trunk is fixed);
taking a left hip joint as a local coordinate system origin, knowing the relative position of a left knee joint coordinate in a left hip joint coordinate system (the length of the trunk is fixed), and calculating a rotation matrix according to an inertial sensor fixed on a thigh to obtain a left knee joint coordinate;
taking the left knee joint as the origin of a local coordinate system, wherein the relative position of the left ankle joint coordinate in the left knee joint coordinate system is known (the length of the trunk is fixed), and calculating a rotation matrix according to an inertial sensor fixed on the shank to obtain the left ankle joint coordinate;
the coordinates of the right hip joint, the right knee joint and the right ankle joint are calculated in the same way.
The system for realizing the human body real-time indoor positioning and motion gesture capturing method in the human-computer cooperation specifically comprises the following steps:
a data receiving module: receiving quaternions and angular velocities returned by the IMU and position information returned by the UWB;
a coordinate conversion module: converting the posture coordinate information and the position information returned by the UWB under the human body base coordinate system into a geographical coordinate system;
a data fusion module: and (3) taking the touch ground leg as an original point, calculating a tracking point coordinate by adopting a pose matrix, fusing the tracking point coordinate with a label coordinate returned by the UWB, establishing a system state equation by using data obtained by the IMU sensor, and establishing a system observation equation by using data obtained by the UWB.
And a posture calculation module: and calculating the position information and the posture information of each trunk of the whole body according to the fused tracking point coordinates, submitting the position information and the posture information to a motion posture display module of software, and reconstructing a human motion model in real time.
The invention obtains the posture information of each trunk of the human body through the inertial sensor, calculates the real-time coordinates of the heel point through touchdown, obtains the three-dimensional coordinate information of the label fixed at the heel point in real time through the ultra-wideband positioning technology, obtains smoother and more accurate heel point coordinates through unscented Kalman filtering fusion, realizes the accurate displacement of the human body, ensures the accurate determination of the distance and the position, and can reappear the posture motion of the human body.
Drawings
Fig. 1 is a schematic view of the placement of an inertial measurement unit on a person.
Fig. 2 is a diagram of a test environment incorporating UWB.
Fig. 3 is a sequence chart of the calculation of the tracking point coordinates when the left foot of the human body touches the ground.
Fig. 4 is a sequence diagram of the calculation of the coordinates of the tracking points when the right foot of the human body touches the ground.
Fig. 5 is a schematic diagram of the calculation of the posture and coordinates of each trunk of the whole body with the heel point as the center point.
Detailed Description
The principles and features of this invention are described below in conjunction with the following drawings, which are set forth to illustrate, but are not to be construed to limit the scope of the invention.
The module has the advantages of small volume, light weight, strong cruising ability, flexible configuration, capability of capturing the whole body action and specific placement position as shown in figure 1. The invention selects 12 wireless inertia measurement units (XSENS MTw Awindows) which are respectively fixed on an upper body trunk, a lower body trunk, a left big arm, a right big arm, a left small arm, a right small arm, a left thigh, a right thigh, a left calf, a right calf, a left foot and a right foot. The second part is an Ultra-wideband (UWB for short) positioning system, selects a Link Track module provided by NoopLoop, can be used as a label, a base station and a control console at the same time, uses 4 modules as the base station for improving the three-dimensional positioning precision of personnel, places 1 module as the label on the trunk of the upper body of the testing personnel, places the base station module on the horizontal ground through a tripod, and the whole testing system is shown in figure 2. In the whole test process, the IMU transmits the angular velocity or quaternion obtained by processing to upper computer software, simultaneously acquires the tag coordinates in the UWB system, and subsequent data resolving is carried out in the software, which is specifically summarized into two parts: firstly, select personnel at the walking in-process, calculate with the coordinate according to the coordinate of that foot of contact ground, fuse with the label positional information that UWB returned, obtain higher positioning accuracy's pelvis coordinate information. Secondly, the pelvis is selected as a following point, the motion information of each trunk is captured in real time through a multi-inertia measurement unit system, the posture information and the position information of each trunk are solved according to the tree structure and the positive kinematics function, the pose information obtained in real time is assigned to the three-dimensional model of the human body, and the three-dimensional model of the human body is driven to move.
Coordinate system used in this example:
a geographic coordinate system: generally, the initial position is taken as the origin, the positive direction of the X axis refers to east, the positive direction of the Y axis refers to north, and the square of the Z axis points to the sky according to the right hand rule, namely a coordinate system of the northeast sky;
sensor coordinate system: the origin of coordinates is the center of the sensor, the positive direction of the X axis is parallel to the long side of the sensor and faces to the right, the positive direction of the Y axis is parallel to the short side of the sensor and faces to the front, and the positive direction of the Z axis is perpendicular to the sensor and faces to the upper side;
a base coordinate system: the coordinate system of each joint of the human body is in the same direction as the coordinate system of the following point, the following point is a base coordinate system, the positive direction of an X axis points to the left hand side of the human body, the positive direction of a Y axis points to the rear side of the human body, and the positive direction of a Z axis points vertically upwards according to the rule of a right hand;
UWB local positioning system coordinate system: a local coordinate system is determined according to the placement condition of a base station, a base station A0 is generally selected as the origin of the coordinate system, the positive direction of an X axis points to a base station A3 from the base station A0, the positive direction of a Y axis points to a base station A1 from the base station A0 and is vertical to the X axis, and a Z axis is vertical to an XY axis plane and points to the sky.
Assuming that the relationship between the sensor coordinate system and the base coordinate system remains unchanged during the movement, determining the relationship between the base coordinate system and the sensor coordinate system,the representation of the sensor coordinate system in the geographical coordinate system is calculated by a direct cosine matrix algorithm,according to the coordinate transformation between the coordinate systems, the representation of the base coordinate system in the geographic coordinate system can be obtained,the specific conversion relationship is as follows:
using the unitized quaternion q = [ q ] 0 q 1 q 2 q 3 ]Representing a rotation matrixThe formula is as follows:
and converting the coordinate values obtained under the coordinate system of the local positioning system into the geographic coordinate system.
And calculating the pelvis coordinates according to the position of the touchdown foot.
During walking, people always touch the ground with one foot or both feet simultaneously, and we discuss the situation respectively.
The angular velocity transmitted back by the sensors fixed on the feet is used for judging which foot touches the ground and which foot is in the air, and different positive kinematic trees are selected for estimating the tracking point coordinates when different feet touch the ground. Each joint is a local coordinate system and is consistent with the base coordinate system.
When the left foot touches the ground, as shown in fig. 3, the specific process of acquiring the posture of the person is as follows:
since the length of the limbs of the human body is known, the coordinates of the left foot at the initial time can be known, and the coordinates of the heel point can be calculated. Let the coordinates of the left knee joint in a coordinate system taking the left ankle joint as the coordinate system beIts translation matrix can be written as:
indicates the left thighThe rotation matrix of (2). The same can be obtainedThe displacement of the heel point relative to the left ankle when the left foot touches down may be expressed as:
if the left ankle coordinate at this time is expressed asThe real-time coordinates of the following points are as follows:
and estimating a coordinate value of the right ankle according to the coordinate of the heel point, wherein the coordinate of the right ankle relative to the heel point is expressed as follows:
the right ankle real-time coordinates are expressed as:
the right foot touchdown calculating tracking point coordinates and the left ankle touchdown method are the same as those in fig. 4, and are not described again here.
When both feet touch the ground, the heel point coordinate can be calculated if the left ankle touches the ground, and the heel point coordinate can also be calculated if the right ankle touches the ground.
The UWB transmits back the tag location coordinates.
Install the basic station on the tripod, the antenna is up, and the label is placed in tester pelvis position, opens each module switch, and label and basic station real-time communication pass back label real-time position coordinate. The coordinate values returned by UWB are first pre-processed before being fused with the tracking point coordinates calculated based on IMU. Since UWB is susceptible to multipath effects and non-line-of-sight, both abrupt and continuously invariant values tend to occur. If the value at a certain moment is a mutation value, using the previous moment value as the moment value; if a certain value appears for a plurality of times in succession, values appearing more than 5 times are directly ignored.
And (5) fusing and positioning algorithm with point coordinate information.
Because the coordinate change error of the UWB in the Z-axis direction is too large, in the data fusion, the Unscented Kalman Filtering (UKF) algorithm is used for fusing the position data of IMU and UWB in the X-axis direction and the Y-axis direction, and the posterior probability density of the sample is utilized to approximate the probability density distribution of the nonlinear function, thereby solving the computation complexity and cost in the extended Kalman filtering and effectively improving the parameter estimation precision and the system stability. Since the coordinate change error of the UWB in the Z-axis direction is too large, only a simple Kalman Filter (KF) filter estimation is performed on the Z-axis position coordinate calculated based on the IMU in the data fusion.
And fusing position coordinate data in the XY axis direction.
The data obtained by the IMU sensor is used for establishing a system state equation, the data obtained by the UWB sensor is used for establishing a system observation equation, and the specific fusion algorithm flow is as follows:
(10) Assuming an initial state estimate and a covariance estimate;
(11) Selecting a Sigma point set according to a proportional correction symmetric sampling strategy;
(12) Carrying out one-step prediction on the Sigma point set;
(13) Calculating a mean value and a covariance by using the prediction result;
(14) According to the predicted value, UT conversion is used again to obtain a new Sigma point set
(15) Predicting an observed value by using a nonlinear observation function;
(16) Calculating an observation prediction mean value and a covariance;
(17) Calculating a filtering gain;
(18) The state and variance are updated.
The initial state vector is X (k) = [ X = [) x (k) x y (k)]The coordinates of the tracking points in the X-axis and Y-axis directions calculated based on the inertial sensor network are shown, and the measured data itself contains process noise, so the system state equation is simplified to X (k + 1) = FX (k), and F represents a system state transition matrix, which is a unit matrix in this example. The observation vector is Z (k) = [ Z% x (k) z y (k)]The coordinates of a label in the direction of an X axis and a Y axis in a UWB positioning system are represented, measured data contain measurement noise, so that an observation equation is simplified into Z (k + 1) = HZ (k), H is an observation matrix, and is a unit matrix, and in the data fusion process, a UWB measurement value corrects a calculation value of an IMU in real time to obtain smoother and more accurate tracking point coordinates.
And performing filtering estimation on the Z-axis position coordinate calculated based on the IMU by using a Kalman filtering algorithm.
The specific filtering algorithm flow is as follows:
(7) An initial state estimate and a variance estimate are assumed;
(8) Calculating a priori state estimation value;
(9) Calculating prior error covariance;
(10) Updating the filtering gain;
(11) Updating the observation value;
(12) The error covariance is updated.
Since the state variables of the kalman filter function at this time are only the Z-axis coordinates of the IMU, the filter equation is simplified, where the state equation of the system is X (K) = AX (K-1), a is the state transition matrix, and a = [1], the observation equation is Z (K) = HX (K-1), H is the state transition matrix, and H = [1], error covariance P (K) = P (K-1) + Q, kalman gain K (K + 1) = P (K)/(P (K) + R), state update estimate X (K + 1) = X (K) + K (K + 1) = (Z (K) -X (K)), update error covariance P (K + 1) = (1-K (K + 1)) + P (K).
So far, the coordinates of the center point of the pelvis are obtained, and the posture information and the position information of the whole body trunk can be calculated, as shown in fig. 5. And unitizing the quaternion transmitted back by the inertial sensor to express the posture information of each trunk.
And calculating the position and posture information of each trunk of the whole body according to the fused tracking point coordinates.
Establishing a local coordinate system taking the pelvis as a center, wherein the relative coordinate of the left shoulder joint in the pelvis coordinate system is known asConverting quaternions returned by the IMU fixed to the upper body into a rotation matrix R up Then the left shoulder joint coordinates may be expressed as:
taking the left shoulder joint as the origin of the coordinate system, the relative coordinate of the left elbow joint coordinate in the coordinate system of the left shoulder joint is known asConverting quaternions returned by IMU fixed on left upper arm into rotation matrix R arm1 Then the left elbow joint coordinates may be expressed as:
taking the left elbow joint as the origin of the coordinate system, the relative coordinates of the left wrist joint in the coordinate system of the left elbow joint are known asConverting quaternions returned by IMU fixed on left forearm into rotation matrix R arm2 Then the left wrist coordinates can be expressed as:
the coordinates of the right shoulder joint, the right elbow joint and the right wrist are calculated in the same derivation mode.
Next, the coordinates of the joints of the lower body are calculated.
Taking the pelvis as the origin of a heel point coordinate system, the relative position of the left hip joint coordinate in the heel point coordinate system is known and recorded asCalculating a rotation matrix R from inertial sensors fixed to the lower body down And obtaining the coordinates of the left hip joint as follows:
taking the left hip joint as the origin of the local coordinate system, the relative position of the left knee joint coordinate in the left hip joint coordinate system is known and recorded asComputing a rotation matrix R from inertial sensors fixed to the thigh leg1 Obtaining the left knee joint coordinates as follows:
taking the left knee joint as the origin of the local coordinate system, the relative position of the left ankle joint coordinate in the left knee joint coordinate system is known and recorded asCalculating a rotation matrix R from inertial sensors fixed to the lower leg leg2 And obtaining the coordinates of the left ankle joint as follows:
the coordinates of the right hip joint, the right knee joint and the right ankle joint are calculated in the same way.
Therefore, the three-dimensional position information and the posture information of the personnel are transmitted to the virtual human body model in the software, and the real-time tracking of the movement posture of the personnel can be realized.
Accordingly, in this example, specifically includes:
a data receiving module: receiving quaternions and angular velocities returned by the IMU and position information returned by the UWB;
a coordinate conversion module: converting the posture coordinate information and the position information returned by the UWB under the human body base coordinate system into a geographical coordinate system;
a data fusion module: and (3) taking the touch ground leg as an original point, calculating a tracking point coordinate by adopting a pose matrix, fusing the tracking point coordinate with a label coordinate returned by the UWB, establishing a system state equation by using data obtained by the IMU sensor, and establishing a system observation equation by using data obtained by the UWB.
And a posture calculation module: and calculating the position information and the posture information of each trunk of the whole body according to the fused tracking point coordinates, submitting the position information and the posture information to a motion posture display module of software, and reconstructing a human motion model in real time.
Claims (3)
1. A human body real-time indoor positioning and motion posture capturing method in man-machine cooperation is characterized by comprising the following steps:
(1) Acquiring pose information of a human body under a geographic coordinate system, and converting position coordinates transmitted back by the UWB into the geographic coordinate system;
(2) Calculating the coordinates of the tracking points by using a positive kinematics tree according to the foot touchdown condition;
(3) Performing data fusion on the tracking point coordinate calculated based on the inertial sensor network and the label coordinate returned by the UWB to obtain a fused tracking point coordinate value;
(4) Calculating posture information and position information of the whole trunk according to the fused tracking point coordinate values;
the geographic coordinate system refers to: taking the initial position as an origin, wherein the positive direction of an X axis refers to east, the positive direction of a Y axis refers to north, and a square of a Z axis points to the sky according to the rule of a right hand, namely a coordinate system of the northeast sky;
the specific process of acquiring the pose information of the human body under the geographic coordinate system in the step (1) is as follows:
fixing an inertia measurement unit on a human body, automatically capturing and collecting motion data of a contact part of the human body through an inertia sensor, selecting 12 wireless inertia measurement units, respectively fixing the wireless inertia measurement units on an upper body trunk, a lower body trunk, a left big arm, a right big arm, a left small arm, a right small arm, a left thigh, a right thigh, a left calf, a right calf, a left foot and a right foot, selecting a Link Track module provided by NoopLoop as an ultra-wideband positioning system, using 4 modules as a base station, placing 1 module as a label on the upper body trunk of the human body, and placing the base station module on a horizontal ground;
the inertial measurement unit transmits the processed angular velocity or quaternion to an upper computer, and simultaneously acquires tag coordinates in a UWB system, wherein firstly, a selection person calculates the coordinates of a following point according to the coordinates of the foot contacting the ground in the walking process, and the following point coordinates are fused with tag position information transmitted back by the UWB system to obtain pelvic bone coordinate information with higher positioning precision, secondly, a pelvic bone is selected as the following point, the inertial measurement unit captures the motion information of each trunk in real time, the posture information and the position information of each trunk are solved according to a tree structure and a positive kinematic function, the pose information obtained in real time is assigned to a three-dimensional human body model, and the three-dimensional human body model is driven to move;
the specific process of converting the position coordinates returned by the UWB into the geographic coordinate system in the step (1) is as follows:
determining the relationship between the base coordinate system and the sensor coordinate system, assuming that the relationship between the sensor coordinate system and the base coordinate system remains unchanged during the movement,the representation of the sensor coordinate system under the geographical coordinate system is calculated by a direct cosine matrix algorithm,obtaining the representation of the coordinate system of the substrate in the geographic coordinate system according to the coordinate transformation between the coordinate systems,the specific conversion relationship is as follows:
using the unitized quaternion q = [ q ] 0 q 1 q 2 q 3 ]Representing a rotation matrixThe formula is as follows:
converting coordinate values obtained under a local positioning system coordinate system into a geographic coordinate system;
the specific process of calculating the tracking point coordinates in the step (2) is as follows:
when the left foot touches the ground, the specific process of acquiring the coordinates of the person tracking points is as follows:
since the length of the limbs of the human body is known, the coordinates of the left foot at the initial moment can be known, the coordinates of the heel point are calculated, and the coordinates of the left knee joint in a coordinate system taking the left ankle joint as the coordinate system are set asThe transformation matrix is written as:
rotation matrix representing the left thigh, by the same tokenThe displacement of the heel point relative to the left ankle at the time of left foot strike is expressed as:
if the coordinates of the left ankle at this time are expressed asThe real-time coordinates of the following points are as follows:
and estimating a coordinate value of the right ankle according to the heel point coordinate, wherein the coordinate of the right ankle relative to the heel point is expressed as:
the right ankle real-time coordinates are expressed as:
the process of calculating the tracking point coordinate and the left ankle coordinate by touchdown of the right foot is the same as that of the calculation of the tracking point coordinate and the left ankle coordinate by touchdown of the right foot;
when the feet touch the ground, respectively calculating the tracking point coordinates by touching the ground of the left ankle and the right ankle, and calculating the average value of the two obtained coordinate values to be used as the tracking point coordinates when the feet touch the bottom;
the step (3) of data fusion is to fuse data in the XY axis direction by using an unscented Kalman filtering algorithm, and the Kalman filtering algorithm is used for carrying out filtering estimation on the Z axis position coordinate calculated based on the IMU;
the data obtained by using the inertial sensor is used for establishing a system state equation, the data obtained by UWB is used for establishing a system observation equation, and the specific fusion algorithm flow is as follows:
(1) Assuming an initial state estimate and a variance estimate;
(2) Selecting a Sigma point set according to a proportional correction symmetric sampling strategy;
(3) Carrying out one-step prediction on the Sigma point set;
(4) Calculating a mean value and a covariance by using the prediction result;
(5) Obtaining a new Sigma point set by using UT conversion again according to the predicted value
(6) Predicting an observation value by using a nonlinear observation function;
(7) Calculating an observation prediction mean and covariance;
(8) Calculating a filtering gain;
(9) Updating the state and variance;
filtering estimation is carried out on the Z-axis position coordinate based on IMU calculation, and the specific filtering algorithm flow is as follows:
(1) An initial state estimate and a variance estimate are assumed;
(2) Calculating a priori state estimation value;
(3) Calculating prior error covariance;
(4) Updating the filtering gain;
(5) Updating the observed value;
(6) The error covariance is updated.
2. The human body real-time indoor positioning and motion gesture capturing method in human-computer cooperation according to claim 1, wherein the specific process of calculating the whole body trunk gesture information and the position information in the step (4) is as follows:
establishing a local coordinate system taking a pelvis as a center, wherein the relative coordinate of the left shoulder joint in the pelvis coordinate system is known, and calculating the relative displacement in the motion process by utilizing a rotation matrix so as to further calculate the coordinate of the left shoulder joint;
taking the left shoulder joint as the origin of a coordinate system, knowing the relative coordinate of the left elbow joint coordinate in the left shoulder joint coordinate system, calculating the relative displacement in the motion process by using a rotation matrix, and further calculating the left elbow joint coordinate;
taking the left elbow joint as the origin of a coordinate system, knowing the relative coordinate of the left wrist joint in the left elbow joint coordinate system, and calculating the relative displacement in the motion process by using a rotation matrix so as to calculate the coordinate of the left wrist joint;
calculating the coordinates of the right shoulder joint, the right elbow joint and the right wrist in the same derivation mode;
next, the lower body joint coordinates are calculated:
taking the pelvis as the origin of a heel point coordinate system, calculating a rotation matrix according to an inertial sensor fixed on the lower half body to obtain the left hip joint coordinate, wherein the relative position of the left hip joint coordinate in the heel point coordinate system is known;
taking a left hip joint as a local coordinate system origin, knowing the relative position of a left knee joint coordinate in a left hip joint coordinate system, and calculating a rotation matrix according to an inertial sensor fixed on a thigh to obtain a left knee joint coordinate;
taking the left knee joint as the origin of a local coordinate system, wherein the relative position of the left ankle joint coordinate in the left knee joint coordinate system is known, and calculating a rotation matrix according to an inertial sensor fixed on a shank to obtain the left ankle joint coordinate;
the coordinates of the right hip joint, the right knee joint and the right ankle joint are calculated in the same way.
3. A system for implementing the human body real-time indoor positioning and motion gesture capturing method in human-computer cooperation according to any one of claims 1 and 2, which is characterized by comprising:
a data receiving module: receiving quaternions and angular velocities returned by the IMU and position information returned by the UWB;
a coordinate conversion module: converting the posture coordinate information and the position information returned by the UWB under the human body base coordinate system into a geographical coordinate system;
a data fusion module: taking the touch ground leg as an original point, calculating a tracking point coordinate by adopting a pose matrix, fusing the tracking point coordinate with a label coordinate transmitted back by the UWB, wherein data obtained by an IMU sensor is used for establishing a system state equation, and data obtained by the UWB is used for establishing a system observation equation;
and a posture calculation module: and calculating the position information and the posture information of each trunk of the whole body according to the fused tracking point coordinates, submitting the position information and the posture information to a motion posture display module of software, and reconstructing a human motion model in real time.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110133657.5A CN112957033B (en) | 2021-02-01 | 2021-02-01 | Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110133657.5A CN112957033B (en) | 2021-02-01 | 2021-02-01 | Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112957033A CN112957033A (en) | 2021-06-15 |
CN112957033B true CN112957033B (en) | 2022-10-18 |
Family
ID=76272566
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110133657.5A Active CN112957033B (en) | 2021-02-01 | 2021-02-01 | Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112957033B (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113487674B (en) * | 2021-07-12 | 2024-03-08 | 未来元宇数字科技(北京)有限公司 | Human body pose estimation system and method |
CN113570711B (en) * | 2021-07-23 | 2023-05-09 | 上海工程技术大学 | Method for calculating three-dimensional coordinates of breast relative to trunk in motion |
CN116072291A (en) * | 2021-10-29 | 2023-05-05 | 华为终端有限公司 | Motion analysis method, motion analysis device, electronic equipment and computer storage medium |
CN114562993A (en) * | 2022-02-28 | 2022-05-31 | 联想(北京)有限公司 | Track processing method and device and electronic equipment |
CN115105059A (en) * | 2022-06-10 | 2022-09-27 | 深圳前海向纺未来科技有限公司 | Method and device for determining whole body posture of human body and intelligent shoes |
CN115824210B (en) * | 2022-11-08 | 2023-12-22 | 华北科技学院 | Fusion positioning method and system for indoor active fire-fighting robot |
CN116058829A (en) * | 2022-12-26 | 2023-05-05 | 青岛大学 | System for displaying human lower limb gesture in real time based on IMU |
CN116211290B (en) * | 2023-02-20 | 2024-01-30 | 汕头大学 | Ankle pump movement posture monitoring and evaluating method and system |
CN117322872A (en) * | 2023-10-26 | 2024-01-02 | 北京软体机器人科技股份有限公司 | Motion capturing method and device |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103150016B (en) * | 2013-02-20 | 2016-03-09 | 兰州交通大学 | A kind of many human actions capture system merging ultra broadband location and inertia sensing technology |
US20190228583A1 (en) * | 2018-01-22 | 2019-07-25 | MassVR, LLC | Systems and methods for tracking object location and orientation in virtual reality environments using ultra-wideband signals, inertia measurement units, and reflective markers |
CN109916410B (en) * | 2019-03-25 | 2023-04-28 | 南京理工大学 | Indoor positioning method based on improved square root unscented Kalman filtering |
CN110109470B (en) * | 2019-04-09 | 2021-10-29 | 西安电子科技大学 | Combined attitude determination method based on unscented Kalman filtering and satellite attitude control system |
CN110375730B (en) * | 2019-06-12 | 2021-07-27 | 深圳大学 | Indoor positioning navigation system based on IMU and UWB fusion |
CN110978064B (en) * | 2019-12-11 | 2022-06-24 | 山东大学 | Human body safety assessment method and system in human-computer cooperation |
-
2021
- 2021-02-01 CN CN202110133657.5A patent/CN112957033B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN112957033A (en) | 2021-06-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112957033B (en) | Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation | |
US10679360B2 (en) | Mixed motion capture system and method | |
Zihajehzadeh et al. | UWB-aided inertial motion capture for lower body 3-D dynamic activity and trajectory tracking | |
KR101739996B1 (en) | Moving robot and simultaneous localization and map-buliding method thereof | |
Yun et al. | Estimation of human foot motion during normal walking using inertial and magnetic sensor measurements | |
Zihajehzadeh et al. | A novel biomechanical model-aided IMU/UWB fusion for magnetometer-free lower body motion capture | |
Zhou et al. | Reducing drifts in the inertial measurements of wrist and elbow positions | |
US9357948B2 (en) | Method and system for determining the values of parameters representative of a movement of at least two limbs of an entity represented in the form of an articulated line | |
Tao et al. | A novel sensing and data fusion system for 3-D arm motion tracking in telerehabilitation | |
Yuan et al. | 3-D localization of human based on an inertial capture system | |
KR102226846B1 (en) | System for Positioning Hybrid Indoor Localization Using Inertia Measurement Unit Sensor and Camera | |
US9021712B2 (en) | Autonomous system and method for determining information representative of the movement of an articulated chain | |
EP3067783A1 (en) | Method and system to track human locomotion by relative positional tracking of body parts of the human | |
CN113793360B (en) | Three-dimensional human body reconstruction method based on inertial sensing technology | |
CN106843484B (en) | Method for fusing indoor positioning data and motion capture data | |
JP2013531781A (en) | Method and system for detecting zero speed state of object | |
CN109284006B (en) | Human motion capturing device and method | |
CN108957505A (en) | A kind of localization method, positioning system and portable intelligent wearable device | |
CN110609621B (en) | Gesture calibration method and human motion capture system based on microsensor | |
KR20120131553A (en) | method of motion tracking. | |
Gong et al. | Robust inertial motion tracking through deep sensor fusion across smart earbuds and smartphone | |
CN101531216A (en) | Robot device and method of controlling the same | |
CN109453505B (en) | Multi-joint tracking method based on wearable device | |
CN112256125B (en) | Laser-based large-space positioning and optical-inertial-motion complementary motion capture system and method | |
CN113268141A (en) | Motion capture method and device based on inertial sensor and fabric electronics |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |