CN109284006B - Human motion capturing device and method - Google Patents

Human motion capturing device and method Download PDF

Info

Publication number
CN109284006B
CN109284006B CN201811334444.3A CN201811334444A CN109284006B CN 109284006 B CN109284006 B CN 109284006B CN 201811334444 A CN201811334444 A CN 201811334444A CN 109284006 B CN109284006 B CN 109284006B
Authority
CN
China
Prior art keywords
motion
human body
limb
data
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811334444.3A
Other languages
Chinese (zh)
Other versions
CN109284006A (en
Inventor
吴健康
吴燃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongke Digital Health Research Institute Nanjing Co ltd
Original Assignee
Zhongke Digital Health Research Institute Nanjing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongke Digital Health Research Institute Nanjing Co ltd filed Critical Zhongke Digital Health Research Institute Nanjing Co ltd
Priority to CN201811334444.3A priority Critical patent/CN109284006B/en
Publication of CN109284006A publication Critical patent/CN109284006A/en
Application granted granted Critical
Publication of CN109284006B publication Critical patent/CN109284006B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Abstract

The invention discloses a human body motion capture device and a corresponding method, wherein the device comprises a motion measurement unit, a motion parameter estimation unit, an initialization unit, a gait detection unit and a displacement fusion unit, wherein the motion measurement unit is used for measuring human body limb motion data and environment data; the motion parameter estimation unit fuses the motion data of the limbs of the human body with the environmental data to deduce the motion parameters and the environmental parameters of the limbs; the initialization unit fuses the mutual limiting condition and the motion boundary condition between the limbs of the human body to derive the initial operation parameters of the human body motion capture device; the gait detection unit detects the touchdown state of the lower limb of the human body at the current moment to obtain gait detection information; the displacement fusion unit deduces and outputs the posture and position information of the overall motion of the human body relative to the ground. The portable motion capture and analysis device has the characteristics of portability and practicability, is very suitable for being made into a wearable motion capture and analysis device, and has wide application in various fields.

Description

Human motion capturing device and method
Technical Field
The application relates to the technical field of human motion sensing, in particular to a human motion capturing device based on a sensor and a related human motion capturing method.
Background
At present, by sensing and acquiring accurate human body posture and position motion information, problems existing in motion trail analysis of each limb of an athlete can be analyzed and training is improved, possible diseases can be deduced according to changes of human body gait, a high-level 3D game can be established according to tracking of human body motion information, and a life-like role can be built for digital movies and virtual worlds according to tracking of human body motion posture. However, the randomness and complexity of human body movement, and the diversity of the environment around the human body, all bring great challenges to the real-time and accurate human body movement sensing and acquisition. Therefore, a human body movement sensing and obtaining technology which is not limited by space and time and can overcome the interference of external environment is urgently needed at present, so that the human body posture and position movement information can be obtained and reproduced, and a key technology is provided for the application in the fields of health monitoring, rehabilitation training, dance training, body-building movement analysis, movie digital tricks, virtual reality, games, human-computer interaction and the like.
Currently, commonly used motion capture techniques can be broadly divided into two categories.
One type mainly uses high precision camera arrays. Such systems utilize multiple cameras of high precision and high sampling rate to capture reflective markers on the joints of the athlete, such as the commercially available product Vicon. The patent technology in this aspect is: U.S. patent No. 20080192116, real-time objects tracking and motion capture in sports event, is a Real-time moving object tracking system that uses multiple cameras to detect and track moving objects, but does not involve the motion details of the object itself; us patent No. 7457439, system and method for motion capture, uses the position information of the mark on the athlete and the three-dimensional motion model of the athlete obtained by the camera to recover the three-dimensional motion information of the body, and compares the motion states by using the three-dimensional motion model; chinese patent 'color tights based on motion acquisition', application number 00264404, designed a motion acquisition suit for encoding human body parts with color blocks; chinese patent application No. 03120688, a method for processing passive optical motion to obtain data, includes: acquiring a synchronized multi-camera image of a subject with passive optical markers, obtaining a set of three-dimensional coordinates of the markers from the acquired data, determining correspondence in time between the markers in successive acquisitions, thereby determining the position of the body part of the subject to which the markers are attached, determining the angle of each link of a motion model to which the subject motion is projected based on the set of markers made, and calculating the pose of the subject; the Chinese patent 'a calibration method and a device for a multi-camera system', application number 200710062825, is a new method for reconstructing three-dimensional motion information of multiple cameras based on mark points. The defects of the system are that the system needs a fixed laboratory, has light and shielding problems, and is limited by places and application scenes when in use; the system uses a plurality of cameras with high precision and high sampling rate, so that the system is extremely expensive in cost, very complex in structure and inconvenient to use; moreover, such systems handle large amounts of data and do not capture human motion information in real time.
The other type uses micro sensors attached to the limbs of the human body to measure and estimate the three-dimensional azimuth angle and other motion information of each limb. The miniature sensor has small volume, low energy consumption, direct measurement and convenient wearing, is not limited by space and time, and is very suitable for being made into a wearing motion analysis device. The patent technology in this aspect is: U.S. Pat. No. System and Method for Motion Capture in Natural Environments, IPC8 class: AGO1C2300FI uses ultrasonic emission sources and receivers placed on each part of the body to measure the position of the corresponding part, and uses the rotation angle measured by the inertial sensor to calibrate the position measurement, thereby obtaining the motion parameters of the body. However, the entire motion acquisition system is complicated by the use of ultrasonic sensors and inertial sensors (acceleration sensors and gyroscopes). The Chinese patent 200920108961.9 discloses a motion acquisition system only or mainly using a miniature sensor, wherein the whole system is based on a human motion model and comprises the steps of placement and wearing of the sensor, estimation of motion parameters, restraint among the motion parameters and motion reproduction of a three-dimensional image of a human body. Technical challenges that currently exist with such systems include that microsensors have their inherent problems, such as large measurement noise, and system bias; the inertial sensor can only measure the change rate, such as acceleration measured by an accelerometer and angular velocity measured by a gyroscope, and directly integrates and estimates position and angular motion information to generate offset of motion quantity estimation; the miniature sensor performs human motion estimation in a distributed measurement mode, namely, the sensor units are respectively attached to each limb of a human body, and the integral posture and the position of the human body cannot be directly obtained. The Chinese patent 201110060074.0 human motion capture device provides a portable human motion capture device, utilizes a self-adaptive sensor data fusion technology to estimate human motion parameters, utilizes an initialization technology and a displacement estimation technology of the human motion capture device to fuse and estimate the posture and position information of the whole motion of a human body, and has the technical challenges that the micro sensor has inherent problems such as large measurement noise and system bias at present; the complementary Kalman filtering method is adopted to fuse the motion data of each limb with the environmental data, and large errors still exist for large interference or long-time interference, four measurement models are dynamically selected along with time, so that the calculated amount is increased, and the iteration rate is reduced; the algorithm is based on a deconcentration filtering model, so that the complexity of the whole system is improved; noise and drift are rapidly accumulated along with time, and accumulated errors are remarkable in a short time; the displacement algorithm is directed to only a single and relatively simple motion gait.
Disclosure of Invention
In order to solve the problems that the existing motion capture technology based on the camera array needs a special laboratory and is limited by application sites, light rays and shielding exist, the manufacturing cost is extremely high, the data volume is huge, and the real-time processing is difficult; the existing motion capture technology based on the miniature sensor has large measurement noise and system bias, the complementary Kalman filtering algorithm cannot eliminate errors of long-time and large interference on the posture estimation of the human body, and the position estimation is single and simple, so that limited technical problems and technical challenges are applied.
In order to achieve the purpose of the invention, the invention provides a human body motion capture device, which comprises a motion measurement unit, a motion parameter estimation unit, an initialization unit, a gait detection unit and a displacement fusion unit, wherein the motion measurement unit is used for measuring human body limb motion data and environment data; the motion parameter estimation unit is used for fusing the human body limb motion data and the environment data and deducing the motion parameters and the environment parameters of the limbs; the initialization unit is used for fusing the mutual limiting conditions and the motion boundary conditions between the limbs of the human body to derive the initial operation parameters of the human motion capture device; the gait detection unit is used for detecting the touchdown state of the lower limb of the human body at the current moment to obtain gait detection information; the displacement fusion unit is used for receiving and fusing the motion parameters of each limb of the human body, the initial operation parameters of the human body motion capture device, the length of each limb, the mutual limiting conditions among the limbs and gait detection information, and deducing and outputting the posture and position information of the overall motion of the human body relative to the ground.
According to a preferred embodiment of the present invention, the motion measurement unit comprises a plurality of micro sensor nodes and at least one control unit, wherein the micro sensor nodes are used for sampling and measuring limb motion data and environment data; the control unit acquires the data of each micro sensor node and sends the data to the motion parameter estimation unit.
According to a preferred embodiment of the present invention, the motion parameter estimation unit uses a complementary kalman filter system to fuse the motion data of each limb with the environmental data, and derives a three-dimensional angle estimation value in each limb motion parameter.
According to a preferred embodiment of the present invention, the gait detection unit is further configured to update the state of the complementary kalman filter system to eliminate accumulated errors generated after the human body is lifted off the ground.
According to a preferred embodiment of the invention, the gait detection unit comprises a support phase detection module which determines which gait phase the foot node is in based on the raw IMU sensing data and the azimuth angle of the currently predicted sensor coordinate system relative to the global coordinate system, and based on the sensor signal characteristics.
According to a preferred embodiment of the present invention, the gait detection unit performs a gait detection algorithm based on signal characteristics, which compensates for errors based on raw sensor data of each lower limb node, the pose of each sensor coordinate system of the lower limb with respect to the global coordinate system, and the state of the complementary kalman filter system output.
According to a preferred embodiment of the invention, the gait detection unit takes the velocity integration mean value in the fixed window during the supporting phase as the drift generated by integration in the window equivalent time on the basis of the gait phase detection algorithm, and removes the drift when the swing phase integration estimates the velocity displacement.
According to the preferred embodiment of the invention, the displacement fusion unit converts the motion parameters of each limb of the human body from the sensor coordinate system to the body coordinate system by utilizing the initial operation parameters of the human body motion capture device to obtain the motion parameters of each limb in the body coordinate system, and generates and outputs the posture and position information of the overall motion of the human body together with the displacement of the human body relative to the ground.
Another aspect of the present invention provides a human motion capture method, including: measuring human body limb movement data and environmental data; fusing the human body limb movement data and the environment data to deduce limb movement parameters and environment parameters; fusing the mutual limiting conditions and the motion boundary conditions between the limbs of the human body to derive the initial operation parameters of the human body motion capture device; detecting the touchdown state of the lower limb of the human body at the current moment to obtain gait detection information; and receiving and fusing the motion parameters of each limb of the human body, the initial operation parameters of the human body motion capture device, the length of each limb, the mutual limiting conditions among the limbs and gait detection information, and deducing and outputting the posture and position information of the overall motion of the human body relative to the ground.
The invention has the beneficial effects that: the invention adopts a plurality of micro sensor nodes attached to each limb of a human body to respectively measure the motion data and the environment data of each limb. The miniature sensor has the advantages of small volume, low energy consumption, direct measurement and economic cost; the use is convenient, and the control is not performed by space time; the data volume is small, and the real-time processing and analysis of the data can be performed; the portable and practical portable motion capture and analysis device has the characteristics of portability and practicability, is very suitable for being made into a wearable motion capture and analysis device, has wide application in various fields, and has stronger practical value and application prospect.
Drawings
FIG. 1 is a block diagram of the overall logic of a human motion capture device according to one embodiment of the present invention;
FIG. 2 is a block diagram of the motion measurement unit of one embodiment of the present invention;
FIG. 3 is a graph showing a node distribution of a plurality of microsensors for each critical limb of a human body according to one embodiment of the present invention;
FIG. 4 is a block diagram of the motion parameter estimation unit according to one embodiment of the present invention;
FIG. 5 is a diagram of three coordinate systems and their relationship in a human motion capture device according to one embodiment of the present invention;
FIG. 6 is a flow chart of a displacement fusion unit according to one embodiment of the present invention;
fig. 7 is a diagram showing the structure of the lower limb used in the gait detection unit of the invention.
Detailed Description
Specific embodiments of the invention will be described in detail below, it being noted that the embodiments described herein are for illustration only and are not intended to limit the invention. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that: these specific details need not be employed to practice the present invention. In other instances, well-known structures, materials, or methods have not been described in detail in order to avoid obscuring the present invention. Throughout the specification, references to "one embodiment," "an embodiment," "one example," or "an example" mean: a particular feature, structure, or characteristic described in connection with the embodiment or example is included within at least one embodiment of the invention. Thus, the appearances of the phrases "in one embodiment," "in an embodiment," "one example," or "an example" in various places throughout this specification are not necessarily all referring to the same embodiment or example. Furthermore, the particular features, structures, or characteristics may be combined in any suitable combination and/or sub-combination in one or more embodiments or examples. Furthermore, it will be understood by those of ordinary skill in the art that the term "and/or" as used herein includes any and all combinations of one or more of the associated listed items.
Fig. 1 is a block diagram showing the overall logic structure of a motion capture apparatus according to the present invention, which is composed of a motion measuring unit 100, a motion parameter estimating unit 200, an initializing unit 300, a gait detecting unit 400 and a displacement fusing unit 500, as shown in fig. 1, wherein:
the motion measuring unit 100 is provided with a plurality of micro sensor nodes attached to each limb of the human body for measuring and obtaining each limb motion data and environment data;
the structure of the motion measuring unit 100 includes: the motion parameter estimation device comprises a plurality of micro sensor nodes and one or more control units, wherein each micro sensor node is provided with a unique address, all the micro sensor nodes and the control units are connected together through a data bus, the control units select different micro sensor nodes through the address bus, issue control commands to the micro sensor nodes to acquire each path of measurement data of each micro sensor node, and then the control units are connected with the motion parameter estimation unit 200 in a wireless or wired mode to send the acquired data to the motion parameter estimation unit 200.
The microsensor node comprises a microsensor and a microcontroller, wherein: the miniature sensor is one or a plurality of combinations of a miniature three-dimensional accelerometer, a miniature three-dimensional gyroscope, a miniature three-dimensional magnetometer, a miniature ultrasonic range finder or a miniature ultra-wideband range finder and is used for sampling and measuring the motion data and the environmental data of each limb; the motion data comprises three-dimensional acceleration measurement data and three-dimensional angular velocity measurement data, and the environment data is three-dimensional magnetic field intensity measurement data; and the microcontroller controls the micro sensor to sample and measure the motion data and the environment data of each limb, and packages the measured motion data and the measured environment data and sends the packaged motion data and the measured environment data to the control unit in a wireless connection mode.
The motion parameter estimation unit 200 is used for fusing the motion data and the environment data of the motion measurement unit 100 to deduce the motion parameters and the environment parameters of each limb, and fusing the motion data and the environment data of each limb by adopting a complementary Kalman filtering method in the deduction to deduce a three-dimensional angle estimated value in the motion parameters of each limb; the motion parameters of each limb comprise three-dimensional acceleration estimation, three-dimensional speed estimation, three-dimensional displacement estimation, three-dimensional angular velocity estimation and three-dimensional angle estimation, and the environment parameters comprise three-dimensional magnetic field intensity estimation.
The complementary Kalman filtering algorithm establishes an error model, takes errors of attitude errors, displacement errors, speed errors, gyroscope bias and magnetic interference as state variables, adopts a feedback mechanism in an inertial navigation system, utilizes accelerometer observation variables to compensate inclination errors in gyroscope integration, utilizes magnetometer observation values to compensate magnetic field component error differences in gyroscope integration, combines a human forward kinematics theory, fuses gait detection information, and can execute ZUPT (zero velocity update) and ZARU (zero angular velocity update) after detecting a supporting phase, so as to modify a covariance matrix of the algorithm, thereby correcting the estimation state of the system. In the whole process, two bias items (acceleration bias and gyroscope bias) are kept updated in a filtering algorithm at all times, and state quantities of attitude, speed and position estimation errors are set to zero after each iteration update, so that state estimation drift errors generated by integration are restrained. The algorithm fully utilizes the output data of the accelerometer and the magnetometer to compensate the angle estimation error caused by drift and interference caused by integration of the gyroscope, and has remarkable correction effect on real-time tracking of human body posture.
The initialization unit 300 is configured to receive the motion parameters and the environmental parameters of each limb, and fuse the length of each limb, the mutual constraint condition between each limb and the motion boundary condition, so as to derive the initial operation parameters of the human motion capture device.
The step of the initialization unit 300 deriving the initial operation parameters of the human motion capture device is: the human body makes an initialization gesture according to the motion boundary conditions, and meanwhile, the initialization unit 300 receives the motion parameters and the environmental parameters of each limb under the initialization gesture conditions sent by the motion parameter estimation unit 200 in real time; according to the mutual limiting conditions among the limbs, establishing a topological relation map among a plurality of micro sensor nodes attached to the limbs of the human body, fusing the motion boundary conditions, the motion parameters and the environment parameters of the limbs of the human body under the condition of initializing the gesture, filtering by adopting a Behcet network dynamic system, and deducing the initial operation parameters of the human motion capture device.
The initial operating parameters include: the sensor coordinate system 210 of each micro sensor node of the motion measurement unit 100 is three-dimensional angular deviation and three-dimensional positional deviation with respect to the body coordinate system 220 of each limb; an initial three-dimensional angle and an initial three-dimensional position of the body coordinate system 220 relative to the global coordinate system 230; wherein: the sensor coordinate system 210 is the coordinate system of each micro sensor node itself in the motion measurement unit 100; the body coordinate system 220 is the coordinate system of each limb of the human body; the global coordinate system 230 is a geodetic coordinate system.
The motion boundary conditions include: when the human body moves on the horizontal ground, the vertical component of the position of the limb grounding part is zero; when a human body performs walking, side walking, sliding, stepping, running and jumping movements on the horizontal ground, the three-dimensional speed and the three-dimensional angular speed of the limb grounding part are zero; the human body naturally stands on the horizontal ground, and eyes visually observe the front, so that the plane of the human body back is approximately perpendicular to the horizontal ground and the direction of the human body sight line; after the hands of the human body are combined together, if the palm centers of the hands are always opposite during exercise and the ten fingers are always opposite, the positions of the hands are approximately equal.
The gait detection unit 400 is configured to detect a touchdown state of a lower limb at a current time, and update a state of a complementary kalman filter system to eliminate an accumulated error generated after a human body is lifted off the ground, and includes a support phase detection module, which is based on original IMU sensing data and an azimuth angle of a currently predicted sensor coordinate system relative to a global coordinate system, and determines which gait phase the foot node is in based on sensor signal characteristics. To overcome the misjudgment of the motion complexity and the abnormal condition of the motion gait by the signal noise, the gait detection unit 400 simultaneously realizes a mechanism for eliminating the misjudgment of the detection of the gait phase, and the detection unit can accurately detect the state of the motion gait for most of the gait motions after combining the detection unit and the detection unit.
The gait detection algorithm based on the signal characteristics is based on the original sensor data (mainly acceleration and angular velocity data) of each lower limb node, the posture (quaternion) of each lower limb sensor coordinate system relative to the global coordinate system and the state compensation error output by the complementary filtering algorithm, and the support phase detection module detects the motion state and the gait phase in real time. The method specifically comprises the steps of converting the original data under the sensor coordinate system into a global coordinate system, and separating out the motion acceleration. On the basis, the useful sensor signals (acceleration and gyroscope or one of the acceleration and gyroscope) are extracted, the step phase is roughly detected by adopting a threshold detection method, and most of threshold selection adopts off-line selection, and an adaptive threshold selection mode (needing to be finished off-line) can also be adopted.
The basic idea of misjudgment elimination is as follows: in gait movements in a short time (such as a short period of several movement gait cycles), the gait phase detection results at adjacent sampling moments cannot generate abrupt changes, and the data of the IMU sensing unit may generate errors due to various interferences in the environment, so that the detection results based on the signal characteristics have erroneous judgment.
In order to reduce the speed estimation error caused by integral accumulation drift, on the basis of a gait phase detection algorithm, the drift generated by integrating the speed integral mean value in a fixed window during a supporting phase is taken as the window equivalent time, and is removed when the swing phase integral estimates the speed displacement (namely speed displacement calibration), the general thought is as follows: during a gait cycle, the integration speed average is updated for each fixed time window during the stance phase, taking this average as the drift error. During the wobble phase, the integrated speed subtracts the last updated drift error.
The displacement fusion unit 500 is configured to receive and fuse the motion parameters of each limb, the initial operation parameters of the human motion capture device, and the displacement of the human body relative to the ground, and derive and output the posture and position information of the overall motion of the human body. The method comprises the following steps: the motion parameters of each limb of the human body are converted from the sensor coordinate system 210 to the body coordinate system 220 by using the initial operation parameters of the human body motion capture device, so that the motion parameters of each limb in the body coordinate system 220 are obtained, and the motion parameters and the displacement of the human body relative to the ground are generated together to output the gesture and the position information of the overall motion of the human body.
Here, we will describe the workflow and system structure of the motion capture device of the present invention using the micro sensor nodes consisting of three-dimensional micro accelerometers, three-dimensional micro gyroscopes and three-dimensional micro magnetometers as an example.
Fig. 2 is a detailed construction diagram of the motion measuring unit 100 of the human motion capture device, which simultaneously gives a signal acquisition and a processing flow. The motion measurement unit 100 is composed of a plurality of micro sensor nodes and one or several control units. The microsensor node may be connected to the control unit in a wired manner of a data bus, which in turn is connected to a host computer, either desktop or portable, in a wireless or wired manner. The motion parameter estimation unit 200 is implemented in software on a host computer. One complete data measurement s flow is: assuming that the sampling rate is fs hz, in each 1/fs second time slot, the motion measurement unit 100 will complete the following actions, firstly, the control unit sends a data acquisition instruction, and the microcontroller on the micro sensor node starts to acquire data after receiving the instruction; after the acquisition is completed, the control unit sequentially receives the data of each micro sensor node through the data interface; after collecting the data of all the micro sensor nodes, the control unit compresses and packages the data and sends the data to the communication interface.
In the above method, three-dimensional micromagnetometers, three-dimensional acceleration sensors, and three-dimensional gyroscopes are all optional. Depending on the application, one or two of them may be selected, or even none of them may be selected, and the corresponding hardware may be deleted, so as to form a new implementation method.
The human motion capture device of the present invention, as shown in fig. 3, shows a distribution diagram of a plurality of micro sensor nodes for each important limb of a human body, the limbs comprising: head, upper waist, middle waist, lower waist, left upper arm, left forearm, left hand, right upper arm, right forearm, right hand, left thigh, left calf, left foot, right thigh and right foot, require 16-20 microsensor nodes. The number of the micro sensor nodes can be increased or reduced according to application requirements; the placement position and the direction of each micro sensor node are not fixed and can be adjusted according to application requirements. In this example, each micro-sensor node includes a microcontroller, a three-dimensional accelerometer, a three-dimensional gyroscope, and a three-dimensional magnetometer.
If the motion measuring unit 100 is worn on each limb of the human body, the motion parameters of each limb of the human body can be measured and estimated. If we have built a skeletal model of the human body, real-time capture and reproduction of human body movements is possible. However, if the motion measurement unit 100 is worn on each limb, if the initialization of the motion capture device is not performed, the measurement and estimation result will be affected by the difference of each wearing position, and meanwhile, the overall posture and position of the human body cannot be directly obtained by the distributed measurement.
For human motion capture devices using high precision cameras, everything is done in the geodetic coordinate system. Three coordinate systems and their relationships in a human motion capture device are shown in fig. 5, including: a sensor coordinate system 210, a body coordinate system 220, and a global coordinate system 230. The sensor coordinate system 210 is a coordinate system of each micro sensor node in the motion measurement unit 100, each micro sensor node in the motion measurement unit 100 has a coordinate system independent from other nodes, measurement data of each node is obtained under the own sensor coordinate system, and the motion parameters of each limb estimated by the motion parameter estimation unit 200 are also relative to the coordinate system of the micro sensor node attached to each limb; the body coordinate system 220 is the coordinate system of each limb of the human body; global coordinate system 230 is a geodetic coordinate system. In order to unify motion data and environmental data measured by each micro sensor node in the motion measurement unit 100 under respective sensor coordinate systems, which are measured in a distributed manner, the motion parameters of all limbs are unified under one coordinate system frame to obtain the overall motion of the human body, and remove the influence of different wearing positions of the motion measurement unit 100 on motion capture and estimation, the initialization unit 300 of the present invention sets initial values and initial parameters of the human body motion capture device according to the mutual limitation condition and motion boundary condition of each limb of the human body. The mutual limitation condition among the limbs of the human body is that the limbs are connected and linked, such as the left thigh is connected with the left calf which is connected with the left foot, the left thigh can drive the left calf and the left foot to move, but the left thigh and the left foot are not directly connected together, and the movement of the left thigh can only drive the foot to move together through the left calf. In addition, the motion of the person exists in the geodetic coordinate system, and the motion of the person includes two parts, namely, the motion of each limb in the body coordinate system 220 and the three-dimensional displacement of the person in the global coordinate system 230, that is, the motion of the body coordinate system 220 relative to the global coordinate system 230. The invention takes gait as clues to deduce the three-dimensional displacement of a human body relative to a geodetic coordinate system.
The motion parameter estimation unit 200, the initialization unit 300, the gait detection unit 400 and the displacement fusion unit 500 will be described in detail below; in describing the motion parameter estimation unit 200, the present invention will be described by taking a three-dimensional angle estimation method of complementary kalman filtering as an example only:
as above, each microsensor node includes three microsensors, a microcontroller and three-dimensional accelerometer, three-dimensional gyroscope, and three-dimensional magnetometer. By using the three-dimensional angle estimation method of the complementary Kalman filtering provided by the invention, the accurate three-dimensional angle of the limb measured by each micro sensor node under the sensor coordinate system 210 can be estimated.
The three-dimensional micro accelerometer measures three-dimensional acceleration measurement data, and gravity acceleration measurement data is measured under the static condition, so that the rotation angle of the micro sensor node relative to the horizontal plane can be provided as an inclination angle (Pitch) and a Roll angle (Roll);
the three-dimensional magnetometer measures three-dimensional magnetic field intensity measurement data, can provide Yaw angle (Yaw) of the micro sensor node around the vertical direction, and is similar to a compass in principle;
the three-dimensional gyroscope measures three-dimensional angular velocity measurement data, and three-dimensional rotation angles can be obtained by integrating the three-dimensional angular velocity measurement data.
The accuracy of three kinds of microsensor data measured by the microsensor nodes is affected by multiple aspects, namely, the measurement accuracy and errors of three kinds of microsensors, and the microsensors inevitably have errors in the process of converting physical information into digital signals. Secondly, as the data measured by the micro sensor is disturbed, for example, three-dimensional acceleration measurement data is obtained by measuring by a three-dimensional micro accelerometer, and gravity acceleration measurement data is obtained under static or quasi-static conditions, but larger human motion acceleration is introduced when a human body moves rapidly; 3. the dimensional micromagnetometer measures three-dimensional earth magnetic field intensity measurement data, but the actual measurement data can be interfered by magnetic fields of surrounding ferromagnetic substances. In addition, the accumulation of digital signal errors produces a drift error that gradually accumulates over time when integrating the three-dimensional angular velocity measurement data to angle. Therefore, in the process of information fusion of the three data to obtain angle information, denoising, calibration and temperature compensation pretreatment are needed to be carried out on the micro sensor data, and then various possible interference conditions are needed to be considered to extract available information as far as possible. On the other hand, the real-time performance of the system requires the fusion method, and the calculation complexity can be as low as possible under the condition of ensuring the accuracy of estimation.
Based on the above analysis, the method for estimating the angle of the complementary kalman filter provided by the motion parameter estimation unit 200 according to the present invention, the flowchart of which is shown in fig. 4,
the raw observed data measured by the motion measurement unit 100 is first subjected to a preprocessing operation including denoising, calibration, and temperature compensation. After the preprocessing, the method of estimating other motion parameters by the motion parameter estimation unit 200 is similar to the above, and will not be described again here. The motion parameters estimated by the motion parameter estimation unit 200 are the motion parameters and the environmental parameters of the limb measured by the micro sensor node under the sensor coordinate system 210 at the time t, and may include three-dimensional angles, that is, accurate three-dimensional angles estimated by the three-dimensional angle estimation method of the complementary kalman filter, and may also include three-dimensional position estimation, three-dimensional velocity estimation, three-dimensional acceleration estimation, three-dimensional angle estimation and three-dimensional magnetic field intensity estimation. If each limb of the human body wears n micro sensor nodes in the motion measurement unit 100, the three-dimensional azimuth angle of each limb can be obtained by adopting a three-dimensional angle estimation method of complementary Kalman filtering. However, the three-dimensional azimuth angle of each limb estimated by the motion parameter estimation unit 200 is in a sensor coordinate system relative to the micro sensor node attached to each limb, and the azimuth angles of all limbs must be unified to one coordinate system frame to obtain the overall motion gesture and position information of the human body. In addition, wearing the motion measurement 110 to each limb, each wearing a different location affects the measurement and estimation,
whereby the initialization unit estimates initial operating parameters other than the human motion capture device. In order to obtain the movement of the human body with respect to the earth, it is necessary to estimate the three-dimensional displacement of the human body in the earth coordinate system. The displacement can be obtained by double integration of the acceleration of the three-dimensional motion of each limb of the human body. However, since the three-dimensional acceleration measurement data estimated by the motion parameter estimation unit is the vector sum of the gravitational acceleration and the human motion acceleration in the sensor coordinate system 210, the human motion acceleration component needs to be stripped first, and the three-dimensional angle obtained by the motion parameter estimation unit 200 is used to convert the three-dimensional acceleration component from the sensor coordinate system 210 to the global coordinate system 230, and then the three-dimensional acceleration measurement data is integrated twice to obtain the displacement of each limb of the human body in the motion process. However, due to the unknown integration constants, the existence of errors in the three-dimensional angle information estimated by the motion parameter estimation unit 200, and the existence of drift of the three-dimensional micro accelerometer itself, the errors in the integrated position may increase over time to an accumulation of square coefficients, and the position obtained after several seconds is quite unreliable.
To constrain the drift from growing indefinitely during integration, algorithms that introduce motion boundary conditions, such as zero velocity updates, are required. The present invention provides a gait detection algorithm based on signal characteristics, namely a gait detection unit 400 to detect a motion state. The basic method of the zero-speed updating algorithm is that when a human body performs walking, side-walking, sliding, stepping, running and jumping movements on the horizontal ground, the three-dimensional speed and the three-dimensional angular speed of the landing foot are zero, and the time of the landing foot is called as a supporting phase; during the estimation, the foot in the support phase is at rest, and the foot speed during this period is set to zero, thereby limiting the accumulation of errors to each stride of walking, reducing the accumulation of errors. The unit detects the motion state and the gait phase in real time, specifically converts the original data under the sensor coordinate system into the global coordinate system, and simultaneously separates out the motion acceleration. On the basis, useful sensor signals (acceleration and gyroscope or one of the acceleration and gyroscope) are extracted, the gait phase is roughly detected by adopting a threshold detection method, and most of threshold selection adopts off-line selection, and an adaptive threshold selection mode (needing to be finished off-line) can also be adopted.
As shown in FIG. 6, a flow chart of the displacement fusion unit method of the invention is that a gait detection unit 400 detects a limb landed by a human body and determines landing feet of the human body; detecting gait time parameters according to the motion boundary conditions; and further fusing the motion parameters and the environment parameters of each limb of the human body, which are captured and deduced by the motion parameter estimation unit 200, the initial operation parameters of the human body motion capture device, which are deduced by the initial operation parameters of the initialization unit 300, the length of each limb and the mutual limitation conditions between each limb, and transmitting the data of the motion boundary conditions to each limb of the human body by using the human body kinematics through the parameters, thereby solving the displacement of the human body relative to the ground. The example describes the pelvis as a root node, and illustrates the basic method of human kinematics transmission by taking the root node as an example, and is related to each limb of the lower limb of the human body. The lower limb structure of the displacement estimation unit shown in fig. 7 comprises pelvis, left and right femur, left and right tibia and left and right foot.
It should be appreciated that in the above description of exemplary embodiments of the invention, various features of the invention are sometimes described in the context of a single embodiment or with reference to a single figure in order to streamline the invention and aid those skilled in the art in understanding the various aspects of the invention. The present invention should not, however, be construed as including features that are essential to the patent claims in the exemplary embodiments.
It should be understood that modules, units, components, etc. included in the apparatus of one embodiment of the present invention may be adaptively changed to arrange them in an apparatus different from the embodiment. The different modules, units or components comprised by the apparatus of the embodiments may be combined into one module, unit or component or they may be divided into a plurality of sub-modules, sub-units or sub-components.
The modules, units, or components of embodiments of the invention may be implemented in hardware, in software running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that embodiments in accordance with the present invention may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present invention may also be implemented as a computer program product or a computer readable medium for carrying out part or all of the methods described herein.

Claims (10)

1. A human motion capture device comprises a motion measurement unit, a motion parameter estimation unit, an initialization unit, a gait detection unit and a displacement fusion unit, wherein,
the motion measuring unit is used for measuring the motion data and the environment data of the limbs of the human body;
the motion parameter estimation unit is used for fusing the human body limb motion data and the environment data and deducing the motion parameters and the environment parameters of the limbs; in the derivation, a complementary Kalman filtering method is adopted to fuse the motion data of each limb with the environmental data, and three-dimensional angle estimation values in the motion parameters of each limb are derived; the motion parameters of each limb comprise a three-dimensional acceleration estimation value, a three-dimensional speed estimation value, a three-dimensional displacement estimation value, a three-dimensional angular speed estimation value and a three-dimensional angle estimation value, and the environment parameters comprise a three-dimensional magnetic field intensity estimation value; an error model is established by using a complementary Kalman filtering algorithm, errors of attitude errors, displacement errors, speed errors, gyroscope bias and magnetic interference are used as state variables, a feedback mechanism is adopted in an inertial navigation system, an accelerometer observation variable is utilized to compensate for inclination errors in gyroscope integration, a magnetometer observation value is utilized to compensate for magnetic field component errors in gyroscope integration, a human body forward kinematics theory is combined, gait detection information is fused, zero speed update and zero angular speed update can be executed after a supporting phase is detected, and a covariance matrix of the algorithm is modified, so that the system estimation state is corrected;
the initialization unit is used for fusing the mutual limiting conditions and the motion boundary conditions between the limbs of the human body to derive the initial operation parameters of the human body motion capture device;
the gait detection unit is used for detecting the touchdown state of the lower limb of the human body at the current moment to obtain gait detection information;
the displacement fusion unit is used for receiving and fusing the motion parameters of each limb of the human body, the initial operation parameters of the human body motion capture device, the length of each limb, the mutual limiting conditions among the limbs and gait detection information, and deducing and outputting the posture and position information of the overall motion of the human body relative to the ground.
2. The human motion capture device of claim 1, wherein the motion measurement unit comprises a plurality of microsensor nodes and at least one control unit, wherein,
the miniature sensor nodes are used for sampling and measuring limb movement data and environment data;
the control unit acquires the data of each micro sensor node and sends the data to the motion parameter estimation unit.
3. The human motion capture device of claim 1, wherein the motion parameter estimation unit uses a complementary kalman filter system to fuse each limb motion data with the environmental data to derive the three-dimensional angle estimate in each limb motion parameter.
4. The human motion capture device of claim 3, wherein the gait detection unit is further configured to update the state of the complementary kalman filter system to eliminate accumulated errors generated after the human body is lifted off the ground.
5. The human motion capture device of claim 3, wherein the gait detection unit comprises a support phase detection module that determines which gait phase the foot node is in based on raw IMU sensing data and an azimuth angle of a currently predicted sensor coordinate system relative to a global coordinate system, and based on sensor signal characteristics.
6. The human motion capture device of claim 5, wherein the gait detection unit performs a gait detection algorithm based on the signal characteristics that compensates for errors based on raw sensor data for each lower limb node, the pose of each lower limb sensor coordinate system relative to the global coordinate system, and the state of the complementary kalman filter system output.
7. The human motion capture device of claim 6, wherein the gait detection unit based on a gait phase detection algorithm uses a mean value of velocity integration within a fixed window during the support phase as a drift generated by integration within a window equivalent time, and removes the drift when the swing phase integration estimates the velocity displacement.
8. The human motion capture device of claim 6, wherein the displacement fusion unit converts motion parameters of each limb of the human body from a sensor coordinate system (210) to a body coordinate system (220) by using initial operation parameters of the human motion capture device, obtains each limb motion parameter in the body coordinate system (220), and generates and outputs posture and position information of the overall motion of the human body together with the displacement of the human body relative to the ground.
9. A human motion capture method comprising:
measuring human body limb movement data and environment data;
fusing the human body limb movement data and the environment data to deduce limb movement parameters and environment parameters; in the derivation, a complementary Kalman filtering method is adopted to fuse the motion data of each limb with the environmental data, and three-dimensional angle estimation values in the motion parameters of each limb are derived; the motion parameters of each limb comprise a three-dimensional acceleration estimation value, a three-dimensional speed estimation value, a three-dimensional displacement estimation value, a three-dimensional angular speed estimation value and a three-dimensional angle estimation value, and the environment parameters comprise a three-dimensional magnetic field intensity estimation value; using a complementary Kalman filtering algorithm to establish an error model, taking errors of attitude errors, displacement errors, speed errors, gyroscope bias and magnetic interference as state variables, adopting a feedback mechanism in an inertial navigation system, utilizing accelerometer observation variables to compensate inclination errors in gyroscope integration, utilizing magnetometer observation values to compensate magnetic field component errors in gyroscope integration, combining a human forward kinematics theory, fusing gait detection information, and when a supporting phase is detected, performing zero-speed updating) and zero-angle speed updating, and modifying a covariance matrix of the algorithm, thereby correcting the estimation state of the system;
fusing the mutual limiting conditions and the motion boundary conditions between the limbs of the human body to derive the initial operation parameters of the human body motion capture device;
detecting the touchdown state of the lower limb of the human body at the current moment to obtain gait detection information;
and receiving and fusing the motion parameters of each limb of the human body, the initial operation parameters of the human body motion capture device, the length of each limb, the mutual limiting conditions among the limbs and gait detection information, and deducing and outputting the posture and position information of the overall motion of the human body relative to the ground.
10. The human motion capture method of claim 9, wherein the motion parameter estimation unit uses a complementary kalman filter system to fuse the motion data of each limb with the environmental data to derive the three-dimensional angle estimate in each limb motion parameter.
CN201811334444.3A 2018-11-09 2018-11-09 Human motion capturing device and method Active CN109284006B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811334444.3A CN109284006B (en) 2018-11-09 2018-11-09 Human motion capturing device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811334444.3A CN109284006B (en) 2018-11-09 2018-11-09 Human motion capturing device and method

Publications (2)

Publication Number Publication Date
CN109284006A CN109284006A (en) 2019-01-29
CN109284006B true CN109284006B (en) 2024-01-16

Family

ID=65175274

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811334444.3A Active CN109284006B (en) 2018-11-09 2018-11-09 Human motion capturing device and method

Country Status (1)

Country Link
CN (1) CN109284006B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109781104B (en) * 2019-01-31 2021-06-08 深圳创维数字技术有限公司 Motion attitude determination and positioning method and device, computer equipment and medium
CN111353543B (en) * 2020-03-04 2020-09-11 镇江傲游网络科技有限公司 Motion capture data similarity measurement method, device and system
CN112562068B (en) * 2020-12-24 2023-07-14 北京百度网讯科技有限公司 Human body posture generation method and device, electronic equipment and storage medium
CN115024715B (en) * 2022-05-20 2023-06-06 北京航天时代光电科技有限公司 Human motion intelligent measurement and digital training system
CN116019442A (en) * 2022-12-12 2023-04-28 天津大学 Motion posture assessment system based on UWB/IMU fusion

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101579238A (en) * 2009-06-15 2009-11-18 吴健康 Human motion capture three dimensional playback system and method thereof
CN102323854A (en) * 2011-03-11 2012-01-18 中国科学院研究生院 Human motion capture device
CN104613964A (en) * 2015-01-30 2015-05-13 中国科学院上海高等研究院 Pedestrian positioning method and system for tracking foot motion features
CN105043385A (en) * 2015-06-05 2015-11-11 北京信息科技大学 Self-adaption Kalman filtering method for autonomous navigation positioning of pedestrians

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101579238A (en) * 2009-06-15 2009-11-18 吴健康 Human motion capture three dimensional playback system and method thereof
CN102323854A (en) * 2011-03-11 2012-01-18 中国科学院研究生院 Human motion capture device
CN104613964A (en) * 2015-01-30 2015-05-13 中国科学院上海高等研究院 Pedestrian positioning method and system for tracking foot motion features
CN105043385A (en) * 2015-06-05 2015-11-11 北京信息科技大学 Self-adaption Kalman filtering method for autonomous navigation positioning of pedestrians

Also Published As

Publication number Publication date
CN109284006A (en) 2019-01-29

Similar Documents

Publication Publication Date Title
CN109284006B (en) Human motion capturing device and method
US11402402B2 (en) Systems and methods for human body motion capture
KR101751760B1 (en) Method for estimating gait parameter form low limb joint angles
Zihajehzadeh et al. A novel biomechanical model-aided IMU/UWB fusion for magnetometer-free lower body motion capture
CN108939512B (en) Swimming posture measuring method based on wearable sensor
US20100194879A1 (en) Object motion capturing system and method
Yun et al. Estimation of human foot motion during normal walking using inertial and magnetic sensor measurements
Zhou et al. Reducing drifts in the inertial measurements of wrist and elbow positions
AU2009251176B2 (en) Visual and physical motion sensing for three-dimensional motion capture
CN102323854B (en) Human motion capture device
CN101579238B (en) Human motion capture three dimensional playback system and method thereof
US9357948B2 (en) Method and system for determining the values of parameters representative of a movement of at least two limbs of an entity represented in the form of an articulated line
US20180089841A1 (en) Mixed motion capture system and method
Young et al. IMUSim: A simulation environment for inertial sensing algorithm design and evaluation
Zheng et al. Pedalvatar: An IMU-based real-time body motion capture system using foot rooted kinematic model
Chen et al. Real-time human motion capture driven by a wireless sensor network
CN110609621B (en) Gesture calibration method and human motion capture system based on microsensor
Sun et al. Adaptive sensor data fusion in motion capture
CN113793360B (en) Three-dimensional human body reconstruction method based on inertial sensing technology
CN108338791A (en) The detection device and detection method of unstable motion data
Salehi et al. Body-IMU autocalibration for inertial hip and knee joint tracking
Hindle et al. Inertial-based human motion capture: A technical summary of current processing methodologies for spatiotemporal and kinematic measures
GB2466714A (en) Hybrid visual and physical object tracking for virtual (VR) system
Ahmadi et al. Human gait monitoring using body-worn inertial sensors and kinematic modelling
KR102229070B1 (en) Motion capture apparatus based sensor type motion capture system and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant