CN113268141A - Motion capture method and device based on inertial sensor and fabric electronics - Google Patents

Motion capture method and device based on inertial sensor and fabric electronics Download PDF

Info

Publication number
CN113268141A
CN113268141A CN202110535501.XA CN202110535501A CN113268141A CN 113268141 A CN113268141 A CN 113268141A CN 202110535501 A CN202110535501 A CN 202110535501A CN 113268141 A CN113268141 A CN 113268141A
Authority
CN
China
Prior art keywords
human body
posture
inertial sensor
fabric
body part
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110535501.XA
Other languages
Chinese (zh)
Other versions
CN113268141B (en
Inventor
张衡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest University
Original Assignee
Southwest University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest University filed Critical Southwest University
Priority to CN202110535501.XA priority Critical patent/CN113268141B/en
Publication of CN113268141A publication Critical patent/CN113268141A/en
Application granted granted Critical
Publication of CN113268141B publication Critical patent/CN113268141B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/22Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring angles or tapers; for testing the alignment of axes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation

Abstract

The application discloses a motion capture method based on an inertial sensor and fabric electronics, which is characterized in that a mode of combining the inertial sensor and the fabric electronics is adopted for data acquisition, so that the error problem caused by non-lamination of the inertial sensor is avoided, and the wearing comfort level is improved; in addition, the method provides a simplified 4-degree-of-freedom limb model according to the body movement characteristics, describes the rotation angle of a first human body part relative to the human body or the rotation angle of a second human body part relative to the first human body part by using quaternion, integrates the advantages of capturing a 3-dimensional posture by an inertial sensor, and captures the limb space posture; finally, the method can correct the lower limb posture through the reverse kinematic posture fitting, and effectively inhibits the abnormal twisting action caused by the drift of the inertial sensor. In addition, the application also provides a motion capture device, equipment and a readable storage medium based on the inertial sensor and the fabric electronics, and the technical effect of the motion capture device corresponds to that of the method.

Description

Motion capture method and device based on inertial sensor and fabric electronics
Technical Field
The present application relates to the field of computer technologies, and in particular, to a motion capture method, device, and apparatus based on an inertial sensor and fabric electronics, and a readable storage medium.
Background
Motion capture (motion capture) refers to a technique of measuring, tracking and recording the motion trajectory of each part of the human body in three-dimensional space by using sensors, and representing and storing the motion trajectory in digital form. With the development of motion capture technology, the method is applied to a plurality of fields such as movie and television, virtual reality, man-machine interaction, games and medical rehabilitation.
A simple motion capture pioneer is generally considered to be a technology called Rotoscoping (Rotoscoping) developed by american animators, maxfleischer, 1915, during the first world war, which draws a motion image actually taken as a motion picture base on which the animators draw required motions frame by frame. "white snow princess and seven dwarfs" shown in 1939 is a movie using rotoscope technology for the first time. The human motion capture technology developed up to now includes: (1) mechanical capture device: the mechanical dynamic capturing system comprises a plurality of rigid connecting rods and joints, the position and the motion trail of the tail end point of the rod piece in the space can be calculated by measuring the angle of the joints through an angle sensor arranged in the joints, and the system is not interfered by the external environment, has low cost and high precision, but hinders the movement of a user. (2) Electromagnetic type: the electromagnetic motion capture system is composed of a transmission source, a receiver and a data processing unit. The emitting source can generate electromagnetic fields with time regularity, and the receivers are arranged at different positions of the human body. The advantage is that it is simple to use, robust and real-time, but is affected by ferrous metal in floors, walls and ceilings and noise sources in the capture area, and metallic objects and stray magnetic fields inside the operating space can degrade performance. (3) Acoustic wave type: the principle is that the position and the posture of a human body are calculated according to the delay and the phase deviation of ultrasonic waves, and the method is commonly used for short-distance measurement. The system consists of three parts, namely a transmitter, a receiver and a processor. The advantages are low cost, but low real-time performance, poor accuracy, susceptibility to obstacles between the receiver and the transmitter, large noise and multiple reflections that also interfere with their performance, and propagation speed in air that is affected by air pressure, humidity and temperature. (4) The optical type is a kinetic capture technology which is widely used at present, and is a kinetic capture system which is mature in technology, highest in precision and high in price. The motion trajectory is captured by tracking a specific spot. The system has the advantages of low latency and high accuracy, but places light requirements and the markers may be hidden due to shadowing effects. (5) The inertial sensor type: the inertial motion capture system is a low cost MEMS sensor based motion capture system. The space position of the object is obtained according to the acceleration double integral, so that the method has strong autonomy and is not easily interfered by the external environment, but the inertial sensor can generate larger accumulated drift error when time passes, and the traditional auxiliary algorithm can only reduce the drift error as much as possible.
One existing motion capture scheme is an optical-based motion capture system. The VR virtual reality motion capture system includes a plurality of optical motion capture cameras for capturing displacements of body parts in three dimensions from the marker points. The optical camera is expensive, the optical camera captures the motion, the effect of a three-dimensional scene in the later period and other three-dimensional figures or props respectively synthesized in the later period cannot be confirmed in real time, so the performance needs to be performed in a motion capture chamber in a blank space, and in addition, the positions of the marks cannot be accurately captured by the camera due to the shielding of the marks by limbs. Optical motion capture systems are therefore not suitable for everyday use in general.
Another existing motion capture scheme is an inertial sensor-based motion capture system. The number of the adopted data sensors is not less than 31, the simultaneous transmission is easy to delay due to excessive nodes, and each sensor adopts wired connection. First, too many inertial nodes add complexity to the wear, and weight adds to the burden on the user. Secondly, accurate conversion from a sensor coordinate system to a human body coordinate system requires that the sensor is tightly fixed on a human body during movement, and the sensor is worn on the human body by using rigid nodes, which seems reasonable, however, the non-rigid part of the human body can enable the sensor to move relative to the human body, and in order to reduce coordinate conversion errors caused by loosening of the sensor, the bandage is used to enable the sensor to be firmly fixed on each part of the human body, which can cause discomfort to the skin and is not suitable for the requirement of long-time capturing of the movement.
Yet another existing motion capture solution is an inertial sensor-based real-time motion capture AR bowling entertainment system. The motion capture is carried out by only installing an inertial sensor on a hand, only the motion of the hand is captured, but the real-time movement of the foot is not considered, and the motion of the whole human body cannot be captured. The inertial sensor is greatly influenced by a local magnetic field and metal, and is uncomfortable to wear, so that the requirement of capturing the whole body action in real time cannot be met.
In summary, the current human motion capture scheme suffers from several disadvantages: low real-time performance, high price, uncomfortable wearing, low precision and high requirement on environment.
Disclosure of Invention
The application aims to provide a motion capture method, a motion capture device, motion capture equipment and a readable storage medium based on an inertial sensor and fabric electronics, and aims to solve the problems of low real-time performance, uncomfortable wearing, low precision and high cost of the current human motion capture scheme. The specific scheme is as follows:
in a first aspect, the present application provides a motion capture method based on inertial sensors and fabric electronics, comprising:
acquiring inertial data by using an inertial sensor deployed on a first human body part, and acquiring activity angle data by using a fabric electronic device deployed on a second human body part;
respectively converting the inertia data and the activity angle data into quaternion forms to obtain human postures so as to describe the rotation angle of the first human body part relative to the human body and the rotation angle of the second human body part relative to the first human body part;
determining a lower limb posture according to the current foot position and the human body posture, wherein the lower limb posture comprises a mass center position; determining the current step length according to the preset leg length and the current centroid height; correcting the lower limb posture through reverse kinematic posture fitting according to the current step length and the centroid position;
and presenting the human body action according to the human body posture and the corrected lower limb posture.
Preferably, the inertial sensors are deployed at the head, waist, upper arms and thighs, and the fabric is electronically deployed at the leg joints and elbow joints.
Preferably, after the acquiring inertial data by using the inertial sensor deployed on the first human body part and the electronically acquiring activity angle data by using the fabric deployed on the second human body part, the method further comprises:
and acquiring the inertia data and the activity angle data based on a wireless communication technology.
Preferably, after the converting the inertial data and the motion angle data into quaternion forms respectively to obtain the human body posture, the method further includes:
and correcting the human body posture according to preset motion constraint conditions among the human body parts.
Preferably, the determining the posture of the lower limb according to the current foot position and the posture of the human body includes:
determining the current knee joint position according to the current foot position and the human body posture;
determining a centroid position according to the current knee joint position and the human body posture;
determining the position of the other knee joint according to the position of the mass center and the posture of the human body;
determining the position of the other foot according to the position of the other knee joint and the posture of the human body;
and taking the current foot position, the current knee joint position, the mass center position, the other knee joint position and the other foot position as the lower limb postures.
Preferably, the presenting the human body action according to the human body posture and the corrected lower limb posture includes:
and establishing a character model through Unity 3D, and mapping the human body posture and the corrected lower limb posture to the human body model so as to present human body actions.
In a second aspect, the present application provides a motion capture device based on inertial sensors and fabric electronics, comprising:
a data acquisition module: the device comprises a first human body part, a second human body part, a fabric, a sensor and a controller, wherein the first human body part is used for collecting inertial data by using the inertial sensor arranged on the first human body part, and the second human body part is used for collecting activity angle data by using the fabric arranged on the second human body part;
human body posture determination module: the inertial data and the activity angle data are respectively converted into quaternion forms to obtain human postures so as to describe the rotation angle of the first human body part relative to the human body and the rotation angle of the second human body part relative to the first human body part;
a lower limb posture correction module: the lower limb posture is determined according to the current foot position and the human body posture, wherein the lower limb posture comprises a mass center position; determining the current step length according to the preset leg length and the current centroid height; correcting the lower limb posture through reverse kinematic posture fitting according to the current step length and the centroid position;
an action presenting module: and the human body action display module is used for displaying the human body action according to the human body posture and the corrected lower limb posture.
In a third aspect, the present application provides a motion capture device based on inertial sensors and fabric electronics, comprising:
a memory: for storing a computer program;
a processor: for executing the computer program to implement the inertial sensor and fabric electronics based motion capture method as described above.
In a fourth aspect, the present application provides a readable storage medium having stored thereon a computer program for implementing an inertial sensor and fabric electronics based motion capture method as described above when executed by a processor.
The application provides a motion capture method based on an inertial sensor and fabric electronics, which comprises the following steps: acquiring inertial data by using an inertial sensor deployed on a first human body part, and acquiring activity angle data by using a fabric electronic device deployed on a second human body part; respectively converting the inertia data and the activity angle data into quaternion forms to obtain human postures so as to describe the rotation angle of the first human body part relative to the human body and the rotation angle of the second human body part relative to the first human body part; determining the lower limb posture according to the current foot position and the human body posture, wherein the lower limb posture comprises a mass center position; determining the current step length according to the preset leg length and the current centroid height; correcting the posture of the lower limb through reverse kinematic posture fitting according to the current step length and the position of the mass center; and presenting the human body action according to the human body posture and the corrected lower limb posture.
Therefore, firstly, the method adopts a mode of combining the inertial sensor and the fabric electronics for data acquisition, and compared with a mode of completely adopting the inertial sensor, the method solves the problem that the inertial sensor cannot be well attached to a human body, avoids the error problem caused by non-attachment and improves the wearing comfort level; secondly, according to the body movement characteristics, a simplified 4-degree-of-freedom limb model is provided, quaternion is used for describing the rotation angle of a first human body part relative to a human body or the rotation angle of a second human body part relative to the first human body part, so that the fabric can replace part of inertial nodes in an electronic compromise mode, the advantages of an inertial sensor for capturing 3-dimensional postures are fused, and the limb space postures are captured; thirdly, considering the congenital defect that the inertial sensor has inertial drift, the method can correct the posture of the lower limb through reverse kinematic posture fitting, and effectively inhibit abnormal twisting action caused by the inertial sensor drift and other reasons.
In addition, the application also provides a motion capture device, equipment and a readable storage medium based on the inertial sensor and the fabric electronics, and the technical effect of the motion capture device corresponds to that of the method, and the detailed description is omitted here.
Drawings
For a clearer explanation of the embodiments or technical solutions of the prior art of the present application, the drawings needed for the description of the embodiments or prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flowchart of a first embodiment of a motion capture method based on inertial sensors and fabric electronics provided in the present application;
fig. 2 is a schematic diagram of a motion capture system according to a second embodiment of the motion capture method based on inertial sensors and fabric electronics provided in the present application;
fig. 3 is a schematic view of a whole body sensor wearing in a second embodiment of the motion capture method based on inertial sensors and fabric electronics provided in the present application;
fig. 4 is a schematic view of a step calculation model according to a second embodiment of the motion capture method based on inertial sensors and fabric electronics provided by the present application;
fig. 5 is a schematic diagram of a human lower limb rigid body hinge model according to a second embodiment of the motion capture method based on inertial sensors and fabric electronics provided by the present application;
fig. 6 is a flow chart of a real-time motion mapping of a second embodiment of the motion capture method based on inertial sensors and fabric electronics provided in the present application;
fig. 7 is a functional block diagram of an embodiment of an inertial sensor and fabric electronics based motion capture device as provided herein.
Detailed Description
In order that those skilled in the art will better understand the disclosure, the following detailed description will be given with reference to the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The core of the application is to provide a motion capture method, a motion capture device, motion capture equipment and a readable storage medium based on an inertial sensor and fabric electronics, wherein the inertial sensor and the fabric electronics are combined to acquire data, so that errors caused by non-fit of the inertial sensor are avoided, and the wearing comfort level is improved; providing a simplified 4-degree-of-freedom limb model according to the body movement characteristics, fusing the advantages of capturing a 3-dimensional posture by an inertial sensor, and capturing a limb space posture; the lower limb posture is corrected through reverse kinematic posture fitting, and the generation of uncoordinated motion is effectively inhibited.
Referring to fig. 1, a first embodiment of a motion capture method based on an inertial sensor and fabric electronics provided in the present application is described below, where the first embodiment includes:
s101, acquiring inertial data by using an inertial sensor deployed on a first human body part, and acquiring activity angle data by using fabric electronics deployed on a second human body part;
s102, respectively converting the inertia data and the activity angle data into quaternion forms to obtain human body postures so as to describe the rotation angle of the first human body part relative to the human body and the rotation angle of the second human body part relative to the first human body part;
s103, determining the posture of the lower limb according to the current foot position and the posture of the human body, wherein the posture of the lower limb comprises a mass center position; determining the current step length according to the preset leg length and the current centroid height; correcting the posture of the lower limb through reverse kinematic posture fitting according to the current step length and the position of the mass center;
and S104, presenting the human body action according to the human body posture and the corrected lower limb posture.
The embodiment simultaneously adopts the combination of the inertial sensor and the fabric electronics for data acquisition. The fabric electronic is an electronic device developed on the basis of fabric, aims to integrate ubiquitous electronic and computing elements into the fabric, has the inherent style and wearability of common textile materials, can sense environmental changes, can change one or more performance parameters of the fabric in real time, and is a novel material which is self-adjusted in a manner of adapting to the environment. The development of present fabric electronics will be favorable to wearable development of calculating greatly, and flexible tensile strain sensor based on fabric electronics is a sensor that can produce corresponding capacitance change to tensile strain, and the activity angle of joint angle can be obtained in real time through the resistance change that the joint is tensile produces and the mapping relation of angle.
The fabric electronics is worn on the human body in a non-invasive manner, and can replace inertial nodes at specific places of the human body, so that the wearing burden of a user is reduced, long-time action capture is possible, and the system hardware cost is reduced.
According to kinematic analysis, the hands and legs are both 7 degrees of freedom, with the upper arm and thigh being three degrees of freedom, the lower arm and calf being two degrees of freedom, and the hands and feet being two degrees of freedom. For example, for a human hand, the large arm has 3-dimensional angle changes of yaw angle, pitch angle and roll angle, the small arm has pitch angle and roll angle changes, and the hand has pitch angle and yaw angle changes. It is noted that the roll of the hand and the roll of the forearm are closely coupled, i.e. one degree of freedom. Therefore, in the present application, the hand is not considered, and the entire hand is considered as the large arm 3 degree of freedom and the small arm 1 degree of freedom, and correspondingly, the entire leg is considered as the large leg 3 degree of freedom and the small leg 1 degree of freedom, regardless of the foot. In summary, the present application considers the hands and legs as a 4 degree of freedom structure.
Based on the characteristics of the fabric electronics and the results of the kinematic analysis as described above, the present embodiment deploys the fabric electronics at the elbow joint and the leg joint for acquiring the angle of motion of the forearm relative to the forearm or the angle of motion of the calf relative to the thigh. Accordingly, the inertial sensors are deployed at the upper arms and the thighs, and in order to acquire the posture of the whole body of the human body, part of the sensors can be deployed at the waist, the head and the like.
In summary, the first human body part may be a large arm, a large leg, a waist, a head, etc., and the second human body part may be an elbow joint and a leg joint.
After the inertial data are acquired by the inertial sensor and the activity angle data are acquired by the fabric electronics, the inertial data and the activity angle data are transmitted to one end for data analysis. In order to avoid complicated wiring, data transmission can be performed by means of wireless communication.
Then, the inertial data and the movement angle data are respectively converted into quaternion forms, and the converted data are called human body postures. Specifically, 3-dimensional angle data obtained by the inertial sensor is converted into a quaternion form, the upper arm or the thigh is rotated relative to the body, joint movement angle data obtained electronically by the fabric is converted into a quaternion form (the remaining two angles are set to be constant values), the lower arm is rotated relative to the upper arm, and the lower leg is rotated relative to the thigh. When the inertial sensors are provided on the head and the waist, the 3-dimensional angles of the head and the waist obtained by the inertial sensors are also converted into quaternions, and the head and the waist are rotated relative to the body.
In order to prevent the feet from falling into the ground or not falling to the ground, the posture of the lower limbs needs to be corrected. Specifically, the current foot position can be known according to the historical movement step length, and the whole lower limb posture can be obtained through gradual calculation according to the current foot position and the human body posture (the lower limb movement angle), wherein the lower limb posture comprises but is not limited to the centroid position. In the moving process, the height of the center of mass and the length of the leg are known quantities, and the step length can be calculated according to the height of the center of mass and the length of the leg. In the correction process, the posture of the lower limb is corrected through reverse kinematics posture fitting according to the current step length and the position of the mass center, so that the condition that the foot is underground or does not fall to the ground can be avoided.
After the correction, the human body action presentation can be performed according to the human body posture and the corrected lower limb posture, and the action presentation mode is not described in detail here.
According to the motion capture method based on the inertial sensor and the fabric electronics, data collection is carried out in a mode of combining the inertial sensor and the fabric electronics, the problem that the inertial sensor cannot be well attached to a human body is solved, the error problem caused by non-attachment is avoided, and the wearing comfort level is improved; moreover, according to the body movement characteristics, the method provides a simplified 4-degree-of-freedom limb model, describes the rotation angle of a first human body part relative to a human body or the rotation angle of a second human body part relative to the first human body part by using quaternion, enables the fabric to replace part of inertial nodes in a trade-off manner, integrates the advantages of an inertial sensor for capturing 3-dimensional postures, and captures the spatial postures of the limbs; finally, the method can correct the lower limb posture through the reverse kinematic posture fitting, and effectively inhibit abnormal twisting action caused by the drift of the inertial sensor and other reasons.
The following description starts with a detailed description of a second embodiment of the motion capture method based on inertial sensors and fabric electronics.
Referring to fig. 2, in the second embodiment, the overall structure of the motion capture system includes: the device comprises a data acquisition module, a wireless communication module, a human body posture determination module, a lower limb posture correction module and an action presentation module. The following describes each model separately.
(1) Data acquisition module
The data acquisition module adopts an inertial sensor (mpu9250) which comprises a 3-axis accelerometer, a 3-axis gyroscope and a 3-axis magnetometer. The accelerometer and the gyroscope can respectively measure 3-axis acceleration and direction, and the magnetometer can measure the heading angle of an object in a world coordinate system. The data is resolved into attitude data in a quaternion form by the master control chip stm32lo 71.
The fabric electronic is an electronic material with the flexibility and comfort of fabric, and the elastic tensile strain sensor solves the problem that an inertial sensor cannot be well attached to a human body when a non-rigid body part of the human body moves, avoids the error of converting a sensor coordinate system into a human body coordinate system caused by the non-attachment of the inertial sensor, is particularly useful for the non-rigid body part, and reduces the use of inertial nodes by the fabric joint angle measuring technology with the fabric performance.
Specifically, resistance changes generated by stretching of the sensor are acquired by bending the elbow joint and the knee joint, and a mapping relation exists between the resistance value and the angle value. Fabric electronic sensors may be specifically integrated on the elbow pad and knee pad to capture the angle of joint motion for capturing the position of the lower arm and lower leg.
As shown in fig. 3, the inertial sensors are deployed on the head, waist, thighs and large arms, and the fabric is electronically deployed on the small and small arms.
(2) Wireless communication module
The wireless communication module is a data communication module of the motion capture system. The module is a wireless data transmission module with low power consumption and high real-time performance and is responsible for transmitting inertial data of each node and digital signals of fabric electronics, which are acquired by a lower computer, to the upper computer.
(3) Human body posture determining module
For body parts with 3 degrees of freedom (head, waist, upper arm and thigh), the inertial node is used to capture its spatial position, and for body parts with 1 degree of freedom (knee joint and elbow joint), the joint movement angle is measured using a fabric electronic sensor, so that the movement angle of the lower arm relative to the upper arm or the movement angle of the lower leg relative to the thigh is obtained.
When the robot works, the 3-dimensional angle obtained by the inertial sensor is converted into a quaternion form, the upper arm or the thigh is rotated relative to the body, the joint movement angle obtained by the fabric electronics is converted into a quaternion form (the other two angles are set as fixed values), the lower arm is rotated relative to the upper arm, and the lower leg is rotated relative to the thigh. The 3-dimensional angles of the nodes of the head and waist are also converted to quaternions, causing the head and waist to rotate relative to the body.
In order to reduce the drift of the inertial sensor in motion, a whole-body kinematics model constraint is required. Specifically, according to the biokinematics model, assuming that the whole body of the human body is composed of body segments connected by joints, the arms or legs are considered to be in a structure with 3 degrees of freedom plus 1 degree of freedom, and the motion parameters of adjacent segments are fusion-estimated by providing constraint conditions (for example, elbow joints can only bend within the range of 0-150, and can not exceed or reversely bend) for the human body model, wherein the specific constraint conditions are shown in table 1.
TABLE 1
Figure BDA0003069467040000111
(4) Lower limb posture correction module
In the embodiment, a waist Pedestrian Dead Reckoning (PDR) algorithm is used for calculating the step length, namely the gravity center moves downwards when a person strides, and the gravity center moving downwards height h is obtained through double integration of the acceleration in the vertical direction. As shown in fig. 4, according to the pythagorean theorem, the step calculation formula is as follows:
Figure BDA0003069467040000112
the supporting leg is judged through the joint angle, as shown in fig. 5, the knee joint position of the leg is calculated through the supporting leg in the movement process, then the mass center position is calculated, the knee joint position of the other leg is calculated, and finally the position of the other leg in the movement process is calculated. When the moving foot touches the ground, the condition that the foot is underground or does not land is prevented by using the reverse kinematic attitude fitting according to the step length and the position of the mass center calculated in the front.
(5) Action presenting module
The action presentation module is a 3D scene interface developed by Unity 3D, establishes a character model through Unity 3D, can map the received data onto the character model after processing, and presents the whole body action of the human body in real time.
As shown in fig. 6, in practical application, the action real-time mapping process specifically includes the following steps: continuously monitoring serial port information, judging whether serial port interruption is triggered or not, if so, receiving data acquired by an inertial sensor and fabric electronics, then processing the data according to the flow, and finally mapping the processed data to a 3D model to realize action mapping.
It can be seen that the motion capture method based on the inertial sensor and the fabric electronics provided by the embodiment has at least the following advantages:
first, compared with an optical capture system, an inertial sensor is relatively low in price, is not influenced by the environment and has strong autonomy, but due to the inherent defects (inertial drift) of the inertial sensor, a long-time use result has a large error with a true value, and a traditional rigid hardware node cannot be well adapted to a non-rigid part of a human body, so that not only is the precision influenced, but also the comfort of long-time wearing is influenced. The fabric electron that this embodiment adopted has the characteristic of fabric, can adapt to non-rigid position, can not cause uncomfortable sense to the human body, can also play the guard action to key position, and the low characteristics of consumption make it use for a long time in addition, have strengthened wearable greatly.
Secondly, compared with a general inertial motion capture system, the embodiment performs data transmission through low-power wireless communication, and avoids the defect of complex wiring.
Thirdly, according to the body movement characteristics, the provided 4-degree-of-freedom movement structure model can enable flexible fabric to replace part of inertia nodes in a compromised mode, integrates the advantages of an inertia sensor for capturing 3-dimensional postures, and captures the body space postures.
Fourthly, the traditional motion capture technology does not analyze and constrain the motion characteristics of all joints of a human body, and the embodiment provides joint constraint, so that the generation of uncoordinated motion can be effectively inhibited, and the generation of abnormal twisting motion caused by the drift of an inertial sensor is inhibited.
The following describes an inertial sensor and fabric electronic based motion capture device provided in an embodiment of the present application, and the inertial sensor and fabric electronic based motion capture device described below and the inertial sensor and fabric electronic based motion capture method described above may be referred to correspondingly.
As shown in fig. 7, the motion capture device based on the inertial sensor and the fabric electronics of the present embodiment includes:
the data acquisition module 701: the device comprises a first human body part, a second human body part, a fabric, a sensor and a controller, wherein the first human body part is used for collecting inertial data by using the inertial sensor arranged on the first human body part, and the second human body part is used for collecting activity angle data by using the fabric arranged on the second human body part;
human body pose determination module 702: the inertial data and the activity angle data are respectively converted into quaternion forms to obtain human postures so as to describe the rotation angle of the first human body part relative to the human body and the rotation angle of the second human body part relative to the first human body part;
lower limb posture correction module 703: the lower limb posture is determined according to the current foot position and the human body posture, wherein the lower limb posture comprises a mass center position; determining the current step length according to the preset leg length and the current centroid height; correcting the lower limb posture through reverse kinematic posture fitting according to the current step length and the centroid position;
action presentation module 704: and the human body action display module is used for displaying the human body action according to the human body posture and the corrected lower limb posture.
The motion capture device based on the inertial sensor and the fabric electronics of the present embodiment is used to implement the motion capture method based on the inertial sensor and the fabric electronics, and therefore specific embodiments in the device can be seen in the above-mentioned embodiment parts of the motion capture method based on the inertial sensor and the fabric electronics, for example, the data acquisition module 701, the human body posture determination module 702, the lower limb posture correction module 703, and the motion presentation module 704 are respectively used to implement steps S101, S102, S103, and S104 in the motion capture method based on the inertial sensor and the fabric electronics. Therefore, specific embodiments thereof may be referred to in the description of the corresponding respective partial embodiments, and will not be described herein.
In addition, since the motion capture device based on the inertial sensor and the fabric electronics of this embodiment is used to implement the motion capture method based on the inertial sensor and the fabric electronics, the function corresponds to that of the above method, and is not described again here.
In addition, the present application also provides a motion capture device based on inertial sensors and fabric electronics, comprising:
a memory: for storing a computer program;
a processor: for executing the computer program to implement the inertial sensor and fabric electronics based motion capture method as described above.
Finally, the present application provides a readable storage medium having stored thereon a computer program for implementing an inertial sensor and fabric electronics based motion capture method as described above when executed by a processor.
The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The above detailed descriptions of the solutions provided in the present application, and the specific examples applied herein are set forth to explain the principles and implementations of the present application, and the above descriptions of the examples are only used to help understand the method and its core ideas of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (9)

1. A method for motion capture based on inertial sensors and fabric electronics, comprising:
acquiring inertial data by using an inertial sensor deployed on a first human body part, and acquiring activity angle data by using a fabric electronic device deployed on a second human body part;
respectively converting the inertia data and the activity angle data into quaternion forms to obtain human postures so as to describe the rotation angle of the first human body part relative to the human body and the rotation angle of the second human body part relative to the first human body part;
determining a lower limb posture according to the current foot position and the human body posture, wherein the lower limb posture comprises a mass center position; determining the current step length according to the preset leg length and the current centroid height; correcting the lower limb posture through reverse kinematic posture fitting according to the current step length and the centroid position;
and presenting the human body action according to the human body posture and the corrected lower limb posture.
2. The inertial sensor and fabric electronics-based motion capture method of claim 1, wherein said inertial sensors are deployed at the head, waist, upper arms and thighs and said fabric is electronically deployed at the leg joints and elbow joints.
3. The inertial sensor and fabric electronics-based motion capture method of claim 1, further comprising, after said acquiring inertial data with the inertial sensor deployed in a first body part and electronically acquiring motion angle data with the fabric deployed in a second body part:
and acquiring the inertia data and the activity angle data based on a wireless communication technology.
4. The inertial sensor and fabric electronics based motion capture method of claim 1, further comprising, after said converting said inertial data and said angular motion data into quaternion form, respectively, to obtain body pose:
and correcting the human body posture according to preset motion constraint conditions among the human body parts.
5. The inertial sensor and fabric electronics based motion capture method of claim 1 wherein said determining a lower limb pose from a current foot position and said body pose comprises:
determining the current knee joint position according to the current foot position and the human body posture;
determining a centroid position according to the current knee joint position and the human body posture;
determining the position of the other knee joint according to the position of the mass center and the posture of the human body;
determining the position of the other foot according to the position of the other knee joint and the posture of the human body;
and taking the current foot position, the current knee joint position, the mass center position, the other knee joint position and the other foot position as the lower limb postures.
6. The inertial sensor and fabric electronics based motion capture method of claim 1, wherein said performing human motion presentations from said human pose and a corrected lower limb pose comprises:
and establishing a character model through Unity 3D, and mapping the human body posture and the corrected lower limb posture to the human body model so as to present human body actions.
7. An inertial sensor and fabric electronics based motion capture device comprising:
a data acquisition module: the device comprises a first human body part, a second human body part, a fabric, a sensor and a controller, wherein the first human body part is used for collecting inertial data by using the inertial sensor arranged on the first human body part, and the second human body part is used for collecting activity angle data by using the fabric arranged on the second human body part;
human body posture determination module: the inertial data and the activity angle data are respectively converted into quaternion forms to obtain human postures so as to describe the rotation angle of the first human body part relative to the human body and the rotation angle of the second human body part relative to the first human body part;
a lower limb posture correction module: the lower limb posture is determined according to the current foot position and the human body posture, wherein the lower limb posture comprises a mass center position; determining the current step length according to the preset leg length and the current centroid height; correcting the lower limb posture through reverse kinematic posture fitting according to the current step length and the centroid position;
an action presenting module: and the human body action display module is used for displaying the human body action according to the human body posture and the corrected lower limb posture.
8. An inertial sensor and fabric electronics based motion capture device comprising:
a memory: for storing a computer program;
a processor: for executing said computer program for implementing an inertial sensor and fabric electronics based motion capture method according to any of claims 1-6.
9. A readable storage medium, having stored thereon a computer program for implementing the inertial sensor and fabric electronics based motion capture method according to any one of claims 1-6 when executed by a processor.
CN202110535501.XA 2021-05-17 2021-05-17 Motion capture method and device based on inertial sensor and fabric electronics Active CN113268141B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110535501.XA CN113268141B (en) 2021-05-17 2021-05-17 Motion capture method and device based on inertial sensor and fabric electronics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110535501.XA CN113268141B (en) 2021-05-17 2021-05-17 Motion capture method and device based on inertial sensor and fabric electronics

Publications (2)

Publication Number Publication Date
CN113268141A true CN113268141A (en) 2021-08-17
CN113268141B CN113268141B (en) 2022-09-13

Family

ID=77231238

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110535501.XA Active CN113268141B (en) 2021-05-17 2021-05-17 Motion capture method and device based on inertial sensor and fabric electronics

Country Status (1)

Country Link
CN (1) CN113268141B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115881315A (en) * 2022-12-22 2023-03-31 北京壹永科技有限公司 Interactive medical visualization system
WO2023113694A3 (en) * 2021-12-17 2023-08-17 Refract Technologies Pte Ltd Tracking system for simulating body motion

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324288A (en) * 2013-06-21 2013-09-25 武汉纺织大学 System and method for human body movement identification of combined sensor
CN104887237A (en) * 2015-04-14 2015-09-09 南昌大学 Pedestrian navigation method based on human body motion mode monitoring
CN105997097A (en) * 2016-06-22 2016-10-12 武汉纺织大学 Reproduction system and reproduction method for human lower limb movement posture
CN106889991A (en) * 2017-03-17 2017-06-27 浙江大学 A kind of flexible fabric sensor and its method for measuring human body knee joint motion
CN107898466A (en) * 2017-10-17 2018-04-13 深圳大学 A kind of limb motion based on inertial sensor catches system and method
WO2018140429A1 (en) * 2017-01-24 2018-08-02 Blacktop Labs, Llc Method, system, and device for analyzing ankle joint kinematics
CN110418626A (en) * 2016-10-17 2019-11-05 拉科鲁尼亚大学 Assist running gear
CN111318009A (en) * 2020-01-19 2020-06-23 张衡 Somatosensory health entertainment system based on wearable inertial sensing and working method thereof
CN212679569U (en) * 2020-04-26 2021-03-12 南方科技大学 Hip and knee double-joint walking aid exoskeleton

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324288A (en) * 2013-06-21 2013-09-25 武汉纺织大学 System and method for human body movement identification of combined sensor
CN104887237A (en) * 2015-04-14 2015-09-09 南昌大学 Pedestrian navigation method based on human body motion mode monitoring
CN105997097A (en) * 2016-06-22 2016-10-12 武汉纺织大学 Reproduction system and reproduction method for human lower limb movement posture
CN110418626A (en) * 2016-10-17 2019-11-05 拉科鲁尼亚大学 Assist running gear
WO2018140429A1 (en) * 2017-01-24 2018-08-02 Blacktop Labs, Llc Method, system, and device for analyzing ankle joint kinematics
CN106889991A (en) * 2017-03-17 2017-06-27 浙江大学 A kind of flexible fabric sensor and its method for measuring human body knee joint motion
CN107898466A (en) * 2017-10-17 2018-04-13 深圳大学 A kind of limb motion based on inertial sensor catches system and method
CN111318009A (en) * 2020-01-19 2020-06-23 张衡 Somatosensory health entertainment system based on wearable inertial sensing and working method thereof
CN212679569U (en) * 2020-04-26 2021-03-12 南方科技大学 Hip and knee double-joint walking aid exoskeleton

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023113694A3 (en) * 2021-12-17 2023-08-17 Refract Technologies Pte Ltd Tracking system for simulating body motion
CN115881315A (en) * 2022-12-22 2023-03-31 北京壹永科技有限公司 Interactive medical visualization system
CN115881315B (en) * 2022-12-22 2023-09-08 北京壹永科技有限公司 Interactive medical visualization system

Also Published As

Publication number Publication date
CN113268141B (en) 2022-09-13

Similar Documents

Publication Publication Date Title
US8165844B2 (en) Motion tracking system
KR101483713B1 (en) Apparatus and Method for capturing a motion of human
KR101751760B1 (en) Method for estimating gait parameter form low limb joint angles
Roetenberg et al. Xsens MVN: Full 6DOF human motion tracking using miniature inertial sensors
CN101579238B (en) Human motion capture three dimensional playback system and method thereof
CN113268141B (en) Motion capture method and device based on inertial sensor and fabric electronics
CN112957033B (en) Human body real-time indoor positioning and motion posture capturing method and system in man-machine cooperation
CN106843507B (en) Virtual reality multi-person interaction method and system
CN110327048B (en) Human upper limb posture reconstruction system based on wearable inertial sensor
CN110609621B (en) Gesture calibration method and human motion capture system based on microsensor
JP2017503225A (en) Motion capture system
EP2171688A2 (en) Object motion capturing system and method
CN109284006B (en) Human motion capturing device and method
WO2018132999A1 (en) Human body step length measuring method for use in wearable device and measuring device of the method
US9021712B2 (en) Autonomous system and method for determining information representative of the movement of an articulated chain
CN108762488A (en) A kind of single base station portable V R system based on wireless human body motion capture and optical alignment
JP2016006415A (en) Method and apparatus for estimating position of optical marker in optical motion capture
Chen et al. A method to calibrate installation orientation errors of inertial sensors for gait analysis
Shi et al. Human motion capture system and its sensor analysis
JP6205387B2 (en) Method and apparatus for acquiring position information of virtual marker, and operation measurement method
CN109003300A (en) A kind of virtual reality system based on mass center of human body displacement computational algorithm
KR102229070B1 (en) Motion capture apparatus based sensor type motion capture system and method thereof
JP7216222B2 (en) Information processing device, control method for information processing device, and program
Niu et al. A survey on IMU-and-vision-based human pose estimation for rehabilitation
KR20210040671A (en) Apparatus for estimating displacement center of gravity trajectories and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant