CN114974506B - Human body posture data processing method and system - Google Patents

Human body posture data processing method and system Download PDF

Info

Publication number
CN114974506B
CN114974506B CN202210533572.0A CN202210533572A CN114974506B CN 114974506 B CN114974506 B CN 114974506B CN 202210533572 A CN202210533572 A CN 202210533572A CN 114974506 B CN114974506 B CN 114974506B
Authority
CN
China
Prior art keywords
data
posture
gesture
level
human body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210533572.0A
Other languages
Chinese (zh)
Other versions
CN114974506A (en
Inventor
董方杰
钟代笛
仲元红
徐乾锋
郑良
陈尧
王世伟
黄智勇
周庆
葛亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University
Original Assignee
Chongqing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University filed Critical Chongqing University
Priority to CN202210533572.0A priority Critical patent/CN114974506B/en
Publication of CN114974506A publication Critical patent/CN114974506A/en
Application granted granted Critical
Publication of CN114974506B publication Critical patent/CN114974506B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Primary Health Care (AREA)
  • General Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Biomedical Technology (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Biophysics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a human body posture data processing method and system. The invention has the advantages that the instantaneous gesture capturing data is processed, the gesture data is changed into the gesture base data with the same format, the processing of the data is convenient, and meanwhile, the data is output as the gesture base data table, so that a specific gesture can be intuitively reflected.

Description

Human body posture data processing method and system
Technical Field
The invention relates to a human body posture data processing method and system, and belongs to the technical field of data processing.
Background
Remote health management assessment of human health requires acquisition of user-related data relating to human activities and behaviors over a period of time. Regular and proper exercise is a necessary condition to improve physical condition and maintain health. It was found that human body posture is an essential element that constitutes movement and behavior. With the environmental information removed, the human body motion and behavior can be seen as a time-sequential sequence of gestures. The data processing of the gestures can help describe and analyze human body motions and behaviors.
In conventional telemedicine, the behavioral activity information of a person is obtained by a doctor or manager by viewing a monitoring video. This process is time consuming and labor intensive. By utilizing the health equipment to record the activities of the patient, the gesture information acquisition efficiency can be effectively improved, and the workload of related personnel is reduced. The current mainstream method is to simply and rapidly acquire human body posture data using various sensing devices and computer vision techniques. However, according to the acquisition and capture principle, the acquired data formats of different motion capture devices are different, and a specific processing program is needed to restore the motion capture devices to the gesture later. In order to minimize the data processing time and obtain uniform data output in medical remote data transmission, the human body posture data needs to be transmitted in a uniform format.
The current human body posture data format mostly focuses on the accuracy of the restored posture, emphasizes the position of the human body joint point in space, directly or indirectly adopts the coordinate of the joint point, or uses translation and rotation components relative to all coordinate axes to express. If BVH format records the rotation angle of each key point relative to the own coordinate system, the gesture is restored by calculating from father node to son node one by one. Only the position information or the rotation information of each node is needed in the existing gesture expression mode, and the specific gesture cannot be intuitively distinguished from the data. In operations such as action judgment and similarity calculation, joint included angles are also required to be calculated on the basis of joint point coordinates for analysis, but the existing data formats cannot intuitively reflect the gestures and are inconvenient for subsequent processing and calculation of the gestures.
In view of the foregoing, there is a need for improvements to existing human body posture data processing methods to address the above-described problems.
Disclosure of Invention
The invention aims to provide a human body posture data processing method and system which can intuitively reflect human body postures and are simple and easy to understand.
In order to achieve the above object, the present invention provides a human body posture data processing method, comprising the steps of:
Step 1: defining an initial gesture and determining initial gesture data of a gesture sequence according to the initial gesture;
step 2: capturing the instantaneous posture of the human body by using a motion capture device, and acquiring instantaneous posture capture data;
step 3: simplifying the instantaneous gesture capturing data in the step 2 to obtain simplified data;
step 4: comparing the simplified data in the step 3 with the initial posture data in the step 1 to obtain difference data between the simplified data and the initial posture data;
step 5: carrying out posture base data processing on the difference data in the step 4 to obtain posture base data;
step 6: and outputting the gesture base data to obtain a gesture base data table.
As a further improvement of the present invention, the gesture sequence in the step 1 is specifically: dividing the human body posture, defining the postures of a head, a trunk, left and right upper arms, left and right forearms, left and right palms, left and right thighs, left and right calves and left and right ankles as posture basic units, classifying the posture basic units according to a connection sequence, arranging the posture basic units according to a sequence from small to large in posture grades, and arranging the posture basic units according to a sequence from top to bottom, right to left when the posture grades are the same, so as to obtain the posture sequence.
As a further improvement of the present invention, the classifying of the gesture basic units in connection order is specifically: the trunk gesture is a 0-level gesture base, and the corresponding joint is a 0-level joint; the postures of the left upper arm, the right upper arm and the left thigh are 1-level posture bases, and the corresponding joints are 1-level joints; the postures of the left forearm, the right forearm and the left calf are 2-level posture bases, and the corresponding joints are 2-level joints; the posture of the head, the left palm, the right palm and the left ankle is a 3-level posture base, and the corresponding joints are 3-level joints.
As a further improvement of the present invention, the format of the transient gesture capturing data in the step 2 is BVH format.
As a further improvement of the present invention, the simplification of the instantaneous pose capturing data in the step 3 is specifically to extract the data of the head, the trunk, the left and right upper arms, the left and right forearms, the left and right palms, the left and right thighs, the left and right calves, and the left and right ankles in the instantaneous pose capturing data, and obtain the simplified data.
As a further improvement of the present invention, the difference data in the step 4 is a difference value between the simplified data and the initial posture data in the step 3, that is, a rotation angle of the instantaneous posture of the human body in the step 2 with respect to each of the posture base units between the initial postures in the step 1.
As a further improvement of the present invention, the posture base data processing in the step 5 is to describe each of the posture base units in the difference data using the quantized rotation angle.
As a further improvement of the present invention, when the difference data is subjected to the posture-based data processing in the step 5, the difference data is not distorted within a quantization accuracy range of 0 to 10 degrees.
As a further improvement of the invention, the gesture coding data table in the step 6 can intuitively reflect a gesture without restoring the gesture by software.
In order to achieve the above purpose, the invention also provides a human body posture data processing system, which comprises an acquisition unit, a processing unit and an output unit, wherein the acquisition unit is used for acquiring instantaneous posture capturing data during human body movement; the processing unit is used for simplifying the instantaneous gesture capturing data, then comparing the instantaneous gesture capturing data, and then processing gesture base data to obtain gesture base data; the output unit is used for outputting the attitude base data into the attitude base data table.
The beneficial effects of the invention are as follows:
1. According to the invention, by processing the instantaneous gesture data, the gesture data is changed into gesture base data in a unified format, so that a large amount of data can be further processed;
2. the quantization accuracy of the attitude-based data has small influence on the attitude, can greatly compress the data volume and is not easy to distort;
3. According to the invention, a gesture can be intuitively reflected through the gesture base data table without software restoration, and the gesture detection method is simple and clear.
Drawings
Fig. 1 is a flow chart of the present invention.
Fig. 2 is a human skeleton diagram of the present invention.
Fig. 3 is an exploded view of the human body posture of the present invention.
Fig. 4 is a diagram of the joint point hierarchy, initial pose, and geodetic coordinates of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and specific embodiments.
Referring to fig. 1, the invention discloses a human body posture data processing method, which specifically comprises the following steps:
Step 1: dividing the human body posture, defining the postures of a head, a trunk, left and right upper arms, left and right forearms, left and right palms, left and right thighs, left and right calves and left and right ankles as posture basic units, classifying the posture basic units according to a connection sequence, wherein the trunk posture is a 0-level posture base, and the corresponding joints are 0-level joints; the postures of the left upper arm, the right upper arm and the left thigh are 1-level posture bases, and the corresponding joints are 1-level joints; the postures of the left forearm, the right forearm and the left calf are 2-level posture bases, and the corresponding joints are 2-level joints; the gestures of the head, the left palm, the right palm and the left ankle are 3-level gesture bases, the corresponding joints are 3-level joints, gesture basic units are arranged in sequence from small to large according to gesture grades, when the gesture grades are the same, the gesture basic units are arranged in sequence from top to bottom, right to left to right to obtain a gesture sequence, an initial gesture is defined, and initial gesture data of the gesture sequence are determined according to the initial gesture;
Step 2: capturing the instantaneous posture of the human body by using the motion capture device, and acquiring instantaneous posture capture data, wherein the data format BVH format of the instantaneous posture capture data;
step 3: simplifying the instantaneous gesture capturing data in the step 2, and extracting the data of the head, the trunk, the left and right upper arms, the left and right forearms, the left and right palms, the left and right thighs, the left and right calves and the left and right ankles in the instantaneous gesture capturing data to obtain simplified data;
step 4: comparing the simplified data in the step 3 with the initial posture data in the step 1 to obtain difference data between the simplified data and the initial posture data, wherein the difference data is a difference value between the simplified data in the step 3 and the initial posture data, namely, the rotation angle of the instantaneous posture of the human body in the step 2 relative to each posture basic unit between the initial postures in the step 1;
Step 5: carrying out attitude base data processing on the difference data in the step 4, and describing each attitude basic unit in the difference data by adopting the quantized rotation angle to obtain attitude base data;
Step 6: and outputting the attitude base data to obtain an attitude base data table.
And 5, when the gesture base data processing is carried out on the difference data, the difference data cannot be distorted within the quantization precision range of not more than 10 degrees. In the step 6, the gesture coding data table can intuitively reflect a gesture without restoring the gesture by software.
The following description will be given in detail with respect to steps 1 to 6.
In step 1, the initial setting of the coordinate system is performed on the basic attitude unit. Specifically, step 1 includes:
S11, defining a posture basic unit, namely dividing the posture of a human body, and defining the postures of a head, a trunk, left and right upper arms, left and right forearms, left and right palms, left and right thighs, left and right calves and left and right ankles as the posture basic unit;
S12, defining gesture grades, grading gesture basic units according to a connection sequence, wherein the trunk gesture is a 0-grade gesture base, and the corresponding joints are 0-grade joints; the postures of the left upper arm, the right upper arm and the left thigh are 1-level posture bases, and the corresponding joints are 1-level joints; the postures of the left forearm, the right forearm and the left calf are 2-level posture bases, and the corresponding joints are 2-level joints; the postures of the head, the left palm, the right palm and the left ankle are 3-level posture bases, the corresponding joints are 3-level joints, and the posture basic units are arranged in sequence from small to large according to the posture grades;
s13, acquiring initial posture data, and when the posture grades are the same, arranging posture basic units according to the sequence of top-down, right-left, so as to obtain a posture sequence, defining an initial posture and determining initial posture data of the posture sequence according to the initial posture;
s14, establishing a coordinate system of the gesture basic unit, and establishing the coordinate system of the gesture basic unit according to the gesture sequence of the gesture basic unit.
Further, step 14 is to establish a coordinate system for each basic unit of posture; taking the coordinate system of trunk, arms, thighs and head as an example; specifically, a coordinate system of the 0-level posture-based trunk position; the trunk position is simplified and can be represented by a left shoulder, a right shoulder and a waist, wherein the vector direction from the left shoulder to the right shoulder is taken as the direction vector of the x axis of the trunk; the middle point from the waist to the left and right shoulders is the z-axis direction vector of the trunk; the outer product (cross product) of the trunk z-axis and trunk x-axis direction vectors is the trunk y-axis direction vector, thereby establishing a coordinate system of trunk position.
The coordinate system of the arm position in the level 1 posture base is established as follows: taking the vector from the elbow to the shoulder as the direction vector of the z axis of the arm part; the vector from the elbow to the wrist is an auxiliary vector U; taking the outer product of the arm z-axis direction vector and the auxiliary vector U as an arm x-axis direction vector; the outer product of the arm z and the arm x-axis direction vector is used as the arm y-axis direction vector.
The coordinate system of the thigh position in the level 1 posture base is established as follows: a knee-to-hip vector is taken as a thigh z-axis direction vector; taking a knee-to-ankle vector as an auxiliary vector V; taking the outer product of the auxiliary vector V and the thigh z-axis direction vector as the thigh x-axis direction vector; the outer product of the thigh z-axis and the x-axis direction vectors is the thigh y-axis direction vector.
The coordinate system establishment of the head position in the 3-level posture base comprises the steps of selecting key points and establishing the coordinate system. Specifically, the head coordinate system needs to be established by considering whether the BVH data to be converted has an eye key point (some data are simplified and do not contain the eye key point). If the eye key points exist, taking the neck-to-head vector direction as the direction vector of the head z-axis; taking the vector direction from the neck to the midpoint of the eyes as an auxiliary vector M; taking the outer product of the auxiliary vector M and the head z-axis direction vector as the head x-axis direction vector; the outer product of the head z and the head x-axis direction vector is used as the head y-axis direction vector.
If the coordinate system of the head position is established without the key point of eyes, the default head does not rotate left and right (namely, rotate around the z axis of the head), and the vector direction from the neck to the head is taken as the direction vector of the z axis of the head; taking the global y-axis direction vector as a head y-axis direction vector; the outer product of the head y-axis and head z-axis direction vectors is the head x-axis direction vector.
Step 2 is to acquire instantaneous pose capture data. Specifically, step 2 is to capture the instantaneous posture of the human body by using a motion capture device, and obtain instantaneous posture capture data; further, the acquired transient gesture capture data is subjected to format modification so that the acquired transient gesture capture data is in BVH format.
Step 3 is to simplify the instantaneous pose capture data. And the specific step 3 is to divide the captured instantaneous gesture capturing data so as to classify the instantaneous gesture capturing data according to the defined gesture basic units, and meanwhile, retain the instantaneous gesture capturing data corresponding to the gesture basic units, namely, extract the data of the head, the trunk, the left and right upper arms, the left and right forearms, the left and right palms, the left and right thighs, the left and right shanks and the left and right ankles in the instantaneous gesture capturing data, delete the rest data, simplify the instantaneous gesture capturing data, and acquire simplified data.
And 4, comparing the simplified data with the initial posture data to obtain difference data between the simplified data and the initial posture data. Specifically, step 4 is to compare the rotation of each posture basic unit between the instantaneous posture of the human body in step 2 and the initial posture in step 1, and obtain the rotation angle.
In a preferred embodiment of the present invention, the rotation angle of the instantaneous posture of the human body in step 2 relative to the basic unit of each posture between the initial postures in step 1 is adjusted in order from top to bottom and from near to far, and the rotation angle of the basic unit of each posture is obtained. Specifically, step 4 includes:
S41, the trunk is rotated from the current posture to the initial posture, the rotation angle of the trunk in each direction (such as an X axis, a Y axis and a Z axis) is recorded, and the rotation angle is recorded.
S42, rotating key points of the left elbow, the right elbow, the left hand, the right hand and the head of each posture basic unit contained in the upper body around z, y and x axes in a local coordinate system respectively, enabling the left elbow, the right hand, the left hand and the head of the current posture to coincide with the initial posture after rotating, and respectively calculating the rotation angle of each posture basic unit.
S43, the postures of the left and right arms and the left and right legs are restored in sequence, so that the current postures of the left and right arms and the left and right legs rotate to initial postures around the z, y and x axes in the local coordinate system of the current postures, and the rotation angles of the left and right arms and the left and right legs on the axes are recorded.
S44, rotating the left and right small arms and the left and right small legs in the current posture to restore to the initial posture, calculating the rotation included angles of the left and right small arms and the left and right small legs on the x axis of the local coordinate system, and calculating the included angles to obtain the rotation angles.
After the rotation transformation in the steps S41-S44, the current gesture is overlapped with the initial gesture, the acquired rotation angle is arranged to obtain rotation data, the rotation angle of each gesture basic unit between the instantaneous gesture of the human body in the step 2 relative to the initial gesture in the step 1 is acquired, and the rotation angle is recorded as the difference data between the current gesture and the initial gesture, namely, the difference value between the simplified data and the initial gesture data in the step 3.
Defining a coordinate system of a human body posture model formed by a plurality of posture basic units as a global coordinate system, and defining a coordinate system of which the relative positions of the posture basic units and the posture basic units are always unchanged as a local coordinate system; in the present invention, the specific steps of calculating the rotation angle by the local coordinate system in steps S41 to S43 are as follows:
Step A, calculating a local coordinate system of a corresponding gesture basic unit, wherein clockwise rotation is positive and counterclockwise rotation is negative when seen in the direction of shooting around a rotation shaft;
b, rotating the coordinate system of each attitude basic unit around the z axis of the coordinate system, enabling the y axis direction vector of the coordinate system of each attitude basic unit to rotate into a yoz plane under the global coordinate system, and recording the rotation angle at the moment as the rotation angle Rz of the y axis;
C, rotating the coordinate system of each attitude basic unit around the y axis of the coordinate system, enabling the z axis direction vector of the coordinate system of each attitude basic unit to rotate into a yoz plane under the global coordinate system, and recording the rotation angle at the moment as a rotation angle Ry of the z axis;
And D, rotating the coordinate system of each attitude basic unit around the x axis of the coordinate system, enabling the x axis direction vector of the coordinate system of each attitude basic unit to rotate into a zox plane under the global coordinate system, and recording the rotation angle at the moment as the rotation angle Rx of the x axis.
At this time, the rotation angles Rx, ry, rz of each attitude basic unit around the x, y, z axes of its own coordinate system can be obtained.
In step S44, since the left and right forearms and the left and right lower legs in the 2-stage posture base can only rotate about the x-axis of their own coordinate systems, i.e., the forearms can only rotate counterclockwise about the x-axis of their own coordinate systems, the lower legs can only rotate clockwise about the x-axis of their own coordinate systems; the rotation angle of the left and right forearms is the angle between the elbow-to-shoulder vector and the elbow-to-wrist vector and the direction is negative in step S44; the rotation angles of the left and right legs are angles of a knee-to-hip vector and a knee-to-ankle vector, and the directions are positive.
And 5, carrying out posture base data processing on the difference data in the step 4, and describing each posture basic unit in the difference data by adopting the quantized rotation angle to obtain posture base data. The step 5 is specifically as follows:
S51, inverting the difference data obtained by calculation in the step 4;
s52, judging the positive and negative of the difference data corresponding to each attitude basic unit to obtain a coded prefix, wherein the non-negative value is 1, and the negative value is 0;
S53, dividing the data obtained in the step S52 by a preset precision value, and converting the data into binary coded data serving as a suffix of gesture base data;
And S54, connecting the prefixes and the suffixes corresponding to the gesture basic units to form final gesture basic data.
In step 5, when the difference data is processed in the posture-based data processing, the difference data is not distorted within a quantization accuracy range of not more than 10 degrees.
And step 6, outputting the posture base data in the step 5 to obtain a posture base data table.
Furthermore, the invention also provides a human body posture data processing system which comprises an acquisition unit, a processing unit and an output unit, wherein the acquisition unit is used for acquiring instantaneous posture capturing data during human body movement; the processing unit is used for simplifying the instantaneous gesture capturing data, then comparing the instantaneous gesture capturing data, and then processing gesture base data to obtain gesture base data; the output unit is used for outputting the gesture base data table as the gesture base data table.
For a human body posture, it is assumed that the shape and the size of each structural part of the human body are not changed under the action of external force and internal force, and the human body posture can be determined by only determining the position and the posture of each joint point of the human body and combining the positions and the postures.
The human body posture is formed by rotating all parts of limbs around corresponding joint points, wherein the limbs which can reflect the posture are the head, the trunk and the limbs. Therefore, we first select the corresponding joint points to form a human skeleton model (as shown in fig. 2), normalize the human skeleton model size in order to better process the posture data between different human bodies, and decompose the human body posture into a head posture, a trunk posture, an upper limb posture and a lower limb posture by referring to the data, wherein the head posture consists of a head and a neck; the trunk posture consists of chest to hip bone; the upper limb posture consists of a shoulder to a wrist; the lower limb posture is composed of the hip to ankle. The postures of the head, the trunk, the left and right upper arms, the left and right forearms, the left and right palms, the left and right thighs, the left and right calves and the left and right ankles are defined as posture basic units. Then dividing the articulation points into 0, 1,2 and 3 levels according to the connection sequence, wherein the trunk posture is a 0-level posture base, and the corresponding vertebra joints are 0-level joints; the postures of the left upper arm, the right upper arm and the left thigh are 1-level posture bases, and the corresponding shoulder joints and hip joints are 1-level joints; the postures of the left forearm, the right forearm, the left calf and the right calf are 2-level posture bases, and the corresponding elbow joints and knee joints are 2-level joints; the posture of the head, the left palm, the right palm and the left ankle is a 3-level posture base, and the corresponding cervical vertebra joint, the wrist joint and the ankle joint are 3-level joints.
The 1-level joint point rotates around the 0-level joint point local coordinate axis; the 2-level joint point rotates around the 1-level joint point local coordinate axis; the level 3 articulation point rotates about the level 2 articulation point local coordinate axis. When the lower level (higher level) joint rotates around the upper level (lower level) joint, the local coordinate systems of the lower level (higher level) joint and the upper level (lower level) joint also rotate. The establishment of the posture of each posture base unit requires a rotation axis, a rotation direction, and a rotation angle of explicit rotation. When all the gesture basic units are combined into a complete gesture, as each gesture basic unit can move, some gesture basic units are superior joint points of other joints, and the linkage is complex. For convenience of rotation and subsequent processing, the horizontal rightward direction of the human body is taken as the X-axis direction (in the coronal plane), the horizontal forward direction of the human body is taken as the Y-axis direction (in the sagittal plane), and the vertical upward direction is taken as the Z-axis direction (the intersection line of the coronal plane and the sagittal plane), so that a local coordinate system is established at each joint, and the local coordinate system is used for representing the posture of the articulation point and the subsequent rotation axis connected with the articulation point.
Referring to fig. 3, the head movement is performed around the local coordinate system of the cervical joint, because there are 3 degrees of freedom, i.e. the head can rotate around its upper joint point, i.e. X, Y, Z axes of the cervical joint, corresponding to the forward and backward movements of the head, the left and right tilting, and the left and right rotation. The trunk moves by rotating around a local coordinate system of the vertebra joint, has 3 degrees of freedom, can rotate around X, Y, Z axes of the vertebra joint at the upper joint point, and correspondingly moves to bend forwards and backwards, incline left and right and rotate left and right; the arm/thigh movement is rotation around the local coordinate system of the shoulder joint/hip joint, has 3 degrees of freedom, can rotate around X, Y, Z axes of the upper-level articulation point, and corresponds to the flexion and extension, abduction and adduction and internal rotation and external rotation of the arm/leg; the forearm/calf has only one degree of freedom (X-axis) corresponding to the flexion and extension of the forearm/calf.
To ensure uniformity and accuracy of the results, we specify the order of rotation to be first around the X-axis, then around the Y-axis, and finally around the Z-axis.
The posture of each part is determined by the rotation angle of the corresponding joint point around the local coordinate axis of the superior joint point. The integral gesture is formed by combining single gesture basic units. If the gesture of the arm is formed by combining the gestures of three gesture basic units, namely, the palm, the forearm and the upper arm rotate around the local coordinate axis of the upper-level joint in sequence.
And arranging the gesture basic units according to the order of the gesture basic levels from small to large, and arranging the gesture basic units according to the order of top to bottom, right to left to obtain a gesture sequence, wherein the gesture sequence is a combination of the gestures of all gesture basic units of one complete gesture when the gesture levels are the same.
The posture when the human body stands upright and the hands naturally sag is defined as an initial posture, and at this time, the local coordinate system of each joint of the human body coincides with the geodetic coordinate system (as shown in fig. 4). And determining the gesture data of the gesture sequence of the initial gesture, namely the initial gesture data.
Then capturing the instant posture of the human body by using the motion capture device to obtain instant posture capture data, wherein the format of the instant posture capture data is BVH format, but the invention is not limited to the BVH format, and the format of the instant posture capture data can be ASF, AMC, BVH, HTR or other formats.
The instantaneous posture capturing data in the BVH format comprises a plurality of unused joint data such as eyes, fingers and toes of a human body, and the like, and is simplified before the instantaneous posture capturing data is used, and the data of the head, the trunk, the left and right upper arms, the left and right forearms, the left and right palms, the left and right thighs, the left and right calves and the left and right ankles in the instantaneous posture capturing data are extracted, so that simplified data is obtained.
And comparing the simplified data with the initial posture data to obtain difference data between the simplified data and the initial posture data, namely the rotation angle of the instantaneous posture of the human body relative to each posture basic unit between the initial postures.
And carrying out gesture base data processing on the difference data, and carrying out data processing on each gesture basic unit in the difference data in a form of 0/1_0000/1111 to obtain gesture base data. Wherein 0/1 is a prefix composed of 1-bit binary numbers for explaining the rotation direction of the rotation shaft. 0 is clockwise rotation direction, 1 bit counter clockwise rotation direction. 0000/1111 is a suffix, and the suffix is composed of 4-bit 2-system numbers, and the value range is as follows: 0000-1111, for describing the angle of rotation about the coordinate axis, 1 represents 10 degrees, for example 0011 represents 30 degrees, and "_" is not used in practice, but is merely for convenience of distinguishing the prefix and the suffix. The following is a table of comparison of the pose-based data processing.
After all the gesture basic units are sequentially subjected to data processing, the gesture basic units are combined according to the sequence of gesture sequences and then output as a gesture base data table, the following table is an example of a gesture base data table, and the gesture coding data table can intuitively reflect a gesture without restoring the gesture by software.
Attitude base data table at a certain moment (quantization accuracy 10 degree)
In the process of carrying out gesture-based data processing on the difference data, the suffix 0000/1111 is 2 scale, 1 represents 10 degrees, experiments show that 1 represents 1 degree, 1 represents 2 degrees, 1 represents 5 degrees and 1 represents quantization precision of 10 degrees, after the gesture-based data is processed, inverse transformation is carried out to restore the gesture-based data to gesture, the difference between restored gesture data and initial gesture data is compared, namely, average errors and variances of corresponding joint coordinates of the restored gesture data and the initial gesture data are calculated, and the difference between the calculated average errors and variances is less than 10% under the different precision of 1 degree, 2 degrees, 5 degrees and 10 degrees, so that the difference data can be known to be not distorted within the quantization precision range of not more than 10 degrees.
In summary, the invention changes the gesture data into the gesture base data with a uniform format by processing the instantaneous gesture data, thereby being convenient for further processing a large amount of data; the quantization accuracy of the attitude-based data has small influence on the attitude, can greatly compress the data volume and is not easy to distort; according to the invention, a gesture can be intuitively reflected through the gesture base data table without software restoration, and the gesture detection method is simple and clear.
The above embodiments are only for illustrating the technical solution of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications and equivalents may be made thereto without departing from the spirit and scope of the technical solution of the present invention.

Claims (7)

1. A human body posture data processing method is characterized in that: the method specifically comprises the following steps:
Step 1: defining an initial gesture and determining initial gesture data of a gesture sequence according to the initial gesture;
step 2: capturing the instantaneous posture of the human body by using a motion capture device, and acquiring instantaneous posture capture data;
step 3: simplifying the instantaneous gesture capturing data in the step 2 to obtain simplified data;
step 4: comparing the simplified data in the step 3 with the initial posture data in the step 1 to obtain difference data between the simplified data and the initial posture data;
step 5: carrying out posture base data processing on the difference data in the step 4 to obtain posture base data;
step 6: outputting the gesture base data to obtain a gesture base data table;
The gesture sequence in the step 1 specifically includes: dividing the human body posture, defining the postures of a head, a trunk, left and right upper arms, left and right forearms, left and right palms, left and right thighs, left and right calves and left and right ankles as posture basic units, classifying the posture basic units according to a connection sequence, arranging the posture basic units according to a sequence from small to large in posture grades, and arranging the posture basic units according to a sequence from top to bottom, right to left when the posture grades are the same, so as to obtain a posture sequence;
The gesture basic units are classified according to the connection sequence specifically as follows: the trunk gesture is a 0-level gesture base, and the corresponding joint is a 0-level joint; the postures of the left upper arm, the right upper arm and the left thigh are 1-level posture bases, and the corresponding joints are 1-level joints; the postures of the left forearm, the right forearm and the left calf are 2-level posture bases, and the corresponding joints are 2-level joints; the postures of the head, the left palm, the right palm and the left ankle are 3-level posture bases, and the corresponding joints are 3-level joints;
The posture base data in the step 5 is processed to describe each of the posture base units in the difference data using the quantized rotation angle.
2. The human body posture data processing method of claim 1, characterized in that: the format of the transient gesture capturing data in the step 2 is BVH format.
3. The human body posture data processing method of claim 1, characterized in that: the simplifying of the instantaneous gesture capturing data in the step 3 specifically includes extracting data of a head, a trunk, left and right upper arms, left and right forearms, left and right palms, left and right thighs, left and right calves, and left and right ankles in the instantaneous gesture capturing data, so as to obtain the simplified data.
4. The human body posture data processing method of claim 1, characterized in that: the difference data in the step 4 is a difference value between the simplified data in the step 3 and the initial posture data, that is, a rotation angle of the instantaneous posture of the human body in the step 2 relative to each posture basic unit between the initial postures in the step 1.
5. The human body posture data processing method of claim 1, characterized in that: and (5) when the difference data is subjected to gesture-based data processing in the step (5), the difference data is not distorted within the quantization precision range of 0-10 degrees.
6. The human body posture data processing method of claim 1, characterized in that: and (3) the gesture base data table in the step (6) can intuitively reflect a gesture without restoring the gesture by software.
7. A human body posture data processing system, characterized in that: the device comprises an acquisition unit, a processing unit and an output unit, wherein the acquisition unit is used for acquiring instantaneous gesture capturing data during human body movement; the processing unit is used for simplifying the instantaneous gesture capturing data, then comparing the instantaneous gesture capturing data, and then processing gesture base data to obtain gesture base data; the output unit is used for outputting the attitude base data into an attitude base data table;
The processing unit is also used for defining an initial gesture and determining initial gesture data of a gesture sequence according to the initial gesture;
The specific mode of performing comparison processing after simplifying the instantaneous gesture capturing data is as follows: simplifying the instantaneous gesture capturing data to obtain simplified data; comparing the simplified data with the initial posture data to obtain difference data between the simplified data and the initial posture data; processing the difference data according to the gesture base data to obtain gesture base data;
The gesture sequence specifically comprises the following steps: dividing the human body posture, defining the postures of a head, a trunk, left and right upper arms, left and right forearms, left and right palms, left and right thighs, left and right calves and left and right ankles as posture basic units, classifying the posture basic units according to a connection sequence, arranging the posture basic units according to a sequence from small to large in posture grades, and arranging the posture basic units according to a sequence from top to bottom, right to left when the posture grades are the same, so as to obtain a posture sequence;
The gesture basic units are classified according to the connection sequence specifically as follows: the trunk gesture is a 0-level gesture base, and the corresponding joint is a 0-level joint; the postures of the left upper arm, the right upper arm and the left thigh are 1-level posture bases, and the corresponding joints are 1-level joints; the postures of the left forearm, the right forearm and the left calf are 2-level posture bases, and the corresponding joints are 2-level joints; the postures of the head, the left palm, the right palm and the left ankle are 3-level posture bases, and the corresponding joints are 3-level joints;
The posture base data is processed to describe each of the posture base units in the difference data using the quantized rotation angle.
CN202210533572.0A 2022-05-17 2022-05-17 Human body posture data processing method and system Active CN114974506B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210533572.0A CN114974506B (en) 2022-05-17 2022-05-17 Human body posture data processing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210533572.0A CN114974506B (en) 2022-05-17 2022-05-17 Human body posture data processing method and system

Publications (2)

Publication Number Publication Date
CN114974506A CN114974506A (en) 2022-08-30
CN114974506B true CN114974506B (en) 2024-05-03

Family

ID=82984176

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210533572.0A Active CN114974506B (en) 2022-05-17 2022-05-17 Human body posture data processing method and system

Country Status (1)

Country Link
CN (1) CN114974506B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117503120B (en) * 2023-12-18 2024-04-16 北京铸正机器人有限公司 Human body posture estimation method and system

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6152890A (en) * 1997-10-30 2000-11-28 Hauptverband Der Gewerblichen Berufsgenossenschaften E.V. Method and device for the recording, presentation and automatic classification of biomechanical load variables measured on a freely moving test person during a work shift
JP2004150900A (en) * 2002-10-29 2004-05-27 Japan Aviation Electronics Industry Ltd Attitude angle detector, and method of acquiring attaching angle correction level
KR20130110441A (en) * 2012-03-29 2013-10-10 한국과학기술원 Body gesture recognition method and apparatus
CN103761996A (en) * 2013-10-18 2014-04-30 中广核检测技术有限公司 Nondestructive testing robot intelligent testing method based on virtual reality technology
CN109948472A (en) * 2019-03-04 2019-06-28 南京邮电大学 A kind of non-intrusion type human thermal comfort detection method and system based on Attitude estimation
CN110516114A (en) * 2019-08-28 2019-11-29 北京理工大学 Motion database automatic marking method and terminal based on posture base
CN112990089A (en) * 2021-04-08 2021-06-18 重庆大学 Method for judging human motion posture
CN113143257A (en) * 2021-02-09 2021-07-23 国体智慧体育技术创新中心(北京)有限公司 Generalized application system and method based on individual movement behavior hierarchical model
CN113255487A (en) * 2021-05-13 2021-08-13 中国民航大学 Three-dimensional real-time human body posture recognition method
CN114373223A (en) * 2021-12-27 2022-04-19 华南理工大学 Method for constructing 3D attitude estimation model combining static and dynamic joint relation

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5945419B2 (en) * 2012-01-10 2016-07-05 本田技研工業株式会社 Leg motion trajectory generator for legged mobile robots.
US10902618B2 (en) * 2019-06-14 2021-01-26 Electronic Arts Inc. Universal body movement translation and character rendering system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6152890A (en) * 1997-10-30 2000-11-28 Hauptverband Der Gewerblichen Berufsgenossenschaften E.V. Method and device for the recording, presentation and automatic classification of biomechanical load variables measured on a freely moving test person during a work shift
JP2004150900A (en) * 2002-10-29 2004-05-27 Japan Aviation Electronics Industry Ltd Attitude angle detector, and method of acquiring attaching angle correction level
KR20130110441A (en) * 2012-03-29 2013-10-10 한국과학기술원 Body gesture recognition method and apparatus
CN103761996A (en) * 2013-10-18 2014-04-30 中广核检测技术有限公司 Nondestructive testing robot intelligent testing method based on virtual reality technology
CN109948472A (en) * 2019-03-04 2019-06-28 南京邮电大学 A kind of non-intrusion type human thermal comfort detection method and system based on Attitude estimation
CN110516114A (en) * 2019-08-28 2019-11-29 北京理工大学 Motion database automatic marking method and terminal based on posture base
CN113143257A (en) * 2021-02-09 2021-07-23 国体智慧体育技术创新中心(北京)有限公司 Generalized application system and method based on individual movement behavior hierarchical model
CN112990089A (en) * 2021-04-08 2021-06-18 重庆大学 Method for judging human motion posture
CN113255487A (en) * 2021-05-13 2021-08-13 中国民航大学 Three-dimensional real-time human body posture recognition method
CN114373223A (en) * 2021-12-27 2022-04-19 华南理工大学 Method for constructing 3D attitude estimation model combining static and dynamic joint relation

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
基于Kinect的人体姿态估计优化和动画生成;赵威等;《计算机应用》;20211224;第42卷(第09期);第2830-2837页 *
基于人体姿态估计的功能性动作筛查自动化评估;徐乾锋等;《第三十四届中国仿真大会暨第二十一届亚洲仿真会议》;20221209;第1018-1025 *
基于人机工程学的机电式动作捕捉系统开发;光震宇;《中国优秀硕士学位论文全文数据库 (工程科技Ⅱ辑)》;20190715(第07期);第C030-105页 *
基于体域网的人体姿态识别与捕捉技术研究;于景超;《中国优秀硕士学位论文全文数据库 (医药卫生科技辑)》;20191215(第12期);第E054-15页 *

Also Published As

Publication number Publication date
CN114974506A (en) 2022-08-30

Similar Documents

Publication Publication Date Title
Molet et al. A real time anatomical converter for human motion capture
CN107485844B (en) Limb rehabilitation training method and system and embedded equipment
CN111460875A (en) Image processing method and apparatus, image device, and storage medium
Cerveri et al. Finger kinematic modeling and real-time hand motion estimation
US9600934B2 (en) Augmented-reality range-of-motion therapy system and method of operation thereof
CN110349081B (en) Image generation method and device, storage medium and electronic equipment
US20230021860A1 (en) Systems and methods for approximating musculoskeletal dynamics
WO2022227664A1 (en) Robot posture control method, robot, storage medium and computer program
CN114974506B (en) Human body posture data processing method and system
CN112435731B (en) Method for judging whether real-time gesture meets preset rules
Wei et al. Real-time 3D arm motion tracking using the 6-axis IMU sensor of a smartwatch
Wei et al. Real-time limb motion tracking with a single imu sensor for physical therapy exercises
CN115240247A (en) Recognition method and system for detecting motion and posture
CN115346670A (en) Parkinson's disease rating method based on posture recognition, electronic device and medium
CN114782497A (en) Motion function analysis method and electronic device
CN111369626B (en) Mark point-free upper limb movement analysis method and system based on deep learning
WO2021149629A1 (en) Posture diagnosis system, posture diagnosis method, and data set for posture diagnosis
WO2019152566A1 (en) Systems and methods for subject specific kinematic mapping
CN114712835A (en) Supplementary training system based on two mesh human position appearance discernments
CN210302240U (en) Augmented reality AR wrist joint rehabilitation evaluation and training system
Lin et al. A vision-based compensation detection approach during robotic stroke rehabilitation therapy
Bastico et al. Continuous Person Identification and Tracking in Healthcare by Integrating Accelerometer Data and Deep Learning Filled 3D Skeletons
Cha et al. Mobile. Egocentric human body motion reconstruction using only eyeglasses-mounted cameras and a few body-worn inertial sensors
KR20210076559A (en) Apparatus, method and computer program for generating training data of human model
Dao et al. Real-time rehabilitation system of systems for monitoring the biomechanical feedbacks of the musculoskeletal system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant