CN115310484A - Posture expansion state semantic processing method and system - Google Patents

Posture expansion state semantic processing method and system Download PDF

Info

Publication number
CN115310484A
CN115310484A CN202210939186.1A CN202210939186A CN115310484A CN 115310484 A CN115310484 A CN 115310484A CN 202210939186 A CN202210939186 A CN 202210939186A CN 115310484 A CN115310484 A CN 115310484A
Authority
CN
China
Prior art keywords
state
posture
human body
attitude
rotation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210939186.1A
Other languages
Chinese (zh)
Other versions
CN115310484B (en
Inventor
靳虎
伍光伟
钟代笛
仲元红
葛亮
周庆
黄智勇
庄洁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University
Original Assignee
Chongqing University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University filed Critical Chongqing University
Priority to CN202210939186.1A priority Critical patent/CN115310484B/en
Publication of CN115310484A publication Critical patent/CN115310484A/en
Application granted granted Critical
Publication of CN115310484B publication Critical patent/CN115310484B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Dentistry (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a method and a system for processing a posture expansion state semantic meaning. According to the posture expansion state semantic processing method, the human body posture data are divided into the posture bases to be represented, the expansion posture base data set is added on the basis of the basic posture base data set to accurately describe the human body posture, and the human body posture data can be conveniently and visually processed.

Description

Posture expansion state semantic processing method and system
Technical Field
The invention relates to a method and a system for processing a posture expansion state semantic meaning, and belongs to the technical field of data processing.
Background
With the development of the technology, the human body motion data can be used in the fields of medical health, exercise rehabilitation and the like to help related work to be copied and analyzed at present. By processing and analyzing the human body movement data, the physical condition which needs to be observed and analyzed by human eyes can be known more conveniently and quickly, so that medical measures and correction suggestions can be taken quickly.
With the development of microsensors, human motion analysis based on MEMS capture has also been studied extensively, but whether video-based or MEMS-based, human motion data captured by them focuses on the precise description of human motion, which is represented by the rotational and translational information of each key point (bone) of the body relative to some root key point (bone) or its connected father key point (bone).
The motion data in the format is not optimal in the fields of medical health, exercise rehabilitation and the like, and the rotation and translation quantities described in the motion data are all angle values with high precision, so that the motion data are suitable for accurately describing the spatial position of the human skeleton, do not contain semantic information and are not convenient to search; the same precision is used for describing the postures of different required scenes, and the precision is excessive in some scenes.
The data processing method based on the posture base utilizes the semantic information of each limb to facilitate the judgment and query of the posture, and reduces the calculated amount. However, compared with other existing data formats (such as BVH, C3D, etc.), the gesture is only relatively coarse semantic description, and the precise gesture cannot be restored through the coarse semantic description, and the gesture restoration requirements in some scenes cannot be met.
However, existing data description methods for motion capture typically use ASF data and AMC data for the description of the pose basis. Wherein the ASF data describes skeletal information defining an initial pose of motion; AMC data describes skeleton motion, and rotation and translation of root bones are described through Euler angles and translation quantities, and the Euler angles describe changes of various skeletons relative to the initial postures of the motion. The BVH format defines, at the file header, the keypoint names describing the pose and the positions of the keypoints relative to the parent keypoints, and finally describes the rotation amounts of the respective child keypoints relative to the parent keypoints using euler angles and translations.
At present, motion and motion data are directly stored according to corresponding utilization rates (such as ASF/AMC, BVH, C3D and other format data), gestures cannot be distinguished obviously on the data, and the data focus on restoring the precision of sampled motion and motion but cannot contain semantic information. The judgment and identification of the human body actions need to restore the whole data into the human body skeleton and then carry out manual judgment, so that the efficiency is low and a large amount of manpower is needed. Under the scene that massive data are in need of processing, a large amount of time is consumed or a special processing program is used, so that the method is not convenient enough; and the attitude data can not be searched and queried; describing the gesture with the same precision adds extra data processing capacity in some application scenarios which can be represented in a simplified mode.
In view of the above, it is necessary to improve the existing gesture expanding state semantic processing method to solve the above problems.
Disclosure of Invention
The invention aims to provide a method and a system for processing the gesture expansion state semantics, wherein the method can accurately describe and restore the gesture while representing the human gesture by adding the description of the expansion state semantics, thereby facilitating the visualization of human gesture data and realizing the progressive data visualization.
In order to achieve the above object, the present invention provides a method for processing a posture expansion state semantic, comprising:
s1, collecting human body posture data, and segmenting a posture base according to different human body parts;
s2, giving rotating shaft state semantics and space position state semantics based on the space position state of each attitude base relative to the neutral position, and acquiring a basic attitude base data set;
s3, giving expanded state semantics based on the spatial position state of each attitude basis relative to the coronal axis and/or the vertical axis, and acquiring an expanded attitude basis data set;
s4, representing the human body posture by using the basic posture base data group to obtain a preliminary human body posture model;
and S5, carrying out fine description on the preliminary human body posture model by using the extended posture base data set to obtain a high-precision human body posture model.
As a further improvement of the present invention, S1 specifically includes:
s11, collecting the human body posture data by using an action collecting device;
and S12, dividing the human body posture data according to the human body part, and dividing the posture base.
As a further improvement of the present invention, in S11, the motion capture device is a sensor module having an inertial measurement unit, and the inertial measurement unit includes a 3-axis gyroscope, a 3-axis accelerometer, and a 3-axis magnetometer.
As a further improvement of the present invention, in S2, the assignment of the rotation axis state semantics includes assigning a coronal axis rotation state, a vertical axis rotation state, and a sagittal axis rotation state to each of the attitude bases with respect to a neutral position; and the giving of the spatial position state semantics comprises giving a numerical value to the rotation state of the attitude base by using a data value to obtain the basic attitude base data set.
As a further improvement of the present invention, the S3 specifically describes the motion angle of the attitude base by using a rotation value to obtain the extended attitude base data set.
As a further improvement of the invention, the extended pose base data set is denoted PU _ ANGLE _ UNIT, and is the minimum ANGLE UNIT when using state extension description, and the final ANGLE value = minimum ANGLE UNIT × rotation value of the rotation of each of the pose bases around the rotation axis.
As a further improvement of the invention, the selected value of the minimum angle unit can be selected according to the position of the posture base and/or the restoration precision of the high-precision human body posture model.
As a further improvement of the present invention, the S4 is specifically to sequentially arrange the basic posture base data sets according to the spatial ordering of the posture bases, and obtain the preliminary human body posture model.
As a further improvement of the present invention, the S5 is specifically:
s51, constructing a neutral position as an initial posture;
s52, restoring the angle value of the initial attitude according to the spatial sorting of the attitude basis;
s53, correcting the preliminary human body posture model according to the restored angle value.
In order to achieve the purpose, the invention also provides a posture expansion state semantic processing system, which comprises an action acquisition device, a processing unit and an output unit, wherein the action acquisition device is used for acquiring the instantaneous posture of the human body during the motion of the human body; the processing unit is used for executing the posture expansion state semantic processing method.
The invention has the beneficial effects that: according to the posture expansion state semantic processing method, the human body posture data are divided into the posture bases to be represented, the posture expansion base data set is added on the basis of the basic posture base data set to accurately describe the human body posture, and the human body posture data can be conveniently processed in a visualized mode.
Drawings
FIG. 1 is a flow chart of a pose extension state semantic processing method of the present invention.
Fig. 2 is a diagram of the human skeleton of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and specific embodiments.
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and specific embodiments.
It should be noted that, in order to avoid obscuring the present invention with unnecessary details, only the structures and/or processing steps closely related to the aspects of the present invention are shown in the drawings, and other details not closely related to the present invention are omitted.
In addition, it is also to be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Referring to fig. 1, a method for processing a posture expansion state semantic according to the present invention includes:
s1, collecting human body posture data, and segmenting a posture base according to different human body parts;
s2, giving rotating shaft state semantics and space position state semantics based on the space position state of each attitude base relative to the neutral position, and acquiring a basic attitude base data set;
s3, giving expanded state semantics based on the spatial position state of each attitude basis relative to the coronal axis and/or the vertical axis, and acquiring an expanded attitude basis data set;
s4, representing the human body posture by using the basic posture base data group to obtain a preliminary human body posture model;
and S5, carrying out fine description on the preliminary human body posture model by using the extended posture base data set to obtain a high-precision human body posture model.
The following description will be made in detail with respect to S1 to S5.
The S1 specifically comprises:
s11, collecting the human body posture data by using an action collecting device;
and S12, dividing the human body posture data according to the human body part (as shown in figure 2) and dividing the posture base.
Further, the motion acquisition device is a sensor module with an inertial measurement unit, and the inertial measurement unit comprises a 3-axis gyroscope, a 3-axis accelerometer and a 3-axis magnetometer.
In S2, the assigning of the rotation axis state semantic includes assigning a coronal axis rotation state, a vertical axis rotation state, and a sagittal axis rotation state to each of the orientation bases with respect to a neutral position; and the giving of the spatial position state semantics comprises giving a numerical value to the rotation state of the attitude base by using a data value to obtain the basic attitude base data set.
Specifically, PU0_ true is defined as a posture attribute value describing a torso posture base. Describing the spatial position state of the body unit relative to the neutral position based on the rotation axis state semantic, wherein the state value is represented by three character positions so as to describe the spatial position rotation state of the body unit relative to the neutral position; one of the characters is a character indicating a state of rotation about the coronal axis (the state can be described as: neutral/forward/backward), and one of the characters is a character indicating a state of rotation about the vertical axis (the state can be described as: neutral/left/right); a one-digit character indicates a state of rotation about the sagittal axis (which state can be described as: neutral/left/right flexion). Assigning a data value of the space position state of the trunk posture relative to the neutral position based on the local posture space state semantic, wherein the value of each character is [0,5], and 0 represents that the trunk is in the vertical posture; 1 indicates that the trunk is in an inverted state; 2, the trunk is in a forward leaning state; 3, the trunk is in a backward leaning state; 4 indicates that the torso is in a simple inclination state; and 5 indicates that the torso is in a bi-tilt state.
Define PU1_ LEFT _ UPPERAMM as the pose attribute value describing the pose base of the upper LEFT arm. Describing the spatial position state of the left upper arm unit relative to the neutral position based on the rotation axis state semantic, wherein the state value is represented by three character positions, and one position represents the state of the left upper arm unit rotating around the coronal axis (the state can be described as neutral position/forward bending/backward stretching); one bit represents a state of rotation about a vertical axis (which state can be described as: neutral/internal/external rotation); one bit represents a state of rotation about the sagittal axis (which state can be described as: neutral/abducted/adducted). Assigning a data value of the space position state of the upper left arm unit relative to the neutral position based on the local posture space state semantic, wherein the value is represented by one character bit, the value of the character is [0,6], and 0 represents that the upper arm is in the neutral position; 1 indicates that the upper arm is in a forward bending state; 2, the upper arm is in a backward extending state; 3, the upper arm is in a state of internal rotation; 4 represents that the upper arm is in a state of external rotation; 5 represents the upper arm in the abduction state; 6 indicates that the upper arm is in the adduction state.
Define PU1_ RIGHT _ uppearmm as the pose attribute value describing the pose base of the upper RIGHT arm. Describing the spatial position state of the posture of the upper right arm relative to the neutral position based on the rotating shaft state semantic, wherein the value is represented by three character positions, and one position represents the state of rotating around the coronal axis (the state can be described as neutral position/anteflexion/extension); one bit represents a state of rotation about a vertical axis (which state can be described as: neutral/internal/external rotation); one bit indicates the state of rotation about the sagittal axis (which state can be described as: neutral/abducted/adducted). Assigning a data value of the space position state of the upper right arm unit relative to the neutral position based on the local posture space state semantic, wherein the value is represented by one character bit, the value of the character is [0,6], and 0 represents that the upper arm is in the neutral position; 1 indicates that the upper arm is in a forward bending state; 2, the upper arm is in a backward extending state; 3, the upper arm is in a state of internal rotation; 4 represents that the upper arm is in a state of external rotation; 5 indicates that the upper arm is in an abduction state; 6 indicates that the upper arm is in the adduction state.
PU1_ LEFT _ THIGH is defined as a posture attribute value describing the LEFT THIGH posture base. Describing the spatial position state of the left thigh posture relative to the neutral position based on the rotation axis state semantic, wherein the value is represented by three character positions, and one position represents the state of rotating around the coronal axis (the state can be described as neutral position/anteflexion/extension); one bit represents a state of rotation about a vertical axis (which state can be described as: neutral/internal/external rotation); one bit represents a state of rotation about the sagittal axis (which state can be described as: neutral/abducted/adducted). Assigning a data value of the space position state of the left thigh gesture relative to the neutral position based on the local gesture space state semantic, wherein the value is represented by a character bit, the character value is [0,6], and 0 represents that the thigh is in the neutral position; 1 indicates that the thigh is in a forward flexion state; 2, the thigh is in a backward extending state; 3 indicates that the thigh is in the internal rotation state; 4 indicates that the thigh is in the state of external rotation; 5 indicates that the thigh is in the abducted state; and 6, the thigh is in adduction state.
Define PU1_ RIGHT _ THIGH as the posture attribute value describing the RIGHT THIGH posture base. Describing the spatial position state of the right thigh posture relative to the neutral position based on the rotation axis state semantic, wherein the value is represented by three character positions, and one position represents the state of rotating around the coronal axis (the state can be described as neutral position/anteflexion/extension); one bit represents the state of rotation about a vertical axis (which state can be described as: neutral/in/out); one bit represents a state of rotation about the sagittal axis (which state can be described as: neutral/abducted/adducted). Assigning a data value of the space position state of the right thigh gesture relative to the neutral position based on the local gesture space state semantic, wherein the value is represented by a character bit, the character value is [0,6], and 0 represents that the thigh is in the neutral position; 1 indicates that the thigh is in a forward flexion state; 2, the thigh is in a backward extending state; 3 indicates that the thigh is in the internal rotation state; 4 indicates that the thigh is in the state of external rotation; 5 indicates that the thigh is in an abduction state; and 6, the thigh is in adduction state.
PU2_ LEFT _ form is defined as a pose attribute value that describes the LEFT FOREARM pose base. Describing the spatial position state of the left forearm posture relative to the neutral position based on the rotation axis state semantic description, wherein the value is represented by two character positions, and one position represents the state of rotating around the coronal axis (the state can be described as neutral position/flexion/hyperextension); one bit represents a state of rotation about a vertical axis (which state can be described as: neutral/internal/external rotation). Assigning a data value of the space position state of the left forearm posture relative to the neutral position based on the local posture space state semantic, wherein the value is represented by a character bit, the character value is [0,4], and 0 represents that the forearm is in the neutral position; 1 indicates that the forearm is in a forward flexion state; 2 indicates that the forearm is in a backward extension state; 3 indicates that the forearm is in the internal rotation state; 4 indicates that the forearm is in supination.
PU2_ RIGHT _ form is defined as a pose attribute value describing the pose base of the RIGHT FOREARM. Describing the spatial position state of the posture of the right forearm relative to the neutral position based on the rotation axis state semantic description, wherein the value is represented by two character positions, and one position represents the state of the right forearm rotating around the coronal axis (the state can be described as neutral position/flexion/hyperextension); one bit represents the state of rotation about a vertical axis (which state can be described as: neutral/in/out). Assigning a data value of the space position state of the right front arm posture relative to the neutral position based on the local posture space state semantic, wherein the value is represented by a character bit, the character value is [0,4], and 0 represents that the front arm is in the neutral position; 1 indicates that the forearm is in a forward flexion state; 2, the forearm is in a rear extension state; 3 indicates that the forearm is in the internal rotation state; 4 indicates that the forearm is in supination.
Define PU2_ LEFT _ SHANK as the attitude attribute value describing the base of the LEFT calf attitude. Describing the spatial position state of the left calf posture relative to the neutral position based on the rotation axis state semantic description, wherein the value is represented by two character positions, and one character position represents the state of the left calf posture rotating around the coronal axis (the state can be described as neutral position/flexion/hyperextension); one bit represents a state of rotation about a vertical axis (which state can be described as: neutral/internal/external rotation). Assigning a data value of the space position state of the left shank posture relative to the neutral position based on the local posture space state semantic, wherein the value is represented by a character bit, the value of the character is [0,4], and 0 represents that the shank is in the neutral position; 1 indicates that the lower leg is in a forward bending state; 2, the shank is in a backward extension state; 3, the crus are in the internal rotation state; 4 indicates that the calf is in supination.
Define PU2_ RIGHT _ SHANK as the attitude attribute value describing the base of the RIGHT calf attitude. Describing the spatial position state of the posture of the right lower leg relative to the neutral position based on the rotation axis state semantics, wherein the value of the spatial position state is represented by two character positions, and one character position represents the state of the right lower leg which rotates around the coronal axis (the state of the right lower leg can be described as neutral position/flexion/hyperextension); one bit represents a state of rotation about a vertical axis (which state can be described as: neutral/internal/external rotation). Assigning a data value of the space position state of the posture of the right shank relative to the neutral position based on the local posture space state semantic, wherein the value is represented by a character bit, the value of the character is [0,4], and 0 represents that the shank is in the neutral position; 1 indicates that the lower leg is in a forward bending state; 2, the shank is in a backward extension state; 3, the crus is in the internal rotation state; and 4, the calf is in the state of external rotation.
PU3_ HEAD is defined as a pose attribute value that describes the pose base of the HEAD. Describing the spatial position state of the head posture relative to the neutral position based on the rotation axis state semantics, wherein the value of the spatial position state is represented by three character positions, and one bit represents the state of the head posture rotating around the coronal axis (the state can be described as neutral position/head-lowering/head-raising); one bit represents the state of rotation about a vertical axis (which state can be described as: neutral/left/right); one bit indicates the state of rotation about the sagittal axis (neutral/left roll/right roll). Assigning a data value of the head posture relative to the spatial position state of the neutral position based on the local posture spatial state semantic, wherein the value is represented by a character bit, the character value is [0,6], and 0 represents that the head is in the neutral position; 1 indicates that the head is in a low head state; 2 indicates that the head is in a head-up state; 3 indicates that the head is in a left-handed state; 4 indicates that the head is in a right-handed state; 5 indicates that the head is in a left-inclined state; and 6, the head is in a right-leaning state.
Define PU3_ LEFT _ HAND as a gesture attribute value that describes the LEFT-HAND gesture base. The spatial position state of the left hand pose relative to neutral is described based on the rotation axis state semantics, with the value being one character bit representation and the first bit representing the state in which it is rotated about the coronal axis (which state may be described as: neutral/forward/backward). Assigning a data value of the spatial position state of the left-hand posture relative to the neutral position based on the local posture spatial state semantic, wherein the value is represented by a character bit, the character value is [0,2], and 0 represents that the left hand is in the neutral position; 1 indicates that the left hand is in a forward bending state; 2 indicates that the left hand is in a backward extended state.
Define PU3_ RIGHT _ HAND as the pose attribute value describing the RIGHT HAND pose base. The spatial position state of the right-hand posture relative to the neutral position is described based on the rotation axis state semantic description, the value of which is a character bit representation, and the first bit represents a state that is rotated around the coronal axis (the state can be described as: neutral/anteflexion/extension). Assigning a data value of the spatial position state of the right-hand posture relative to the neutral position based on the local posture spatial state semantic, wherein the value is represented by a character bit, the value of the character is [0,2], and 0 represents that the right hand is in the neutral position; 1 indicates that the right hand is in a forward flexion state; 2 indicates that the right hand is in a rearwards extended state.
Define PU3_ LEFT _ FOOT as the pose attribute value describing the pose base of the LEFT FOOT. Describing the spatial position state of the left foot posture relative to the neutral position based on the rotation axis state semantics, wherein the value of the spatial position state is represented by two character positions, and the first position represents the state of rotating around the coronal axis (the state can be described as neutral position/forward bending/backward stretching); the second bit represents the state of rotation about the vertical axis (which state can be described as: neutral/left/right). Assigning a data value of the space position state of the left foot posture relative to the neutral position based on the local posture space state semantic, wherein the value is represented by a character bit, the value of the character is [0,4], and 0 represents that the left foot is in the neutral position; 1 indicates that the left foot is in a forward flexion state; 2, the left foot is in a backward extending state; 3 indicates that the left foot is in a left-handed state; 4 indicates that the left foot is in a dextrorotatory state.
Define PU3_ LEFT _ FOOT as the pose attribute value describing the pose base of the right FOOT. Describing the spatial position state of the right foot posture relative to the neutral position based on the rotation axis state semantics, wherein the value of the spatial position state is represented by two character positions, and the first position represents the state of the right foot posture rotating around the coronal axis (the state of the right foot posture can be described as neutral position/forward bending/backward stretching); the second bit represents a state of rotation about a vertical axis (which state may be described as neutral/right/left). Assigning a data value of the space position state of the right foot posture relative to the neutral position based on the local posture space state semantic, wherein the value is represented by a character bit, the character value is [0,4], and 0 represents that the right foot is in the neutral position; 1 indicates that the right foot is in a forward flexion state; 2, the right foot is in a backward extending state; 3, the right foot is in a right-handed state; 4 indicates that the right foot is in a right-handed state.
And the 3 rd character in the first section of each posture attribute value represents the spatial ordering of the posture base, and the small-to-large characters represent that the distance from the posture base to the center of the human body is increased.
And S3, specifically, describing the motion angle of the attitude base by using a rotation value to obtain the extended attitude base data set. In the present invention, the extended pose base data set is denoted as PU _ ANGLE _ UNIT, and is a minimum ANGLE UNIT when using state extension description, and a final ANGLE value of rotation of each pose base around a rotation axis = minimum ANGLE UNIT × data value. And the selected value of the minimum angle unit can be selected according to the position of the posture base and/or the restoration precision of the high-precision human body posture model. Furthermore, by setting different minimum angle values in the expanded posture base data group, a progressive human body posture data representation can be presented: when the human body gestures are previewed, the human body gestures can be roughly visualized and described by using a larger minimum angle value; when a specific human body posture needs to be observed in detail, a smaller minimum angle value is used for visualizing and describing the human body posture in more detail; and when high-precision reconstruction and visualization are required to be carried out on a certain posture, a high-precision minimum angle value is used.
And S4, specifically, sequentially arranging the basic posture base data sets according to the spatial sequence of the posture bases to obtain the preliminary human body posture model.
The S5 specifically comprises the following steps:
s51, constructing a neutral position as an initial posture;
s52, restoring the angle value of the initial attitude according to the spatial sorting of the attitude basis;
s53, correcting the preliminary human body posture model according to the restored angle value.
Specifically, S5 includes: firstly, constructing a neutral position as an initial posture; the upper body (including the trunk posture base, the left and right arm posture bases, and the head posture base) is regarded as a whole in the spatial order of the posture bases, and is rotated around the coronal axis, the vertical axis, and the sagittal axis of the waist, and the rotation angle of each axis is the data value of the spatial position state of the trunk with respect to the neutral position × the angle minimum unit.
Taking the arm (including the upper arm posture base, the forearm posture base and the hand posture base) as a whole, rotating around a coronal axis, a vertical axis and a sagittal axis of the corresponding shoulder, and simultaneously, the rotation angle of each axis is the rotation value of the arm multiplied by the minimum unit of angle; wherein the left and right arm posture bases are processed in the same way.
Furthermore, the forearm posture base and the hand posture base are regarded as a whole and rotate around the coronal axis and the vertical axis corresponding to the elbow; meanwhile, the rotation angle of each axis is the rotation value multiplied by the minimum unit of the angle of the forearm posture base; similarly, the left and right forearm posture bases and the hand posture base are treated the same. And rotating the hand posture base around the coronal axis of the corresponding wrist, wherein the rotation angle is the rotation value of the hand posture base multiplied by the minimum unit of the angle, and further, the left hand posture base and the right hand posture base are processed in the same way. Rotating the head pose base about the coronal, vertical, and sagittal axes of the neck; the rotation angle of each axis is the rotation value of the head posture base × the angle minimum unit.
Regarding legs (thigh posture base, shank posture base and foot posture base) as a whole, and rotating around the coronal axis, the vertical axis and the sagittal axis of the corresponding hip; the rotation angle of each axis is the rotation value of the thigh posture base × the minimum unit of angle, and the same processing is performed for the left and right leg posture bases. Regarding the shank posture base and the foot posture base as a whole, rotating around a coronal axis and a vertical axis corresponding to the knee joint, wherein the rotation angle of each axis is the rotation value multiplied by the minimum angle unit of the thigh posture base; the left and right shank posture bases and the foot posture base are processed in the same way. Regarding the foot posture base as a whole, and rotating around the coronal axis and the vertical axis of the corresponding ankle joint; the rotation angle of each axis is the rotation value of the foot posture base multiplied by the minimum unit of the angle; wherein the left and right foot poses are treated the same.
After the posture-based action is restored in the above order, the posture finally presented is the restored posture.
The invention also provides a posture expansion state semantic processing system which comprises an action acquisition device, a processing unit and an output unit, wherein the action acquisition device is used for acquiring the instantaneous posture of the human body during the motion of the human body; the processing unit is used for executing the posture expansion state semantic processing method.
In summary, the posture expansion state semantic processing method provided by the invention divides the human posture data into the posture bases for representation, adds the expansion posture base data set on the basis of the basic posture base data set to accurately describe the human posture, and can conveniently perform visual processing on the human posture data.
Although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the spirit and scope of the present invention.

Claims (10)

1. A semantic processing method for an attitude expansion state is characterized by comprising the following steps:
s1, collecting human body posture data, and segmenting a posture base according to different human body parts;
s2, giving rotating shaft state semantics and space position state semantics based on the space position state of each attitude base relative to the neutral position, and acquiring a basic attitude base data set;
s3, giving expanded state semantics based on the spatial position state of each attitude basis relative to the coronal axis and/or the vertical axis, and acquiring an expanded attitude basis data set;
s4, representing the human body posture by using the basic posture base data group to obtain a preliminary human body posture model;
and S5, carrying out fine description on the preliminary human body posture model by using the extended posture base data set to obtain a high-precision human body posture model.
2. The method for semantic processing of posture improvement states according to claim 1, wherein the S1 specifically includes:
s11, collecting the human body posture data by using an action collecting device;
and S12, dividing the human body posture data according to the human body part, and dividing the posture base.
3. The method for posture extension state semantic processing according to claim 2, characterized in that: in S11, the motion acquisition device is a sensor module having an inertial measurement unit, and the inertial measurement unit includes a 3-axis gyroscope, a 3-axis accelerometer, and a 3-axis magnetometer.
4. The pose expansion state semantic processing method according to claim 1, characterized by: in S2, the assignment of the rotation axis state semantics includes assigning a coronal rotation state, a vertical rotation state, and a sagittal rotation state to each of the posture bases with respect to a neutral position; and giving the spatial position state semantics comprises giving a numerical value to the rotation state of the attitude base by using a data value to acquire the basic attitude base data set.
5. The method for semantic processing of an attitude heading reference according to claim 1, wherein S3 is to describe a motion angle of the attitude heading reference using a rotation value to obtain the heading reference data set.
6. The pose expansion state semantic processing method according to claim 5, characterized by comprising: the extended pose basis data set is denoted as PU _ ANGLE _ UNIT, and is the minimum angular UNIT when the state extension description is used, and the final angular value of the rotation of each pose basis around the rotation axis = minimum angular UNIT × rotation value.
7. The attitude expansion state semantic processing method according to claim 1, characterized in that the selected value of the minimum angle unit is selected according to the position of the attitude basis and/or the restoration precision of the high-precision human body attitude model.
8. The pose expansion state semantic processing method according to claim 1, characterized by: and S4, specifically, sequentially arranging the basic posture base data sets according to the spatial sequence of the posture bases to obtain the preliminary human body posture model.
9. The method for semantic processing of an attitude heading according to claim 1, wherein S5 specifically is:
s51, constructing a neutral position as an initial posture;
s52, restoring the angle value of the initial attitude according to the spatial sorting of the attitude basis;
s53, correcting the preliminary human body posture model according to the restored angle value.
10. An attitude expansion state semantic processing system, characterized in that: the human body movement monitoring device comprises an action acquisition device, a processing unit and an output unit, wherein the action acquisition device is used for acquiring the instantaneous posture of a human body during human body movement; the processing unit is used for executing the posture expansion state semantic processing method of any one of claims 1 to 9.
CN202210939186.1A 2022-08-05 2022-08-05 Attitude expansion state semantic processing method and system Active CN115310484B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210939186.1A CN115310484B (en) 2022-08-05 2022-08-05 Attitude expansion state semantic processing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210939186.1A CN115310484B (en) 2022-08-05 2022-08-05 Attitude expansion state semantic processing method and system

Publications (2)

Publication Number Publication Date
CN115310484A true CN115310484A (en) 2022-11-08
CN115310484B CN115310484B (en) 2024-04-09

Family

ID=83860993

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210939186.1A Active CN115310484B (en) 2022-08-05 2022-08-05 Attitude expansion state semantic processing method and system

Country Status (1)

Country Link
CN (1) CN115310484B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020070812A1 (en) * 2018-10-03 2020-04-09 株式会社ソニー・インタラクティブエンタテインメント Skeleton model update device, skeleton model update method, and program
CN112750199A (en) * 2021-01-13 2021-05-04 电子科技大学 Energy optimization-based panda three-dimensional model reconstruction method
CN113143257A (en) * 2021-02-09 2021-07-23 国体智慧体育技术创新中心(北京)有限公司 Generalized application system and method based on individual movement behavior hierarchical model
CN114758039A (en) * 2020-12-28 2022-07-15 北京陌陌信息技术有限公司 Sectional driving method and equipment of human body model and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020070812A1 (en) * 2018-10-03 2020-04-09 株式会社ソニー・インタラクティブエンタテインメント Skeleton model update device, skeleton model update method, and program
CN114758039A (en) * 2020-12-28 2022-07-15 北京陌陌信息技术有限公司 Sectional driving method and equipment of human body model and storage medium
CN112750199A (en) * 2021-01-13 2021-05-04 电子科技大学 Energy optimization-based panda three-dimensional model reconstruction method
CN113143257A (en) * 2021-02-09 2021-07-23 国体智慧体育技术创新中心(北京)有限公司 Generalized application system and method based on individual movement behavior hierarchical model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
田枫;刘贤梅;沈旭昆;: "一种基于运动姿态的三维人体运动检索方法", 计算机仿真, no. 11, pages 52 - 56 *

Also Published As

Publication number Publication date
CN115310484B (en) 2024-04-09

Similar Documents

Publication Publication Date Title
WO2018196227A1 (en) Evaluation method, device, and system for human motor capacity
CN107616898B (en) Upper limb wearable rehabilitation robot based on daily actions and rehabilitation evaluation method
CN110609621B (en) Gesture calibration method and human motion capture system based on microsensor
CN110327048A (en) A kind of human upper limb posture reconstruction system based on wearable inertial sensor
CN108388887A (en) Biped robot's Analytical Methods of Kinematics based on toddlerhood child's Gait extraction
WO2022227664A1 (en) Robot posture control method, robot, storage medium and computer program
WO2001073689A2 (en) Method and system for viewing kinematic and kinetic information
Kim et al. StrokeTrack: wireless inertial motion tracking of human arms for stroke telerehabilitation
CN113499065A (en) Body motion capturing method based on inertial sensor and rehabilitation evaluation system
CN115964933A (en) Construction method of virtual and real training device based on digital twins
Liu et al. A new IMMU-based data glove for hand motion capture with optimized sensor layout
CN102479386A (en) Three-dimensional motion tracking method of upper half part of human body based on monocular video
CN113679568B (en) Robot-assisted upper limb multi-mode mirror image rehabilitation training scoring system for stroke patient
CN115310484B (en) Attitude expansion state semantic processing method and system
Chakravarthi et al. Real-time human motion tracking and reconstruction using imu sensors
CN114974506B (en) Human body posture data processing method and system
CN109887570B (en) Robot-assisted rehabilitation training method based on RGB-D camera and IMU sensor
CN112205979A (en) Device and method for measuring mechanical energy of moving human body in real time
Lin et al. Using hybrid sensoring method for motion capture in volleyball techniques training
CN115890671A (en) SMPL parameter-based multi-geometry human body collision model generation method and system
CN115299931A (en) Human body posture data processing method and system
CN109102572A (en) Power transformation emulates virtual hand bone ratio in VR system and estimates method
Yang et al. Research on human motion monitoring method based on multi-joint constraint filter model
US20210097885A1 (en) Quantified Movement Feedback System
CN114391831A (en) Hip-knee joint angle and dynamic force line resolving system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant