CN113450903A - Human body action mapping method and device, computer equipment and storage medium - Google Patents

Human body action mapping method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN113450903A
CN113450903A CN202110724802.7A CN202110724802A CN113450903A CN 113450903 A CN113450903 A CN 113450903A CN 202110724802 A CN202110724802 A CN 202110724802A CN 113450903 A CN113450903 A CN 113450903A
Authority
CN
China
Prior art keywords
joint
target
robot
adjacent
measurement unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110724802.7A
Other languages
Chinese (zh)
Other versions
CN113450903B (en
Inventor
刘天鑫
蒿杰
梁俊
刘嘉瑞
舒琳
黄玉清
叶渐豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xintiao Technology Guangzhou Co ltd
Institute of Automation of Chinese Academy of Science
Guangdong Institute of Artificial Intelligence and Advanced Computing
Original Assignee
Xintiao Technology Guangzhou Co ltd
Institute of Automation of Chinese Academy of Science
Guangdong Institute of Artificial Intelligence and Advanced Computing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xintiao Technology Guangzhou Co ltd, Institute of Automation of Chinese Academy of Science, Guangdong Institute of Artificial Intelligence and Advanced Computing filed Critical Xintiao Technology Guangzhou Co ltd
Priority to CN202110724802.7A priority Critical patent/CN113450903B/en
Publication of CN113450903A publication Critical patent/CN113450903A/en
Application granted granted Critical
Publication of CN113450903B publication Critical patent/CN113450903B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation

Abstract

The embodiment of the invention provides a human body action mapping method, a human body action mapping device, computer equipment and a storage medium, wherein the method comprises the following steps: in the embodiment, inertia data are read from a plurality of inertia measurement units, the inertia measurement units are attached to body parts of a user, body joints of the user are spaced between two adjacent inertia measurement units, the inertia data are fused into postures of the inertia measurement units in space, joint angles of the body joints are calculated according to the corresponding postures of the two adjacent inertia measurement units, the joint angles are mapped to robot joints of the robot, the robot is driven to move until the adjacent robot joints reach the joint angles, the joint angles of the body joints are calculated by directly calculating the postures of the inertia measurement units, and the joint angles are mapped to the robot joints of the robot.

Description

Human body action mapping method and device, computer equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of automation, in particular to a human body action mapping method, a human body action mapping device, computer equipment and a storage medium.
Background
With the development of science and technology, human body actions are learned, mapped to the robot, and executed by the robot, so that the robot can be widely applied to a plurality of service scenes.
For example, in a fire fighting and disaster relief business scene, a fire fighting robot can help a fire fighter to carry out rescue work remotely; in a medical service scene, actions of a doctor can be remotely mapped to medical robots in hospitals beyond kilometers, and remote surgery or virtual surgery is realized.
The method for extracting the human body motion mainly includes optical motion capture, curvature sensor motion capture, and IMU (Inertial Measurement Unit) motion capture, wherein the optical motion capture is troubled by problems of illumination, shielding, precision, and the like, the motion capture of the curvature sensor requires the sensor and the joint to be attached with high precision, and the degree of freedom of Measurement is limited, so the IMU motion capture is mostly applied to capture the human body motion at present.
The robot follows after human body action is extracted, generally, the human body and the robot follow in a Cartesian space, and after data generated by the human body action is collected, kinematics calculation is carried out in real time in each frame by the method, the calculation is complex, the consumed time is long, and the following precision is low and the response speed is low.
Disclosure of Invention
The embodiment of the invention provides a human body action mapping method, a human body action mapping device, computer equipment and a storage medium, and aims to solve the problems that robot follow-up calculation is complex and time-consuming.
In a first aspect, an embodiment of the present invention provides a human body motion mapping method, including:
reading inertial data from a plurality of inertial measurement units, wherein the plurality of inertial measurement units are attached to body parts of a user, and a part of adjacent inertial measurement units are spaced from each other by body joints of the user;
fusing the inertial data into a pose of the inertial measurement unit in space;
calculating joint angles of the body joints according to the postures corresponding to the two adjacent inertial measurement units;
mapping the joint angle to a robot joint of a robot;
and driving the robot to move until the adjacent machine joint reaches the joint angle.
In a second aspect, an embodiment of the present invention further provides a human body motion mapping apparatus, including:
the inertial data reading module is used for reading inertial data from a plurality of inertial measurement units, the plurality of inertial measurement units are attached to the body part of a user, and part of adjacent inertial measurement units are spaced by the body joint of the user;
the attitude fusion module is used for fusing the inertial data into the attitude of the inertial measurement unit in the space;
the joint angle calculation module is used for calculating joint angles of the body joints according to the postures corresponding to the two adjacent inertia measurement units;
a joint angle mapping module for mapping the joint angle to a robot joint of a robot;
and the robot driving module is used for driving the robot to move until the adjacent machine joint reaches the joint angle.
In a third aspect, an embodiment of the present invention further provides a computer device, where the computer device includes:
one or more processors;
a memory for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the human action mapping method as described in the first aspect.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the human motion mapping method according to the first aspect is implemented.
In the embodiment, inertia data are read from a plurality of inertia measurement units, the inertia measurement units are attached to body parts of a user, body joints of the user are spaced between two adjacent inertia measurement units, the inertia data are fused into postures of the inertia measurement units in space, joint angles of the body joints are calculated according to the corresponding postures of the two adjacent inertia measurement units, the joint angles are mapped to robot joints of the robot, the robot is driven to move until the adjacent robot joints reach the joint angles, the joint angles of the body joints are calculated by directly calculating the postures of the inertia measurement units, and the joint angles are mapped to the robot joints of the robot.
Drawings
Fig. 1 is a flowchart of a human body action mapping method according to an embodiment of the present invention;
fig. 2A is a diagram illustrating an example of an inertial measurement unit mounted on a human body according to an embodiment of the present invention;
fig. 2B is an exemplary diagram of a robot according to an embodiment of the present invention;
fig. 2C is a schematic diagram of another inertial measurement unit mounted on a human body according to an embodiment of the present invention;
fig. 2D is a diagram illustrating another robot according to an embodiment of the present invention;
fig. 3 is a flowchart of a human body action mapping method according to a second embodiment of the present invention;
fig. 4 is a schematic structural diagram of a human body motion mapping device according to a third embodiment of the present invention;
fig. 5 is a schematic structural diagram of a computer device according to a fourth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a human body motion mapping method according to an embodiment of the present invention, where the present embodiment is applicable to a case where an IMU is used to acquire motions of a user-specified joint and calculate a pose, and map the pose to a robot joint of a robot in real time, and the method may be executed by a human body motion mapping apparatus, where the human body motion mapping apparatus may be implemented by software and/or hardware, and may be configured in a computer device, such as a personal computer, a server, a workstation, and the like, and specifically includes the following steps:
step 101, reading inertial data from a plurality of inertial measurement units.
In this embodiment, a plurality of inertial measurement units may be preconfigured, an inertial measurement unit being a sensor or a collection of sensors that measure data generated when the carrier is in motion (i.e. inertial data), the inertial measurement unit comprising the following types:
1. six-axis inertia measurement unit
The six-axis inertial measurement unit is provided with a three-axis gyroscope and a three-axis accelerometer, can sense the pitch (rotating around the X axis, also called pitch angle), yaw (rotating around the Y axis, also called yaw angle) and roll (rotating around the Z axis, also called roll angle) of the carrier with 3 degrees of freedom, and can also sense the acceleration of the carrier with 3 degrees of freedom.
2. Nine-axis inertia measurement unit
The nine-axis inertia measurement unit is provided with a three-axis gyroscope, a three-axis accelerometer and a three-axis magnetometer, and can sense the acceleration of 3 degrees of freedom of pitch, yaw and roll of the carrier, and can also sense the magnetic field intensity of 3 degrees of freedom of the carrier.
The number of data dimensions acquired by the nine-axis inertia measurement unit is larger than that of data dimensions acquired by the six-axis inertia measurement unit, so that the accuracy of mapping the human body action by applying the nine-axis inertia measurement unit is higher than that of mapping the human body action by applying the six-axis inertia measurement unit, and therefore, the nine-axis inertia measurement unit can be selected to map the human body action.
In particular implementations, a plurality of inertial measurement units may be attached to a user's body part, which may include the head, hands, feet, back, and so forth.
Further, the inertial measurement units can be attached to different positions of the same body part, the skeleton is used as a partition, and at least one inertial measurement unit can be attached to the body part where the same skeleton is located, particularly to the edge of a body joint, wherein the body joint refers to a movable bone connection, so that the body joint of the user is at least partially separated between two adjacent inertial measurement units.
Taking an arm as an example, as shown in fig. 2A, a default stationary inertial measurement unit is set for the back of a user 210 as a reference, 4 inertial measurement units are respectively mounted on the arm of the user 210, an inertial measurement unit 211 is mounted between a shoulder joint (body joint) and an elbow joint (body joint), an inertial measurement unit 212 and an inertial measurement unit 213 are mounted between the elbow joint (body joint) and a wrist joint (body joint), the inertial measurement unit 212 is located at the edge of the elbow joint, the inertial measurement unit 213 is located at the edge of the wrist joint, and an inertial measurement unit 214 is mounted on the back of the hand.
Then, a shoulder joint is spaced between the inertial measurement unit with the back stationary and the inertial measurement unit 211, an elbow joint is spaced between the inertial measurement unit 211 and the inertial measurement unit 212, and a wrist joint is spaced between the inertial measurement unit 213 and the inertial measurement unit 214.
Taking the palm as another example, as shown in fig. 2C, 13 inertial measurement units are respectively mounted on the back of the palm of the user, and for ease of understanding, fig. 2C is shown in the form of a palm center, in this example, a default stationary inertial measurement unit 2300 is set for the back center of the palm of the user as a reference.
The inertia measurement unit 2311, the inertia measurement unit 2312 and the inertia measurement unit 2313 are respectively mounted on the thumb of the user, a finger joint is arranged between the inertia measurement unit 2311 and the inertia measurement unit 2312 at an interval, a finger joint is arranged between the inertia measurement unit 2312 and the inertia measurement unit 2313 at an interval, and a finger joint is arranged between the inertia measurement unit 2300 and the inertia measurement unit 2313 at an interval.
An inertia measurement unit 2321, an inertia measurement unit 2322 and an inertia measurement unit 2323 are respectively mounted on the index finger of the user, a finger joint is arranged between the inertia measurement unit 2321 and the inertia measurement unit 2322 at an interval, a finger joint is arranged between the inertia measurement unit 2322 and the inertia measurement unit 2323 at an interval, and a finger joint is arranged between the inertia measurement unit 2300 and the inertia measurement unit 2323 at an interval.
An inertia measurement unit 2331, an inertia measurement unit 2332 and an inertia measurement unit 2333 are respectively mounted on the middle finger of the user, one knuckle is arranged between the inertia measurement unit 2331 and the inertia measurement unit 2332, one knuckle is arranged between the inertia measurement unit 2332 and the inertia measurement unit 2333, and one knuckle is arranged between the inertia measurement unit 2300 and the inertia measurement unit 2333.
An inertia measurement unit 2341, an inertia measurement unit 2342 and an inertia measurement unit 2343 are respectively mounted on a ring finger of a user, a knuckle is arranged between the inertia measurement unit 2341 and the inertia measurement unit 2342 at an interval, a knuckle is arranged between the inertia measurement unit 2342 and the inertia measurement unit 2343 at an interval, and a knuckle is arranged between the inertia measurement unit 2300 and the inertia measurement unit 2343 at an interval.
Of course, the mounting manner of the inertial measurement unit is only an example, and when the embodiment of the present invention is implemented, other mounting manners of the inertial measurement unit may be set according to actual situations, which is not limited in the embodiment of the present invention. In addition, besides the mounting manner of the inertial measurement unit, a person skilled in the art may also use other mounting manners of the inertial measurement unit according to actual needs, and the embodiment of the present invention is not limited thereto.
During operation, on one hand, a start instruction can be respectively sent to the plurality of inertial measurement units, the plurality of inertial measurement units respond to the start instruction, and acquire inertial data in real time, such as angular velocity of 3 degrees of freedom, acceleration of 3 degrees of freedom, and magnetic field strength of 3 degrees of freedom, and on the other hand, a user executes an action according to the requirement of a service scene, at this time, the inertial data can be respectively read from the plurality of inertial measurement units, and the inertial data is cached in the memory.
And 102, fusing the inertial data into the attitude of the inertial measurement unit in the space.
Generally, the inertial measurement unit can measure various types of inertial data, and the inertial data can measure the angular relationship to a certain extent, so that different types of inertial data can be fused into the rotating posture of the inertial measurement unit in a three-dimensional space through posture fusion algorithms such as Complementary, Madgwick, Mahony, Kalman, Extended Kalman Filter and the like, and the rotating posture of the body part of the user where each inertial measurement unit is located in the three-dimensional space when the user performs an action is obtained.
The pose may be expressed in terms of Quaternions (quaternones), euler angles, rotation matrices, and the like.
In one embodiment of the present invention, step 102 comprises the steps of:
step 1021, determining the type of the inertial data.
Step 1022, calibrating the inertial data according to type.
Since the inertia measurement unit can measure various types of inertia data, and the inertia data have certain errors, different calibration modes can be set in advance for different types of inertia data, namely, different sensors in the inertia measurement unit are calibrated.
For the inertial data detected by the inertial measurement unit, the type of the inertial data can be obtained based on the sensor (such as a gyroscope, an accelerometer, a magnetometer, and the like) from which the inertial data is obtained, a corresponding calibration mode is inquired according to the type of the inertial data, and the inertial data is calibrated according to the calibration mode.
In one example, if the type of the inertial data is the angular velocity acquired by the gyroscope, and the unit of the angular velocity acquired by the gyroscope is degree/second, a first offset error may be queried, and the first offset error may be obtained by reading a plurality of angular velocities acquired by the gyroscope a plurality of times (for example, 1000 times) in advance while the gyroscope (inertial measurement unit) is stationary, and calculating an average value of the plurality of angular velocities for each axis as the first offset error.
Thereafter, the first offset error is subtracted from each angular velocity acquired by the gyroscope as a calibrated angular velocity.
In another example, if the type of inertial data is acceleration collected by an accelerometer, the acceleration is expressed in terms of acceleration relative to gravity (9.8 m/s)2) The ratio of (a) can be queried as a second bias error bias and a first scale error scale, if the accelerometer is not calibrated well before leaving a factory, the accelerometer (an inertia measurement unit) can be respectively placed towards 6 surfaces in a static mode and a plurality of accelerations are collected, and each accelerometer can be respectively solved based on ellipsoid fitting of a least square methodSecond bias error bias, first proportional error scale, of the axis.
Thereafter, the difference between the acceleration and the second bias error bias is multiplied by the first proportional error scale as the calibrated acceleration.
In yet another example, if the type of inertial data is the magnetic field strength collected by a magnetometer, the magnetometer (inertial measurement unit) may be rotated around the "8" word in a three-dimensional space in advance, and the magnetic field strength recorded by the magnetometer when the space is rotated is collected, and the sampling may be continued for a certain period of time, such as 30 seconds, in order to ensure the amount of the magnetic field strength.
During calibration, the magnetic field strength recorded when the inertial measurement unit rotates in space can be acquired, and the calibration atmosphere of the magnetic field strength comprises the following two parts:
1. deburring
In the present part, derivation is performed on the one-dimensional discrete magnetic field strength, so as to calculate a series of change rates for the magnetic field strength, that is, for the current time, the magnetic field strength at the current time is subtracted by the magnetic field strength at the previous time, so as to obtain the change rate at the current time.
Calculating the average value mean and the standard deviation std of the change rate, and respectively expanding a plurality of standard deviations upwards and downwards on the basis of the average value to obtain a change range [ mean-m × std, mean + n × std ], wherein m and n are positive numbers, and m and n can be equal (if m and n are both 3) or unequal.
The range of change belongs to a range of rate of change confidence to which the rate of change at each time can be compared.
If the rate of change is within the range of change, the rate of change may be considered normal.
If the change rate exceeds the change range, that is, if the change rate is smaller than the lower limit value (mean-m × std) of the change range or the change rate is larger than the upper limit value (mean + n × std) of the change range, it is considered that the change rate is abnormal, and the time corresponding to the change rate is recorded, so that the magnetic field intensity at the time is deleted.
2. Correction of
After the magnetic field intensity of the part belonging to the burr is deleted, for the remaining magnetic field intensity, respectively inquiring a first target intensity Max and a second target intensity Min for a plurality of axes (namely an X axis, a Y axis and a Z axis) from the remaining magnetic field intensity, wherein the first target intensity Max is the magnetic field intensity with the largest numerical value, and the second target intensity Min is the magnetic field intensity with the smallest numerical value.
For the plurality of axes, an average value is calculated for the first target intensity Max and the second target intensity Min, respectively, as a plurality of third offset errors bias on the plurality of axes, that is, bias ═ Max + Min)/2.
The difference between the first target intensity Max and the second target intensity Min, respectively, for the plurality of axes is taken as a plurality of variation widths (Max-Min) of the magnetic field intensity on the plurality of axes.
Comparing the change amplitudes on the multiple axes, selecting the change amplitude with the largest value as the target amplitude, and marking as A.
For the plurality of axes, the ratio between the variation width and the target width is calculated as the second proportional error scale, specifically, the variation width is Ax for the X axis, the second proportional error scale _ X is Ax/a, the variation width is Ay for the Y axis, the second proportional error scale _ Y is Ay/a, the variation width is Az for the Z axis, and the second proportional error scale _ Z is Az/a.
The difference between the original magnetic field strength and the third bias error bias is multiplied by a second proportional error scale to obtain a corrected magnetic field strength, which can be regarded as the actual magnetic field strength, and the original magnetic field strength is m, which can be expressed as m _ ture ═ m-bias × scale.
And performing first-order complementary filtering on the magnetic field intensity m _ future corrected at the current time and the magnetic field intensity m _ future corrected at the last time to serve as the magnetic field intensity after calibration.
In the embodiment, the magnetic field intensity calibration mode has low calculation complexity and good stability, and can realize real-time online calibration.
Of course, the above calibration method is only an example, and when the embodiment of the present invention is implemented, other calibration methods may be set according to actual situations, for example, in order to eliminate an error generated when the inertial measurement unit is installed, an included angle may be calculated for the acceleration of the accelerometer and the magnetic field strength of the magnetometer, and the offset error is subtracted from the included angle, that is, the acceleration and the magnetic field strength are mapped to the X axis, the Y axis, and the Z axis, and the like, which is not limited in this embodiment of the present invention. In addition, besides the calibration method, a person skilled in the art may also use other calibration methods according to actual needs, and the embodiment of the present invention is not limited to this.
And 1023, if the calibration is finished, fusing the inertial data into the attitude of the inertial measurement unit in the space based on a filtering mode.
Aiming at different types of inertial data, the confidence coefficients are different, the confidence coefficient of the angular velocity of the gyroscope is higher than the confidence coefficients of the acceleration of the accelerometer and the magnetic field strength of the magnetometer in a short time, the angular velocity of the gyroscope can be corrected by using the acceleration of the accelerometer and the magnetic field strength of the magnetometer in a filtering mode, so that all the inertial data are fused into the attitude of the inertial measurement unit in a three-dimensional space, and the filtering mode has the advantages of low time complexity, low space complexity and capability of realizing on-line real-time fusion under the condition of ensuring high accuracy.
Taking an example of fusing inertial data of a six-axis inertial measurement unit by using an RTQF (Richards-tech) algorithm, taking an attitude calculated by an accelerometer and a magnetometer as a measured value by using the RTQF (Richards-tech) algorithm, performing Runge-Kutta integration with a gyroscope to obtain a quaternion, and performing spherical interpolation on the quaternion to obtain a final attitude.
Considering that the attitude measured by the inertial measurement unit has a deviation, the attitude acquired by the inertial measurement unit when the inertial measurement unit is stationary may be recorded in advance as a deviation attitude, an inverse of the deviation attitude corresponding to the inertial measurement unit is calculated for each inertial measurement unit, and the inverse of the deviation attitude is multiplied by the attitude corresponding to the inertial measurement unit, thereby updating the attitude and eliminating the initial deviation.
In this regard, the update process may be written as:
Q=Qinv -1Qupdate
wherein Q isinvIn a deviated attitude, Qinv -1Is the inverse of the deviation attitude, Q is the attitude before update, QupdateIs the updated pose.
And 103, calculating joint angles of the body joints according to the corresponding postures of the two adjacent inertia measurement units.
For the situation that the body joints of the user are arranged between two adjacent inertia measurement units, the whole skeleton forming the body joints is relatively straight, and the angle between the two adjacent inertia measurement units can be regarded as the angle of the body part where the two adjacent inertia measurement units are located, namely the joint angle of the body joint formed by the body part, so that the corresponding postures of the two adjacent inertia measurement units are converted into the included angle in the three-dimensional space, and the joint angle between the body parts where the two adjacent inertia measurement units are located can be calculated.
In one embodiment of the present invention, step 103 may comprise the steps of:
and step 1031, inquiring two postures corresponding to two adjacent inertia measurement units as a first target posture and a second target posture.
In this embodiment, two inertial measurement units adjacent to and spaced apart from the body joint of the user, and the extractor postures may be respectively recorded as the first target posture and the second target posture.
And step 1032, determining the rotation mode of two adjacent inertia measurement units in one or more degrees of freedom as the rotation mode.
In this embodiment, a mode of the inertial measurement unit rotating in one or more degrees of freedom may be pre-determined based on the type of the inertial measurement unit, and as the rotation mode, the rotation mode may be recorded in the computer device in a hard-coded manner, or may be recorded in the computer device with configurable parameters, which is not limited in this embodiment.
For example, for a six-axis inertial measurement unit, a single degree of freedom may be set, X, Y, and Z axes, which may be denoted as kx (i.e., rotation about the X axis), ky (i.e., rotation about the Y axis), and kz (i.e., rotation about the Z axis).
For another example, the nine-axis inertial measurement unit may have two degrees of freedom, which are X-axis and Z-axis, and Y-axis and Z-axis, respectively, and may be referred to as xz (i.e., rotation around X-axis and Z-axis) and yz (i.e., rotation around Y-axis and Z-axis).
Step 1033, calculating a relationship of the second target pose with respect to the first target pose as a third target pose.
In general, a default stationary inertial measurement unit is set for a body part of a user as a reference in a three-dimensional space, for example, an inertial measurement unit is set on the back of the user, and a relationship between the attitude (second target attitude) of a succeeding inertial measurement unit with respect to the attitude (first target attitude) of a preceding inertial measurement unit is calculated with the stationary inertial measurement unit as a starting point, so as to obtain a third target attitude.
In a specific implementation, the inverse of the first target pose may be calculated, the second target pose is multiplied by the inverse of the first target pose to obtain a third target pose, and the first target pose is recorded as Q1The second target posture is Q2Then, the inverse of the first target pose is Q1 -1Then, the third target attitude Q ═ Q2Q1 -1
And 1034, rotating the reference vector representing the rotation mode around the third target posture to obtain a target vector.
In the present embodiment, a vector indicating a rotation pattern may be set as a reference vector, and for convenience of operation, the reference vector may be a unit vector, an angular velocity, and for improving accuracy, the angular velocity of the other axis than the degrees of freedom in the rotation pattern may be set to 0, that is, in the reference vector, a dimension corresponding to a degree of freedom for rotation of adjacent two inertial measurement units is set to 0, and in the reference vector, a dimension corresponding to a degree of freedom for non-rotation of adjacent two inertial measurement units is set to 1.
For example, for the rotation pattern ky, the reference vector V may be set to (1, 0, 0), for the rotation pattern kx, the reference vector V may be set to (0, 1, 0), for the rotation pattern kz, the reference vector V may be set to (0, 1, 0), for the rotation pattern xz, the reference vector V may be set to (0, 1, 0), and for the rotation pattern yz, the reference vector V may be set to (1, 0, 0).
And rotating the reference vector through the operator operation of the third target posture to obtain a target vector.
In a specific implementation, an inverse of the third target pose may be calculated, and the third target pose, the reference vector, and the inverse of the third target pose may be multiplied to obtain a target vector.
Let the third target pose be Q and the reference vector be V, then the inverse of the third target pose is Q-1Then the target vector Vnew=QVQ-1
And 1035, calculating an included angle between the target vector and the reference vector as a joint angle between two adjacent body parts.
The included angle formed between the target vector and the unit vector can be regarded as the included angle formed between two adjacent inertia measurement units in a three-dimensional space, and is used as the joint angle formed between the body parts where the two adjacent inertia measurement units are located in the three-dimensional space.
In the specific implementation, the target vector and the reference vector are processed by inverse cosine to obtain the joint angle between two adjacent body parts, namely the target vector is VnewWhen the reference vector is V, the joint angle θ becomes acos (V)newV), wherein the direction of the joint angle theta is defined by V and VnewDetermined according to the right-hand rule, the direction of the joint angle theta and V in generalnewThe Z-axis direction of (a) is the same.
Step 104, mapping the joint angle to a robot joint of the robot.
In the robot, a robot joint that fits a part or all of human joints of a user may be provided, and fitting may mean that a relative position between the robot joints is the same as a relative position between the human joints, and a rotation mode of the robot joint is the same as a rotation mode of the human joint.
Taking the robot arm 220 as an example of a robot, as shown in fig. 2B, the robot arm includes a shoulder joint 211, an elbow joint 222, and a wrist joint 223 as joints of the robot.
Taking a mechanical palm as another example of the robot, as shown in fig. 2D, there are 12 machine joints provided in the mechanical arm, where the machine Joint of the mechanical thumb in the mechanical palm includes Joint _11 ═ 2400, 2413>, Joint _12 ═ 2413, 2412>, Joint _13 ═ 2412, 2411>, the machine Joint of the mechanical index finger in the mechanical palm includes Joint _21 ═ 2400, 2423>, Joint _22 ═ 2423, 2422>, Joint _23 ═ 2422, 2421>, the machine Joint of the mechanical middle finger in the mechanical palm includes Joint _31 ═ 2400, 2433>, Joint _32 ═ 2433, 2432 ═ Joint _33 ═ 2432, 2431>, and the machine Joint of the mechanical ring finger in the mechanical palm includes Joint _41 ═ 2400, 2443>, Joint _42 ═ 2443, Joint _42 ═ 43, and 2442 ═ 2442.
And mapping the currently measured joint angle of the human body joint to a robot joint of the robot so that the robot joint of the robot follows the joint angle of the human body joint in real time.
In a specific implementation, a mapping relationship between a machine joint and a human body joint may be preset, in which the human body joint is identified by a first number of an inertia measurement unit calculating the human body joint, and the machine joint is identified by a second number, that is, a relationship of mapping from the first number of the inertia measurement unit to the second number of the machine joint of the robot is recorded in the mapping relationship.
For example, as shown in fig. 2A and 2B, the joint angle between the inertial measurement unit 211 and the inertial measurement unit 211 at rest on the back may be mapped to the shoulder joint 221, the joint angle between the inertial measurement unit 211 and the inertial measurement unit 212 may be mapped to the elbow joint 222, and the joint angle between the inertial measurement unit 213 and the inertial measurement unit 214 may be mapped to the wrist joint 223.
For another example, as shown in fig. 2C and 2D, the knuckle spaced between the inertia measurement unit 2311 and the inertia measurement unit 2312 may be mapped to a machine Joint _11 ═ 2400, 2413>, the knuckle spaced between the inertia measurement unit 2312 and the inertia measurement unit 2313 may be mapped to a machine Joint _12 ═ 2413, 2412>, and the knuckle spaced between the inertia measurement unit 2300 and the inertia measurement unit 2313 may be mapped to a machine Joint _13 ═ 2412, 2411 >.
The finger Joint spaced between the inertia measurement unit 2321 and the inertia measurement unit 2322 may be mapped to a machine Joint _21 ═ 2400, 2423>, the finger Joint spaced between the inertia measurement unit 2322 and the inertia measurement unit 2323 may be mapped to a machine Joint _22 ═ 2423, 2422>, and the finger Joint spaced between the inertia measurement unit 2300 and the inertia measurement unit 2323 may be mapped to a machine Joint _23 ═ 2422, 2421 >.
The finger joints spaced between the inertia measurement unit 2331 and the inertia measurement unit 2332 may be mapped to machine joints Joint _31 ═ 2400, 2433>, the finger joints spaced between the inertia measurement unit 2332 and the inertia measurement unit 2333 may be mapped to machine joints Joint _32 ═ 2433, 2432>, and the finger joints spaced between the inertia measurement unit 2300 and the inertia measurement unit 2333 may be mapped to machine joints Joint _33 ═ 2432, 2431 >.
The knuckles spaced between the inertia measurement unit 2341 and the inertia measurement unit 2342 may be mapped to machine Joint _41 ═ 2400, 2443>, the knuckles spaced between the inertia measurement unit 2342 and the inertia measurement unit 2343 may be mapped to machine Joint _42 ═ 2443, 2442>, and the knuckles spaced between the inertia measurement unit 2300 and the inertia measurement unit 2343 may be mapped to machine Joint _43 ═ 2442, 2441 >.
Then, when the joint angles of the human body joints are mapped in real time, a preset mapping relation can be inquired, two first numbers of two adjacent inertia measurement units are inquired, in the mapping relation, a second number which is jointly mapped by the two first numbers of the two adjacent inertia measurement units is inquired, and therefore the joint angles of the two adjacent inertia measurement units are mapped to the machine joints corresponding to the second number.
And 105, driving the robot to move until the adjacent machine joint reaches the joint angle.
Aiming at joint angles of all the machine joints in the robot, a control box of the robot can be started to execute moving operation until the adjacent machine joints reach the joint angles, so that the robot follows a user to finish actions, and the remote control operation of the robot is realized.
In the specific implementation, the track of the joint angle reached by the robot joint can be planned in real time on line, the rule is input into a control box of the robot, and the robot is controlled to rotate according to the track until the adjacent robot joint reaches the joint angle, so that the motion of the robot is smoother.
In this example, a polynomial interpolation may be used to plan the trajectory.
The joint angle calculated by the inertia measurement unit is original data for capturing human body joint motion, and due to the constraint of the motion performance of the robot, the original data is directly input to the robot to serve as an instruction, so that phenomena of robot motion overshoot, vibration and the like are often caused.
In the embodiment, smooth tracks are planned for all the machine joints of the robot in real time by a method based on polynomial interpolation, and the robot can closely follow the motion of a human body.
Specifically, the angle of the current robot joint feedback state is used as a starting point (p)0) And the positive and negative of the joint angle obtained from the inertial measurement unit is used as the direction judgment, and the extreme position angle of the direction is used as the terminal point (p)1). The starting point speed is the current feedback speed (v) of the robot0) The end point velocity is set to 0(v 1). And performing curve interpolation by using a polynomial on the attitude given by the inertia measurement unit at certain time intervals (T, such as 40 ms). By five parameters (p)0,p1,v0,v1And T) the interpolated track ensures that the speed and the acceleration of the robot cannot change suddenly in the motion process.
Based on the result of the above trajectory planning, the present embodiment may use the Control system of the robot to establish a TCP (Transmission Control Protocol) client, receive the network packet of the server in time, and transmit the network packet to the Protocol analysis module of the robot through the queue.
The protocol analysis module analyzes the joint angle of the robot joint of the robot according to the existing protocol and transmits the joint angle to the control module of the robot joint through the queue respectively.
For the condition of the mechanical arm, the control module performs track planning on the joint angle in combination with the position and the speed state fed back by the mechanical arm to generate a path at the subsequent moment, and the path is issued to the machine joint through a ros node. The motion control of the machine joint is based on CB3, and an Application Programming Interface (API) uses a servoj command, so that the control mode is direct, and the motion of the mechanical arm is stable.
For the request of the manipulator, the control module issues the corresponding joint angle to the machine joint through the ROS node.
In the embodiment, inertia data are read from a plurality of inertia measurement units, the inertia measurement units are attached to body parts of a user, body joints of the user are spaced between two adjacent inertia measurement units, the inertia data are fused into postures of the inertia measurement units in space, joint angles of the body joints are calculated according to the corresponding postures of the two adjacent inertia measurement units, the joint angles are mapped to robot joints of the robot, the robot is driven to move until the adjacent robot joints reach the joint angles, the joint angles of the body joints are calculated by directly calculating the postures of the inertia measurement units, and the joint angles are mapped to the robot joints of the robot.
Example two
Fig. 3 is a flowchart of a human body motion mapping method according to a second embodiment of the present invention, where the present embodiment further adds operations of virtual reality human body motions based on the foregoing embodiments, and the method specifically includes the following steps:
step 301, reading inertial data from a plurality of inertial measurement units.
The plurality of inertial measurement units are attached to the body part of the user, and the joints of the body of the user are arranged between two adjacent inertial measurement units.
Step 302, fusing the inertial data into the attitude of the inertial measurement unit in the space.
Step 303, loading a service scene.
And 304, generating a virtual part matched with the body part according to the posture in the service scene.
In some service scenes, a 3D engine, for example, a Unity engine, a ghost engine, a Cocos2dx engine, a self-research engine, or the like, may be started, and the service scenes related to the service scenes are rendered by the 3D engine, where the service scenes may be virtual scenes or scenes collected by a robot reality, which is not limited in this embodiment.
In the business scenario, the same virtual part as the body part of the user, i.e. the body part displayed in a virtual reality manner, may be loaded.
The attitude of the inertial measurement unit can be transmitted into the 3D engine in forms of quaternion and the like in real time, and the attitude is mapped to a virtual part which is the same as the human body part where the inertial measurement unit is located in a service scene, so that the virtual part is expressed as the attitude, the action of a user is displayed in real time, and the follow-up effect in some service scenes can be ensured by realizing the virtual part with the same real-time attitude.
Taking the Unity engine as an example, the Unity engine receives a data packet through a network, unpacks the data, converts the data into a quaternion of Unity, performs coordinate axis transformation on the quaternion, and transforms the coordinate system of each sensor of the inertial measurement unit into a world coordinate system of Unity so as to facilitate subsequent processing.
The Unity engine also performs a calibrated alignment of the attitude because the inertial measurement units will vary somewhat in position and attitude when used by different users. Another script in the Unity engine guides the user to complete the calibration process and records the calibration data. During calibration, the interface displays a model of the calibration action, guides a user to make a corresponding action and continues for a period of time, and the Unity engine records the postures of all the inertial measurement units at the moment as calibration data.
The calibrated pose is distributed in the project of the Unity engine. The model for simulating the body part can be added with corresponding scripts to receive distributed data, and then the data are applied to corresponding model skeletons, and the real-time actions of the user can be displayed by controlling the rotating postures of all skeletons of the model.
And calculating the distance and angle relation between the skeletons according to the position and the posture of the model skeletons, and can be used for judging the current gesture to realize gesture recognition. In addition, a collision body is added to the model to detect the collision condition of the hand and the object. In combination with the current gesture and the contact of the object with the finger collision object, various interactions between the object and the virtual human hand can be realized, such as actions of picking up, putting down, throwing, pinching and the like. For a specific tool, a specific use mode is also set, such as opening and closing scissors, pressing a watering can and the like.
Further, in the Unity engine, the virtual site may be generated by using a bone skinning animation (Skinned Mesh), which is a method of dynamically calculating vertices of a Skinned Mesh by vertex blending under the control of a bone, and the motion of the bone is relative to a parent bone and driven by animation key frame data. A bone skinning animation typically includes bone hierarchy data, Mesh (Mesh) data, Mesh skinning data (skin info), and animation (keyframe) data for the bone (bone).
In the Unity engine, aiming at the posture of the body part transmitted by the specified virtual part, the virtual part is rendered by using a BindPose (posture binding, the BindPose is used for prefabricating some skeleton transformations before the skeleton transformation so that a character can be represented by different skeleton positions on the same animation, the workflow is simplified, and the like) algorithm, wherein the BindPose algorithm is represented as follows:
OneBoneBindPose=bone.worldToLocalMatrix*transform.localToWorldMatrix
wherein the posture is the multiplication of a world-to-local coordinate system matrix wordtollocalmatrix of the skeleton and a local-to-world matrix localholdmatrix of Mesh.
The process of calculating the change of the Mesh vertex affected by the bone is generally called skinning, and the general process is as follows:
transforming the vertices from mesh space to bone space:
vbone=vmesh*bindpose
and transforming the vertex under the bone space to the mesh space through the transformation matrix of the current bone:
vout=vbone*boneToMeshMatrix
a vertex may be affected by multiple bones, so the affected bones are finally transformed and multiplied by the weights in boneWeight to be added and mixed:
vout0=boneToMeshMatrix0*bindpose0*vmesh
vout1=boneToMeshMatrix1*bindpose1*vmesh
vout2=boneToMeshMatrix2*bindpose2*vmesh
vout3=boneToMeshMatrix3*bindpose3*vmesh
vout=vout0*weight0+vout1*weight1+vout2*weight2+vout3*weight3
and 305, calculating joint angles of the body joints according to the corresponding postures of the two adjacent inertial measurement units.
Step 306, mapping the joint angle to a robot joint of the robot.
And 307, driving the robot to move until the adjacent machine joint reaches the joint angle.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
EXAMPLE III
Fig. 4 is a block diagram of a human body action mapping apparatus provided in the third embodiment of the present invention, which may specifically include the following modules:
an inertial data reading module 401, configured to read inertial data from a plurality of inertial measurement units, where the plurality of inertial measurement units are attached to a body part of a user, and a part of two adjacent inertial measurement units are spaced from each other by a body joint of the user;
an attitude fusion module 402 for fusing the inertial data into an attitude of the inertial measurement unit in space;
a joint angle calculation module 403, configured to calculate a joint angle of the body joint according to the postures corresponding to two adjacent inertial measurement units;
a joint angle mapping module 404 for mapping the joint angle to a robot joint of a robot;
and a robot driving module 405, configured to drive the robot to move until the adjacent robot joint reaches the joint angle.
In one embodiment of the present invention, the gesture fusion module 402 comprises:
a type determination module for determining a type of the inertial data;
the data calibration module is used for calibrating the inertial data according to the type;
and the data filtering module is used for fusing the inertial data into the attitude of the inertial measurement unit in the space based on a filtering mode if the calibration is finished.
In an example of the embodiment of the present invention, the data calibration module includes:
the first calibration parameter query module is used for querying a first offset error if the type is the angular velocity;
the angular velocity calibration module is used for subtracting the first offset error from the angular velocity to obtain a calibrated angular velocity;
and/or the presence of a gas in the gas,
the second calibration parameter query module is used for querying a second offset error and a first proportional error if the type is the acceleration;
the acceleration calibration module is used for multiplying the difference value between the acceleration and the second bias error by the first proportional error to obtain calibrated acceleration;
and/or the presence of a gas in the gas,
the magnetic field intensity acquisition module is used for acquiring the magnetic field intensity recorded when the inertial measurement unit rotates in space if the type is the magnetic field intensity;
the change rate calculation module is used for calculating the change rate of the magnetic field intensity;
the variation difference calculation module is used for calculating the average value and the standard deviation of the variation rate;
the variation range calculation module is used for respectively expanding the plurality of standard deviations upwards and expanding the plurality of standard deviations downwards on the basis of the average value to obtain a variation range;
the time recording module is used for recording the time corresponding to the change rate if the change rate exceeds the change range;
a magnetic field strength deleting module for deleting the magnetic field strength at the time;
the extreme value query module is used for respectively querying a first target strength and a second target strength aiming at the plurality of axes, wherein the first target strength is the magnetic field strength with the maximum value, and the second target strength is the magnetic field strength with the minimum value;
the offset error calculation module is used for calculating an average value of the first target intensity and the second target intensity respectively aiming at a plurality of axes to serve as a third offset error;
a variation amplitude calculation module, configured to calculate, for each of the plurality of axes, a difference between the first target intensity and the second target intensity as a variation amplitude of the magnetic field intensity;
the target amplitude selection module is used for selecting the change amplitude with the maximum value as a target amplitude;
a proportional error calculation module, configured to calculate, for the multiple axes, ratios between the variation amplitudes and the target amplitudes, respectively, as second proportional errors;
a magnetic field strength correction module for multiplying the difference between the magnetic field strength and the third offset error by the second proportional error to obtain a corrected magnetic field strength;
and the magnetic field intensity calibration module is used for performing first-order complementary filtering on the magnetic field intensity corrected at the current time and the magnetic field intensity corrected at the last time to serve as the magnetic field intensity after calibration.
In one embodiment of the present invention, the joint angle calculation module 403 includes:
the target attitude query module is used for querying two corresponding attitudes of two adjacent inertia measurement units as a first target attitude and a second target attitude;
the rotation mode determining module is used for determining a mode of rotation of two adjacent inertia measuring units under one or more degrees of freedom as a rotation mode;
a relative relation calculation module, configured to calculate a relation between the second target posture and the first target posture as a third target posture;
the vector rotation module is used for rotating the reference vector representing the rotation mode around the third target posture to obtain a target vector;
and the included angle calculation module is used for calculating an included angle between the target vector and the reference vector as a joint angle between two adjacent body parts.
In one embodiment of the present invention, the relative relationship calculation module includes:
a first inverse computation module for computing an inverse of the first target pose;
and the first multiplying module is used for multiplying the second target posture with the inverse of the first target posture to obtain a third target posture.
In one embodiment of the invention, the vector rotation module comprises:
a reference vector setting module for setting a vector representing the rotation mode as a reference vector;
a second inverse computation module for computing an inverse of the third target pose;
and the second multiplication module is used for multiplying the third target posture, the reference vector and the inverse of the third target posture to obtain a target vector.
In one embodiment of the present invention, the reference vector setting module includes:
the degree of freedom setting module is used for setting the corresponding dimension of the degree of freedom of rotation of two adjacent inertia measurement units as 0 in a reference vector;
and the non-freedom degree setting module is used for setting the dimensionality corresponding to the non-rotation freedom degrees of two adjacent inertia measurement units to be 1 in the reference vector.
In an embodiment of the present invention, the included angle calculating module includes:
and the inverse cosine processing module is used for performing inverse cosine processing on the target vector and the reference vector to obtain a joint angle between two adjacent body parts.
In one embodiment of the present invention, the joint angle mapping module 404 includes:
the mapping relation query module is used for querying a preset mapping relation, and the mapping relation records a relation of mapping from a first number of the inertia measurement unit to a second number of a robot joint of the robot;
the first number inquiry module is used for inquiring two first numbers of two adjacent inertia measurement units;
the second number inquiry module is used for inquiring a second number which is mapped by two first numbers of two adjacent inertia measurement units together in the mapping relation;
and the serial number mapping module is used for mapping the joint angles of two adjacent inertia measurement units to the machine joint corresponding to the second serial number.
In one embodiment of the present invention, the robot driving module 405 includes:
the track planning module is used for planning a track of the machine joint reaching the joint angle;
and the track rotating module is used for controlling the robot to rotate according to the track until the adjacent machine joints reach the joint angle.
In one embodiment of the present invention, further comprising:
the service scene loading module is used for loading service scenes;
and the virtual part generating module is used for generating a virtual part matched with the body part according to the posture in the service scene.
The human body action mapping device provided by the embodiment of the invention can execute the human body action mapping method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
Example four
Fig. 5 is a schematic structural diagram of a computer device according to a fourth embodiment of the present invention. FIG. 5 illustrates a block diagram of an exemplary computer device 12 suitable for use in implementing embodiments of the present invention. The computer device 12 shown in FIG. 5 is only an example and should not bring any limitations to the functionality or scope of use of embodiments of the present invention.
As shown in FIG. 5, computer device 12 is in the form of a general purpose computing device. The components of computer device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Computer device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. Computer device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 5, and commonly referred to as a "hard drive"). Although not shown in FIG. 5, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Computer device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with computer device 12, and/or with any devices (e.g., network card, modem, etc.) that enable computer device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, computer device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via network adapter 20. As shown, network adapter 20 communicates with the other modules of computer device 12 via bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with computer device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing by running programs stored in the system memory 28, for example, implementing a human motion mapping method provided by an embodiment of the present invention.
EXAMPLE five
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the human body motion mapping method, and can achieve the same technical effect, and in order to avoid repetition, the computer program is not described here again.
A computer readable storage medium may include, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (14)

1. A human body action mapping method is characterized by comprising the following steps:
reading inertial data from a plurality of inertial measurement units, wherein the plurality of inertial measurement units are attached to body parts of a user, and a part of adjacent inertial measurement units are spaced from each other by body joints of the user;
fusing the inertial data into a pose of the inertial measurement unit in space;
calculating joint angles of the body joints according to the postures corresponding to the two adjacent inertial measurement units;
mapping the joint angle to a robot joint of a robot;
and driving the robot to move until the adjacent machine joint reaches the joint angle.
2. The method of claim 1, wherein said fusing the inertial data into a pose of the inertial measurement unit in space comprises:
determining a type of the inertial data;
calibrating the inertial data according to the type;
and if the calibration is finished, fusing the inertial data into the attitude of the inertial measurement unit in the space based on a filtering mode.
3. The method of claim 2, wherein said calibrating said inertial data by said type comprises:
if the type is angular velocity, inquiring a first offset error;
subtracting the first offset error from the angular velocity to obtain a calibrated angular velocity;
and/or the presence of a gas in the gas,
if the type is acceleration, inquiring a second offset error and a first proportional error;
multiplying the difference between the acceleration and the second bias error by the first proportional error to obtain a calibrated acceleration;
and/or the presence of a gas in the gas,
if the type is the magnetic field intensity, acquiring the magnetic field intensity recorded when the inertial measurement unit rotates in space;
calculating a rate of change for the magnetic field strength;
calculating the mean and standard deviation of the change rate;
on the basis of the average value, respectively expanding a plurality of standard deviations upwards and downwards to obtain a variation range;
if the change rate exceeds the change range, recording the time corresponding to the change rate;
deleting the magnetic field strength at the time;
respectively inquiring first target intensity and second target intensity aiming at the plurality of axes, wherein the first target intensity is the magnetic field intensity with the maximum value, and the second target intensity is the magnetic field intensity with the minimum value;
calculating an average value of the first target intensity and the second target intensity as a third offset error for a plurality of axes, respectively;
for a plurality of axes, the difference between the first target intensity and the second target intensity respectively is taken as the change amplitude of the magnetic field intensity;
selecting the variation amplitude with the largest value as a target amplitude;
calculating ratios between the variation amplitudes and the target amplitudes respectively as second proportional errors for a plurality of axes;
multiplying the difference between the magnetic field strength and the third bias error by the second proportional error as a corrected magnetic field strength;
and performing first-order complementary filtering on the magnetic field intensity corrected at the current time and the magnetic field intensity corrected at the last time to serve as the magnetic field intensity after calibration.
4. The method according to any one of claims 1-3, wherein said calculating joint angles of said body joints from said poses corresponding to two adjacent inertial measurement units comprises:
inquiring two corresponding postures of two adjacent inertia measurement units as a first target posture and a second target posture;
determining a mode of rotation of two adjacent inertial measurement units in one or more degrees of freedom as a rotation mode;
calculating a relationship of the second target pose with respect to the first target pose as a third target pose;
rotating the reference vector representing the rotation mode around the third target posture to obtain a target vector;
and calculating an included angle between the target vector and the reference vector as a joint angle between two adjacent body parts.
5. The method of claim 4, wherein the calculating a relationship of the second target pose relative to the first target pose as a third target pose comprises:
calculating an inverse of the first target pose;
and multiplying the second target posture with the inverse of the first target posture to obtain a third target posture.
6. The method of claim 4, wherein rotating the reference vector representing the turning pattern about the third target pose to obtain a target vector comprises:
setting a vector representing the rotation pattern as a reference vector;
calculating an inverse of the third target pose;
and multiplying the third target posture, the reference vector and the inverse of the third target posture to obtain a target vector.
7. The method according to claim 6, wherein the setting a vector representing the rotation pattern as a reference vector comprises:
in the reference vector, the dimension corresponding to the degree of freedom of rotation of two adjacent inertial measurement units is set to be 0;
in the reference vector, the dimension corresponding to the non-rotational degree of freedom of two adjacent inertial measurement units is set to be 1.
8. The method of claim 4, wherein said calculating an angle between said target vector and said reference vector as a joint angle between two adjacent of said body parts comprises:
and performing inverse cosine processing on the target vector and the reference vector to obtain a joint angle between two adjacent body parts.
9. The method of any of claims 1-3, 5-8, wherein mapping the joint angle to a robot joint of a robot comprises:
inquiring a preset mapping relation, wherein the mapping relation records the relation of mapping from the first number of the inertia measurement unit to the second number of the robot joint of the robot;
inquiring two first numbers of two adjacent inertia measurement units;
in the mapping relation, inquiring a second number which is mapped by two first numbers of two adjacent inertia measurement units together;
and mapping the joint angles of two adjacent inertial measurement units to the machine joint corresponding to the second number.
10. The method of any of claims 1-3, 5-8, wherein the driving the robot to move until adjacent the machine joint reaches the joint angle comprises:
planning a trajectory of the machine joint to the joint angle;
and controlling the robot to rotate according to the track until the adjacent machine joint reaches the joint angle.
11. The method of any one of claims 1-3, 5-8, further comprising:
loading a service scene;
and generating a virtual part matched with the body part according to the posture in the service scene.
12. A human motion mapping device, comprising:
the inertial data reading module is used for reading inertial data from a plurality of inertial measurement units, the plurality of inertial measurement units are attached to the body part of a user, and part of adjacent inertial measurement units are spaced by the body joint of the user;
the attitude fusion module is used for fusing the inertial data into the attitude of the inertial measurement unit in the space;
the joint angle calculation module is used for calculating joint angles of the body joints according to the postures corresponding to the two adjacent inertia measurement units;
a joint angle mapping module for mapping the joint angle to a robot joint of a robot;
and the robot driving module is used for driving the robot to move until the adjacent machine joint reaches the joint angle.
13. A computer device, characterized in that the computer device comprises:
one or more processors;
a memory for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the human action mapping method as claimed in any one of claims 1-11.
14. A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, implements a human motion mapping method as claimed in any one of claims 1-11.
CN202110724802.7A 2021-06-29 2021-06-29 Human body action mapping method and device, computer equipment and storage medium Active CN113450903B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110724802.7A CN113450903B (en) 2021-06-29 2021-06-29 Human body action mapping method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110724802.7A CN113450903B (en) 2021-06-29 2021-06-29 Human body action mapping method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113450903A true CN113450903A (en) 2021-09-28
CN113450903B CN113450903B (en) 2022-10-04

Family

ID=77813832

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110724802.7A Active CN113450903B (en) 2021-06-29 2021-06-29 Human body action mapping method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113450903B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120316683A1 (en) * 2011-06-10 2012-12-13 Samsung Electronics Co., Ltd. Balance control apparatus of robot and control method thereof
EP3173191A2 (en) * 2015-11-27 2017-05-31 Industrial Technology Research Institute Method for estimating posture of robotic walking aid
CN107115114A (en) * 2017-04-28 2017-09-01 王春宝 Human Stamina evaluation method, apparatus and system
CN107943283A (en) * 2017-11-08 2018-04-20 浙江工业大学 Mechanical arm Pose Control system based on gesture identification
CN108127673A (en) * 2017-12-18 2018-06-08 东南大学 A kind of contactless robot man-machine interactive system based on Multi-sensor Fusion
CN109079794A (en) * 2018-09-18 2018-12-25 广东省智能制造研究所 It is a kind of followed based on human body attitude robot control and teaching method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120316683A1 (en) * 2011-06-10 2012-12-13 Samsung Electronics Co., Ltd. Balance control apparatus of robot and control method thereof
EP3173191A2 (en) * 2015-11-27 2017-05-31 Industrial Technology Research Institute Method for estimating posture of robotic walking aid
CN107115114A (en) * 2017-04-28 2017-09-01 王春宝 Human Stamina evaluation method, apparatus and system
CN107943283A (en) * 2017-11-08 2018-04-20 浙江工业大学 Mechanical arm Pose Control system based on gesture identification
CN108127673A (en) * 2017-12-18 2018-06-08 东南大学 A kind of contactless robot man-machine interactive system based on Multi-sensor Fusion
CN109079794A (en) * 2018-09-18 2018-12-25 广东省智能制造研究所 It is a kind of followed based on human body attitude robot control and teaching method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
周瑞文: ""基于惯性测量单元的三维动作捕捉系统关键技术研究"", 《中国优秀博硕士学位论文全文数据库(硕士)医药卫生科技辑》 *

Also Published As

Publication number Publication date
CN113450903B (en) 2022-10-04

Similar Documents

Publication Publication Date Title
US11093036B2 (en) Tracking arm movements to generate inputs for computer systems
Du et al. Markerless human–manipulator interface using leap motion with interval Kalman filter and improved particle filter
CN110394780B (en) Simulation device of robot
Borst et al. Realistic virtual grasping
CN108161882B (en) Robot teaching reproduction method and device based on augmented reality
CN110815258B (en) Robot teleoperation system and method based on electromagnetic force feedback and augmented reality
JP5987247B2 (en) Motion capture pointer by data fusion
KR102001214B1 (en) Apparatus and method for dual-arm robot teaching based on virtual reality
Neto et al. High‐level robot programming based on CAD: dealing with unpredictable environments
US11762369B2 (en) Robotic control via a virtual world simulation
JP6826069B2 (en) Robot motion teaching device, robot system and robot control device
Xu et al. Visual-haptic aid teleoperation based on 3-D environment modeling and updating
JP2021000678A (en) Control system and control method
CN112276914B (en) Industrial robot based on AR technology and man-machine interaction method thereof
JP2014054483A (en) Hand motion measuring apparatus
CN113103230A (en) Human-computer interaction system and method based on remote operation of treatment robot
Manou et al. Off-line programming of an industrial robot in a virtual reality environment
CN112847336A (en) Action learning method, action learning device, storage medium and electronic equipment
US11763464B2 (en) Estimation apparatus, learning apparatus, estimation method, learning method, and program
CN107145706B (en) Evaluation method and device for performance parameters of virtual reality VR equipment fusion algorithm
CN113450903B (en) Human body action mapping method and device, computer equipment and storage medium
CN117047771A (en) Flexible control method and device for robot and electronic equipment
CN115946120B (en) Mechanical arm control method, device, equipment and medium
Sousa et al. Computed torque posture control for robotic-assisted tele-echography
Al-Junaid ANN based robotic arm visual servoing nonlinear system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant