CN113492404B - Humanoid robot action mapping control method based on machine vision - Google Patents

Humanoid robot action mapping control method based on machine vision Download PDF

Info

Publication number
CN113492404B
CN113492404B CN202110427319.2A CN202110427319A CN113492404B CN 113492404 B CN113492404 B CN 113492404B CN 202110427319 A CN202110427319 A CN 202110427319A CN 113492404 B CN113492404 B CN 113492404B
Authority
CN
China
Prior art keywords
coordinate system
camera
robot
angle
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110427319.2A
Other languages
Chinese (zh)
Other versions
CN113492404A (en
Inventor
黎学臻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology Beijing USTB
Original Assignee
University of Science and Technology Beijing USTB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology Beijing USTB filed Critical University of Science and Technology Beijing USTB
Priority to CN202110427319.2A priority Critical patent/CN113492404B/en
Publication of CN113492404A publication Critical patent/CN113492404A/en
Application granted granted Critical
Publication of CN113492404B publication Critical patent/CN113492404B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning

Abstract

The invention provides a humanoid robot action mapping control method based on machine vision, which comprises three parts of human body action data acquisition, conversion of visual data and action data and robot control, wherein the adopted action mapping control method has good control flexibility, so that a humanoid robot has the capability of simulating the behavior of a human in real time, the action of the human can be simulated in real time by the humanoid robot under the condition of not needing pre-programming, even the robot training can be carried out by simulating the action, the autonomy and the adaptability of the robot are improved, and the humanoid robot has the capability of quickly and accurately completing complex tasks It may even be robot training by simulating actions.

Description

Humanoid robot action mapping control method based on machine vision
Technical Field
The invention relates to the technical field of humanoid robot action mapping, in particular to a humanoid robot action mapping control method based on machine vision.
Background
In recent years, with the continuous development of artificial intelligence and robot technology, the humanoid robot has incomparable advantages in human-computer interactivity by virtue of the characteristic of humanoid appearance, and plays an increasingly important role in the fields of teaching, scientific research, industry and the like. However, the general humanoid robots have a large number of rudders, and the traditional humanoid robot based on preprogramming has high difficulty in automatic control, high cost and low intelligent degree, so that great obstacles are brought to the application and popularization of the humanoid robot. Therefore, it is imperative to explore more intelligent and convenient control modes of the humanoid robot.
Disclosure of Invention
In view of the above, the present invention provides a robot-vision-based motion mapping control method for a humanoid robot, which uses a robot to follow the motion of a human body as a means, and uses the motion of the human body, the robot and an action target as a closed loop to realize human-robot interaction, and the specific technical scheme is as follows:
a humanoid robot action mapping control method based on machine vision is characterized by comprising three parts, namely human body action data acquisition, visual data and action data conversion and robot control;
acquiring a depth map and an RGB map through an RGB-D camera to obtain pixel positions (u, v) of all joint points of a human body in the RGB map, and extracting pixel values of the same pixel positions in the depth map, wherein the pixel values are depth values D of the points;
calculating coordinates of the key points in a pixel coordinate system, an image coordinate system, a camera coordinate system and a world coordinate system by adopting a pinhole model;
the visual data and the motion data are converted in a human-machine motion mapping model, in order to realize the mapping from human body motion to robot motion, a space vector method is adopted, the three-dimensional coordinates of joints of a human body are calculated to obtain joint angles of each joint, the joint angles are mapped onto the humanoid robot, the human-machine matching is realized, and data are transmitted for the control of the robot;
matching the three-dimensional coordinates acquired by the depth camera with the human skeleton model by using a space vector method, performing operation between the coordinates according to a joint connection structure of the skeleton model so as to convert the three-dimensional coordinates into space vectors, and obtaining joint angles of all joints through the operation of outer products of the space vectors;
the robot control obtains relevant data of a human body arm through a depth camera, an upper computer analyzes and processes the data to obtain a joint value, the human body arm is mapped to the robot arm, steering engine value data are transmitted to a control unit, the control unit generates a control signal, when a steering engine receives a PWM signal from a controller, the pulse width is compared with the pulse width generated by an internal circuit to obtain two pulses, one pulse is output to a driving circuit, the other pulse is used for controlling the driving direction, the position of a rotary driving potentiometer of a motor changes the future pulse width until the future pulse width is equal to the pulse width output by an external signal, when the motor stops rotating, a steering angle is fixed, the steering engine is controlled at a corresponding angle, and the control of the robot is realized.
Further, the symbols used in the pinhole model illustrate: (u, v) are the coordinates in the pixel coordinate system, (x, y) the coordinates in the image coordinate system, (x c ,y c ,z c ) Is a coordinate in the camera coordinates, (x) 0 ,y 0 ,z 0 ) Is the coordinate in the world coordinate system, the coordinate system in the pinhole model describes, the coordinate of a certain point in the pixel coordinate system in the image by taking the pixel as the unit, if the pixel value is taken as the element, the picture is stored in the computer in the form of a matrixStoring, the pixel coordinate system coordinate of a certain point is the row mark and the column mark of the corresponding element in the matrix;
the coordinate of a certain point in the image coordinate system by taking the physical length (m) as a unit, and the corresponding relation between the pixel coordinate system and the image coordinate system is as follows:
o1 is the intersection of the camera optical axis and the image plane, called the picture principal point. Defining the image principal point as the origin of the image coordinate system, and the pixel coordinate system coordinate is (u) 0 ,v 0 ). If the physical size of each pixel in the x, y axes is dx, dy, the relationship between the two coordinate systems is:
Figure BDA0003030037900000021
Figure BDA0003030037900000022
in matrix form, the following is true:
Figure BDA0003030037900000023
the camera coordinate system is the camera coordinate system with the camera optical center O as the origin, (X) c ,Y c ) The axes are respectively parallel to the x and y axes in the image coordinate system, Z c The axis is the optical axis of the camera and is perpendicular to the image plane. Z c The intersection of the axis with the image plane is O1, the image principal point. OO1 is the camera focal length;
the correspondence between the image coordinate system and the camera coordinate system is as follows:
Figure BDA0003030037900000024
further, the correspondence between the camera coordinate system and the world coordinate system is as follows:
Figure BDA0003030037900000025
according to the conversion relation between the coordinate systems in the pinhole model, the mapping relation from one point on the image to one point in the space can be established by combining the key point pixel position and the depth value obtained by the depth camera;
according to the kinematics principle, the coordinate transformation matrix expression is as follows:
Figure BDA0003030037900000026
the mapping relation from one point on the image to one point in the space can be established through the formula.
To solve the camera intrinsic parameter f x 、f y 、c x 、c y Need to perform camera calibration, c x 、c y Is the center of the aperture of the camera.
Figure BDA0003030037900000027
M is called an internal reference matrix, each value in the matrix is only related to the internal parameters of the camera and does not change along with the change of the position of the object, S is a scaling factor of the depth map, and when one point (u, v, d) and the corresponding space coordinate relation (x, y, z):
Figure BDA0003030037900000031
Figure BDA0003030037900000032
d=z*s
further, the yaw angle is calculated as follows:
the yaw freedom degree corresponds to the motion of the elbow joint of the robot, and two vectors are formed in a plane formed by the elbow, the wrist and the shoulder
Figure BDA00030300379000000316
And
Figure BDA00030300379000000317
the two form an angle formed in the plane as a yaw angle.
By utilizing the thought of a space vector method, the component of ES projection in an XOY plane is taken, the positive direction included angle between the component and a Y coordinate axis is solved, namely the yaw joint angle, and the calculation process of the yaw angle of the left arm is as follows:
Figure BDA0003030037900000033
Figure BDA0003030037900000034
Figure BDA0003030037900000035
the yaw angle forming principle of the right arm is completely consistent with the calculation principle and the left arm, so that the yaw angle forming principle can also be calculated by using the formula.
By utilizing the thought of a space vector method, the component of ES projection in an XOY plane is taken, the positive direction included angle between the component and a Y coordinate axis is solved, namely the pitch joint angle, and the calculation process of the pitch angle of the left arm is as follows:
Figure BDA0003030037900000036
Figure BDA0003030037900000037
Figure BDA0003030037900000038
Figure BDA0003030037900000039
pitch angle of right arm
Figure BDA00030300379000000310
The process of the component in the XOY plane and the process of solving the included angle are completely consistent with the calculation of the left arm, so the calculation and the solution can also be realized by utilizing the algorithm;
the rolling freedom determines the state of the whole arm rotating along the shoulder-shoulder straight line, and corresponds to the freedom of the human shoulder for controlling the arm to swing back and forth and also corresponds to a steering engine for the shoulder of the humanoid robot to rotate;
since the roll degree of freedom does not correspond to the shoulder degree of freedom, the calculation of the roll degree of freedom cannot be obtained directly using the vector connecting the shoulders;
and (3) calculating the angle between the positive direction of the Y coordinate axis by using a space vector method and the rolling degree of freedom by using the outer product of the vector ES and the vector EW. The calculation method is as follows:
Figure BDA00030300379000000311
Figure BDA00030300379000000312
Figure BDA00030300379000000313
Figure BDA00030300379000000314
Figure BDA00030300379000000315
when in use
Figure BDA0003030037900000041
Respectively located at the front and rear sides of the body, and
Figure BDA0003030037900000042
calculated theta at the same angle 3 Therefore, the difference between the two actions cannot be distinguished, and at this time, it is necessary to determine the front-back relationship on the z-axis (depth direction) between the Y-coordinate of the left shoulder and the Y-coordinate of the left elbow and to distinguish separately;
the calculation method of the right arm is similar, but the direction obtained by performing the outer product operation on the two vectors is just opposite to that of the left arm, and in order to ensure the consistency of the operation, the calculation method is to use
Figure BDA0003030037900000043
And
Figure BDA0003030037900000044
the calculation method of (2) is kept unchanged, and the vector outer product operation of the right arm is modified as follows:
Figure BDA0003030037900000045
holding
Figure BDA0003030037900000046
Is not changed, the turning freedom of the right arm is calculated as
Figure BDA0003030037900000047
In the actual operation process, the data of the steering engine value always fluctuates, so that the data needs to be filtered by adopting an amplitude limiting filtering method;
the main idea of the limiting filtering method is as follows: according to the empirical judgment, the maximum deviation value (set as A) allowed by two times of sampling is determined, and the judgment is carried out each time a new value is detected: if the difference between the current value and the previous value is equal to A, the current value is valid, if the difference between the current value and the previous value is greater than A, the current value is invalid, the current value is abandoned, and the previous value is used for replacing the current value, so that the method can effectively overcome pulse interference caused by accidental factors under the condition of ensuring the existing precision; the steering engine values and joint angles of different steering engines have different mapping relations. After filtering, the joint angle is converted into a steering engine value according to the characteristics of the humanoid robot steering engine.
By adopting the technical scheme, the method has the following beneficial effects:
the control method of the action mapping adopted by the invention has good control flexibility, can enable the humanoid robot to have the capability of simulating the behavior of a human in real time, can enable the humanoid robot to simulate the action of the human in real time without pre-programming, even can carry out robot training by simulating the action, improves the autonomy and the adaptability of the robot, and enables the humanoid robot to have the capability of rapidly and accurately completing complex tasks.
Drawings
FIG. 1 is a schematic diagram of the relationship between four coordinate systems according to the present invention;
FIG. 2 is a diagram illustrating a correspondence between a pixel coordinate system and an image coordinate system;
FIG. 3 is a schematic diagram of the correspondence between the image coordinate system and the camera coordinate system according to the present invention;
FIG. 4 is a schematic diagram of a four coordinate system transformation sequence according to the present invention;
FIG. 5 is a schematic view of the calculation of the yaw degree of freedom of the present invention;
FIG. 6 is a flow chart of the robot motion control of the present invention;
fig. 7 is a schematic view of states of a yaw degree of freedom, a roll degree of freedom, and a pitch degree of freedom of the robot according to the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
Example 1: as shown in fig. 1-7: a humanoid robot action mapping control method based on machine vision is disclosed, wherein the scheme is mainly completed by three parts, namely human body action data acquisition, conversion of visual data and action data and robot control.
Collecting human body action data:
firstly, a depth map and an RGB map are acquired through an RGB-D camera, and the pixel positions (u, v) of all joint points of the human body in the RGB map are acquired. Then, the pixel value of the same pixel position is extracted from the depth map, which is the depth value D of the point.
Pinhole models are the most used model in camera imaging studies. Under the model, the space coordinate and the image coordinate of the object present a linear relation, so the solution of the camera parameters is reduced to the solution of a linear equation system. The model involves four coordinate systems: world coordinate system, camera coordinate system, image coordinate system, pixel coordinate system. The coordinates of the key point in any coordinate system can be calculated through the relation of four coordinate systems in the pinhole model.
Notation used in the pinhole model: (u, v) are coordinates in a pixel coordinate system; (x, y) coordinates of an image coordinate system; (x) c ,y c ,z c ) Are the coordinates in the camera coordinates; (x) 0 ,y 0 ,z 0 ) Are coordinates in the world coordinate system. Coordinate system description in pinhole model: pixel coordinate system: the coordinates of a certain point in the image are in pixel units, if the pixel value is an element, the image is stored in a computer in a matrix mode, and the coordinates of the pixel coordinate system of the certain point are the row mark and the column mark of the corresponding element in the matrix.
Image coordinate system: the coordinate of a certain point in the image with the physical length (such as cm) as a unit and the position relation of the pixel coordinate system are shown in fig. 2. The corresponding relation between the pixel coordinate system and the image coordinate system is-
In fig. 2, O1 is the intersection of the camera optical axis and the image plane, called the picture principal point. Defining the image principal point as the origin of the image coordinate system, and the pixel coordinate system coordinate is (u) 0 ,v 0 ). If the physical size of each pixel in the x and y axes is dx and dy, the relationship between the two coordinate systems is —)
Figure BDA0003030037900000051
Figure BDA0003030037900000052
Expressed in matrix form, it is-
Figure BDA0003030037900000053
Camera coordinate system: the camera coordinate system takes the camera optical center O as the origin, (X) c ,Y c ) The axes being parallel to the x, y axes of the image coordinate system, Z c The axis is the optical axis of the camera and is perpendicular to the image plane. Z c The intersection of the axis with the image plane is O1, the image principal point. OO1 is the camera focal length.
The corresponding relation between the image coordinate system and the camera coordinate system is-
Figure BDA0003030037900000054
The corresponding relation between the camera coordinate system and the world coordinate system is — -
Figure BDA0003030037900000055
According to the conversion relation between the coordinate systems in the pinhole model, the mapping relation from one point on the image to one point in the space can be established through the formula by combining the key point pixel position and the depth value obtained by the depth camera. The coordinates of the keypoints in any coordinate system can be found, and the coordinate system conversion order is shown in fig. 4.
According to the kinematics principle, the coordinate transformation matrix expression is — -)
Figure BDA0003030037900000061
The mapping relation from one point on the image to one point in the space can be established through the formula.
To solve the camera intrinsic parameter f x 、f y 、c x 、c y Requiring camera calibration, c x 、c y Is the center of the aperture of the camera.
Figure BDA0003030037900000062
M is called an internal reference matrix, each value in the matrix is only related to the internal parameters of the camera and does not change along with the change of the position of an object, and S is a scaling factor of the depth map. At this time, a point (u, v, d) has a corresponding spatial coordinate relationship (x, y, z) — to
Figure BDA0003030037900000063
Figure BDA0003030037900000064
d=z*s
Transformation of visual data and motion data
In the human-machine action mapping model, in order to realize the mapping from human body action to robot action, the joint angle is calculated by adopting a space vector method, the three-dimensional coordinates of the joints of the human body are calculated to obtain the angle of each joint, the angles are mapped onto the humanoid robot, the human-machine matching is realized, and data are transmitted for the control of the robot.
By using a space vector method, firstly, three-dimensional coordinates acquired by a depth camera are matched with a human skeleton model, and the coordinates are calculated according to a joint connection structure of the skeleton model, so that the coordinates are converted into space vectors.
The yaw degree of freedom corresponds to the motion of the robot elbow joint. In the plane formed by elbow, wrist and shoulder, two vectors are formed
Figure BDA00030300379000000611
And
Figure BDA00030300379000000612
the two form an angle formed in the plane as a yaw angle. The yaw angle of the left arm is calculated as —)
And (3) taking the component of the ES projected in the XOY plane by utilizing the thought of a space vector method, and solving the positive direction included angle between the component and the Y coordinate axis, namely the yaw joint angle. The yaw angle of the left arm is calculated as —)
Figure BDA0003030037900000065
Figure BDA0003030037900000066
Figure BDA0003030037900000067
The principle of forming the yaw angle of the right arm is completely consistent with the calculation principle and the left arm, and therefore, the yaw angle can also be calculated by using the formula.
And (3) taking the component of the ES projected in the XOY plane by utilizing the thought of a space vector method, and solving the positive direction included angle between the component and the Y coordinate axis, namely the pitch joint angle. The calculation process of the pitching angle of the left arm is — -)
Figure BDA0003030037900000068
Figure BDA0003030037900000069
Figure BDA00030300379000000610
Figure BDA0003030037900000071
Pitch angle of right arm
Figure BDA0003030037900000079
The process of the component in the XOY plane and the process of calculating the included angle are completely consistent with the calculation of the left arm, so the calculation and the solution can be calculated by utilizing the algorithm.
The rolling freedom determines the state of the whole arm rotating along the shoulder-shoulder straight line, and corresponds to the freedom of the human shoulder to control the arm to swing back and forth and also corresponds to the steering engine for the shoulder rotation of the humanoid robot.
Since the roll degree of freedom does not correspond to the shoulder degree of freedom, the roll degree of freedom calculation cannot be directly obtained using the shoulder-connecting vector.
And (3) calculating the angle between the positive direction of the Y coordinate axis by using a space vector method and the rolling degree of freedom by using the outer product of the vector ES and the vector EW. The calculation method is as follows
Figure BDA0003030037900000072
Figure BDA0003030037900000073
Figure BDA0003030037900000074
Figure BDA0003030037900000075
Figure BDA0003030037900000076
When in use
Figure BDA00030300379000000710
Respectively located at the front and rear sides of the body, and
Figure BDA00030300379000000711
calculated theta at the same angle 3 Therefore, it is not possible to distinguish between these two actions, and in this case, it is necessary to determine the front-back relationship on the z axis (in the depth direction) between the Y coordinate of the left shoulder and the Y coordinate of the left elbow and to distinguish them.
The right arm is calculated similarly, but the direction of the outer product of the two vectors is exactly opposite to that of the left arm. To ensure consistency of operation, the method will
Figure BDA00030300379000000712
And
Figure BDA00030300379000000713
the calculation method of (2) is kept unchanged, and the vector outer product operation of the right arm is modified to be- (phi)
Figure BDA0003030037900000077
Holding
Figure BDA00030300379000000714
Is not changed, the turning freedom of the right arm is calculated as
Figure BDA0003030037900000078
In the actual operation process, the data of the calculated steering engine value always fluctuates, so that the filtering operation is needed. The scheme adopts a limited-amplitude filtering method.
The main idea of the limiting filtering method is as follows: according to the empirical judgment, the maximum deviation value (set as A) allowed by two times of sampling is determined, and the judgment is carried out each time a new value is detected: if the difference between the current value and the previous value is equal to A, the current value is valid, and if the difference between the current value and the previous value is greater than A, the current value is invalid, the current value is discarded, and the previous value is used to replace the current value. The method can effectively overcome pulse interference caused by accidental factors under the condition of ensuring the existing precision.
The steering engine values and joint angles of different steering engines have different mapping relations. After filtering, the joint angle is converted into a steering engine value according to the characteristics of the steering engine of the humanoid robot.
Robot control
After the human body arm is mapped to the robot arm, the related data of the arm is obtained through the sensor, and the upper computer analyzes and processes the data to obtain a joint value. The joint value data is transmitted to the control unit, and the control unit generates a control signal. When the steering engine receives a PWM signal from the controller, the pulse width is compared with the pulse width generated by the internal circuit to obtain two pulses, one pulse is output to the driving circuit, and the other pulse is used for controlling the driving direction. The position of the rotary drive potentiometer of the motor changes the future pulse width until it equals the pulse width of the external signal output. When the motor stops rotating, the rudder angle is fixed. And the steering engine is controlled at a corresponding angle to realize the control of the robot.
Having thus described the basic principles and principal features of the invention, it will be appreciated by those skilled in the art that the invention is not limited by the embodiments described above, which are merely illustrative of the principles of the invention, but is susceptible to various changes and modifications without departing from the spirit and scope of the invention as defined by the appended claims and their equivalents.

Claims (3)

1. A humanoid robot action mapping control method based on machine vision is characterized by comprising three parts, namely human body action data acquisition, visual data and action data conversion and robot control;
acquiring a depth map and an RGB map through an RGB-D camera to obtain pixel positions (u, v) of all joint points of a human body in the RGB map, and extracting pixel values of the same pixel position in the depth map, wherein the pixel value is the depth value D of the point;
calculating coordinates of the key points in a pixel coordinate system, an image coordinate system, a camera coordinate system and a world coordinate system by adopting a pinhole model;
the conversion of the visual data and the motion data is performed in a human-machine motion mapping model, in order to realize the mapping from human body motion to robot motion, a space vector method is adopted, the three-dimensional coordinates of joints of a human body are calculated to obtain joint angles of each joint, the joint angles are mapped onto a humanoid robot, the human-machine matching is realized, and data are transmitted for the control of the robot;
matching the three-dimensional coordinates acquired by the depth camera with the human skeleton model by using a space vector method, performing operation between the coordinates according to a joint connection structure of the skeleton model so as to convert the three-dimensional coordinates into space vectors, and obtaining joint angles of all joints through the operation of outer products of the space vectors;
the robot control obtains relevant data of a human body arm through a depth camera, an upper computer analyzes and processes the data to obtain a joint value, the human body arm is mapped to the robot arm, steering engine value data are transmitted to a control unit, the control unit generates a control signal, when a steering engine receives a PWM signal from a controller, the pulse width is compared with the pulse width generated by an internal circuit to obtain two pulses, one pulse is output to a driving circuit, the other pulse is used for controlling the driving direction, the position of a rotary driving potentiometer of a motor changes the future pulse width until the future pulse width is equal to the pulse width output by an external signal, when the motor stops rotating, a rudder angle is fixed, the steering engine is controlled at a corresponding angle, and the control of the robot is realized;
the symbols used in the pinhole model illustrate: (u, v) isCoordinates in the pixel coordinate system, (x, y) coordinates of the image coordinate system, (x) c ,y c ,z c ) Is a coordinate in the camera coordinates, (x) 0 ,y 0 ,z 0 ) The coordinates are coordinates in a world coordinate system, the coordinate system in a pinhole model indicates that a certain point in a pixel coordinate system takes a pixel as a unit coordinate in an image, if the pixel value is taken as an element, a picture is stored in a computer in a matrix mode, and the coordinates of the pixel coordinate system of the certain point are a row mark and a column mark of the corresponding element in the matrix;
the coordinate of a certain point in the image coordinate system by taking the physical length (m) as a unit, and the corresponding relation between the pixel coordinate system and the image coordinate system is as follows:
o1 is the intersection of the camera optical axis and the image plane, called principal point of the picture, defining the principal point of the picture as the origin of the image coordinate system, whose coordinates in the pixel coordinate system are (u) 0 ,v 0 ) If the physical size of each pixel on the x and y axes is dx and dy, the relationship between the two coordinate systems is:
Figure FDA0003706226000000011
Figure FDA0003706226000000012
in matrix form, the following is true:
Figure FDA0003706226000000013
the camera coordinate system is the camera coordinate system with the camera optical center O as the origin, (X) c ,Y c ) The axes being parallel to the x, y axes of the image coordinate system, Z c The axis being the optical axis of the camera, perpendicular to the image plane, Z c The intersection point of the axis and the image plane is O1, i.e. the image principal point, OO1 is the focal length of the camera;
the correspondence between the image coordinate system and the camera coordinate system is as follows:
Figure FDA0003706226000000021
2. the method for controlling mapping of actions of a humanoid robot based on machine vision as claimed in claim 1, wherein the correspondence between the camera coordinate system and the world coordinate system is as follows:
Figure FDA0003706226000000022
according to the conversion relation between the coordinate systems in the pinhole model, the mapping relation from one point on the image to one point in the space can be established by combining the key point pixel position and the depth value obtained by the depth camera;
according to the kinematics principle, the coordinate transformation matrix expression is as follows:
Figure FDA0003706226000000023
the mapping relation from one point on the image to one point in the space can be established through the formula;
to solve the camera intrinsic parameter f x 、f y 、c x 、c y Need to perform camera calibration, c x 、c y Is the center of the aperture of the camera;
Figure FDA0003706226000000024
m is called an internal reference matrix, each value in the matrix is only related to the internal parameters of the camera and does not change along with the change of the position of the object, S is a scaling factor of the depth map, and when one point (u, v, d) and the corresponding space coordinate relation (x, y, z):
Figure FDA0003706226000000025
Figure FDA0003706226000000026
d=z*s。
3. the method for controlling mapping of actions of a humanoid robot based on machine vision as claimed in claim 1, wherein the yaw angle is calculated as follows:
the yaw freedom degree corresponds to the motion of the elbow joint of the robot, and two vectors are formed in a plane formed by the elbow, the wrist and the shoulder
Figure FDA0003706226000000027
And
Figure FDA0003706226000000028
the two form an angle formed in the plane as a yaw angle;
by utilizing the thought of a space vector method, the component of ES projection in an XOY plane is taken, the positive direction included angle between the component and a Y coordinate axis is solved, namely the yaw joint angle, and the calculation process of the yaw angle of the left arm is as follows:
Figure FDA0003706226000000031
Figure FDA0003706226000000032
Figure FDA0003706226000000033
the yaw angle forming principle of the right arm is completely consistent with the calculation principle and the left arm, so that the yaw angle forming principle can be calculated by using the formula;
by utilizing the thought of a space vector method, the component of ES projection in an XOY plane is taken, the positive direction included angle between the component and a Y coordinate axis is solved, namely the pitch joint angle, and the calculation process of the pitch angle of the left arm is as follows:
Figure FDA0003706226000000034
Figure FDA0003706226000000035
Figure FDA0003706226000000036
Figure FDA0003706226000000037
pitch angle of right arm
Figure FDA0003706226000000038
The process of the component in the XOY plane and the process of solving the included angle are completely consistent with the calculation of the left arm, so the calculation and the solution can also be realized by utilizing the algorithm;
the rolling freedom determines the state of the whole arm rotating along the shoulder-shoulder straight line, and corresponds to the freedom of the human shoulder for controlling the arm to swing back and forth and also corresponds to a steering engine for the shoulder of the humanoid robot to rotate;
since the roll degree of freedom does not correspond to the shoulder degree of freedom, the roll degree of freedom calculation cannot be directly obtained using the shoulder-connected vector;
the method is obtained by calculating the angle with the positive direction of the Y coordinate axis by using a space vector method and the outer product of the vector ES and the vector EW according to the rolling degree of freedom, and comprises the following steps:
Figure FDA0003706226000000039
Figure FDA00037062260000000310
Figure FDA00037062260000000311
Figure FDA00037062260000000312
Figure FDA00037062260000000313
when in use
Figure FDA00037062260000000314
Respectively located at the front and rear sides of the body, and
Figure FDA00037062260000000315
calculated theta at the same angle 3 Therefore, the difference between the two motions cannot be distinguished, and at this time, the front-back relationship on the z-axis between the Y-coordinate of the left shoulder and the Y-coordinate of the left elbow needs to be determined and further distinguished;
the calculation method of the right arm is similar, but the direction obtained by performing the outer product operation on the two vectors is just opposite to that of the left arm, and in order to ensure the consistency of the operation, the calculation method is to use
Figure FDA00037062260000000316
And
Figure FDA00037062260000000317
the calculation method of (2) is kept unchanged, and the vector outer product operation of the right arm is modified as follows:
Figure FDA00037062260000000318
holding
Figure FDA00037062260000000319
Is not changed, the turning freedom of the right arm is calculated as
Figure FDA00037062260000000320
In the actual operation process, the data of the steering engine value always fluctuates, so that the data needs to be filtered by adopting an amplitude limiting filtering method;
the main idea of the limiting filtering method is as follows: according to empirical judgment, determining the maximum deviation value allowed by two times of sampling as A, and judging each time a new value is detected: if the difference between the current value and the previous value is equal to A, the current value is valid, if the difference between the current value and the previous value is greater than A, the current value is invalid, the current value is abandoned, and the previous value is used for replacing the current value, so that the method can effectively overcome pulse interference caused by accidental factors under the condition of ensuring the existing precision; the steering engine values and joint angles of different steering engines have different mapping relations; after filtering, the joint angle is converted into a steering engine value according to the characteristics of the steering engine of the humanoid robot.
CN202110427319.2A 2021-04-21 2021-04-21 Humanoid robot action mapping control method based on machine vision Active CN113492404B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110427319.2A CN113492404B (en) 2021-04-21 2021-04-21 Humanoid robot action mapping control method based on machine vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110427319.2A CN113492404B (en) 2021-04-21 2021-04-21 Humanoid robot action mapping control method based on machine vision

Publications (2)

Publication Number Publication Date
CN113492404A CN113492404A (en) 2021-10-12
CN113492404B true CN113492404B (en) 2022-09-30

Family

ID=77997599

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110427319.2A Active CN113492404B (en) 2021-04-21 2021-04-21 Humanoid robot action mapping control method based on machine vision

Country Status (1)

Country Link
CN (1) CN113492404B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106078752A (en) * 2016-06-27 2016-11-09 西安电子科技大学 Method is imitated in a kind of anthropomorphic robot human body behavior based on Kinect
CN107901041A (en) * 2017-12-15 2018-04-13 中南大学 A kind of robot vision servo control method based on image blend square
CN107953331A (en) * 2017-10-17 2018-04-24 华南理工大学 A kind of human body attitude mapping method applied to anthropomorphic robot action imitation
CN108098761A (en) * 2016-11-24 2018-06-01 广州映博智能科技有限公司 A kind of the arm arm device and method of novel robot crawl target
CN108098780A (en) * 2016-11-24 2018-06-01 广州映博智能科技有限公司 A kind of new robot apery kinematic system
CN109693235A (en) * 2017-10-23 2019-04-30 中国科学院沈阳自动化研究所 A kind of Prosthetic Hand vision tracking device and its control method
CN110202583A (en) * 2019-07-09 2019-09-06 华南理工大学 A kind of Apery manipulator control system and its control method based on deep learning
CN111152218A (en) * 2019-12-31 2020-05-15 浙江大学 Action mapping method and system of heterogeneous humanoid mechanical arm
KR20210042644A (en) * 2019-10-10 2021-04-20 한국기술교육대학교 산학협력단 Control system and method for humanoid robot

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106078752A (en) * 2016-06-27 2016-11-09 西安电子科技大学 Method is imitated in a kind of anthropomorphic robot human body behavior based on Kinect
CN108098761A (en) * 2016-11-24 2018-06-01 广州映博智能科技有限公司 A kind of the arm arm device and method of novel robot crawl target
CN108098780A (en) * 2016-11-24 2018-06-01 广州映博智能科技有限公司 A kind of new robot apery kinematic system
CN107953331A (en) * 2017-10-17 2018-04-24 华南理工大学 A kind of human body attitude mapping method applied to anthropomorphic robot action imitation
CN109693235A (en) * 2017-10-23 2019-04-30 中国科学院沈阳自动化研究所 A kind of Prosthetic Hand vision tracking device and its control method
CN107901041A (en) * 2017-12-15 2018-04-13 中南大学 A kind of robot vision servo control method based on image blend square
CN110202583A (en) * 2019-07-09 2019-09-06 华南理工大学 A kind of Apery manipulator control system and its control method based on deep learning
KR20210042644A (en) * 2019-10-10 2021-04-20 한국기술교육대학교 산학협력단 Control system and method for humanoid robot
CN111152218A (en) * 2019-12-31 2020-05-15 浙江大学 Action mapping method and system of heterogeneous humanoid mechanical arm

Also Published As

Publication number Publication date
CN113492404A (en) 2021-10-12

Similar Documents

Publication Publication Date Title
CN112132894B (en) Mechanical arm real-time tracking method based on binocular vision guidance
AU2020201554B2 (en) System and method for robot teaching based on RGB-D images and teach pendant
CN109571487B (en) Robot demonstration learning method based on vision
CN107901041B (en) Robot vision servo control method based on image mixing moment
CN111360827B (en) Visual servo switching control method and system
CN110900581B (en) Four-degree-of-freedom mechanical arm vision servo control method and device based on RealSense camera
CN105014677B (en) Vision Mechanical arm control method based on Camshift visual tracking and D-H modeling algorithm
Li et al. A mobile robot hand-arm teleoperation system by vision and imu
CN109079799B (en) Robot perception control system and control method based on bionics
CN107363813A (en) A kind of desktop industrial robot teaching system and method based on wearable device
JP2022542241A (en) Systems and methods for augmenting visual output from robotic devices
CN109940626B (en) Control method of eyebrow drawing robot system based on robot vision
CN110815189B (en) Robot rapid teaching system and method based on mixed reality
Li et al. Visual servoing of wheeled mobile robots without desired images
Gratal et al. Visual servoing on unknown objects
Li et al. Development of kinect based teleoperation of nao robot
CN114536346B (en) Mechanical arm accurate path planning method based on man-machine cooperation and visual detection
Song et al. On-line stable evolutionary recognition based on unit quaternion representation by motion-feedforward compensation
Natale et al. Learning precise 3d reaching in a humanoid robot
JPH0780790A (en) Three-dimensional object grasping system
CN109693235B (en) Human eye vision-imitating tracking device and control method thereof
Gao et al. Kinect-based motion recognition tracking robotic arm platform
CN113492404B (en) Humanoid robot action mapping control method based on machine vision
CN110722547B (en) Vision stabilization of mobile robot under model unknown dynamic scene
CN115741732A (en) Interactive path planning and motion control method of massage robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant