CN116175569A - Method for determining relation model of hand-eye matrix, hand-eye calibration method and equipment - Google Patents
Method for determining relation model of hand-eye matrix, hand-eye calibration method and equipment Download PDFInfo
- Publication number
- CN116175569A CN116175569A CN202310125623.0A CN202310125623A CN116175569A CN 116175569 A CN116175569 A CN 116175569A CN 202310125623 A CN202310125623 A CN 202310125623A CN 116175569 A CN116175569 A CN 116175569A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- angle
- coordinate
- hand
- eye
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 239000011159 matrix material Substances 0.000 title claims abstract description 144
- 238000000034 method Methods 0.000 title claims abstract description 79
- 238000006243 chemical reaction Methods 0.000 claims abstract description 90
- 230000009466 transformation Effects 0.000 claims description 25
- 238000006073 displacement reaction Methods 0.000 claims description 22
- 230000008569 process Effects 0.000 claims description 17
- 238000009434 installation Methods 0.000 claims description 13
- 238000004590 computer program Methods 0.000 claims description 12
- 238000013519 translation Methods 0.000 claims description 12
- 230000014509 gene expression Effects 0.000 claims description 7
- 229920000535 Tan II Polymers 0.000 claims 3
- 230000007704 transition Effects 0.000 claims 1
- 210000003128 head Anatomy 0.000 description 61
- 238000010586 diagram Methods 0.000 description 12
- 238000004364 calculation method Methods 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 206010063385 Intellectualisation Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T90/00—Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
The embodiment of the application is suitable for the technical field of robots and provides a method for determining a relationship model of a hand-eye matrix, a hand-eye calibration method and equipment, wherein the method for determining the relationship model of the hand-eye matrix comprises the following steps: determining coordinate offset between the coordinate origin of the mechanical arm base coordinate system and the coordinate origin of the holder base coordinate system, and determining a coordinate conversion matrix between the holder base coordinate system and the camera coordinate system; according to the coordinate offset and the coordinate conversion matrix, a relation model of the hand-eye matrix is determined, the relation model can represent the relation between the hand-eye matrix and a holder angle, the holder angle is the angle of the holder relative to a holder base on different planes, and the hand-eye matrix is used for representing the coordinate conversion relation from a mechanical arm base coordinate system to a camera coordinate system. By the method, the relation model of the holder angle and the hand-eye matrix can be established, so that hand-eye calibration of the robot is simplified.
Description
Technical Field
The application belongs to the technical field of robots, and particularly relates to a method for determining a relation model of a hand-eye matrix, a hand-eye calibration method and equipment.
Background
Along with the rapid development of high and new technologies and the landing application of robots in various industries, the degree of intellectualization of the robot arm in cooperation with environmental perception is continuously improved, particularly visual guidance, and the use mode of the robots is being changed.
In a single mechanical arm or double mechanical arms personification system, a depth camera with a cradle head is adopted for visual positioning, the cradle head at least provides rotary motion with two degrees of freedom of horizontal and pitching, the target is positioned from different angles by adjusting the position and the posture of the camera, and the target can be positioned in a wider range in a fixed position. After the positioning is finished, the pose of the target point under the camera coordinate system is converted into the pose under the mechanical arm base coordinate system by the hand-eye matrix, and the mechanical arm can move to the target point to finish the operation after acquiring the pose of the target point under the mechanical arm base coordinate system.
The visual guidance optimization scheme can be generally used for service robots, inspection robots, industrial robots and mechanical arm systems, and provides a solution for the work of the visual guidance robots in a wider range, more intelligent and closer to a human visual observation mode on the basis of the operation positioning of the original camera at a fixed angle position.
In the process, the hand-eye calibration is very important, the mode of acquiring the hand-eye matrix is usually to identify the pose matrix of the chessboard grid calibration camera relative to the base coordinates of the mechanical arm, namely the hand-eye matrix, through a common method that eyes are outside hands under the condition that the horizontal and pitch angle positions of the tripod head are fixed, and the accurate hand-eye matrix can be acquired under the condition that the rotation angle of the tripod head is fixed, but because the operation needs, the horizontal and pitch angles of the tripod head are not fixed, the hand-eye calibration needs to be performed again for each time of changing the horizontal and pitch angle camera, the workload is very large, and huge space is also required for storing the matrices.
Disclosure of Invention
In view of this, the embodiments of the present application provide a method for determining a relationship model of a hand-eye matrix, a method for calibrating hand-eye, and a device for calibrating hand-eye, which are used for calibrating the hand-eye matrix based on the relationship model and the holder angle by determining the relationship model of the hand-eye matrix and the holder angle, so as to reduce the difficulty of calibrating hand-eye.
A first aspect of an embodiment of the present application provides a method for determining a relational model of a hand-eye matrix, including:
determining a coordinate offset between a coordinate origin of a mechanical arm base coordinate system and a coordinate origin of a tripod head base coordinate system, and determining a coordinate conversion matrix between the tripod head base coordinate system and a camera coordinate system, wherein the tripod head base coordinate system is a coordinate system established by taking a rotation axis center of a tripod head base as the coordinate origin, the camera coordinate system is a coordinate system established by taking a focusing center of a camera as the coordinate origin, and the mechanical arm base coordinate system is a coordinate system established by taking the rotation axis center of a mechanical arm base as the coordinate origin;
And determining a relation model of a hand-eye matrix according to the coordinate offset and the coordinate transformation matrix, wherein the relation model can represent the relation between the hand-eye matrix and a holder angle, the holder angle is an angle of a holder relative to a holder base on different planes, and the hand-eye matrix is used for representing the coordinate transformation relation from the mechanical arm base coordinate system to the camera coordinate system.
A second aspect of the embodiments of the present application provides a hand-eye calibration method, which is applied to a terminal device, where the terminal device includes a pan-tilt and a mechanical arm, and a camera is installed on the pan-tilt, and the method includes:
determining a target relation model of the terminal equipment, wherein the target relation model is a relation model between a holder angle of the terminal equipment and a hand-eye matrix, and the hand-eye matrix is used for representing a coordinate conversion relation from a mechanical arm base coordinate system to the camera coordinate system;
inputting a current holder angle into the target relation model to obtain a current target hand-eye matrix, wherein the target hand-eye matrix is used for representing the coordinate conversion relation from the mechanical arm base coordinate system to the camera coordinate system under the current holder angle;
Wherein the target relationship model is established according to the method described in the first aspect.
A third aspect of the embodiments of the present application provides a device for determining a relational model of a hand-eye matrix, including:
the coordinate conversion determining module is used for determining coordinate offset between a coordinate origin of a mechanical arm base coordinate system and a coordinate origin of a tripod head base coordinate system, and determining a coordinate conversion matrix between the tripod head base coordinate system and a camera coordinate system, wherein the tripod head base coordinate system is a coordinate system established by taking the center of a rotation shaft of a tripod head base as the coordinate origin, the camera coordinate system is a coordinate system established by taking the center of focusing of a camera as the coordinate origin, and the mechanical arm base coordinate system is a coordinate system established by taking the center of the rotation shaft of a mechanical arm base as the coordinate origin;
the relation model determining module is used for determining a relation model of a hand-eye matrix according to the coordinate offset and the coordinate transformation matrix, the relation model can represent the relation between the hand-eye matrix and a holder angle, the holder angle is an angle of a holder relative to a holder base on different planes, and the hand-eye matrix is used for representing the coordinate transformation relation from the mechanical arm base coordinate system to the camera coordinate system.
A second aspect of the embodiments of the present application provides a hand-eye calibration device, which is applied to a terminal device, the terminal device includes a cradle head and a mechanical arm, a camera is installed on the cradle head, and the device includes:
the target relation model determining module is used for determining a target relation model of the terminal equipment, wherein the target relation model is a relation model between a holder angle of the terminal equipment and a hand-eye matrix, and the hand-eye matrix is used for representing a coordinate conversion relation from a mechanical arm base coordinate system to the camera coordinate system;
the target hand-eye matrix calibration module is used for inputting the current holder angle into the target relation model to obtain a current target hand-eye matrix, and the target hand-eye matrix is used for representing the coordinate conversion relation from the mechanical arm base coordinate system to the camera coordinate system under the current holder angle;
wherein the target relationship model is established according to the method described in the first aspect.
A fifth aspect of embodiments of the present application provides a terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the method according to the first or second aspect as described above when executing the computer program.
A sixth aspect of embodiments of the present application provides a computer readable storage medium storing a computer program which when executed by a processor implements a method as described in the first or second aspect above.
A seventh aspect of the embodiments of the present application provides a computer program product, which when run on a terminal device, causes the terminal device to perform the method of the first or second aspect described above.
Compared with the prior art, the embodiment of the application has the following advantages:
the hand-eye matrix is used for representing the coordinate conversion relation between the mechanical arm base coordinate system and the camera coordinate system, and the hand-eye matrix can be disassembled into the coordinate conversion relation between the mechanical arm base coordinate system and the holder base coordinate system and the coordinate conversion relation between the holder base coordinate system and the camera coordinate system. Therefore, the coordinate offset between the coordinate origin of the mechanical arm base coordinate system and the coordinate origin of the camera coordinate system can be determined, the coordinate conversion matrix between the holder base coordinate system and the camera coordinate system is determined, then the coordinate offset and the coordinate conversion matrix are combined to obtain a relationship model of the hand-eye matrix, and the relationship model of the hand-eye matrix comprises the related parameters of the holder angle, so that the relationship model can represent the relationship between the hand-eye matrix and the holder angle. Based on the disassembly of the hand-eye matrix, the embodiment of the application can easily obtain the relation model between the hand-eye matrix and the cradle head angle.
According to the method for determining the relationship model of the hand-eye matrix, when the terminal equipment performs hand-eye calibration, the target relationship model between the cradle head angle and the hand-eye matrix can be determined, and the hand-eye calibration can be realized based on the target relationship model and the current cradle head angle, so that the current hand-eye matrix is obtained. Based on the method, the hand-eye calibration can be realized rapidly based on the cradle head angle, the condition that a calibration person calibrates the hand-eye matrix once when the cradle head angle changes every time is avoided, the calibration times are greatly reduced, and the operation efficiency and flexibility of the robot are improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the following will briefly introduce the drawings that are required to be used in the embodiments or the description of the prior art.
Fig. 1 is a schematic step flow diagram of a method for determining a relationship model of a hand-eye matrix according to an embodiment of the present application;
FIG. 2 is a schematic diagram of coordinate transformation provided in an embodiment of the present application;
FIG. 3 is a schematic step flow diagram of a method for calibrating eyes and hands according to an embodiment of the present application;
fig. 4 is a schematic diagram of a determining device for a relational model of a hand-eye matrix according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a hand-eye calibration device according to an embodiment of the present application;
fig. 6 is a schematic diagram of a terminal device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The technical scheme of the present application is described below by specific examples.
Referring to fig. 1, a step flow diagram of a method for determining a relationship model of a hand-eye matrix according to an embodiment of the present application may specifically include the following steps:
s101, determining coordinate offset between a coordinate origin of a mechanical arm base coordinate system and a coordinate origin of a holder base coordinate system, and determining a coordinate conversion matrix between the holder base coordinate system and a camera coordinate system.
The execution body of the embodiment of the application may be a terminal device that needs to determine a hand-eye matrix, and the embodiment of the application does not limit a specific type of the terminal device. The terminal device may be a robot, which may include a cradle head and a robot arm, on which a camera may be mounted. The cradle head comprises a cradle head base, the position of the cradle head base is fixed, and the cradle head can rotate around the center of a rotation shaft of the cradle head base. The robot arm may include a robot arm base, and a rotation axis center of the robot arm base is fixed.
The hand-eye matrix is used for representing the coordinate conversion relation between the mechanical arm base coordinate system and the camera coordinate system, and the hand-eye matrix can be disassembled into the coordinate conversion relation between the mechanical arm base coordinate system and the holder base coordinate system and the coordinate conversion relation between the holder base coordinate system and the camera coordinate system. The holder base coordinate system is a coordinate system established by taking the center of a rotation axis of the holder base as a coordinate origin, the camera coordinate system is a coordinate system established by taking the focusing center of the camera as the coordinate origin and the optical axis as the Z axis, and the mechanical arm base coordinate system is a coordinate system established by taking the center of the rotation axis of the mechanical arm base as the coordinate origin. Because the rotation axis center of the mechanical arm base and the rotation axis center of the tripod head base are fixed, the mechanical arm base coordinate system and the tripod head base coordinate system are fixed. The mechanical arm base coordinate system and the holder base coordinate system can have the same x, y and z axis directions, so that coordinate conversion between the mechanical arm base coordinate system and the holder base coordinate system is equivalent to position offset between coordinate origins only. Therefore, determining the coordinate conversion relationship between the arm-based coordinate system and the pan-tilt-based coordinate system corresponds to determining the coordinate offset between the coordinate origin of the arm-based coordinate system and the coordinate origin of the camera coordinate system. The coordinate offsets may include offsets in the x, y, and z axes.
In one possible implementation manner, in the embodiment of the present application, for convenience of processing, the coordinate offset between the coordinate origin of the base coordinate system of the mechanical arm and the coordinate origin of the camera coordinate system may only include the offset in the x and y axes, and for the cheapness in the z axis, it may be included in the coordinate conversion relationship between the base coordinate system of the pan-tilt and the camera coordinate system. That is, the coordinate shift may include only a lateral shift and a longitudinal shift.
As the angle of the pan/tilt is changed, the angle of the camera is changed, and accordingly, the focus center and the optical axis of the camera are changed, so that rotation and displacement of the coordinate system are required when converting the pan/tilt-based coordinate system into the camera coordinate system. Based on the rotation and displacement of the coordinate system, it is possible to determine the transformation of the pan-tilt-base coordinate system into a coordinate transformation matrix between the camera coordinate systems.
Fig. 2 is a schematic diagram of coordinate transformation provided in an embodiment of the present application. As shown in fig. 2, the coordinate conversion from the robot arm base coordinate system to the pan/tilt base coordinate system may include:
and translating the mechanical arm base coordinate system in the x and y directions, so that the coordinate origin of the mechanical arm base coordinate system moves to the z axis where the coordinate origin of the holder base coordinate system is located. Because the positions of the mechanical arm base and the holder base are fixed, the coordinate conversion from the mechanical arm base coordinate system to the holder base coordinate system only needs to be carried out according to the position offset between the coordinate origins.
The conversion between the pan-tilt-based coordinate system to the camera coordinate system may include:
rotating the holder base coordinate system by an angle theta along a z axis of the holder base coordinate system to obtain a first coordinate system; rotating the first coordinate system by beta angle along the y axis of the first coordinate system to obtain a second coordinate system; moving the second coordinate system by h along the z-axis direction of the second coordinate system to obtain a third coordinate system; the third coordinate system is kept unchanged in posture, and the fourth coordinate system is obtained through coordinate translation d along the x direction of the third coordinate system; rotating the fourth coordinate system by an alpha angle along the x axis of the fourth coordinate system to obtain a fifth coordinate system; and moving the fifth coordinate system along the z axis of the fifth coordinate system by s to obtain a camera coordinate system. The first coordinate system, the second coordinate system, the third coordinate system, the fourth coordinate system and the fifth coordinate system are all intermediate coordinate systems in the coordinate conversion process, and each coordinate conversion is performed on the basis of the last coordinate conversion.
Hypothesis matrixRepresenting the coordinate transformation matrix from the coordinate system a to the coordinate system b, and obtaining the coordinate transformation matrix from the holder base to the origin of the camera according to the transformation step between the holder base coordinate system and the camera coordinate system, wherein the calculation formula is as follows:
wherein ,for the coordinate conversion matrix, θ is the angle of rotation of the pan-tilt-base coordinate system along the z-axis of the pan-tilt-base coordinate system during the conversion from the pan-tilt-base coordinate system to the first coordinate system, β is the angle of rotation of the pan-tilt-base coordinate system along the y-axis of the first coordinate system during the conversion from the first coordinate system to the second coordinate system, h is the displacement translated along the z-axis of the second coordinate system during the conversion from the second coordinate system to the third coordinate system, d is the displacement required to be translated along the x-axis of the third coordinate system during the conversion from the third coordinate system to the fourth coordinate system, α is the angle of rotation along the x-axis of the fourth coordinate system during the conversion from the fourth coordinate system to the fifth coordinate system, s is the displacement translated along the z-axis of the fifth coordinate system during the conversion from the fifth coordinate system to the camera coordinate system; the process of converting the holder base coordinate system into the camera coordinate system comprises the following steps: the camera coordinate system is converted from a cradle head coordinate system to a first coordinate system, from the first coordinate system to a second coordinate system, from the second coordinate system to a third coordinate system, from the third coordinate system to a fourth coordinate system, from the fourth coordinate system to a fifth coordinate system and from the fifth coordinate system to a camera coordinate system.
The h may include a distance between the origin of coordinates of the robot base coordinate system and the origin of coordinates of the pan/tilt base coordinate system in the z axis.
Assuming that the coordinate transformation from the base of the mechanical arm to the end of the camera, i.e. the hand-eye matrix, isIn accordance with the above-mentioned matters,the method can be decomposed into the conversion from a mechanical arm base coordinate system to a holder base coordinate system and the conversion from a holder base to a camera origin, namely:
wherein The coordinate conversion of the mechanical arm base and the camera base is generated by the offset of the x and y directions.
S102, determining a relation model of the hand-eye matrix according to the coordinate offset and the coordinate transformation matrix, wherein the relation model can represent the relation between the hand-eye matrix and the holder angle.
The camera is installed on the cloud deck and can move along with the cloud deck. After the holder is mounted on the holder base, the holder can rotate on a horizontal plane around the center of the rotation shaft and can pitch on the vertical platform, so that the holder angle can comprise a horizontal angle and a pitch angle. In determining the horizontal angle and the pitch angle, an angular relationship between the pan-tilt and a preset reference position can be used for characterization.
When the angle of the pan-tilt changes, the focus center and the optical axis of the camera change, and accordingly, the camera coordinate system changes. It can be seen that the camera coordinate system has an association relationship with the pan-tilt angle, and accordingly, the hand-eye matrix may have a corresponding relationship with the pan-tilt angle.
Therefore, the relationship between the hand-eye matrix and the holder angle can be represented based on the relationship model of the hand-eye matrix determined by the coordinate shift and the coordinate conversion matrix. That is, based on the relational model, the hand-eye matrix can be determined by the pan-tilt angle.
Illustratively, by combining the above formula 1 and the above formula 2, a relationship between the hand-eye matrix and the pan-tilt angle can be obtained:
wherein ,for the hand-eye matrix, θ is the angle of rotation of the pan-tilt-base coordinate system along the z-axis of the pan-tilt-base coordinate system during the conversion from the pan-tilt-base coordinate system to the first coordinate system, β is the angle of rotation of the pan-tilt-base coordinate system along the y-axis of the first coordinate system during the conversion from the first coordinate system to the second coordinate system, h is the displacement of translation along the z-axis of the second coordinate system during the conversion from the second coordinate system to the third coordinate system, d is the displacement of translation along the x-axis of the third coordinate system during the conversion from the third coordinate system to the fourth coordinate system, α is the angle of rotation along the x-axis of the fourth coordinate system during the conversion from the fourth coordinate system to the fifth coordinate system, s is the displacement of translation along the z-axis of the fifth coordinate system during the conversion from the fifth coordinate system to the camera coordinate system, x is the lateral offset, and y is the longitudinal offset; the process of converting the holder base coordinate system into the camera coordinate system comprises the following steps: the camera coordinate system is converted from a cradle head coordinate system to a first coordinate system, from the first coordinate system to a second coordinate system, from the second coordinate system to a third coordinate system, from the third coordinate system to a fourth coordinate system, from the fourth coordinate system to a fifth coordinate system and from the fifth coordinate system to a camera coordinate system.
In the above-mentioned relation model, for the terminal device, the horizontal angle and the pitch angle of the pan/tilt matrix are variables, and other parameters are fixed values in the determined terminal device. For example, after the cradle head is mounted, the mounting angle of the cradle head is fixed, that is, β is a fixed value. Because other parameters are fixed, only the holder angle is a variable, and therefore the relationship model is a relationship model of the hand-eye matrix and the holder angle.
In the above-described relational model, β, x, y, h, d, s are fixed, but need to be solved. In order to facilitate the solution, the matrix in the relationship model can be converted into the pose of the camera, so as to obtain:
roll=atan2(sinα,cosα)=α(4)
pitch=atan2(sinβ,cosβ)=β(5)
yaw=atan2(sinθ,cosθ)=θ(6)
wherein α is a pitching angle of the pan-tilt angle, β is an installation angle of the camera on the pan-tilt, θ is a horizontal angle of the pan-tilt angle, (roll, pitch, yaw) is a pose of the camera under a mechanical arm base coordinate system, and x, y, h, d and s are undetermined coefficients. The final pose roll and yaw angles of the camera under the mechanical arm coordinate system are alpha and theta angles in the pan-tilt angle, and the beta angle is the value of the calibrated pitch angle.
And (3) carrying out position item analysis on the relational expression, and deriving the position coordinates of the camera relative to the mechanical arm base according to coordinate conversion, wherein the position coordinates are as follows:
cam_x=s·(sinθ·sinα+cosθsinβcosα)+dcosθcosβ+x(7)
cam_y=s·(-cosθ·sinα+sinθsinβcosα)+dsinθcosβ+y(8)
cam_z=s·cosαcosβ+h(9)
Wherein, (cam_x, cam_y, cam_z) is a coordinate of a camera coordinate system corresponding to the coordinate of the mechanical arm coordinate system, alpha is a pitching angle of the pan-tilt angle, theta is a horizontal angle of the pan-tilt angle, beta is an installation angle of the camera on the pan-tilt, x is a transverse offset of the coordinate offsets, y is a longitudinal offset of the coordinate offsets, and h is a displacement required to translate along a z-axis of the pan-tilt coordinate system from the pan-tilt coordinate system to the camera coordinate system.
Based on the above relation, when the holder angle is fixed, α, β, θ are fixed values, and substituting them into formulas (7) (8) (9) can obtain 3 equations, and the equation set has five unknowns x, y, h, d, s, so if two sets of data are calibrated, 6 equations can be obtained, and thus the corresponding value of x, y, h, d, s can be solved, and further the relation model of the hand-eye matrix and the holder angle can be obtained.
In the embodiment of the application, when the hand-eye matrix is determined, the hand-eye matrix can be disassembled into the coordinate conversion relation between the mechanical arm base coordinate system and the holder base coordinate system and the coordinate conversion relation between the holder base coordinate system and the camera coordinate system. Therefore, the coordinate offset between the coordinate origin of the mechanical arm base coordinate system and the coordinate origin of the camera coordinate system can be determined, the coordinate conversion matrix between the holder base coordinate system and the camera coordinate system is determined, then the coordinate offset and the coordinate conversion matrix are combined to obtain a relationship model of the hand-eye matrix, and the relationship model of the hand-eye matrix comprises the related parameters of the holder angle, so that the relationship model can represent the relationship between the hand-eye matrix and the holder angle. Based on the established relation model, the hand-eye matrix can be determined through the cradle head angle, so that subsequent hand-eye calibration is facilitated, the calibration difficulty is reduced, and required parameters are reduced.
Referring to fig. 3, a step flow diagram of another hand-eye calibration method provided in an embodiment of the present application is shown, where the method may be applied to a terminal device, the terminal device may include a pan-tilt and a mechanical arm, and the pan-tilt may be provided with a camera, and the method may specifically include the following steps:
s301, determining a target relation model of the terminal equipment, wherein the target relation model is a relation model between a cradle head angle and a hand-eye matrix of the terminal equipment.
The terminal device may be a robot, which may include a cradle head and a robot arm, on which a camera may be mounted. The cradle head comprises a cradle head base, the position of the cradle head base is fixed, and the cradle head can rotate around the center of a rotation shaft of the cradle head base. The camera of the robot can be equivalent to the 'eye' of the robot, the mechanical arm can be equivalent to the 'hand' of the robot, and the hand-eye calibration can determine the coordinate conversion relation between the mechanical arm base coordinate system and the camera coordinate system of the robot, and the coordinate conversion relation can be a hand-eye matrix. Based on the hand-eye matrix, the robot can convert the pose obtained based on the camera coordinate system into the mechanical arm base coordinate system, so that the mechanical arm operation can be controlled based on the positioning result of the camera.
Wherein the target relationship model is established according to the method of the first aspect. As can be seen from the above embodiment, when determining the target relationship model, it is necessary to solve the undetermined coefficients in the relationship model. And solving the undetermined coefficient, and acquiring a corresponding equation based on a plurality of manual calibration data, so as to solve the equation to obtain the undetermined coefficient.
For example, at least two initial hand-eye matrices may be determined at a plurality of different initial pan angles. The determination of the initial hand-eye matrix may be based on a calibration plate.
Based on the relationship model of the hand-eye matrix and the plurality of initial hand-eye matrices determined in the previous embodiment, a corresponding plurality of relational expressions can be obtained. The relation model comprises a plurality of coefficients to be determined, and a plurality of relational expressions are used for representing the relation between the coefficients to be determined and the holder angle. Specifically, the relationship model may be converted to obtain a plurality of conversion relationships with respect to the undetermined coefficient; thereby converting the plurality of initial hand-eye matrices into a corresponding plurality of relational expressions with respect to the coefficient to be determined based on the plurality of conversion relations. Wherein the plurality of conversion relationships may include:
roll=atan2(sinα,cosα)=α
pitch=atan2(sinβ,cosβ)=β
yaw=atan2(sinθ,cosθ)=θ
cam_x=s·(sinθ·sinα+cosθsinβcosα)+dcosθcosβ+x
cam_y=s·(-cosθ·sinα+sinθsinβcosα)+dsinθcosβ+y
cam_z=s·cosαcosβ+h
wherein, (cam_x, cam_y, cam_z) is a coordinate of a camera coordinate system corresponding to the coordinate of the mechanical arm coordinate system, alpha is a pitching angle of a tripod head angle, beta is a mounting angle of the camera on the tripod head, theta is a horizontal angle of the tripod head angle, (roll, pitch, yaw) is a pose of the camera under the mechanical arm base coordinate system, x, y, h, d and s are undetermined coefficients.
Based on the initial hand-eye matrixes, substituting the data into the conversion relation can obtain a plurality of corresponding relation formulas.
After determining the relationship, coefficient values for the plurality of pending coefficients may be calculated based on the plurality of relationships. The initial pan-tilt angle is a pan-tilt angle input by a user, but the initial pan-tilt angle input by the user and the actual pan-tilt angle may have errors, so that the initial pan-tilt angle can be corrected when the coefficient value is calculated. Illustratively, the pan-tilt angle may be corrected based on the following formula:
wherein ,θinput For pitch angle alpha input For the horizontal angle, pitch is the installation angle, α is the corrected pitch angle, β is the corrected installation angle, θ is the corrected horizontal angle, N is the number of initial hand-eye matrices, and N is greater than or equal to 2. One initial hand-eye matrix can determine 3 equations, and a total of 5 unknowns need to be solved, so at least 2 initial hand-eye matrices are needed to determine 6 equations, thereby solving for 5 unknowns.
After correcting the initial pan-tilt angle, a plurality of equations for solving the coefficient values may be determined according to the plurality of conversion relationships and the corrected initial pan-tilt angle. And solving a plurality of equations to obtain the coefficient value.
Substituting the coefficient value into the relation model to obtain the target relation model. The target relationship model corresponds to a relationship model of a pan-tilt angle and a hand-eye angle.
S302, inputting a current holder angle into the target relation model to obtain a current target hand-eye matrix, wherein the target hand-eye matrix is used for representing the coordinate conversion relation from the mechanical arm base coordinate system to the camera coordinate system under the current holder angle.
After the target relation model is determined, when the angle of the cradle head changes, the current angle of the cradle head can be input into the target relation model, and the current target hand-eye matrix can be obtained rapidly. Based on the target hand-eye matrix, the terminal device can control the mechanical arm to execute corresponding operation.
In this embodiment, when the terminal device performs the hand-eye calibration, only after the terminal device is installed, determining a target relationship model corresponding to the terminal device; and then when the cradle head angle of the terminal equipment changes, the hand-eye matrix can be obtained only by inputting the current cradle head angle into the target relation model. Based on the method in the embodiment, the hand-eye calibration is simplified, a large amount of calculation is not needed when the angle of the cradle head changes each time, and the determination difficulty of the hand-eye matrix is reduced; and meanwhile, a large amount of calculation is not needed, so that the calculation resource can be saved.
It should be noted that, the sequence number of each step in the above embodiment does not mean the sequence of execution sequence, and the execution sequence of each process should be determined by its function and internal logic, and should not constitute any limitation on the implementation process of the embodiment of the present application.
The embodiment of the application also provides a robot, which determines the relationship model of the holder angle and the hand-eye matrix by the determination method of the relationship model of the hand-eye matrix, and performs hand-eye calibration by the hand-eye calibration method, so as to determine the hand-eye matrix. The robot can comprise a cradle head and a mechanical arm, and a camera can be installed on the cradle head. The cradle head comprises a cradle head base, the position of the cradle head base is fixed, and the cradle head can rotate around the center of a rotation shaft of the cradle head base. When the cradle head rotates, the camera can be driven to rotate, so that the camera coordinate system is changed. When the camera coordinate system changes, the robot can perform hand-eye calibration through the hand-eye calibration method to obtain a hand-eye matrix.
Referring to fig. 4, a schematic diagram of a determining device for a relationship model of a hand-eye matrix provided in an embodiment of the present application may specifically include a coordinate transformation determining module 41 and a relationship model determining module 42, where:
A coordinate conversion determining module 41, configured to determine a coordinate offset between a coordinate origin of a mechanical arm base coordinate system and a coordinate origin of a pan-tilt base coordinate system, and determine a coordinate conversion matrix between the pan-tilt base coordinate system and a camera coordinate system, where the pan-tilt base coordinate system is a coordinate system established with a rotation axis center of the pan-tilt base as the coordinate origin, the camera coordinate system is a coordinate system established with a focus center of the camera as the coordinate origin, and the mechanical arm base coordinate system is a coordinate system established with a rotation axis center of the mechanical arm base as the coordinate origin;
a relationship model determining module 42, configured to determine a relationship model of a hand-eye matrix according to the coordinate offset and the coordinate transformation matrix, where the relationship model is capable of characterizing a relationship between the hand-eye matrix and a pan-tilt angle, where the pan-tilt angle is an angle of a pan-tilt relative to the pan-tilt base on different planes, and the hand-eye matrix is used to characterize a coordinate transformation relationship from the robot arm base coordinate system to the camera coordinate system.
In one possible implementation, the coordinate transformation matrix between the pan-tilt-base coordinate system and the camera coordinate system is:
wherein ,For the coordinate conversion matrix, θ is an angle at which the pan-tilt-base coordinate system is rotated along the z-axis of the pan-tilt-base coordinate system in the process of converting from the pan-tilt-base coordinate system to the first coordinate system, β is an angle at which the conversion from the first coordinate system to the second coordinate system is required to be rotated along the y-axis of the first coordinate system, h is a displacement at which the conversion from the second coordinate system to the third coordinate system is required to be translated along the z-axis of the second coordinate system, d is a displacement at which the conversion from the third coordinate system to the fourth coordinate system is required to be translated along the x-axis of the third coordinate system, α is an angle at which the conversion from the fourth coordinate system to the fifth coordinate system is required to be rotated along the x-axis of the fourth coordinate system, s is required to be converted from the fifth coordinate system to the camera coordinate systemDisplacement of the z-axis translation of the fifth coordinate system; the process of converting the pan-tilt base coordinate system into the camera coordinate system comprises the following steps: converting from the cradle head coordinate system to the first coordinate system, converting from the first coordinate system to the second coordinate system, converting from the second coordinate system to the third coordinate system, and converting from the third coordinate system to the fourth coordinate system, converting from the fourth coordinate system to the fifth coordinate system, and converting from the fifth coordinate system to the camera coordinate system.
In one possible implementation, the coordinate offset includes a lateral offset and a longitudinal offset of a coordinate origin of the robotic arm base coordinate system and a coordinate origin of the camera coordinate system, the pan-tilt angle includes a horizontal angle and a pitch angle, and the relationship model is:
wherein ,for the hand-eye matrix, θ is an angle of rotation of the pan-tilt-base coordinate system along the z-axis of the pan-tilt-base coordinate system in the process of converting from the pan-tilt-base coordinate system to the first coordinate system, β is an angle of rotation of the pan-tilt-base coordinate system along the y-axis of the first coordinate system in the process of converting from the first coordinate system to the second coordinate system, h is a displacement of translation of the pan-tilt-base coordinate system along the z-axis of the second coordinate system in the process of converting from the second coordinate system to the third coordinate system, d is a displacement of translation of the pan-tilt-base coordinate system along the x-axis of the third coordinate system in the process of converting from the third coordinate system to the fourth coordinate system, α is an angle of rotation of the pan-tilt-base coordinate system along the x-axis of the fourth coordinate system in the process of converting from the fourth coordinate system to the fifth coordinate system, s is a displacement of the pan-tilt-base coordinate system along the z-axis of the fifth coordinate system in the process of converting from the second coordinate system to the third coordinate system, x is the lateral offset, and y is the longitudinal offset; the process of converting the pan-tilt base coordinate system into the camera coordinate system comprises the following steps: converting from the cradle head coordinate system to the first coordinate system, and from the first sitting position The coordinate system is converted to the second coordinate system, the third coordinate system, the fourth coordinate system, the fifth coordinate system and the camera coordinate system.
In one possible implementation, the apparatus further includes:
the transformation module is used for carrying out matrix transformation on the relation model to obtain a relation model of camera coordinates and the cradle head angle, wherein the relation model is as follows:
cam_x=s·(sinθ·sinα+cosθsinβcosα)+dcosθcosβ+x
cam_y=s·(-cosθ·sinα+sinθsinβcosα)+dsinθcosβ+y
cam_z=s·cosαcosβ+h
wherein, (cam_x, cam_y, cam_z) is a coordinate of the camera coordinate system corresponding to the coordinate of the mechanical arm coordinate system, alpha is a pitching angle of the pan-tilt angles, theta is a horizontal angle of the pan-tilt angles, beta is an installation angle of the camera on the pan-tilt, x is a lateral offset of the coordinate offsets, y is a longitudinal offset of the coordinate offsets, and h is a displacement required to translate along a z-axis of the pan-tilt base coordinate system from the pan-tilt base coordinate system to the camera coordinate system.
Referring to fig. 5, a schematic diagram of a hand-eye calibration device provided in an embodiment of the present application is shown, where the device is applied to a terminal device, the terminal device includes a pan-tilt and a mechanical arm, a camera is installed on the pan-tilt, and the device may specifically include a target relationship model determining module 51 and a target hand-eye matrix calibration module 52, where:
The target relationship model determining module 51 is configured to determine a target relationship model of the terminal device, where the target relationship model is a relationship model between a pan-tilt angle of the terminal device and a hand-eye matrix, and the hand-eye matrix is used to characterize a coordinate conversion relationship from a mechanical arm base coordinate system to the camera coordinate system;
the target hand-eye matrix calibration module 52 is configured to input a current pan-tilt angle into the target relation model to obtain a current target hand-eye matrix, where the target hand-eye matrix is used to represent a coordinate conversion relationship from the mechanical arm base coordinate system to the camera coordinate system under the current pan-tilt angle;
the target relation model is established according to the method for determining the relation model of the hand-eye matrix.
In one possible implementation manner, the target relationship model determining module 51 includes:
the initial calibration sub-module is used for determining at least two initial hand-eye matrixes under a plurality of different initial cradle head angles;
the relation determining submodule is used for obtaining a plurality of corresponding relation formulas based on a relation model of the hand-eye matrix and a plurality of initial hand-eye matrices, the relation model comprises a plurality of coefficients to be determined, and the relation formulas are used for representing the relation between the coefficients to be determined and the holder angles;
A coefficient value calculation sub-module for calculating coefficient values of a plurality of the undetermined coefficients based on a plurality of the relational expressions;
the target relation model determining submodule is used for substituting the coefficient value into the relation model to obtain the target relation model;
the relationship model is a relationship model of the hand-eye matrix established by a determination method of the relationship model of the hand-eye matrix.
In one possible implementation manner, the relational expression determination submodule includes:
the conversion relation determining unit is used for converting the relation model to obtain a plurality of conversion relations about the undetermined coefficient;
and a relation determining unit configured to convert a plurality of the initial hand-eye matrices into a corresponding plurality of relation formulas regarding the undetermined coefficient based on a plurality of the conversion relations.
In one possible implementation, the plurality of conversion relationships includes:
roll=atan2(sinα,cosα)=α
pitch=atan2(sinβ,cosβ)=β
yaw=atan2(sinθ,cosθ)=θ
cam_x=s·(sinθ·sinα+cosθsinβcosα)+dcosθcosβ+x
cam_y=s·(-cosθ·sinα+sinθsinβcosα)+dsinθcosβ+y
cam_z=s·cosαcosβ+h
wherein, (cam_x, cam_y, cam_z) is a coordinate of the camera coordinate system corresponding to the coordinate in the mechanical arm coordinate system, alpha is a pitching angle of the pan-tilt angle, beta is an installation angle of the camera on the pan-tilt, theta is a horizontal angle of the pan-tilt angle, (roll, pitch, yaw) is a pose of the camera in the mechanical arm base coordinate system, x, y, h, d and s are the undetermined coefficients.
In one possible implementation manner, the coefficient value calculation submodule includes:
the correcting unit is used for correcting the initial cradle head angle;
an equation establishing unit for determining a plurality of equations for solving the coefficient values according to the plurality of conversion relations and the corrected initial pan/tilt angles;
and the solving unit is used for solving a plurality of equations to obtain the coefficient value.
In one possible implementation manner, the initial pan/tilt angle includes a horizontal angle, an installation angle, and a pitch angle, and the correction unit includes:
the correcting subunit is configured to correct the pan-tilt angle according to the following formula:
wherein ,θinput For the pitch angle alpha input For the horizontal angle, pitch is the installation angle, α is the corrected pitch angle, β is the corrected installation angle, θ is the corrected horizontal angle, N is the number of the initial hand-eye matrices, and N is greater than or equal to 2.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference should be made to the description of the method embodiments.
Fig. 6 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 6, the terminal device 6 of this embodiment includes: at least one processor 60 (only one shown in fig. 6), a memory 61 and a computer program 62 stored in the memory 61 and executable on the at least one processor 60, the processor 60 implementing the steps in any of the various method embodiments described above when executing the computer program 62.
The terminal device 6 may be a robot or other intelligent device. The terminal device may include, but is not limited to, a processor 60, a memory 61. It will be appreciated by those skilled in the art that fig. 6 is merely an example of the terminal device 6 and is not meant to be limiting as to the terminal device 6, and may include more or fewer components than shown, or may combine certain components, or different components, such as may also include input-output devices, network access devices, etc.
The processor 60 may be a central processing unit (Central Processing Unit, CPU), the processor 60 may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may in some embodiments be an internal storage unit of the terminal device 6, such as a hard disk or a memory of the terminal device 6. The memory 61 may in other embodiments also be an external storage device of the terminal device 6, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 6. Further, the memory 61 may also include both an internal storage unit and an external storage device of the terminal device 6. The memory 61 is used for storing an operating system, application programs, boot loader (BootLoader), data, other programs, etc., such as program codes of the computer program. The memory 61 may also be used for temporarily storing data that has been output or is to be output.
Embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, implements steps that may implement the various method embodiments described above.
The present embodiments provide a computer program product which, when run on a terminal device, causes the terminal device to perform steps that enable the respective method embodiments described above to be implemented.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting. Although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.
Claims (13)
1. A method for determining a relational model of a hand-eye matrix, comprising:
determining a coordinate offset between a coordinate origin of a mechanical arm base coordinate system and a coordinate origin of a tripod head base coordinate system, and determining a coordinate conversion matrix between the tripod head base coordinate system and a camera coordinate system, wherein the tripod head base coordinate system is a coordinate system established by taking a rotation axis center of a tripod head base as the coordinate origin, the camera coordinate system is a coordinate system established by taking a focusing center of a camera as the coordinate origin, and the mechanical arm base coordinate system is a coordinate system established by taking the rotation axis center of a mechanical arm base as the coordinate origin;
And determining a relation model of a hand-eye matrix according to the coordinate offset and the coordinate transformation matrix, wherein the relation model can represent the relation between the hand-eye matrix and a holder angle, the holder angle is an angle of a holder relative to a holder base on different planes, and the hand-eye matrix is used for representing the coordinate transformation relation from the mechanical arm base coordinate system to the camera coordinate system.
2. The method of claim 1, wherein a coordinate transformation matrix between the pan-tilt-base coordinate system and the camera coordinate system is:
wherein ,for the coordinate conversion matrix, θ is an angle by which the pan-tilt-base coordinate system rotates along the z-axis of the pan-tilt-base coordinate system during conversion from the pan-tilt-base coordinate system to a first coordinate system, β is an angle by which the pan-tilt-base coordinate system rotates along the y-axis of the first coordinate system during conversion from the first coordinate system to a second coordinate system, h is the displacement of translation along the z-axis of the second coordinate system during the conversion from the second coordinate system to the third coordinate system, d is the displacement of translation along the x-axis of the third coordinate system required for the conversion from the third coordinate system to the fourth coordinate system, alpha is the angle of rotation along the x-axis of the fourth coordinate system during the conversion from the fourth coordinate system to the fifth coordinate system, s A displacement translated along a z-axis of the fifth coordinate system during a transition from the fifth coordinate system to the camera coordinate system; the process of converting the pan-tilt base coordinate system into the camera coordinate system comprises the following steps: converting from the cradle head coordinate system to the first coordinate system, converting from the first coordinate system to the second coordinate system, converting from the second coordinate system to the third coordinate system, and converting from the third coordinate system to the fourth coordinate system, converting from the fourth coordinate system to the fifth coordinate system, and converting from the fifth coordinate system to the camera coordinate system.
3. The method of claim 1, wherein the coordinate offset comprises a lateral offset and a longitudinal offset of a coordinate origin of the robotic arm-based coordinate system and a coordinate origin of the pan-tilt-based coordinate system, the pan-tilt angle comprises a horizontal angle and a pitch angle, and the relational model is:
wherein ,for the hand-eye matrix, θ is an angle of rotation of the holder-based coordinate system along a z-axis of the holder-based coordinate system during conversion from the holder-based coordinate system to a first coordinate system, β is an angle of rotation of the holder-based coordinate system along a y-axis of the first coordinate system during conversion from the first coordinate system to a second coordinate system, h is a displacement of translation of the holder-based coordinate system along the z-axis of the second coordinate system during conversion from the second coordinate system to a third coordinate system, d is a displacement of translation of the holder-based coordinate system along an x-axis of the third coordinate system during conversion from the third coordinate system to a fourth coordinate system, α is an angle of rotation of the holder-based coordinate system along the x-axis of the fourth coordinate system during conversion from the fourth coordinate system to a fifth coordinate system, s is a displacement of translation of the holder-based coordinate system along the z-axis of the fifth coordinate system during conversion from the fifth coordinate system to the camera coordinate system, x is the lateral direction An offset, y is the longitudinal offset; the process of converting the pan-tilt base coordinate system into the camera coordinate system comprises the following steps: converting from the cradle head coordinate system to the first coordinate system, converting from the first coordinate system to the second coordinate system, converting from the second coordinate system to the third coordinate system, and converting from the third coordinate system to the fourth coordinate system, converting from the fourth coordinate system to the fifth coordinate system, and converting from the fifth coordinate system to the camera coordinate system.
4. A method according to any one of claims 1-3, wherein after the step of determining a relational model of the hand-eye matrix from the coordinate offset and the coordinate transformation matrix, the method further comprises:
performing matrix transformation on the relation model to obtain a relation model of camera coordinates and the cradle head angle, wherein the relation model comprises the following steps:
cam_x=s·(sinθ·sinα+cosθsinβcosα)+dcosθcosβ+x
cam_y=s·(-cosθ·sinα+sinθsinβcosα)+dsinθcosβ+y
cam_z=s·cosαcosβ+h
wherein, (cam_x, cam_y, cam_z) is a coordinate of the camera coordinate system corresponding to the coordinate of the mechanical arm coordinate system, alpha is a pitching angle of the pan-tilt angles, theta is a horizontal angle of the pan-tilt angles, beta is an installation angle of the camera on the pan-tilt, x is a lateral offset of the coordinate offsets, y is a longitudinal offset of the coordinate offsets, and h is a displacement required to translate along a z-axis of the pan-tilt base coordinate system from the pan-tilt base coordinate system to the camera coordinate system.
5. The hand-eye calibration method is characterized by being applied to terminal equipment, wherein the terminal equipment comprises a cradle head and a mechanical arm, and a camera is installed on the cradle head, and the method comprises the following steps:
determining a target relation model of the terminal equipment, wherein the target relation model is a relation model between a holder angle of the terminal equipment and a hand-eye matrix, and the hand-eye matrix is used for representing a coordinate conversion relation from a mechanical arm base coordinate system to the camera coordinate system;
inputting a current holder angle into the target relation model to obtain a current target hand-eye matrix, wherein the target hand-eye matrix is used for representing the coordinate conversion relation from the mechanical arm base coordinate system to the camera coordinate system under the current holder angle;
wherein the target relationship model is built according to the method of any one of claims 1-4.
6. The method of claim 5, wherein the determining the target relationship model for the terminal device comprises:
determining at least two initial hand-eye matrixes under a plurality of different initial cradle head angles;
based on a relation model of the hand-eye matrix and a plurality of initial hand-eye matrices, obtaining a plurality of corresponding relation formulas, wherein the relation model comprises a plurality of undetermined coefficients, and the relation formulas are used for representing the relation between the undetermined coefficients and the holder angles;
Calculating coefficient values of a plurality of the undetermined coefficients based on a plurality of the relational expressions;
substituting the coefficient value into the relation model to obtain the target relation model;
wherein the relationship model is a relationship model of a hand-eye matrix established by the method of any one of claims 1-4.
7. The method of claim 6, wherein the hand-eye matrix based relationship model and the plurality of initial hand-eye matrices result in a corresponding plurality of relationships, comprising:
converting the relation model to obtain a plurality of conversion relations about the undetermined coefficients;
based on the plurality of conversion relations, converting a plurality of initial hand-eye matrixes into a corresponding plurality of relation formulas about the undetermined coefficient.
8. The method of claim 7, wherein a plurality of said transformation relationships comprises:
roll=a tan2(sinα,cosα)=α
pitch=a tan2(sinβ,cosβ)=β
yaw=a tan2(sinθ,cosθ)=θ
cam_x=s·(sinθ·sinα+cosθsinβcosα)+dcosθcosβ+x
cam_y=s·(-cosθ·sinα+sinθsinβcosα)+dsinθcosβ+y
cam_z=s·cosαcosβ+h
wherein, (cam_x, cam_y, cam_z) is a coordinate corresponding to a coordinate in the camera coordinate system in the mechanical arm coordinate system, alpha is a pitching angle of the pan-tilt angle, beta is an installation angle of the camera on the pan-tilt, theta is a horizontal angle of the pan-tilt angle, (roll, pitch, yaw) is a pose of the camera in the mechanical arm base coordinate system, and x, y, h, d and s are the undetermined coefficients.
9. The method of claim 7 or 8, wherein said calculating coefficient values for a plurality of said coefficients to be determined based on a plurality of said relationships comprises:
correcting the initial cradle head angle;
determining a plurality of equations for solving the coefficient value according to the plurality of conversion relations and the corrected initial cradle head angle;
and solving a plurality of equations to obtain the coefficient value.
10. The method of claim 9, wherein the initial pan angle comprises a horizontal angle, a mounting angle, and a pitch angle, and wherein the correcting the pan angle comprises:
correcting the holder angle by the following formula:
wherein ,θinput For the pitch angle alpha input For the horizontal angle, pitch is the installation angle, α is the corrected pitch angle, β is the corrected installation angle, θ is the corrected horizontal angle, N is the number of the initial hand-eye matrices, and N is greater than or equal to 2.
11. A robot, characterized in that the robot determines a relation model of a hand-eye matrix and a holder angle by the method according to any one of claims 1-4, and performs hand-eye calibration by the method according to any one of claims 5-10.
12. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1-4 or 5-10 when executing the computer program.
13. A computer readable storage medium storing a computer program, which when executed by a processor implements the method of any one of claims 1-4 or 5-10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310125623.0A CN116175569A (en) | 2023-02-03 | 2023-02-03 | Method for determining relation model of hand-eye matrix, hand-eye calibration method and equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310125623.0A CN116175569A (en) | 2023-02-03 | 2023-02-03 | Method for determining relation model of hand-eye matrix, hand-eye calibration method and equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116175569A true CN116175569A (en) | 2023-05-30 |
Family
ID=86436058
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310125623.0A Pending CN116175569A (en) | 2023-02-03 | 2023-02-03 | Method for determining relation model of hand-eye matrix, hand-eye calibration method and equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116175569A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117338329A (en) * | 2023-09-11 | 2024-01-05 | 合肥合滨智能机器人有限公司 | Remote automatic ultrasonic scanning robot eye-seeing hand control device and method |
-
2023
- 2023-02-03 CN CN202310125623.0A patent/CN116175569A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117338329A (en) * | 2023-09-11 | 2024-01-05 | 合肥合滨智能机器人有限公司 | Remote automatic ultrasonic scanning robot eye-seeing hand control device and method |
CN117338329B (en) * | 2023-09-11 | 2024-07-30 | 合肥合滨智能机器人有限公司 | Remote automatic ultrasonic scanning robot eye-seeing hand control device and method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110355755B (en) | Robot hand-eye system calibration method, device, equipment and storage medium | |
CN110640747B (en) | Hand-eye calibration method and system for robot, electronic equipment and storage medium | |
CN109829953B (en) | Image acquisition device calibration method and device, computer equipment and storage medium | |
CN107194974B (en) | Method for improving multi-view camera external parameter calibration precision based on multiple recognition of calibration plate images | |
CN111801198A (en) | Hand-eye calibration method, system and computer storage medium | |
CN112318506A (en) | Automatic calibration method, device, equipment, mechanical arm and medium for mechanical arm | |
CN111438688B (en) | Robot correction method, robot correction device, computer equipment and storage medium | |
CN110470320B (en) | Calibration method of swinging scanning type line structured light measurement system and terminal equipment | |
CN109781164B (en) | Static calibration method of line laser sensor | |
CN112873204B (en) | Robot calibration method, device, equipment and computer readable storage medium | |
CN109623822B (en) | Robot hand-eye calibration method | |
CN111862221A (en) | UVW platform calibration method and device, deviation correction method and device and alignment system | |
CN116175569A (en) | Method for determining relation model of hand-eye matrix, hand-eye calibration method and equipment | |
CN110682293A (en) | Robot arm correction method, robot arm correction device, robot arm controller and storage medium | |
CN113296395A (en) | Robot hand-eye calibration method in specific plane | |
CN111489399A (en) | Device and method for calibrating installation parameters of visual tracking assembly | |
CN112975959B (en) | Machine vision-based radiator assembling and positioning method, system and medium | |
CN117340879A (en) | Industrial machine ginseng number identification method and system based on graph optimization model | |
CN112631200A (en) | Machine tool axis measuring method and device | |
CN113313772A (en) | Calibration method, calibration device, electronic equipment and storage medium | |
CN111383283B (en) | Calibration method and system for tool coordinate system of robot | |
CN114516048B (en) | Zero point debugging method and device for robot, controller and storage medium | |
CN112971984B (en) | Coordinate registration method based on integrated surgical robot | |
CN115890749A (en) | Endoscope robot visual servo and optimization control method and system under RCM constraint and robot | |
CN115805587A (en) | Motion analysis method and device of seven-axis robot and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |