CN115741666A - Robot hand-eye calibration method, robot and robot operation method - Google Patents

Robot hand-eye calibration method, robot and robot operation method Download PDF

Info

Publication number
CN115741666A
CN115741666A CN202211058016.9A CN202211058016A CN115741666A CN 115741666 A CN115741666 A CN 115741666A CN 202211058016 A CN202211058016 A CN 202211058016A CN 115741666 A CN115741666 A CN 115741666A
Authority
CN
China
Prior art keywords
calibration
hand
robot
eye
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211058016.9A
Other languages
Chinese (zh)
Inventor
廖伟东
张兆彪
韦卓光
高建文
李俊渊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Qianhai Ruiji Technology Co ltd
China International Marine Containers Group Co Ltd
CIMC Containers Holding Co Ltd
Original Assignee
Shenzhen Qianhai Ruiji Technology Co ltd
China International Marine Containers Group Co Ltd
CIMC Containers Holding Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Qianhai Ruiji Technology Co ltd, China International Marine Containers Group Co Ltd, CIMC Containers Holding Co Ltd filed Critical Shenzhen Qianhai Ruiji Technology Co ltd
Priority to CN202211058016.9A priority Critical patent/CN115741666A/en
Publication of CN115741666A publication Critical patent/CN115741666A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

The invention has disclosed a robot hand eye calibration method, apparatus, robot hand eye calibration equipment, computer readable storage medium, robot and robot operation method, this scheme is after obtaining the hand eye matrix, also revise the hand eye matrix, through obtaining several calibration points on the calibration plate picture, adopt the hand eye matrix obtained to convert several calibration points to under the tool coordinate system of the robot, obtain the coordinate set of the image calibration point under the tool coordinate system; controlling the end tool to contact a plurality of calibration points on the calibration plate, and acquiring a coordinate set of the actual calibration point in a tool coordinate system; then, registering the coordinate set of the image calibration point under the tool coordinate system and the coordinate set of the actual calibration point under the tool coordinate system by adopting an iterative nearest neighbor algorithm to obtain a compensation matrix; and finally, correcting the hand-eye matrix by adopting the compensation matrix, wherein the corrected hand-eye matrix has higher precision, and the robot can meet the requirement of high-precision processing.

Description

Robot hand-eye calibration method, robot and robot operation method
Technical Field
The invention relates to the technical field of robots, in particular to a robot hand-eye calibration method, a robot hand-eye calibration device, a computer readable storage medium, a robot and a robot operation method.
Background
With the rapid development of the robot technology and the three-dimensional vision technology and the upgrading of industrial intelligent manufacturing, the robot vision is more and more applied to scenes such as industrial production, service industry and the like, the robot operation is guided through the vision system, and the robot vision is an important means for realizing the intellectualization of the robot operation.
The hand-eye matrix establishes the relation between the robot coordinate system and the camera coordinate system, and the calibration of the hands and eyes of the robot is well researched. However, the current hand-eye calibration algorithm is seriously limited by the influence of conditions such as robot precision, camera precision, photographing posture and the like, and a hand-eye matrix obtained by calibration has certain errors and cannot meet the requirements of high-precision processing such as robot welding, cutting, gluing and the like.
Disclosure of Invention
The invention provides a robot hand-eye calibration method, a robot hand-eye calibration device, a computer readable storage medium, a robot and a robot operation method, and aims to solve the problem that a robot cannot meet high-precision processing requirements due to hand-eye matrix errors.
According to one aspect of the embodiments of the present invention, a method for calibrating a hand and an eye of a robot is disclosed, which comprises:
acquiring a hand-eye matrix; the hand-eye matrix represents a transformation relationship of a camera coordinate system with respect to a tool coordinate system of the robot, the tool coordinate system being used to define a center position of the tip tool and a pose of the tip tool;
acquiring a calibration plate image acquired by the camera, selecting a plurality of first calibration points on the calibration plate image, and converting the first calibration points to a tool coordinate system of the robot by using the hand-eye matrix to obtain a first coordinate set;
controlling the end tool to contact a plurality of second calibration points on a calibration plate, and acquiring coordinates of the second calibration points under the tool coordinate system to acquire a second coordinate set, wherein the second calibration points and the first calibration points are a plurality of same position points on the calibration plate;
registering the first coordinate set and the second coordinate set by adopting an iterative nearest neighbor algorithm to obtain a compensation matrix;
and correcting the hand-eye matrix by adopting the compensation matrix to obtain a corrected hand-eye matrix.
In an exemplary embodiment, the obtaining the coordinates of the second calibration points in the tool coordinate system obtains a second coordinate set, including:
recording coordinates of the second calibration points under a base coordinate system of the robot;
and according to the pose of the camera when the calibration plate image is acquired, converting the coordinates of the second calibration points under the base coordinate system into the coordinates under the tool coordinate system to obtain a second coordinate set.
In an exemplary embodiment, the transforming the coordinates of the second calibration points in the base coordinate system to the tool coordinate system according to the pose of the camera when the calibration plate image is captured, obtaining the second coordinate set, includes:
multiplying the coordinates of the second calibration points in the base coordinate system by pose data of the calibration plate image acquired by the camera;
and taking the result obtained by the multiplication operation as the second coordinate set.
In an exemplary embodiment, the modifying the hand-eye matrix with the compensation matrix to obtain a modified hand-eye matrix includes:
multiplying the compensation matrix and the hand-eye matrix;
and taking the result obtained by the multiplication operation as the corrected hand-eye matrix.
In an exemplary embodiment, said selecting a plurality of first calibration points on said calibration board image comprises:
selecting at least three first calibration points on the calibration plate image, wherein the at least three first calibration points are not on the same straight line;
controlling a plurality of second index points on the end tool contact calibration plate, including:
controlling the end tool to contact at least three second calibration points on a calibration plate, the at least three second calibration points not being on the same line;
the at least three second calibration points correspond to the at least three first calibration points one to one.
In an exemplary embodiment, said selecting a number of first calibration points on said calibration plate image comprises:
selecting four first calibration points on the calibration plate image;
said controlling said end tool to contact a plurality of second index points on an index plate comprising:
and controlling the end tool to contact four second calibration points on the calibration plate.
In an exemplary embodiment, before acquiring the calibration board image captured by the camera, the method further includes:
controlling the robot to drive the camera to move to the position above the calibration plate;
and controlling the camera to acquire the image of the calibration plate.
In an exemplary embodiment, the acquiring the hand-eye matrix includes:
controlling the robot to drive the camera to move to the position above the calibration plate;
controlling the robot to have at least five different postures, acquiring calibration plate images acquired by the camera under the at least five different postures respectively, and recording pose data of the at least five different postures under a tool coordinate system;
and calculating to obtain the hand-eye matrix by adopting a Zhang Zhengyou calibration method.
According to an aspect of an embodiment of the present invention, there is disclosed a robot including: the robot comprises a robot body, a camera, a tail end tool and a controller, wherein the robot body is provided with a plurality of motion axes; the camera is arranged at the tail end of the robot body; the end tool is arranged at the end of the robot body; the controller is connected with the robot body, the camera and the end tool and is used for controlling the robot body, the camera and the end tool to execute the steps of the hand-eye calibration method.
According to an aspect of an embodiment of the present invention, a robot working method is disclosed, in which a tip tool and a camera are provided at a tip of the robot. The operation method comprises the following steps:
the robot hand-eye calibration is carried out by adopting the hand-eye calibration method;
controlling the camera to acquire a working object image of the robot;
performing image processing on the operation object image to obtain an operation position;
and controlling the end tool to move to the working position for working.
According to an aspect of the embodiments of the present invention, a robot hand-eye calibration device is disclosed, which includes:
the hand-eye matrix acquisition module is used for acquiring a hand-eye matrix;
the image coordinate acquisition module is used for acquiring a calibration plate image acquired by the camera, selecting a plurality of first calibration points on the calibration plate image, and converting the first calibration points into a tool coordinate system of the robot by adopting the hand-eye matrix to obtain a first coordinate set;
the actual coordinate acquisition module is used for controlling the tail end tool to contact a plurality of second calibration points on the calibration plate, acquiring coordinates of the second calibration points under the tool coordinate system and acquiring a second coordinate set, wherein the second calibration points and the first calibration points are a plurality of same position points on the calibration plate;
the registration module is used for registering the first coordinate set and the second coordinate set by adopting an iterative nearest neighbor algorithm to obtain a compensation matrix;
and the correction module is used for correcting the hand-eye matrix by adopting the compensation matrix to obtain a corrected hand-eye matrix.
According to an aspect of the embodiments of the present invention, a robot hand-eye calibration apparatus is disclosed, the robot hand-eye calibration apparatus including:
one or more processors;
a memory for storing one or more programs that, when executed by the one or more processors, cause the robotic eye calibration apparatus to implement the aforementioned robotic eye calibration method.
According to an aspect of the embodiments of the present invention, a computer-readable storage medium is disclosed, which stores computer-readable instructions, which, when executed by a processor of a computer, cause the computer to execute the aforementioned robot hand-eye calibration method.
The technical scheme provided by the embodiment of the invention at least comprises the following beneficial effects:
according to the technical scheme provided by the invention, after the hand-eye matrix is obtained, the hand-eye matrix is corrected, a plurality of calibration points on the calibration plate image are obtained, the obtained hand-eye matrix is adopted to convert the calibration points into a tool coordinate system of a robot, and a coordinate set of the image calibration points in the tool coordinate system is obtained; controlling the end tool to contact a plurality of calibration points on the calibration plate, and acquiring a coordinate set of the actual calibration points in a tool coordinate system; then, registering the coordinate set of the image calibration point under the tool coordinate system and the coordinate set of the actual calibration point under the tool coordinate system by adopting an iterative nearest neighbor algorithm to obtain a compensation matrix; and finally, correcting the hand-eye matrix by adopting the compensation matrix, wherein the corrected hand-eye matrix has higher precision, and the robot can meet the requirement of high-precision processing.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a block diagram of a robot shown in an exemplary embodiment.
FIG. 2 is a flow chart illustrating a method for robot hand-eye calibration in accordance with an exemplary embodiment.
Fig. 3 is a detailed flowchart of step S101 in the corresponding embodiment of fig. 2.
Fig. 4 is a detailed flowchart of step S102 in the corresponding embodiment of fig. 2.
FIG. 5 is a schematic illustration of a first index point on a calibration plate image according to an exemplary embodiment.
Fig. 6 is a detailed flowchart of step S103 in the corresponding embodiment of fig. 2.
Fig. 7 is a detailed flowchart of step S1033 in the corresponding embodiment of fig. 6.
FIG. 8 is a diagram of a second calibration point on a calibration plate in accordance with an exemplary embodiment.
Fig. 9 is a detailed flowchart of step S105 in the corresponding embodiment of fig. 2.
FIG. 10 is a flowchart illustrating a method of robot operation, according to an exemplary embodiment.
FIG. 11 is a block diagram illustrating a robotic hand-eye calibration device in accordance with an exemplary embodiment.
The reference numerals are explained below:
300. a robot; 310. a robot body; 320. a camera; 330. a tip tool; 200. a robot hand-eye calibration device; 210. a hand-eye matrix acquisition module; 220. an image coordinate acquisition module; 230. an actual coordinate acquisition module; 240. a registration module; 250. and a correction module.
Detailed Description
While this invention is susceptible of embodiment in different forms, there is shown in the drawings and will herein be described in detail, specific embodiments thereof with the understanding that the present description is to be considered as an exemplification of the principles of the invention and is not intended to limit the invention to that as illustrated.
Furthermore, the terms "comprising," "having," and any variations thereof, as referred to in the description of the invention, are intended to cover non-exclusive inclusions. Such that a process, method, system, article, or apparatus that comprises a list of steps or modules is not limited to only those steps or modules recited, but may alternatively include other steps or modules not expressly listed or inherent to such process, method, article, or apparatus.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more features.
In the description of the present invention, the meaning of "a plurality" means two or more unless otherwise specified.
It should be noted that in the embodiments of the present invention, words such as "exemplary" or "exemplary" are used for example, illustration, or description. Any embodiment or design described as "exemplary" or "exemplary" in embodiments of the invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the words "exemplary" or "exemplary" is intended to present relevant concepts in a concrete fashion.
Exemplary embodiments will be described in detail below. The implementations described in the following exemplary examples do not represent all implementations consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the summary of the invention.
FIG. 1 illustrates a robot architecture diagram in accordance with an exemplary embodiment. As shown in fig. 1, the robot 300 includes a robot body 310, a camera 320, and an end tool 330, the camera 320 and the end tool 330 being disposed at an end of the robot body 310; the robot body 310 drives the camera 320 and the end tool 330 to move, so that the camera 320 is used to capture an image of a work object at a short distance, a work position is obtained by analyzing the image of the work object, and the end tool 330 is further driven to move to the work position for work.
It can be understood that the robot 300 further includes a controller, the controller is connected to the robot body 310, the camera 320 and the end tool 330 to control the movement of the robot body 310 and control the camera 320 to capture an image of a work object, and the controller further analyzes the image of the work object to obtain a work position, so as to control the robot body 310 to move and drive the end tool 330 to move to the work position for performing a work. It is understood that the controller may be built in the robot body 310, or may be disposed outside the robot body 310.
The end tool 330 may be a welding gun and the object to be welded is the object to be welded, in which case the robot 300 is a welding robot. The end tool 330 may be a glue gun, and the object to be glued is the object to be glued, and the robot 300 is a gluing robot. The end tool 330 may also be a cutter, and the work object is the object to be cut, in which case the robot 300 is a cutting robot. Of course, the end tool 330 may be other tools that can be disposed at the end of the robot 300 and driven by the robot 300 to perform work, and is not limited to the welding gun, the glue gun, the cutter, and the like.
The embodiment of the invention provides a robot hand-eye calibration method. The hand-eye calibration method comprises the following steps:
acquiring a hand-eye matrix;
acquiring a calibration plate image acquired by a camera, selecting a plurality of first calibration points on the calibration plate image, and converting the first calibration points into a tool coordinate system of a robot by adopting a hand-eye matrix to obtain a first coordinate set;
controlling the tail end tool to contact a plurality of second calibration points on the calibration plate, acquiring coordinates of the second calibration points under a tool coordinate system, and acquiring a second coordinate set, wherein the second calibration points and the first calibration points are a plurality of same position points on the calibration plate;
registering the first coordinate set and the second coordinate set by adopting an iterative nearest neighbor algorithm to obtain a compensation matrix;
and correcting the hand-eye matrix by adopting the compensation matrix to obtain a corrected hand-eye matrix.
According to the technical scheme provided by the embodiment of the invention, after the hand-eye matrix is obtained, the hand-eye matrix is corrected, a plurality of calibration points on the calibration plate image are obtained, the obtained hand-eye matrix is adopted to convert the calibration points into a tool coordinate system of a robot, and a coordinate set of the image calibration points in the tool coordinate system is obtained; controlling the end tool to contact a plurality of calibration points on the calibration plate, and acquiring a coordinate set of the actual calibration points in a tool coordinate system; then, registering the coordinate set of the image calibration point under the tool coordinate system and the coordinate set of the actual calibration point under the tool coordinate system by adopting an iterative nearest neighbor algorithm to obtain a compensation matrix; and finally, the hand-eye matrix is corrected by adopting the compensation matrix, the corrected hand-eye matrix has higher precision, and the robot can meet the requirement of high-precision processing.
The embodiments of the present invention will be further explained in detail with reference to the drawings in the examples of the present specification.
As shown in fig. 2, an embodiment of the present invention provides a robot hand-eye calibration method, where the robot hand-eye calibration method includes a hand-eye matrix obtaining step and a hand-eye matrix correcting step, where the hand-eye matrix obtaining step includes step S101, and the hand-eye matrix correcting step includes steps S102 to S105.
S101, acquiring a hand-eye matrix.
It is understood that the hand-eye matrix represents the translation of the camera coordinate system with respect to the tool coordinate system of the robot, the hand-eye matrix = robot arm tip to camera translation + robot arm tip to camera rotation. In detail, the hand-eye matrix may be a 4 × 4 square matrix.
In one exemplary embodiment, as shown in fig. 3, step S101 includes:
and S1011, controlling the robot to drive the camera to move to the upper part of the calibration plate. To take a calibration plate image by the camera.
In detail, the calibration plate is a checkerboard calibration plate.
And S1012, controlling the robot to take at least five different postures, acquiring calibration plate images acquired by the camera under at least five different postures respectively, and recording pose data of at least five different postures under a tool coordinate system.
And S1013, calculating to obtain a hand-eye matrix by adopting a Zhang Zhengyou calibration method.
It can be understood that the robot is controlled to obtain calibration plate images acquired by the camera under at least five different postures so as to be suitable for calculation by adopting a Zhang Zhengyou calibration method, so as to obtain a hand-eye matrix. In other embodiments, other calibration methods may be used to calculate to obtain the hand-eye matrix, and in this case, the robot is not limited to be controlled to obtain the calibration plate images acquired by the camera in at least five different poses.
In the case of obtaining calibration plate images acquired under at least five different poses and pose data of at least five different poses in a tool coordinate system, a Zhang Zhengyou calibration method is adopted, and how to calculate to obtain a hand-eye matrix is the prior art and is not described herein again.
S102, acquiring a calibration plate image acquired by a camera, selecting a plurality of first calibration points on the calibration plate image, and converting the first calibration points into a tool coordinate system of the robot by using a hand-eye matrix to obtain a first coordinate set.
As can be appreciated, the tool coordinate system is used to define the center position of the end tool and the pose of the end tool.
In one exemplary embodiment, selecting a number of first calibration points on the calibration plate image includes: and selecting at least three first calibration points on the calibration plate image, wherein the at least three first calibration points are not on the same straight line. At least three first calibration points are selected, the obtained first coordinate set comprises a plurality of coordinate points, and the accuracy is higher when the registration is carried out by adopting an iterative nearest neighbor algorithm in the subsequent step S104. And the at least three first calibration points are not in the same line, the method can be applied to the registration of the rotation direction and the translation direction when the iterative nearest neighbor algorithm is adopted for registration.
In one exemplary embodiment, selecting a number of first calibration points on the calibration plate image includes: and selecting four first calibration points on the calibration board image. Four first calibration points are selected, the first coordinate set comprises more coordinate points, when the iterative nearest neighbor algorithm is adopted for registration in the subsequent step S104, the accuracy can be further improved, and meanwhile, the efficiency cannot be influenced due to large calculation amount caused by excessive coordinate points.
In an exemplary embodiment, as shown in fig. 4, step S102 includes:
and S1021, controlling the robot to drive the camera to move to the position above the calibration plate.
S1022, controlling the camera to collect the image of the calibration plate and recording the pose of the current robot in the tool coordinate system
Figure BDA0003825817250000081
And S1023, acquiring the calibration board image acquired by the camera, and selecting a plurality of first calibration points on the calibration board image.
And S1024, converting the first calibration point to a tool coordinate system of the robot by adopting a hand-eye matrix to obtain a first coordinate set.
For example, as shown in fig. 5, four first calibration points P0, P1, P2, and P3 are selected, and coordinates of the four first calibration points P0, P1, P2, and P3 in the camera coordinate system are represented as
Figure BDA0003825817250000091
In an exemplary embodiment, in step S1024, the first calibration points are converted into the tool coordinate system of the robot using the hand-eye matrix, and the four first calibration points P0, P1, P2, P3 are expressed as the four first calibration points P0, P1, P2, P3 in the tool coordinate system of the robot
Figure BDA0003825817250000092
Then the
Figure BDA0003825817250000093
Can be expressed as:
Figure BDA0003825817250000094
wherein,
Figure BDA0003825817250000095
representing the coordinates of the first index point in the tool coordinate system,
Figure BDA0003825817250000096
a matrix of the eyes of the hand is represented,
Figure BDA0003825817250000097
representing the coordinates of the first calibration point in the camera coordinate system.
S103, controlling the end tool to contact a plurality of second calibration points on the calibration plate, and acquiring coordinates of the second calibration points in a tool coordinate system to acquire a second coordinate set. The second calibration points and the first calibration points are the same position points on the calibration plate.
In one exemplary embodiment, selecting a number of first calibration points on the calibration plate image includes: and selecting at least three first calibration points on the calibration plate image, wherein the at least three first calibration points are not on the same straight line. Correspondingly, controlling the end tool to contact a plurality of second index points on the index plate, including: the control end tool contacts at least three second index points on the index plate, the at least three second index points not being on the same line. The at least three second calibration points correspond to the at least three first calibration points one by one. At least three first calibration points and at least three second calibration points are selected, the obtained first coordinate set and the second coordinate set comprise a plurality of coordinate points, and the accuracy is higher when the iterative nearest neighbor algorithm is adopted for registration in the subsequent step S104. And the at least three first and second calibration points are not in the same line, so that the method can be applied to the registration of the rotation direction and the translation direction when the iterative nearest neighbor algorithm is adopted for registration.
In one exemplary embodiment, selecting a number of first calibration points on the calibration plate image includes: and selecting four first calibration points on the calibration board image. Correspondingly, controlling the end tool to contact a plurality of second index points on the index plate, including: the control end tool contacts four second index points on the index plate. Four first calibration points and four second calibration points are selected, the first coordinate set and the second coordinate set comprise more coordinate points, when the iterative nearest neighbor algorithm is adopted for registration in the subsequent step S104, the accuracy can be further improved, and meanwhile, the efficiency cannot be influenced due to large calculated amount caused by excessive coordinate points.
In an exemplary embodiment, as shown in fig. 6, step S103 includes:
and S1031, controlling the end tool to contact a plurality of second calibration points on the calibration plate.
And S1032, recording the coordinates of the plurality of second calibration points in the base coordinate system of the robot.
And S1033, converting the coordinates of the second calibration points under the base coordinate system into the coordinates under the tool coordinate system according to the pose of the camera when the camera collects the calibration plate image, and obtaining a second coordinate set.
In an exemplary embodiment, as shown in fig. 7, step S1033 includes:
s10331, multiplying the coordinates of the second calibration points in the base coordinate system by the pose data of the calibration board image collected by the camera in the step S102.
S10332, the result obtained by the multiplication operation is taken as the second coordinate set.
For example, as shown in fig. 8, the robot is sequentially moved to make the end of the end tool 330 contact four second calibration points C0, C1, C2, and C3 on the calibration plate, the four second calibration points C0, C1, C2, and C3 correspond to the four first calibration points P0, P1, P2, and P3 one-to-one, and coordinates of the four second calibration points C0, C1, C2, and C3 in the basic coordinate system of the robot are represented as
Figure BDA0003825817250000101
In step S1033, in an exemplary embodiment, coordinates of a plurality of second calibration points in the base coordinate system are converted to be in the tool coordinate system according to the pose of the camera at the time of acquiring the calibration plate image in step S102, and four second calibration points C0, C1, C2, C3 are expressed as four second calibration points in the tool coordinate system of the robot
Figure BDA0003825817250000102
Then the
Figure BDA0003825817250000103
Can be expressed as:
Figure BDA0003825817250000104
wherein,
Figure BDA0003825817250000105
representing the coordinates of the second index point in the tool coordinate system,
Figure BDA0003825817250000106
representing the coordinates of the second calibration point in the base coordinate system,
Figure BDA0003825817250000107
the inverse of the pose matrix when the camera acquires the calibration plate image in step S102 is represented.
And S104, registering the first coordinate set and the second coordinate set by adopting an iterative nearest neighbor algorithm to obtain a compensation matrix.
In detail, the second coordinate set is used as source data, the first coordinate set is used as data needing registration, and registration is carried out to obtain a compensation matrix. It is understood that the iterative nearest neighbor algorithm, also called ICP Registration (Point Cloud Registration), is used to input two Point clouds (source data and data to be registered), and output a transformation (compensation matrix) to make the coincidence degree of the source data and the data to be registered as high as possible. The transformations may or may not be rigid, including rotation and translation. As to how to perform registration by using the iterative nearest neighbor algorithm, it is not described herein again for the prior art.
And S105, correcting the hand-eye matrix by adopting the compensation matrix to obtain a corrected hand-eye matrix.
In an exemplary embodiment, as shown in fig. 9, step S105 includes:
s1051, multiplying the compensation matrix and the hand-eye matrix.
And S1052, taking the result obtained by the multiplication operation as a corrected hand-eye matrix.
For example, the compensation matrix obtained in step S104 is represented as T1, and the hand-eye matrix obtained in step S101 is represented as T1
Figure BDA0003825817250000108
The compensated hand-eye matrix is represented as
Figure BDA0003825817250000109
The corrected hand-eye matrix can be obtained by the following relational expression calculation
Figure BDA00038258172500001010
Figure BDA0003825817250000111
It should be noted that, in the foregoing embodiment, the sequential execution of steps S101 to S105 is merely exemplary, and in other embodiments, the execution order of partial steps may be adjusted. For example, in one embodiment, step S103 is performed first, then steps S101 and S102 are performed in sequence, and then steps S104 and S105 are performed in sequence.
In addition, in the foregoing embodiment, after step S101 is executed, in step S102, the robot is controlled again to drive the camera to move to the position above the calibration board, and then the camera is controlled to capture the image of the calibration board, and the pose of the robot in the tool coordinate system is recorded as
Figure BDA0003825817250000112
And then acquiring a calibration plate image acquired by the camera, and selecting a plurality of first calibration points on the calibration plate image. It should be understood that, in an embodiment, several first calibration points may be extracted from one of the calibration board images directly obtained in step S1011, and the pose data recorded in step S1012 corresponding to the one of the calibration board images is utilized, and in step S102, steps S1021 to S1023 are not executed.
Referring to fig. 1, in order to implement the method for calibrating a hand and an eye of a robot according to an embodiment of the present invention, the embodiment of the present invention provides a robot 300, where the robot 300 includes a robot body 310, a camera 320, an end tool 330, and a controller (not shown). The robot body 310 has a plurality of motion axes, and the camera 320 and the tip tool 330 are disposed at the tip of the robot body 310. The controller is connected to the robot body 310, the camera 320 and the end tool 330, and is configured to control the robot body 310, the camera 320 and the end tool 330, so that the robot 300 can perform all or part of the steps of the robot hand-eye calibration method shown in any one of fig. 2 to 4, 6 to 7 and 9.
For example, the robot body 310 has six axes of motion, i.e., the robot 300 is a six-axis robot.
By way of example, the end tool 330 is a welding gun.
FIG. 10 illustrates a flow chart of a method of robotic work of an exemplary embodiment. As shown in fig. 10, the robot working method includes the steps of:
s401, calibrating the robot hand and eye by adopting the hand and eye calibration method.
S402, the camera 320 is controlled to capture a work object image of the robot.
S403, performs image processing on the job target image to obtain a job position.
S404, the end tool 330 is controlled to move to the working position for working.
Referring next to fig. 11, fig. 11 is a block diagram illustrating a robot hand-eye calibration apparatus 200 according to an exemplary embodiment, where the robot hand-eye calibration apparatus 200 may be applied to a robot to perform all or part of the steps of the robot hand-eye calibration method shown in any one of fig. 2 to 4, 6 to 7, and 9. As shown in fig. 11, the robot hand-eye calibration device 200 includes, but is not limited to: a hand-eye matrix acquisition module 210, an image coordinate acquisition module 220, an actual coordinate acquisition module 230, a registration module 240, and a correction module 250.
The hand-eye matrix obtaining module 210 is configured to obtain a hand-eye matrix.
The image coordinate obtaining module 220 is configured to obtain a calibration plate image collected by the camera, select a plurality of first calibration points on the calibration plate image, and convert the first calibration points into a tool coordinate system of the robot by using a hand-eye matrix, so as to obtain a first coordinate set.
The actual coordinate obtaining module 230 is configured to control the end tool to contact a plurality of second calibration points on the calibration board, obtain coordinates of the plurality of second calibration points in the tool coordinate system, and obtain a second coordinate set. The second calibration points and the first calibration points are the same position points on the calibration plate.
The registration module 240 is configured to perform registration on the first coordinate set and the second coordinate set by using an iterative nearest neighbor algorithm to obtain a compensation matrix.
The correcting module 250 is configured to correct the hand-eye matrix by using the compensation matrix to obtain a corrected hand-eye matrix.
The implementation process of the functions and actions of each module in the robot eye calibration apparatus 200 is specifically described in the implementation process of the corresponding step in the robot eye calibration method, and is not described herein again.
In order to implement the robot hand-eye calibration method provided by the embodiment of the invention, an embodiment of the invention provides a robot hand-eye calibration device, which can be integrated into a robot. The robot hand-eye calibration device comprises a processor and a memory, wherein the memory is used for storing one or more programs, and when the one or more programs are executed by the processor, the robot hand-eye calibration device is enabled to realize the robot hand-eye calibration method provided by the embodiment.
The specific manner in which the processor of the robot hand-eye calibration apparatus performs the operation in this embodiment has been described in detail in the embodiments related to the robot hand-eye calibration method, and will not be described in detail herein.
In detail, the processor may comprise one or more processing units, e.g. two CPUs. And as an example, the robot hand-eye calibration apparatus may comprise a plurality of processors, for example two processors. Each of these processors may be a single-Core Processor (CPU) or a multi-Core Processor (CPU). It will be understood that a processor may refer to one or more devices, circuits, and/or processing cores configured to process data (e.g., computer program instructions).
The memory may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, a charged Erasable Programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
It will be appreciated that the memory may be self-contained, connected to the processor by a bus. The memory may also be integrated with the processor.
In an exemplary embodiment, the memory is used for storing computer-executable instructions corresponding to the software program of the present invention. The processor may implement various functions of the robotic eye calibration device by running or executing software program data stored in the memory.
An embodiment of the present invention also provides a storage medium that is a computer-readable storage medium, such as may be transitory and non-transitory computer-readable storage media, including instructions. The storage medium stores computer readable instructions which, when executed by a processor of a computer, cause the computer to perform the above-described robotic eye calibration method.
An embodiment of the present invention further provides a computer program product, where the computer program product is directly loadable into a memory and contains software codes, and the computer program product can be loaded and executed by a computer to implement the robot eye calibration method provided in the foregoing embodiments.
Those skilled in the art will recognize that, in one or more of the examples described above, the functions described in this invention may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable storage medium.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a modular division is merely a logical division, and other divisions may be realized in practice. For example, various elements or components may be combined or may be integrated into another device, or some features may be omitted, or not implemented.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (10)

1. A robot hand-eye calibration method is characterized in that a terminal tool and a camera are arranged at the tail end of a robot, and the hand-eye calibration method comprises the following steps:
acquiring a hand-eye matrix; the hand-eye matrix represents a transformation relationship of a camera coordinate system with respect to a tool coordinate system of the robot, the tool coordinate system being used to define a center position of the tip tool and a pose of the tip tool;
acquiring a calibration plate image acquired by the camera, selecting a plurality of first calibration points on the calibration plate image, and converting the first calibration points to a tool coordinate system of the robot by using the hand-eye matrix to obtain a first coordinate set;
controlling the end tool to contact a plurality of second calibration points on a calibration plate, and acquiring coordinates of the plurality of second calibration points under the tool coordinate system to acquire a second coordinate set, wherein the plurality of second calibration points and the plurality of first calibration points are a plurality of same position points on the calibration plate;
registering the first coordinate set and the second coordinate set by adopting an iterative nearest neighbor algorithm to obtain a compensation matrix;
and correcting the hand-eye matrix by adopting the compensation matrix to obtain a corrected hand-eye matrix.
2. A hand-eye calibration method according to claim 1, wherein said obtaining coordinates of said second calibration points in said tool coordinate system to obtain a second coordinate set comprises:
recording coordinates of the plurality of second calibration points in a base coordinate system of the robot;
and according to the pose of the camera when the calibration plate image is acquired, converting the coordinates of the second calibration points under the base coordinate system into the coordinates under the tool coordinate system to obtain a second coordinate set.
3. The hand-eye calibration method according to claim 2, wherein the converting coordinates of the second calibration points in the base coordinate system to the tool coordinate system according to the poses of the camera when acquiring the calibration plate image to obtain the second coordinate set comprises:
multiplying the coordinates of the second calibration points in the base coordinate system by pose data of the calibration plate image acquired by the camera;
and taking the result obtained by the multiplication operation as the second coordinate set.
4. The hand-eye calibration method according to claim 1, wherein the correcting the hand-eye matrix by using the compensation matrix to obtain a corrected hand-eye matrix comprises:
multiplying the compensation matrix and the hand-eye matrix;
and taking the result obtained by the multiplication operation as the corrected hand-eye matrix.
5. The hand-eye calibration method according to claim 1,
selecting a plurality of first calibration points on the calibration plate image comprises the following steps:
selecting at least three first calibration points on the calibration plate image, wherein the at least three first calibration points are not on the same straight line;
controlling a plurality of second index points on the end tool contact calibration plate, including:
controlling the end tool to contact at least three second calibration points on a calibration plate, the at least three second calibration points not being on the same line;
the at least three second calibration points correspond to the at least three first calibration points one to one.
6. The hand-eye calibration method according to claim 5,
the selecting a plurality of first calibration points on the calibration plate image comprises:
selecting four first calibration points on the calibration plate image;
said controlling said end tool to contact a plurality of second index points on an index plate comprising:
and controlling the end tool to contact four second calibration points on the calibration plate.
7. The hand-eye calibration method according to claim 1, wherein before acquiring the calibration plate image acquired by the camera, the hand-eye calibration method further comprises:
controlling the robot to drive the camera to move above the calibration plate;
and controlling the camera to acquire the image of the calibration plate.
8. The hand-eye calibration method according to claim 1, wherein the acquiring a hand-eye matrix comprises:
controlling the robot to drive the camera to move above the calibration plate;
controlling the robot to have at least five different postures, acquiring calibration plate images acquired by the camera under the at least five different postures respectively, and recording pose data of the at least five different postures under a tool coordinate system;
and calculating to obtain the hand-eye matrix by adopting a Zhang Zhengyou calibration method.
9. A robot, comprising:
a robot body having a plurality of axes of motion;
a camera disposed at a distal end of the robot body;
a tip tool provided at a tip of the robot body; and
a controller connected to the robot body, the camera and the end tool for controlling the robot body, the camera and the end tool to perform the steps of the hand-eye calibration method as claimed in any one of claims 1 to 8.
10. A method of working a robot having a tip tool and a camera at a tip thereof, the method comprising:
performing robot hand-eye calibration by using the hand-eye calibration method according to any one of claims 1 to 8;
controlling the camera to acquire a working object image of the robot;
performing image processing on the operation object image to obtain an operation position;
and controlling the end tool to move to the working position for working.
CN202211058016.9A 2022-08-31 2022-08-31 Robot hand-eye calibration method, robot and robot operation method Pending CN115741666A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211058016.9A CN115741666A (en) 2022-08-31 2022-08-31 Robot hand-eye calibration method, robot and robot operation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211058016.9A CN115741666A (en) 2022-08-31 2022-08-31 Robot hand-eye calibration method, robot and robot operation method

Publications (1)

Publication Number Publication Date
CN115741666A true CN115741666A (en) 2023-03-07

Family

ID=85349439

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211058016.9A Pending CN115741666A (en) 2022-08-31 2022-08-31 Robot hand-eye calibration method, robot and robot operation method

Country Status (1)

Country Link
CN (1) CN115741666A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117428777A (en) * 2023-11-28 2024-01-23 北华航天工业学院 Hand-eye calibration method of bag-removing robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108818536A (en) * 2018-07-12 2018-11-16 武汉库柏特科技有限公司 A kind of online offset correction method and device of Robotic Hand-Eye Calibration
WO2020024178A1 (en) * 2018-08-01 2020-02-06 深圳配天智能技术研究院有限公司 Hand-eye calibration method and system, and computer storage medium
CN111644935A (en) * 2020-05-15 2020-09-11 江苏兰菱机电科技有限公司 Robot three-dimensional scanning measuring device and working method
CN112067337A (en) * 2020-09-21 2020-12-11 郑州轻工业大学 Rapid hand-eye calibration device and calibration method based on standard ball binocular robot
CN114519738A (en) * 2022-01-24 2022-05-20 西北工业大学宁波研究院 Hand-eye calibration error correction method based on ICP algorithm

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108818536A (en) * 2018-07-12 2018-11-16 武汉库柏特科技有限公司 A kind of online offset correction method and device of Robotic Hand-Eye Calibration
WO2020024178A1 (en) * 2018-08-01 2020-02-06 深圳配天智能技术研究院有限公司 Hand-eye calibration method and system, and computer storage medium
CN111644935A (en) * 2020-05-15 2020-09-11 江苏兰菱机电科技有限公司 Robot three-dimensional scanning measuring device and working method
CN112067337A (en) * 2020-09-21 2020-12-11 郑州轻工业大学 Rapid hand-eye calibration device and calibration method based on standard ball binocular robot
CN114519738A (en) * 2022-01-24 2022-05-20 西北工业大学宁波研究院 Hand-eye calibration error correction method based on ICP algorithm

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郑健红: "面向机器人精确拾取的散乱金属零件识别与定位方法研究", 中国优秀硕士学位论文全文数据库(工程技术Ⅰ辑), 15 February 2021 (2021-02-15), pages 022 - 1310 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117428777A (en) * 2023-11-28 2024-01-23 北华航天工业学院 Hand-eye calibration method of bag-removing robot

Similar Documents

Publication Publication Date Title
KR20180120647A (en) System and method for tying together machine vision coordinate spaces in a guided assembly environment
CN110640747B (en) Hand-eye calibration method and system for robot, electronic equipment and storage medium
JP5685027B2 (en) Information processing apparatus, object gripping system, robot system, information processing method, object gripping method, and program
CN113379849B (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
JPH07311610A (en) Coordinate system setting method using visual sensor
US20130111731A1 (en) Assembling apparatus and method, and assembling operation program
CN110171000B (en) Groove cutting method, device and control equipment
CN109493389B (en) Camera calibration method and system based on deep learning
CN109814434B (en) Calibration method and device of control program
JP2016078195A (en) Robot system, robot, control device and control method of robot
CN114310901B (en) Coordinate system calibration method, device, system and medium for robot
CN111360821A (en) Picking control method, device and equipment and computer scale storage medium
US11577400B2 (en) Method and apparatus for managing robot system
JP2019098409A (en) Robot system and calibration method
CN115741666A (en) Robot hand-eye calibration method, robot and robot operation method
CN114139857A (en) Workpiece finishing process correcting method, system, storage medium and device
CN112809668A (en) Method, system and terminal for automatic hand-eye calibration of mechanical arm
CN110722558A (en) Origin correction method and device for robot, controller and storage medium
CN114505864B (en) Hand-eye calibration method, device, equipment and storage medium
CN113172636B (en) Automatic hand-eye calibration method and device and storage medium
CN112506378B (en) Bending track control method and device and computer readable storage medium
JP2022152845A (en) Calibration device for controlling robot
CN117340879A (en) Industrial machine ginseng number identification method and system based on graph optimization model
CN116038701B (en) Hand-eye calibration method and device for four-axis mechanical arm
CN112171664B (en) Production line robot track compensation method, device and system based on visual identification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Country or region after: China

Address after: Room 201, Building A, No. 1, Qianwan Road, Qianhai-Shenzhen-Hong Kong Cooperation Zone, Shenzhen, Guangdong Province, 518000

Applicant after: SHENZHEN QIANHAI RUIJI TECHNOLOGY CO.,LTD.

Applicant after: CIMC Container (Group) Co.,Ltd.

Applicant after: CHINA INTERNATIONAL MARINE CONTAINERS (GROUP) Ltd.

Address before: Room 201, Building A, No. 1, Qianwan Road, Qianhai-Shenzhen-Hong Kong Cooperation Zone, Shenzhen, Guangdong Province, 518000

Applicant before: SHENZHEN QIANHAI RUIJI TECHNOLOGY CO.,LTD.

Country or region before: China

Applicant before: CIMC CONTAINERS HOLDING Co.,Ltd.

Applicant before: CHINA INTERNATIONAL MARINE CONTAINERS (GROUP) Ltd.