CN114714356A - Method for accurately detecting calibration error of hand eye of industrial robot based on binocular vision - Google Patents

Method for accurately detecting calibration error of hand eye of industrial robot based on binocular vision Download PDF

Info

Publication number
CN114714356A
CN114714356A CN202210391499.8A CN202210391499A CN114714356A CN 114714356 A CN114714356 A CN 114714356A CN 202210391499 A CN202210391499 A CN 202210391499A CN 114714356 A CN114714356 A CN 114714356A
Authority
CN
China
Prior art keywords
camera
coordinate system
calibration
industrial robot
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202210391499.8A
Other languages
Chinese (zh)
Inventor
尹勇
王家文
黄铮
刘雪冬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Research Institute Of Wuhan University Of Technology
Original Assignee
Chongqing Research Institute Of Wuhan University Of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Research Institute Of Wuhan University Of Technology filed Critical Chongqing Research Institute Of Wuhan University Of Technology
Priority to CN202210391499.8A priority Critical patent/CN114714356A/en
Publication of CN114714356A publication Critical patent/CN114714356A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0081Programme-controlled manipulators with master teach-in means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a binocular vision-based method for accurately detecting calibration errors of hands and eyes of an industrial robot, which comprises the following steps: acquiring a conversion relation matrix of a binocular camera relative to the tail end of the industrial robot by an industrial robot hand-eye calibration method; fixing a binocular camera at the tail end of an industrial robot, and shooting pictures of a universal calibration board by using left and right cameras for calibration to obtain parameters of the left and right cameras; acquiring coordinates of two angular points of the universal calibration board in a camera coordinate system by using a binocular camera; and converting the coordinates of the two angular points into an industrial robot base coordinate system according to the calibration result of the hand eyes, calculating the distance between the two angular points in the base coordinate system, and comparing the distance with the actual distance between the two angular points of the universal calibration board to obtain a calculation error. The method based on the high-precision binocular vision is convenient to operate, convenient to obtain materials, simple in calculation, capable of accurately measuring the hand-eye calibration error to a millimeter level and feasible.

Description

Method for accurately detecting calibration error of hand eye of industrial robot based on binocular vision
Technical Field
The invention relates to the field of machine vision of industrial robots, in particular to a binocular vision-based method for accurately detecting calibration errors of hand eyes of an industrial robot.
Background
With the rapid development of machine vision, people are rapidly entering an intelligent era, more and more manual works are replaced by robots, and the intelligent trend is not blocked. The industrial robot is most contacted in the machine vision, and the 'eye' -industrial camera of the industrial robot is indispensable, so that the accurate hand-eye calibration becomes an industrial problem, and the measurement of the hand-eye calibration error caused by the difficult measurement becomes more difficult.
At present, there are two different calibration methods for calibrating the hand-eye of an industrial robot according to the fixed position of a camera, one is that the camera is fixed on the tail end of the industrial robot and is called eye-in-hand, and the other is that the camera is fixed on an independent bracket and is called eye-to-hand. Although the two conditions are different, the calibration steps are consistent, so that the error detection method can be simultaneously applied to the two conditions. According to the principle of hand-eye calibration, a method for detecting hand-eye calibration errors is generated, namely, according to the conversion relation between a camera coordinate system obtained through calibration and a terminal coordinate system, camera external reference data obtained through calibration are substituted to reversely derive position and attitude data of an industrial robot, and the position and attitude data of the industrial robot are compared with position and attitude data of a demonstrator, so that the hand-eye calibration errors can be obtained. However, the detection method cannot fundamentally and accurately calculate the hand-eye calibration error, and the initial data is reversely deduced by using the deduced hand-eye calibration result, so that although the difference between each group of initial pose data and the calculated data can be measured to a certain extent, the detection method still lacks a certain persuasion on the whole.
Generally speaking, to detect the calibration error of the hands and eyes, it is necessary to compare the data calculated from the calibration result with the reference data based on the known parameters, calculate the absolute error or the relative error of the two, and the obtained calibration error has a reference value and can represent the error more intuitively. The precision that the hand eye was markd influences the realization of follow-up industrial robot's visual function, plays the decisive role to whether industrial robot can be applied to actual conditions.
An effective solution to the problems in the related art has not been proposed yet.
Disclosure of Invention
Aiming at the problems in the related art, the invention provides a binocular vision-based method for accurately detecting the calibration error of the hand eye of an industrial robot, so as to overcome the technical problems in the prior related art.
Therefore, the invention adopts the following specific technical scheme:
an industrial robot hand-eye calibration error accurate detection method based on binocular vision comprises the following steps:
s1, acquiring a conversion relation matrix of the binocular camera relative to the tail end of the industrial robot through an industrial robot hand-eye calibration method;
s2, fixing the binocular camera at the tail end of the industrial robot, and shooting the pictures of the general calibration board by using the left camera and the right camera for calibration to obtain the parameters of the left camera and the right camera;
s3, acquiring coordinates of two corner points of the universal calibration board in a camera coordinate system by using a binocular camera;
and S4, converting coordinates of the two angular points into an industrial robot base coordinate system according to the hand-eye calibration result, calculating the distance between the two angular points in the base coordinate system, and comparing the distance with the actual distance between the two angular points of the universal calibration board to obtain a calculation error.
Further, the acquiring of the transformation relation matrix of the binocular camera relative to the end of the industrial robot by the industrial robot hand-eye calibration method comprises the following steps:
s11, shooting general calibration board photos at different poses of the industrial robot, recording corresponding pose data of the industrial robot from a preset demonstrator, then calibrating a camera by using the shot photos, acquiring a rotation vector and a translation vector of the calibration board relative to the camera in each picture, and converting the rotation vector and the translation vector into an RT rotation translation matrix;
and S12, calculating a conversion relation matrix of the camera coordinate system relative to the industrial robot terminal coordinate system by combining the RT rotation translation matrix and the conversion matrix of the calibration board relative to the camera.
Further, the step of obtaining a transformation relation matrix of the camera coordinate system relative to the industrial robot end coordinate system by combining the RT rotation translation matrix and the transformation matrix of the calibration board relative to the camera comprises the following steps:
s121, obtaining a conversion relation from a terminal coordinate system to a base coordinate system through a preset demonstrator, recording the conversion relation as baseHtool, obtaining a conversion relation from a camera coordinate system to a universal calibration board coordinate system through camera calibration, recording the conversion relation as calHcam, and obtaining the following formula according to a hand-eye calibration principle:
baseHtool*calHcam=baseHtool*toolHcam
wherein the baseHtool is a conversion relation matrix from a coordinate system of the universal calibration board to a basic coordinate system and is fixed;
the toolHcam is a conversion relation matrix from a camera coordinate system to an industrial robot terminal coordinate system and is to be solved;
s122, obtaining the following formula through the two sets of pose data and the camera calibration data:
baseHtool*calHcam(1)=baseHtool(1)*toolHcam
baseHtool*calHcam(2)=baseHtool(2)*toolHcam。
further, the binocular camera is fixed at the tail end of the industrial robot, the left camera and the right camera are used for shooting the pictures of the general calibration plate for calibration, and the acquisition of the parameters of the left camera and the right camera comprises the following steps:
s21, calibrating the camera through the shot calibration plate picture, and obtaining parameters of the left camera and the right camera;
and S22, calculating a reprojection matrix Q according to the parameters of the left camera and the right camera.
Further, the parameters of the left and right cameras include an intrinsic parameter matrix, a distortion parameter, a rotation matrix and a translation matrix of the left and right cameras.
Further, the step of acquiring coordinates of two corner points of the universal calibration board under a camera coordinate system by using a binocular camera comprises the following steps:
s31, fixing the universal calibration board in the visual field range of the binocular camera, moving the industrial robot to shoot the photos of the calibration board, and acquiring a first group of target images Pleft1Moving the industrial robot again to different poses, shooting the general calibration plate, and acquiring a second group of target images Pright1
S32, performing distortion removal and baseline correction on the first group of target images and the second group of target images according to the left and right camera parameters to obtain a corrected first group of target images Pleft2And the rectified second group of target images Pright2
S33, performing camera calibration on the corrected first group of target images and the corrected second group of target images to obtain the coordinates of the lower corner points of a pixel coordinate system of a universal calibration board in the images, then selecting the same corner point of the universal calibration board from the corrected first group of target images as a first group of target points under the pixel coordinate system, and selecting the other corner point of the universal calibration board from the corrected second group of target images as a second group of target points under the pixel coordinate system;
s34, converting the first group of target points and the second group of target points under the pixel coordinate system to the camera coordinate system through the parallax values between the reprojection matrix Q and each group of target points, and obtaining the first target point (u) under the camera coordinate systemleft1,vleft1) And a second target point (u) in the camera coordinate systemright1,vright1)。
Further, the step of converting coordinates of the two angular points into a basic coordinate system of the industrial robot according to the result of the hand-eye calibration, calculating the distance between the two angular points in the basic coordinate system, and comparing the distance with the actual distance between the two angular points of the universal calibration board to obtain a calculation error comprises the following steps:
s41, converting the first target point under the camera coordinate system and the second target point under the camera coordinate system into the industrial robot base coordinate system according to the hand-eye calibration result to obtain the first target point under the base coordinate system and the second target point under the base coordinate system, and calculating the first target under the base coordinate systemThe distance between the target point and the second target point in the base coordinate system is marked as Dbase
S42, determining the distance D between the first target point under the base coordinate system and the second target point under the base coordinate systembaseComparing the actual distance D with two angular points of the universal calibration plate to obtain absolute error | D of the hand-eye calibrationbase-D | and relative error | Dbase-d|/d。
Further, the calculation formula of the disparity value is as follows:
disp=|uleft1-uright1|。
further, the reprojection matrix Q is:
Figure BDA0003597119590000041
wherein, c'x1Representing the x-axis offset of the left camera optical axis from the projection plane coordinate center;
c'y1represents the y-axis offset of the left camera optical axis from the projection plane coordinate center;
f' represents the left camera focal length;
c'x2representing the x-axis offset of the right camera optical axis from the projection plane coordinate center;
t'xindicating the amount of translation of the left and right cameras in the x-axis direction.
Further, the calculation formula for converting the first target point in the camera coordinate system and the second target point in the camera coordinate system to the industrial robot base coordinate system according to the hand-eye calibration result is as follows:
Figure BDA0003597119590000051
wherein cam represents a camera coordinate system;
Xcam、Ycam、Zcamrespectively representing x, y and z coordinates of the target point under a camera coordinate system;
base represents a base coordinate system;
Xbase、Ybase、Zbaserespectively representing the x, y, z coordinates of the target point in the industrial robot base coordinate system.
The invention has the beneficial effects that: according to the method, two different angular points on the universal calibration board are selected and converted into the industrial robot base coordinate system through the hand-eye calibration result, so that the distance between the two angular points is obtained, and the hand-eye calibration error is obtained by comparing the distance with the angular point distance in the camera coordinate system.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a flow chart of an industrial robot hand-eye calibration error accurate detection method based on binocular vision according to an embodiment of the invention;
FIG. 2 is a schematic structural diagram of hand-eye calibration in the method for accurately detecting the calibration error of the hand-eye of the industrial robot based on binocular vision according to the embodiment of the invention;
fig. 3 is a schematic view of a binocular calibration structure in the binocular vision based method for accurately detecting the calibration error of the hand and the eye of the industrial robot according to the embodiment of the invention;
fig. 4 is a binocular vision three-dimensional reduction schematic diagram in the binocular vision-based industrial robot hand-eye calibration error accurate detection method according to the embodiment of the invention.
Detailed Description
For further explanation of the various embodiments, the drawings which form a part of the disclosure and which are incorporated in and constitute a part of this specification, illustrate embodiments and, together with the description, serve to explain the principles of operation of the embodiments, and to enable others of ordinary skill in the art to understand the various embodiments and advantages of the invention, and, by reference to these figures, reference is made to the accompanying drawings, which are not to scale and wherein like reference numerals generally refer to like elements.
According to the embodiment of the invention, the accurate detection method for the calibration error of the hand eye of the industrial robot based on binocular vision is provided.
Referring to the drawings and the detailed description, the invention is further explained, as shown in fig. 1, in an embodiment of the invention, a method for accurately detecting calibration errors of a hand eye of an industrial robot based on binocular vision and a universal calibration board, the method including the following steps:
s1, acquiring a conversion relation matrix of the binocular camera relative to the tail end of the industrial robot through an industrial robot hand-eye calibration method;
s2, fixing the binocular camera at the tail end of the industrial robot, and shooting the pictures of the general calibration board by using the left camera and the right camera for calibration to obtain the parameters of the left camera and the right camera;
s3, acquiring coordinates of two corner points of the universal calibration board in a camera coordinate system by using a binocular camera;
and S4, converting coordinates of the two angular points into an industrial robot base coordinate system according to the hand-eye calibration result, calculating the distance between the two angular points in the base coordinate system, and comparing the distance with the actual distance between the two angular points of the universal calibration board to obtain a calculation error.
In one embodiment, the acquiring of the transformation relation matrix of the binocular camera relative to the end of the industrial robot by the industrial robot hand-eye calibration method comprises the following steps:
s11, shooting general calibration board photos at different poses of the industrial robot, recording corresponding pose data of the industrial robot from a preset demonstrator, then calibrating a camera by using the shot photos, acquiring a rotation vector and a translation vector of the calibration board relative to the camera in each picture, and converting the rotation vector and the translation vector into an RT rotation translation matrix;
specifically, in order to make the solved toolHcam more accurate and reduce the error, more than ten sets of data are preferable.
And S12, calculating a conversion relation matrix of the camera coordinate system relative to the industrial robot terminal coordinate system by combining the RT rotation translation matrix and the conversion matrix of the calibration board relative to the camera.
In one embodiment, the step of calculating the transformation relation matrix of the camera coordinate system relative to the industrial robot end coordinate system by combining the transformation matrix of the calibration board relative to the camera through the RT rotation translation matrix comprises the following steps:
s121, obtaining a conversion relation from a terminal coordinate system to a base coordinate system through a preset demonstrator, recording the conversion relation as baseHtool, obtaining a conversion relation from a camera coordinate system to a universal calibration board coordinate system through camera calibration, recording the conversion relation as calHcam, and obtaining the following formula according to a hand-eye calibration principle:
baseHtool*calHcam=baseHtool*toolHcam
wherein the baseHtool is a conversion relation matrix from a coordinate system of the universal calibration board to a basic coordinate system and is fixed;
the toolHcam is a conversion relation matrix from a camera coordinate system to an industrial robot terminal coordinate system and is to be solved;
specifically, as shown in fig. 2, base represents a base coordinate system, tool represents an end coordinate system, cam represents a camera coordinate system, and cal represents a general calibration plate coordinate system, and accordingly, baseHtool represents a conversion relation from the end coordinate system to the base coordinate system, and can be obtained from a teach pendant of the robot system, and as long as the relative position of the base and the calibration plate of the industrial robot is not changed, the conversion matrix is not changed.
S122, obtaining the following formula through the two sets of pose data and the camera calibration data:
baseHtool*calHcam(1)=baseHtool(1)*toolHcam
baseHtool*calHcam(2)=baseHtool(2)*toolHcam;
the above equation is similar to AX ═ XB equation, and the optimal solution for toolHcam can be solved using the least squares method.
In one embodiment, the fixing of the binocular camera to the end of the industrial robot and the taking of the pictures of the universal calibration board by the left and right cameras for calibration includes the following steps:
s21, calibrating the camera through the shot calibration plate picture, and obtaining parameters of the left camera and the right camera;
s22, calculating a reprojection matrix Q according to the parameters of the left camera and the right camera;
specifically, the binocular camera is used for the reason that binocular vision has absolute advantage in the aspect of three-dimensional reconstruction, and the three-dimensional coordinates of a space point in a visual field range under a left camera coordinate system can be accurately acquired by means of a binocular vision system; as shown in fig. 3, the general calibration board is fixed by using a binocular vision system carried by an industrial robot, the purpose of binocular camera calibration is to acquire an internal parameter matrix, distortion parameters, a rotation matrix and a translation matrix of left and right cameras, which plays an important role in subsequent error detection, and more than fifteen sets of photo data are preferably shot in order to ensure accuracy.
In one embodiment, the parameters of the left and right cameras include an intrinsic parameter matrix, distortion parameters, rotation matrix, and translation matrix of the left and right cameras.
In one embodiment, the obtaining the coordinates of the two corner points of the universal calibration board in the camera coordinate system by using the binocular camera comprises the following steps:
s31, fixing the universal calibration board in the visual field range of the binocular camera, moving the industrial robot to shoot the photos of the calibration board, and acquiring a first group of target images Pleft1Moving the industrial robot again to different poses, shooting the general calibration plate, and acquiring a second group of target images Pright1
S32, performing distortion removal and baseline correction on the first group of target images and the second group of target images according to the left and right camera parameters to obtain a corrected first group of target images Pleft2And the rectified second group of target images Pright2
S33, performing camera calibration on the corrected first group of target images and the corrected second group of target images to obtain the coordinates of the lower corner points of a pixel coordinate system of a universal calibration board in the images, then selecting the same corner point of the universal calibration board from the corrected first group of target images as a first group of target points under the pixel coordinate system, and selecting the other corner point of the universal calibration board from the corrected second group of target images as a second group of target points under the pixel coordinate system;
s34, converting the first group of target points and the second group of target points under the pixel coordinate system to the camera coordinate system through the re-projection matrix Q and the parallax value between each group of target points, and obtaining a first target point (u) under the camera coordinate systemleft1,vleft1) And a second target point (u) in the camera coordinate systemright1,vright1)。
In one embodiment, the step of converting coordinates of two corner points into a basic coordinate system of the industrial robot according to the result of the hand-eye calibration, calculating a distance between the two corner points in the basic coordinate system, and comparing the distance with an actual distance between the two corner points of the universal calibration board to obtain a calculation error comprises the following steps:
s41, calibrating the first target point (X) under the camera coordinate system according to the hand-eye calibration resultc1,Yc1,Zc1) And a second target point (X) in the camera coordinate systemc2,Yc2,Zc2) Converting the coordinate system into an industrial robot base coordinate system to obtain a first target point (X) in the base coordinate systemb1,Yb1,Zb1) And a second target point (X) under the base coordinate systemb2,Yb2,Zb2) And calculating the distance between the first target point under the base coordinate system and the second target point under the base coordinate system, and recording as Dbase
S42, calculating the distance D between the first target point under the base coordinate system and the second target point under the base coordinate systembaseComparing the actual distance D of the two angular points of the universal calibration board to obtain the absolute error | D of the hand-eye calibrationbase-D | and relative error | Dbase-d|/d。
As shown in fig. 4, in one embodiment, the calculation formula of the disparity value is as follows:
disp=|uleft1-uright1|。
the reprojection matrix Q is:
Figure BDA0003597119590000091
wherein, c'x1Representing the x-axis offset of the left camera optical axis from the projection plane coordinate center;
c'y1representing the y-axis offset of the left camera optical axis from the projection plane coordinate center;
f' represents the left camera focal length;
c'x2represents the x-axis offset of the right camera optical axis from the projection plane coordinate center;
t'xrepresenting the translation amount of the left camera and the right camera in the direction of the x axis;
specifically, the following may be obtained by reprojection matrix Q:
Figure BDA0003597119590000101
wherein u and v respectively represent the abscissa and the ordinate of a target point under a pixel coordinate system;
disp (u, v) represents the disparity value of the target point;
w' represents a depth coefficient;
x ', Y ', Z ' represent the X, Y, Z coordinates, respectively, of the target point at the camera coordinate system for which no depth information is determined.
Obtaining the final three-dimensional point coordinates:
Figure BDA0003597119590000102
wherein X, Y, Z represents the x, y, z coordinates of the target point in the camera coordinate system, respectively.
That is, the world three-dimensional coordinates of the selected calibration corner point in the first set of data can be obtained and recorded as (X)w1,Yw1,Zw1) Similarly, world three-dimensional coordinates (X) of the selected corner point in the second set of data may be obtainedw2,Yw2,Zw2) Because binocular vision generally takes the optical center of the left camera as the origin of a world coordinate system, the restored three-dimensional coordinates of the corner points are the coordinates under the coordinate system of the left camera, so that the coordinates of the corner points under the two image coordinate systems obtained after baseline correction can be converted into the coordinates under the coordinate system of the camera and recorded as (X)c1,Yc1,Zc1) And (X)c2,Yc2,Zc2);
Specifically, in fig. 4, Left camera and Right camera respectively represent the Left and Right cameras;
f represents a camera focal length;
b represents the baseline distance of the two cameras;
p represents any point in a world coordinate system;
l represents an image width size;
PLrepresents an imaging point of the point P under the left camera (left camera);
PRrepresents an imaging point of the point P under the right camera (right camera);
xLrepresents point PLThe x-coordinate size in the pixel coordinate system;
xRrepresents point PRThe x-coordinate size in the pixel coordinate system;
x1representing point PLDistance on the x-axis from the left image center point;
x2represents point PRDistance on the x-axis from the right image center point;
z represents the z-coordinate size of the point P in the world coordinate system.
In one embodiment, the conversion of the first target point in the camera coordinate system and the second target point in the camera coordinate system to the industrial robot base coordinate system through the hand-eye calibration result is calculated by the following formula:
Figure BDA0003597119590000111
wherein cam represents a camera coordinate system;
Xcam、Ycam、Zcamrespectively representing x, y and z coordinates of the target point under a camera coordinate system;
base represents a base coordinate system;
Xbase、Ybase、Zbaserespectively representing the x, y, z coordinates of the target point in the industrial robot base coordinate system.
In summary, according to the technical scheme of the invention, two different angular points on the universal calibration board are selected by a high-precision binocular vision-based method, the two different angular points are converted into an industrial robot base coordinate system through a hand-eye calibration result, then the distance between the two angular points is obtained, and a hand-eye calibration error is obtained by comparing the distance with the angular point distance of a camera coordinate system.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. The method for accurately detecting the calibration error of the hand eye of the industrial robot based on binocular vision is characterized by comprising the following steps:
s1, acquiring a conversion relation matrix of the binocular camera relative to the tail end of the industrial robot through an industrial robot hand-eye calibration method;
s2, fixing the binocular camera at the tail end of the industrial robot, and using the left camera and the right camera to shoot the pictures of the general calibration plate for calibration to obtain the parameters of the left camera and the right camera;
s3, acquiring coordinates of two corner points of the universal calibration board in a camera coordinate system by using a binocular camera;
and S4, converting coordinates of the two angular points into an industrial robot base coordinate system according to the hand-eye calibration result, calculating the distance between the two angular points in the base coordinate system, and comparing the distance with the actual distance between the two angular points of the universal calibration board to obtain a calculation error.
2. The binocular vision based industrial robot hand-eye calibration error accurate detection method according to claim 1, wherein the obtaining of the transformation relation matrix of the binocular camera relative to the end of the industrial robot through the industrial robot hand-eye calibration method comprises the following steps:
s11, shooting general calibration board photos at different poses of the industrial robot, recording corresponding pose data of the industrial robot from a preset demonstrator, then calibrating a camera by using the shot photos, acquiring a rotation vector and a translation vector of the calibration board relative to the camera in each picture, and converting the rotation vector and the translation vector into an RT rotation translation matrix;
and S12, calculating a conversion relation matrix of the camera coordinate system relative to the industrial robot terminal coordinate system by combining the RT rotation translation matrix and the conversion matrix of the calibration board relative to the camera.
3. The binocular vision based accurate detection method for calibration errors of the hand eyes of the industrial robot is characterized in that the step of calculating the transformation relation matrix of the coordinate system of the camera relative to the end coordinate system of the industrial robot by combining the transformation matrix of the calibration plate relative to the camera through the RT rotation and translation matrix comprises the following steps:
s121, obtaining a conversion relation from a terminal coordinate system to a base coordinate system through a preset demonstrator, recording the conversion relation as baseHtool, obtaining a conversion relation from a camera coordinate system to a universal calibration board coordinate system through camera calibration, recording the conversion relation as calHcam, and obtaining the following formula according to a hand-eye calibration principle:
baseHtool*calHcam=baseHtool*toolHcam
wherein the baseHtool is a conversion relation matrix from a coordinate system of the universal calibration board to a basic coordinate system and is fixed;
the toolHcam is a conversion relation matrix from a camera coordinate system to an industrial robot terminal coordinate system and is to be solved;
s122, obtaining the following formula through the two sets of pose data and the camera calibration data:
baseHtool*calHcam(1)=baseHtool(1)*toolHcam
baseHtool*calHcam(2)=baseHtool(2)*toolHcam。
4. the binocular vision based accurate detection method for calibration errors of the hand-eye calibration of the industrial robot according to claim 1, wherein the binocular camera is fixed at the tail end of the industrial robot, the left camera and the right camera are used for shooting photos of a universal calibration board for calibration, and the acquisition of parameters of the left camera and the right camera comprises the following steps:
s21, calibrating the camera through the shot calibration plate picture, and obtaining parameters of the left camera and the right camera;
and S22, calculating a reprojection matrix Q according to the parameters of the left camera and the right camera.
5. The binocular vision based accurate detection method for calibration errors of the hand and the eye of the industrial robot is characterized in that the parameters of the left camera and the right camera comprise an internal parameter matrix, distortion parameters, a rotation matrix and a translation matrix of the left camera and the right camera.
6. The binocular vision based accurate detection method for calibration errors of the hand and the eye of the industrial robot, as claimed in claim 5, wherein the using of the binocular camera to obtain the coordinates of the two corner points of the universal calibration board in the camera coordinate system comprises the following steps:
s31, fixing the universal calibration board in the visual field range of the binocular camera, moving the industrial robot to shoot the photos of the calibration board, and acquiring a first group of target images Pleft1Moving the industrial robot again to different poses, shooting the general calibration plate, and acquiring a second group of target images Pright1
S32, performing distortion removal and baseline correction on the first group of target images and the second group of target images according to the left and right camera parameters to obtain a corrected first group of target images Pleft2And the rectified second group of target images Pright2
S33, performing camera calibration on the corrected first group of target images and the corrected second group of target images to obtain the coordinates of the lower corner points of a pixel coordinate system of a universal calibration board in the images, then selecting the same corner point of the universal calibration board from the corrected first group of target images as a first group of target points under the pixel coordinate system, and selecting the other corner point of the universal calibration board from the corrected second group of target images as a second group of target points under the pixel coordinate system;
s34, converting the first group of target points and the second group of target points under the pixel coordinate system to the camera coordinate system through the parallax values between the reprojection matrix Q and each group of target points, and obtaining the first target point (u) under the camera coordinate systemleft1,vleft1) And a second target point (u) in the camera coordinate systemright1,vright1)。
7. The binocular vision based accurate detection method for calibration errors of the hand-eye of the industrial robot according to claim 6, wherein the step of converting coordinates of two corner points into a basic coordinate system of the industrial robot according to the calibration result of the hand-eye, calculating a distance between the two corner points in the basic coordinate system, and comparing the distance with an actual distance between the two corner points of a universal calibration board to obtain a calculation error comprises the following steps:
s41, converting the first target point under the camera coordinate system and the second target point under the camera coordinate system into the industrial robot base coordinate system according to the hand-eye calibration result to obtain a first target point under the base coordinate system and a second target point under the base coordinate system, and calculating the distance between the first target point under the base coordinate system and the second target point under the base coordinate system, and recording the distance as Dbase
S42, calculating the distance D between the first target point under the base coordinate system and the second target point under the base coordinate systembaseComparing the actual distance D of the two angular points of the universal calibration board to obtain the absolute error | D of the hand-eye calibrationbase-D | and relative error | Dbase-d|/d。
8. The binocular vision based accurate detection method for calibration errors of the hand eyes of the industrial robot is characterized in that the calculation formula of the parallax value is as follows:
disp=|uleft1-uright1|。
9. the binocular vision based accurate detection method for calibration errors of the hand eye of the industrial robot, according to claim 1, wherein the reprojection matrix Q is:
Figure FDA0003597119580000031
wherein, c'x1Representing the x-axis offset of the left camera optical axis from the projection plane coordinate center;
c′y1representing the y-axis offset of the left camera optical axis from the projection plane coordinate center;
f' represents the left camera focal length;
c′x2representing the x-axis offset of the right camera optical axis from the projection plane coordinate center;
t′xindicating the amount of translation of the left and right cameras in the x-axis direction.
10. The binocular vision based accurate detection method for calibration errors of the hand-eye of the industrial robot according to claim 9, wherein the calculation formula for converting the first target point in the camera coordinate system and the second target point in the camera coordinate system to the industrial robot base coordinate system according to the calibration result of the hand-eye is as follows:
Figure FDA0003597119580000041
wherein cam represents a camera coordinate system;
Xcam、Ycam、Zcamrespectively representing x, y and z coordinates of the target point under a camera coordinate system;
base represents a base coordinate system;
Xbase、Ybase、Zbaserespectively representing the x, y, z coordinates of the target point in the industrial robot base coordinate system.
CN202210391499.8A 2022-04-14 2022-04-14 Method for accurately detecting calibration error of hand eye of industrial robot based on binocular vision Withdrawn CN114714356A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210391499.8A CN114714356A (en) 2022-04-14 2022-04-14 Method for accurately detecting calibration error of hand eye of industrial robot based on binocular vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210391499.8A CN114714356A (en) 2022-04-14 2022-04-14 Method for accurately detecting calibration error of hand eye of industrial robot based on binocular vision

Publications (1)

Publication Number Publication Date
CN114714356A true CN114714356A (en) 2022-07-08

Family

ID=82243343

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210391499.8A Withdrawn CN114714356A (en) 2022-04-14 2022-04-14 Method for accurately detecting calibration error of hand eye of industrial robot based on binocular vision

Country Status (1)

Country Link
CN (1) CN114714356A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115070779A (en) * 2022-08-22 2022-09-20 菲特(天津)检测技术有限公司 Robot grabbing control method and system and electronic equipment
CN115383749A (en) * 2022-10-25 2022-11-25 国网瑞嘉(天津)智能机器人有限公司 Calibration method and device for live working equipment, controller and storage medium
CN115990889A (en) * 2023-03-23 2023-04-21 武汉益模科技股份有限公司 Multi-receptive field-based composite positioning method, device and computer system
CN116038720A (en) * 2023-04-03 2023-05-02 广东工业大学 Hand-eye calibration method, device and equipment based on point cloud registration
CN116160454A (en) * 2023-03-28 2023-05-26 重庆智能机器人研究院 Robot tail end plane vision hand-eye calibration algorithm model
CN116902586A (en) * 2023-08-03 2023-10-20 智威士(天津)技术有限公司 Grabbing adjustment method, device, equipment and medium based on vision camera
WO2024027647A1 (en) * 2022-08-02 2024-02-08 深圳微美机器人有限公司 Robot control method and system and computer program product
CN118096894A (en) * 2024-02-06 2024-05-28 宽瑞智能科技(苏州)有限公司 Single-camera calibration method and device for surgical robot

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024027647A1 (en) * 2022-08-02 2024-02-08 深圳微美机器人有限公司 Robot control method and system and computer program product
CN115070779A (en) * 2022-08-22 2022-09-20 菲特(天津)检测技术有限公司 Robot grabbing control method and system and electronic equipment
CN115383749A (en) * 2022-10-25 2022-11-25 国网瑞嘉(天津)智能机器人有限公司 Calibration method and device for live working equipment, controller and storage medium
CN115990889A (en) * 2023-03-23 2023-04-21 武汉益模科技股份有限公司 Multi-receptive field-based composite positioning method, device and computer system
CN116160454A (en) * 2023-03-28 2023-05-26 重庆智能机器人研究院 Robot tail end plane vision hand-eye calibration algorithm model
CN116038720A (en) * 2023-04-03 2023-05-02 广东工业大学 Hand-eye calibration method, device and equipment based on point cloud registration
CN116038720B (en) * 2023-04-03 2023-08-11 广东工业大学 Hand-eye calibration method, device and equipment based on point cloud registration
WO2024207704A1 (en) * 2023-04-03 2024-10-10 广东工业大学 Hand-eye calibration method, apparatus and device based on point cloud registration
CN116902586A (en) * 2023-08-03 2023-10-20 智威士(天津)技术有限公司 Grabbing adjustment method, device, equipment and medium based on vision camera
CN118096894A (en) * 2024-02-06 2024-05-28 宽瑞智能科技(苏州)有限公司 Single-camera calibration method and device for surgical robot

Similar Documents

Publication Publication Date Title
CN114714356A (en) Method for accurately detecting calibration error of hand eye of industrial robot based on binocular vision
CN110666798B (en) Robot vision calibration method based on perspective transformation model
CN111515944B (en) Automatic calibration method for non-fixed path robot
JP6324025B2 (en) Information processing apparatus and information processing method
CN110136204B (en) Sound film dome assembly system based on calibration of machine tool position of bilateral telecentric lens camera
CN108594245A (en) A kind of object movement monitoring system and method
CN115830103A (en) Monocular color-based transparent object positioning method and device and storage medium
CN109906471B (en) Real-time three-dimensional camera calibration
WO2023201578A1 (en) Extrinsic parameter calibration method and device for monocular laser speckle projection system
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
CN110136068B (en) Sound membrane dome assembly system based on position calibration between bilateral telecentric lens cameras
US20240212210A1 (en) Hand-eye calibration method based on three-dimensional point cloud of calibration plate
CN114299156A (en) Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area
JP2024537798A (en) Photographing and measuring method, device, equipment and storage medium
CN113296395A (en) Robot hand-eye calibration method in specific plane
CN104794718A (en) Single-image CT (computed tomography) machine room camera calibration method
CN110044266B (en) Photogrammetry system based on speckle projection
CN111538029A (en) Vision and radar fusion measuring method and terminal
CN117173254A (en) Camera calibration method, system, device and electronic equipment
JP7427370B2 (en) Imaging device, image processing device, image processing method, calibration method for imaging device, robot device, method for manufacturing articles using robot device, control program, and recording medium
CN109087360A (en) A kind of scaling method that robot camera is joined outside
JP3842988B2 (en) Image processing apparatus for measuring three-dimensional information of an object by binocular stereoscopic vision, and a method for recording the same, or a recording medium recording the measurement program
CN113822920B (en) Method for acquiring depth information by structured light camera, electronic equipment and storage medium
CN113658270A (en) Multi-view visual calibration method, device, medium and system based on workpiece hole center
CN109990801A (en) Level meter rigging error scaling method based on plumb line

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20220708

WW01 Invention patent application withdrawn after publication