CN106767393B - Hand-eye calibration device and method for robot - Google Patents

Hand-eye calibration device and method for robot Download PDF

Info

Publication number
CN106767393B
CN106767393B CN201510812064.6A CN201510812064A CN106767393B CN 106767393 B CN106767393 B CN 106767393B CN 201510812064 A CN201510812064 A CN 201510812064A CN 106767393 B CN106767393 B CN 106767393B
Authority
CN
China
Prior art keywords
robot
coordinate
coordinate system
points
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510812064.6A
Other languages
Chinese (zh)
Other versions
CN106767393A (en
Inventor
张涛
李邦宇
姜楠
张强
陈亮
董状
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Siasun Robot and Automation Co Ltd
Original Assignee
Shenyang Siasun Robot and Automation Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Siasun Robot and Automation Co Ltd filed Critical Shenyang Siasun Robot and Automation Co Ltd
Priority to CN201510812064.6A priority Critical patent/CN106767393B/en
Publication of CN106767393A publication Critical patent/CN106767393A/en
Application granted granted Critical
Publication of CN106767393B publication Critical patent/CN106767393B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/0095Means or methods for testing manipulators
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a hand-eye calibration device and method for a robot, and belongs to the technical field of calibration of industrial robots. The hand-eye calibration method of the robot is applied to a six-axis industrial robot, and comprises the following steps: determining characteristic points for calibrating camera parameters and coordinate points for calibrating a robot coordinate system; recognizing the coordinate values of the feature points and the coordinate points of the robot in the first posture in a camera coordinate system; recognizing the coordinate values of the feature points and the coordinate points of the robot in the second posture in the camera coordinate system; and calculating a transformation relation matrix T from the camera coordinate system to the robot MI system according to the coordinate values identified twice and the MI values of the feature points and the MI points under the robot MI system. The invention only needs the posture of the six-axis industrial robot to be changed twice, collects four characteristic points on the image, and can obtain the transformation relation matrix T from the camera coordinate system to the robot tool coordinate system through calculation, thereby simplifying the calibration process and reducing the calculation difficulty.

Description

Hand-eye calibration device and method for robot
[ technical field ] A method for producing a semiconductor device
The invention relates to the technical field of calibration of industrial robots, in particular to a hand-eye calibration device and method of a robot.
[ background of the invention ]
At present, the EYE-HAND system EYE-IN-HAND industrial robot generally adopts an EYE-HAND calibration mode: the industrial robot takes the camera on the arm to change the posture, and shoots the same fixed calibration plate in the camera field of view. The position of the calibration plate under the world coordinate system of the robot is unchanged. And the industrial robot visually identifies the characteristic points on the calibration plate, and performs image processing calculation to obtain the values of the characteristic points in the camera world coordinate system. Through a series of related calculation, a transformation relation matrix T of a camera coordinate system of the industrial camera and an industrial robot tool coordinate system can be obtained. From this relation matrix T, the points on the image pixels can be transformed to points under the base coordinate system of the industrial robot. And the robot guides the corresponding gripper to move to the target gripping point according to the point identified by the image.
The iRVision vision system of FANUC robot corporation, japan, has been based on the coordinate system automatic teaching mode, and the robot can automatically detect and calculate the calibration coordinate system.
However, the existing intelligent robot needs too many feature points to be collected or identified in the process of completing the hand-eye calibration, which results in very complicated calculation, and the existing technology usually needs nine posture changes to complete the hand-eye calibration of the industrial robot, which results in complicated calibration procedures.
[ summary of the invention ]
The invention aims to provide a device and a method for calibrating hands and eyes of a robot so as to reduce the calculation difficulty of a transformation relation from a camera coordinate system to a robot coordinate system.
In order to solve the above technical problems, the present invention provides the following technical solutions.
In one aspect, the present invention provides a hand-eye calibration method for a robot, the method being applied to a six-axis industrial robot, the method including:
determining characteristic points for calibrating camera parameters and coordinate points for calibrating a basic coordinate system of the robot;
identifying the feature point and the coordinate point of the robot in a first posture in a coordinate system of a camera;
identifying the feature point and the coordinate point of the robot in a second posture in a coordinate system of the camera;
and calculating the transformation relation from the camera coordinate system to the robot tool coordinate system according to the coordinate values recognized twice and the coordinate values of the feature points and the coordinate points in the robot basic coordinate system.
Further, the method further comprises:
acquiring a target coordinate value under a camera coordinate system;
and calculating the coordinates of the target coordinate values in the robot tool coordinate system through the transformation relation.
Further, a transformation relation from the camera coordinate system to the robot tool coordinate system is calculated by the following formula:
T6×T×P1=T61×T×P2
wherein T represents a transformation relation matrix from a camera coordinate system to be calculated to a robot tool coordinate system, T6Representing a pose value, P, of the robot in a first pose1A coordinate value T representing a coordinate point of the robot in the first posture in the camera coordinate system61Representing the attitude value, P, of the robot in the second attitude2And the coordinate values of the coordinate points of the robot in the second posture in the camera coordinate system are represented.
Further, the method further comprises:
and acquiring coordinate values of the feature points by carrying out image recognition on the images at the feature points, and acquiring the coordinate values of the coordinate points by carrying out image recognition on the images at the coordinate points.
Further, the number of the feature points is four.
Further, the characteristic points and the coordinate points are arranged on the hand-eye calibration plate in a checkerboard pattern.
In another aspect, the present invention provides a hand-eye calibration device for a robot, the device including:
the point determination module is used for determining characteristic points for calibrating camera parameters and coordinate points for calibrating a robot coordinate system;
the first recognition module is used for recognizing the feature point and the coordinate value of the coordinate point under the first posture of the robot under a camera coordinate system;
the second recognition module is used for recognizing the feature points and the coordinate values of the coordinate points under the second posture of the robot under the camera coordinate system;
and the relation calculation module is used for calculating a transformation relation from the camera coordinate system to the robot coordinate system according to the coordinate values identified twice and the coordinate values of the feature points and the coordinate points in the robot coordinate system.
Further, the apparatus further comprises:
the target acquisition module is used for acquiring a target coordinate value in a camera coordinate system;
and the coordinate calculation module is used for calculating the coordinates of the target coordinate values in the robot tool coordinate system through the transformation relation.
Further, the relation calculation module calculates a transformation relation from the camera coordinate system to the robot coordinate system by the following formula:
T6×T×P1=T61×T×P2
wherein T represents a transformation relation matrix from a camera coordinate system to be calculated to a robot tool coordinate system, T6Representing a pose value, P, of the robot in a first pose1A coordinate value T representing a coordinate point of the robot in the first posture in the camera coordinate system61Representing the attitude value, P, of the robot in the second attitude2And the coordinate values of the coordinate points of the robot in the second posture in the camera coordinate system are represented.
Further, the apparatus further comprises:
and the image recognition unit is used for acquiring the coordinate values of the characteristic points by carrying out image recognition on the images positioned at the characteristic points and acquiring the coordinate values of the coordinate points by carrying out image recognition on the images positioned at the coordinate points.
The method has the advantages that the posture of the six-axis industrial robot is only required to be changed twice, the four characteristic points on the image are collected twice, and the transformation relation from the camera coordinate system to the robot coordinate system can be obtained through calculation, so that the calibration process is simplified, and the calculation difficulty is reduced.
[ description of the drawings ]
Fig. 1 is a flowchart of a hand-eye calibration method of a robot according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method of hand-eye calibration of a robot according to another embodiment of the present invention;
FIG. 3 is a schematic diagram of two pose transformations of a robot according to one embodiment of the present invention;
FIG. 4 is a schematic front view of a hand-eye calibration plate applied to an embodiment of the present invention;
FIG. 5 is a schematic side view of a hand-eye calibration plate applied to an embodiment of the present invention;
fig. 6 is a block diagram illustrating an exemplary configuration of a hand-eye calibration apparatus of a robot according to an embodiment of the present invention;
fig. 7 is a block diagram illustrating an exemplary configuration of a hand-eye calibration apparatus of a robot according to an embodiment of the present invention.
In the drawings, the reference numerals denote the following components:
1-1, a first through hole, 1-2, a second through hole, 2, a threaded hole, 3, a positioning groove, 4 and a hand-eye calibration plate.
[ detailed description ] embodiments
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Example 1
Fig. 1 is a flowchart of a hand-eye calibration method for a robot according to an embodiment of the present invention, which is specifically described below with reference to fig. 1, and is applied to a six-axis industrial robot, as shown in fig. 1, and includes the following steps:
s100, determining characteristic points for calibrating camera parameters and coordinate points for calibrating a robot coordinate system;
s200, identifying the characteristic point and the coordinate point of the robot in the first posture to obtain a coordinate value in a camera coordinate system;
s300, recognizing the feature points and the coordinate points of the robot in the second posture to obtain coordinate values in a camera coordinate system;
and S400, calculating a transformation relation from the camera coordinate system to the robot tool coordinate system according to the coordinate values of the two identifications, the coordinate values of the feature points and the coordinate points in the camera coordinate system and the attitude values of the two industrial robots. The attitude value refers to a coordinate value of the robot in the tool coordinate system in the first attitude or the second attitude.
Fig. 3 is a schematic diagram of two pose transformations of a robot according to an embodiment of the present invention, the two pose transformations of the robot being as shown in fig. 3.
Preferably, the coordinate values of the feature points are acquired by performing image recognition on the images at the feature points, and the coordinate values of the coordinate points are acquired by performing image recognition on the images at the coordinate points.
The number of the characteristic points is four, the characteristic points and the coordinate points are arranged on the HAND-EYE calibration plate IN a checkerboard mode, A, B, C, O is used as characteristic points for calculating a transformation matrix T from an industrial robot coordinate system with an EYE-IN-HAND system to an industrial camera coordinate system, 88 black circles IN 8 rows and 11 columns on an image are used for calibrating internal references and external references of the industrial camera, and an annular mark with five small white dots on the black circles is used for calibrating a tool coordinate system of the industrial robot.
Fig. 4 is a schematic front view showing a structure of a hand-eye calibration plate 4 applied to an embodiment of the present invention, and fig. 5 is a schematic side view showing a structure of a hand-eye calibration plate applied to an embodiment of the present invention, as shown in fig. 4 and 5, the hand-eye calibration plate includes holes provided on a front side and holes provided on a side. The hole arranged on the front surface comprises four annular patterns which are positioned in the middle of the hand-eye calibration plate and used for calibrating characteristic points, and annular patterns which are arranged at the corners of the hand-eye calibration plate and used for calibrating a coordinate system.
The diameter of the circular pattern used for calibrating the characteristic points is larger than the diameter of the excircle of other circular patterns or annular patterns, the diameter of the excircle of the circular pattern used for calibrating the characteristic points is 11-13 mm, and 12mm is optimal. The outer circle diameter of the annular pattern for calibrating the coordinate system is 8-10 mm, wherein 9mm is the best. Further, the diameter of the inner circle of the annular pattern may be, for example, 2 mm. Optionally, the hand-eye calibration plate may further include through holes for fixing the hand-eye calibration plate, which are a first through hole 1-1 and a second through hole 1-2, respectively.
As shown in fig. 4 and 5, the holes for fixing the hand-hole calibration plate include a first through hole 1-1 and a second through hole 1-2 at the front of the hand-hole calibration plate, and a screw hole 2 and a positioning groove 3 at the side of the hand-hole calibration plate. As shown in fig. 5, two threaded holes 2 and two positioning grooves 3 are oppositely arranged.
This example photographs a feature calibration plate with four great circles (A, B, C, O) by changing the pose twice by the EYE-HAND system EYE-IN-HAND industrial robot. And respectively calculating the values of the coordinate points of the two photographing points and the characteristic points of the industrial camera in the camera world coordinate system. The transformation matrix T from the robot tool coordinate system to the industrial camera coordinate system is calculated through the algorithm provided by the patent. And resetting the photographing point of the industrial robot, acquiring a central grasping point of the workpiece in a camera coordinate system through a visual recognition algorithm, and transforming the central grasping point to an industrial robot base coordinate system through coordinate system transformation. And leading the corresponding gripper to move to a specified point by the robot according to the numerical value of the central point of the workpiece under the robot base coordinate.
When a transformation relation matrix T from a camera coordinate system to a robot coordinate system is calculated, firstly, internal parameters of a camera are calibrated, external parameters of the camera are calibrated according to the current posture of the industrial robot, the value of an original point on a calibration plate in the camera coordinate system and the values of other three characteristic points in the camera coordinate system are identified, the robot transforms another posture, and the corresponding external parameters, the value of the original point on the calibration plate in the camera coordinate system and the values of the other three characteristic points in the camera coordinate system are respectively calibrated. And according to the two attitude values of the robot, adding coordinate values of the four characteristic points in the two camera coordinate systems respectively to obtain a transformation matrix T.
The hand eye calibration plate adopting the non-calibration needle touch checkerboard mode is used for calibration, human errors caused by human participation in calibration and the operation time of the calibration method are reduced, the defects that the hand eye calibration precision is low and the operation is complex and the like in the calibration needle touch checkerboard mode are overcome, the intelligentization of the visual industrial robot is improved, and the foundation is laid for the industrial robot designer to research visual displacement and visual control. According to the method, the gesture of the industrial robot is only required to be changed twice, the feature points on the image are collected twice, the transformation relation from the camera coordinate system to the robot tool coordinate system can be obtained through calculation, and the calibration process is simplified.
Example two
Fig. 2 is a flowchart of a method for calibrating a hand and an eye of a robot according to another embodiment of the present invention, and as shown in fig. 2, the method for calibrating a hand and an eye of a robot specifically includes, on the basis of steps S100, S200, S300, and S400:
s500, acquiring a target coordinate value in a camera coordinate system;
and S600, calculating the coordinates of the target coordinate values in the robot coordinate system through the transformation relation.
The method specifically comprises the following steps of calculating a transformation relation from a camera coordinate system to a robot coordinate system by the following formula:
T6×T×P1=T61×T×P2(equation 6-1);
wherein T represents a transformation relation matrix from a camera coordinate system to be calculated to a robot tool coordinate system, T6Representing a pose value, P, of the robot in a first pose1A coordinate value T representing a coordinate point of the robot in the first posture in the camera coordinate system61Representing the attitude value, P, of the robot in the second attitude2And the coordinate values of the coordinate points of the robot in the second posture in the camera coordinate system are represented.
When the robot is in the first posture, the position of the origin O of the checkerboard grid under the camera coordinate system is P1The position of the lower left corner annular pattern is P2And the position of the lower right corner annular pattern is P3When the robot is in the second posture, the position of the origin O of the checkerboard grid under the camera coordinate system is P4The position of the lower left corner annular pattern is P5The position of the right lower foot annular pattern is P6By these known amountsThen, there are:
Figure BDA0000852722600000061
Figure BDA0000852722600000062
the method is obtained by a representation method of an industrial robot kinematics X-Y-Z fixed angular coordinate system:
in this case, T with the equation can be:
Figure BDA0000852722600000071
wherein, the alpha, the beta, the gamma, Ptx,Pty,PtzSix unknowns are to be sought.
Figure BDA0000852722600000072
From the equation 6-1
T6 -1·T6·T·P1=R6 -1·T61·T·P3
Figure BDA0000852722600000073
Then there is r therein11,r12,r13,r14,r21,r22,r23,r24,r31,r32,r33,r34A calculated constant.
Figure BDA0000852722600000074
The robot moves twice, the industrial camera is calibrated twice, the same point of the object characteristic is in two different coordinate values in a camera coordinate system, and the following three equations can be obtained through calculation.
(r11P3z-P1z)cαsβcγ+(r11P3y-P1y)cαsβsγ+r12P3zsαsβcγ+r12P3ysαsβsγ+(r11P3x-P1x)cαcβ+r12P3ycαcγ-r12P3zcαsγ+r12P3xsαcβ+(P1y-r11P3y)sαcγ+(r11P3z-P1z)sαsγ+r13P3zcβcγ+r13P3ycβsγ-r13P3xsβ+(r11-1)Ptx+r12Pty+r13Ptz+r140 (equation 6-1)
r21P3zcαsβcγ+r21P3ycαsβsγ+(r22P3z-P1z)sαsβcγ+(r22P3y-P1y)sαsβsγ+r21P3xcαcβ+(r22P3y-P1y)cαcγ+(P1z-r22P3z)cαsγ+(r22P3x-P1x)sαcβ-r21P3ysαcγ+r21P3zsαsγ+r23P3zcβcγ+r23P3ycβsγ-r23P3xsβ+r21Ptx+(r22-1)Pty+r23Ptz+r240 (equation 6-2)
r31P3zcαsβcγ+r31P3ycαsβsγ+r32P3zsαsβcγ+r32P3ysαsβsγ+r31P3xcαcβ+r32P3ycαcγ-r32P3zcαsγ+r32P3xsαcβ-r31P3ysαcγ+r31P3zsαsγ+(r33P3z-P1z)cβcγ+(r33P3y-P1y)cβsγ+(P1x-r33P3x)sβ+r31Ptx+r32Pty+(r33-1)Ptz+r340 (equation 6-3)
The above three equations can be concluded, and two coordinate values can be obtained by converting a feature point on the hand-eye calibration plate twice by the robot under the camera coordinate system. One feature point on the hand-eye calibration plate may list three equations. At present, the above equation is a general formula, and the product of trigonometric functions of each term can be regarded as an unknown number, Ptx,Pty,PtzAlso considered as unknown, the above equation can be obtained as a sixteen-element linear equation at a time, requiring at least six characteristic points to be solved.
Initially, x is β, y is α c γ, z is α s γ, w is α c γ, t is α s γ, p is α c β, q is α c β, u is β c γ, and v is β s γ. Wherein Ptx,Pty,PtzAlso considered as unknowns, this general equation can be transformed into
k11xy+k12xz+k13xw+k14xt+k15p+k16y+k17z+k18q+k19w+k110t+k111u+k112v+k113x+k114Ptx+k115Pty+k116Ptz+k1170 (equation 6-4)
This equation is a 12-element 2-degree equation in which kxx is constant, which can simplify the reduction of the required feature points from 6 to 4, and one feature point can list three equations (equation 6-1), (equation 6-2), and (equation 6-3).
And (3) eliminating product terms of equations xy, xz, xw and xt by using an addition and subtraction elimination method:
the reduction of (eq 6-4) by the addition and subtraction elimination method results in the general formula (eq 6-5), and a total of 12 equations as in the general formula (eq 6-5) can be obtained.
(o1,2o2,1-o2,2o1,1)p+(o1,3o2,1-o2,3o1,1)y+(o1,4o2,1-o2,4o1,1)z+(o1,5o2,1-o2,5o1,1)q+(o1,6o2,1-o2,6o1,1)w+(o1,7o2,1-o2,7o1,1)t+(o1,8o2,1-o2,8o1,1)u+(o1,9o2,1-o1,9o1,1)v+(o1,10o2,1-o2,10o1,1)x+(o1,11o2,1-o2,11o1,1)Ptx+(o1,12o2,1-o2,12o1,1)Pty+(o1,13o2,1-o2,13o1,1)Ptz+(o1, 14o2,1-o2,14o1,1) 0 (equation 6-5)
Solving a real coefficient equation set and selecting a full-selection principal element Gaussian elimination method:
the 12 unary twelve-degree equations can be solved by adopting a full-selective principal element Gaussian elimination method to obtain x-a1,y=a2,z=a3,w=a4,t=a5,p=a6,q=a7,u=a8,v=a9,Ptx=a10,Pty=a11,Ptz=a12
Solving and analyzing the kinematic X-Y-Z fixed angle of the industrial robot by applying a trigonometric function:
initially, x is β, y is α c γ, z is α s γ, w is α c γ, t is α s γ, p is α c β, q is α c β, u is β c γ, and v is β s γ.
From a to a1=sinβ,a2=cosαcosγ,a3=cosαsinγ,a4=sinαcosγ,a5=sinαsinγ,a6=cosαcosβ,
a7=sinαcosβ,a8=cosβcosγ,a9The method can divide alpha, beta and gamma in the X-Y-Z fixed angle of the kinematics of the industrial robot into different quadrants for calculation, and then the quadrants are calculatedAnd (4) performing selection according to the calculated result.
The embodiment provides the calculation method for calibrating the four characteristic points through twice postures in detail, so that the calibration process is simplified, the number of parameters in the calculation process is reduced, and the calculation difficulty is further reduced.
EXAMPLE III
Fig. 6 is a block diagram illustrating an exemplary structure of a hand-eye calibration device of a robot according to an embodiment of the present invention, and the hand-eye calibration device of the robot according to an embodiment of the present invention is described below with reference to fig. 6, as shown in fig. 6, the hand-eye calibration device 100 of the robot specifically includes:
the point determination module 10 is used for determining characteristic points for calibrating camera parameters and coordinate points for calibrating a robot coordinate system;
the first recognition module 20 is used for recognizing the feature points and the coordinate values of the coordinate points in the first posture of the robot in the camera coordinate system, so as to calculate the camera external parameters;
the second recognition module 30 is configured to recognize the feature point and the coordinate point of the robot in the second posture in the coordinate system of the camera, so as to calculate the camera external parameter;
and the relation calculation module 40 is used for calculating a transformation relation from the camera coordinate system to the robot coordinate system according to the coordinate values identified twice and the coordinate values of the feature points and the coordinate points in the robot coordinate system.
Fig. 7 is a block diagram illustrating an exemplary structure of a hand-eye calibration device of a robot according to an embodiment of the present invention, and as shown in fig. 7, the hand-eye calibration device 100 of the robot further includes:
a target obtaining module 50 for obtaining target coordinate values in a camera coordinate system;
and the coordinate calculation module 60 is used for calculating the coordinates of the target coordinate values in the robot coordinate system through the transformation relation.
The relation calculation module 40 calculates a transformation relation from the robot coordinate system to the camera coordinate system according to the following formula:
T6×T×P1=T61×T×P2
wherein T represents a transformation relation matrix from a camera coordinate system to be calculated to a robot coordinate system, T6Representing a pose value, P, of the robot in a first pose1A coordinate value T representing a coordinate point of the robot in the first posture in the camera coordinate system61Representing the attitude value, P, of the robot in the second attitude2And the coordinate values of the coordinate points of the robot in the second posture in the camera coordinate system are represented.
Further, the hand-eye calibration device 100 of the robot further includes:
and the image recognition unit is used for acquiring the coordinate values of the characteristic points by carrying out image recognition on the images positioned at the characteristic points and acquiring the coordinate values of the coordinate points by carrying out image recognition on the images positioned at the coordinate points.
The invention has the advantages that the hand-eye calibration plate in a non-calibration needle-touch checkerboard mode is adopted for calibration, the human errors caused by the fact that people participate in calibration in a calibration needle-touch checkerboard mode are reduced, the operation time of the calibration method is shortened, the defects of low hand-eye calibration precision, complex operation and the like in a calibration needle-touch checkerboard mode are overcome, and the intellectualization of the visual industrial robot is improved. And the method lays a foundation for industrial robot design manufacturers to research visual displacement and visual control. According to the embodiment, the posture of the industrial robot is only required to be changed twice, the characteristic points on the image are collected twice, the transformation relation from the camera coordinate system to the robot coordinate system can be obtained through calculation, and the calibration process is simplified.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (4)

1. A hand-eye calibration method of a robot is applied to the robot, and is characterized by comprising the following steps:
determining characteristic points for calibrating camera parameters and coordinate points for calibrating the robot;
identifying the feature point and the coordinate point of the robot in a first posture in a coordinate system of a camera;
identifying the feature point and the coordinate point of the robot in a second posture in a coordinate system of the camera;
calculating a transformation relation from a camera coordinate system to a robot tool coordinate system according to the coordinate values identified twice;
the method further comprises the following steps:
acquiring a target coordinate value under a camera coordinate system;
calculating the coordinate of the target coordinate value in the robot tool coordinate system through the transformation relation;
calculating a transformation relation from the camera coordinate system to the robot tool coordinate system by the following formula:
T6×T×P1=T61×T×P2
wherein T represents a transformation relation matrix from a camera coordinate system to be calculated to a robot tool coordinate system, T6Representing a pose value, P, of the robot in a first pose1A coordinate value, T, representing the coordinate point of the robot in the first pose in a camera coordinate system61Representing a pose value, P, of the robot in a second pose2Coordinate values of the coordinate points of the robot in the second posture in a camera coordinate system are represented;
the method further comprises the following steps:
and acquiring the coordinate values of the feature points by carrying out image recognition on the images at the feature points, and acquiring the coordinate values of the coordinate points by carrying out image recognition on the images at the coordinate points.
2. The method for calibrating hands and eyes of a robot according to claim 1, wherein the number of the feature points is four.
3. The hand-eye calibration method of a robot according to claim 1, wherein the feature points and the coordinate points are provided on a hand-eye calibration plate in a checkerboard pattern.
4. A hand-eye calibration device for a robot, the device comprising:
the point determination module is used for determining characteristic points for calibrating camera parameters and coordinate points for calibrating the robot;
the first recognition module is used for recognizing the coordinate values of the feature points and the coordinate points of the robot in the first posture in a camera coordinate system;
the second recognition module is used for recognizing the coordinate values of the feature points and the coordinate points of the robot in the second posture in the camera coordinate system;
the relation calculation module is used for calculating a transformation relation from a camera coordinate system to a robot tool coordinate system according to the coordinate values identified twice;
the device further comprises:
the target acquisition module is used for acquiring a target coordinate value in a camera coordinate system;
the coordinate calculation module is used for calculating the coordinate of the target coordinate value in the robot tool coordinate system through the transformation relation;
the relationship calculation module calculates a transformation relationship from the camera coordinate system to the robot tool coordinate system by the following formula:
T6×T×P1=T61×T×P2
wherein T represents a transformation relation matrix from a camera coordinate system to be calculated to a robot tool coordinate system, T6Representing a pose value, P, of the robot in a first pose1A coordinate value, T, representing the coordinate point of the robot in the first pose in a camera coordinate system61Representing a pose value, P, of the robot in a second pose2Coordinate values of the coordinate points of the robot in the second posture in a camera coordinate system are represented;
the device further comprises:
and the image recognition unit is used for acquiring the coordinate values of the characteristic points by carrying out image recognition on the images at the characteristic points and acquiring the coordinate values of the coordinate points by carrying out image recognition on the images at the coordinate points.
CN201510812064.6A 2015-11-20 2015-11-20 Hand-eye calibration device and method for robot Active CN106767393B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510812064.6A CN106767393B (en) 2015-11-20 2015-11-20 Hand-eye calibration device and method for robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510812064.6A CN106767393B (en) 2015-11-20 2015-11-20 Hand-eye calibration device and method for robot

Publications (2)

Publication Number Publication Date
CN106767393A CN106767393A (en) 2017-05-31
CN106767393B true CN106767393B (en) 2020-01-03

Family

ID=58885052

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510812064.6A Active CN106767393B (en) 2015-11-20 2015-11-20 Hand-eye calibration device and method for robot

Country Status (1)

Country Link
CN (1) CN106767393B (en)

Families Citing this family (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108015770A (en) * 2017-12-07 2018-05-11 王群 Position of manipulator scaling method and system
CN108195354A (en) * 2017-12-21 2018-06-22 长沙长泰机器人有限公司 A kind of vehicle positioning method and vehicle positioning system based on robot
CN108106535B (en) * 2017-12-21 2020-03-27 长沙长泰机器人有限公司 Line laser calibration method and line laser calibration device based on robot
CN108326850B (en) * 2018-01-10 2021-07-06 温州大学 Method and system for robot to accurately move mechanical arm to reach specified position
CN108346165B (en) * 2018-01-30 2020-10-30 深圳市易尚展示股份有限公司 Robot and three-dimensional sensing assembly combined calibration method and device
CN108527360B (en) * 2018-02-07 2021-11-19 唐山英莱科技有限公司 Position calibration system and method
CN110193849B (en) * 2018-02-27 2021-06-29 北京猎户星空科技有限公司 Method and device for calibrating hands and eyes of robot
CN108436909A (en) * 2018-03-13 2018-08-24 南京理工大学 A kind of hand and eye calibrating method of camera and robot based on ROS
CN108627178B (en) * 2018-05-10 2020-10-13 广东拓斯达科技股份有限公司 Robot eye calibration method and system
CN108748146A (en) * 2018-05-30 2018-11-06 武汉库柏特科技有限公司 A kind of Robotic Hand-Eye Calibration method and system
CN108818535B (en) * 2018-07-05 2020-04-10 苏州汉振深目智能科技有限公司 Robot 3D vision hand-eye calibration method
CN108818536B (en) * 2018-07-12 2021-05-14 武汉库柏特科技有限公司 Online offset correction method and device for robot hand-eye calibration
CN108942934A (en) * 2018-07-23 2018-12-07 珠海格力电器股份有限公司 Determine the method and device of hand and eye calibrating
CN109454634B (en) * 2018-09-20 2022-02-22 广东工业大学 Robot hand-eye calibration method based on plane image recognition
CN109129445B (en) * 2018-09-29 2020-11-10 先临三维科技股份有限公司 Hand-eye calibration method, calibration plate, device, equipment and storage medium for mechanical arm
CN109470138A (en) * 2018-10-22 2019-03-15 江苏集萃微纳自动化系统与装备技术研究所有限公司 The On-line Measuring Method of part
CN109465822A (en) * 2018-10-22 2019-03-15 江苏集萃微纳自动化系统与装备技术研究所有限公司 Based on 3D vision hand and eye calibrating method
CN109702738B (en) * 2018-11-06 2021-12-07 深圳大学 Mechanical arm hand-eye calibration method and device based on three-dimensional object recognition
US11911914B2 (en) 2019-01-28 2024-02-27 Cognex Corporation System and method for automatic hand-eye calibration of vision system for robot motion
CN110009689B (en) * 2019-03-21 2023-02-28 上海交通大学 Image data set rapid construction method for collaborative robot pose estimation
CN110238845B (en) * 2019-05-22 2021-12-10 湖南视比特机器人有限公司 Automatic hand-eye calibration method and device for optimal calibration point selection and error self-measurement
CN110202560A (en) * 2019-07-12 2019-09-06 易思维(杭州)科技有限公司 A kind of hand and eye calibrating method based on single feature point
CN112238453B (en) * 2019-07-19 2021-08-31 上银科技股份有限公司 Vision-guided robot arm correction method
CN110695996B (en) * 2019-10-14 2022-05-31 扬州大学 Automatic hand-eye calibration method for industrial robot
CN111002312A (en) * 2019-12-18 2020-04-14 江苏集萃微纳自动化系统与装备技术研究所有限公司 Industrial robot hand-eye calibration method based on calibration ball
CN110977987B (en) * 2019-12-25 2021-07-20 杭州未名信科科技有限公司 Mechanical arm hand-eye calibration method, device and system
CN111890355B (en) * 2020-06-29 2022-01-11 北京大学 Robot calibration method, device and system
CN112091971B (en) * 2020-08-21 2021-10-12 季华实验室 Robot eye calibration method and device, electronic equipment and system
CN112045682B (en) * 2020-09-02 2022-01-25 亿嘉和科技股份有限公司 Calibration method for solid-state area array laser installation
CN113240751B (en) * 2021-05-18 2023-01-17 广州慧炬智能科技有限公司 Calibration method for robot tail end camera
CN113524147B (en) * 2021-08-02 2022-05-24 北京深度奇点科技有限公司 Industrial robot teaching system and method based on 3D camera
CN115724199B (en) * 2022-12-07 2023-08-01 赛那德科技有限公司 Disordered package coordinate calibration method based on camera vision

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102294695A (en) * 2010-06-25 2011-12-28 鸿富锦精密工业(深圳)有限公司 Robot calibration method and calibration system
CN202622812U (en) * 2012-05-25 2012-12-26 山东天泽软控技术股份有限公司 Calibrating plate for visual system of robot
CN104180753A (en) * 2014-07-31 2014-12-03 东莞市奥普特自动化科技有限公司 Rapid calibration method of robot visual system
CN104260112A (en) * 2014-09-18 2015-01-07 西安航天精密机电研究所 Robot hand and eye locating method
CN104354167A (en) * 2014-08-29 2015-02-18 广东正业科技股份有限公司 Robot hand-eye calibration method and device
CN104515502A (en) * 2013-09-28 2015-04-15 沈阳新松机器人自动化股份有限公司 Robot hand-eye stereo vision measurement method
EP2728374B1 (en) * 2012-10-30 2016-12-28 Technische Universität Darmstadt Invention relating to the hand-eye calibration of cameras, in particular depth image cameras

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102294695A (en) * 2010-06-25 2011-12-28 鸿富锦精密工业(深圳)有限公司 Robot calibration method and calibration system
CN202622812U (en) * 2012-05-25 2012-12-26 山东天泽软控技术股份有限公司 Calibrating plate for visual system of robot
EP2728374B1 (en) * 2012-10-30 2016-12-28 Technische Universität Darmstadt Invention relating to the hand-eye calibration of cameras, in particular depth image cameras
CN104515502A (en) * 2013-09-28 2015-04-15 沈阳新松机器人自动化股份有限公司 Robot hand-eye stereo vision measurement method
CN104180753A (en) * 2014-07-31 2014-12-03 东莞市奥普特自动化科技有限公司 Rapid calibration method of robot visual system
CN104354167A (en) * 2014-08-29 2015-02-18 广东正业科技股份有限公司 Robot hand-eye calibration method and device
CN104354167B (en) * 2014-08-29 2016-04-06 广东正业科技股份有限公司 A kind of Robotic Hand-Eye Calibration method and device
CN104260112A (en) * 2014-09-18 2015-01-07 西安航天精密机电研究所 Robot hand and eye locating method
CN104260112B (en) * 2014-09-18 2016-05-18 西安航天精密机电研究所 A kind of Robot Hand-eye localization method

Also Published As

Publication number Publication date
CN106767393A (en) 2017-05-31

Similar Documents

Publication Publication Date Title
CN106767393B (en) Hand-eye calibration device and method for robot
JP6966582B2 (en) Systems and methods for automatic hand-eye calibration of vision systems for robot motion
JP6505729B2 (en) Automatic calibration method for robot system using vision sensor
CN109454634B (en) Robot hand-eye calibration method based on plane image recognition
JP2023052266A (en) System and method for combining machine vision coordinate spaces in guided assembly environment
US9519736B2 (en) Data generation device for vision sensor and detection simulation system
JP6886620B2 (en) Calibration method, calibration system and program
CN109559341B (en) Method and device for generating mechanical arm grabbing scheme
CN113379849B (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
CN109910014B (en) Robot hand-eye calibration method based on neural network
KR20190070875A (en) Calibration and operation of vision-based manipulation systems
CN110238820A (en) Hand and eye calibrating method based on characteristic point
CN110202560A (en) A kind of hand and eye calibrating method based on single feature point
CN112958960B (en) Robot hand-eye calibration device based on optical target
JP2014029664A (en) Image comparison range generation method, positional orientation detection method, image comparison range generation device, positional orientation detection device, robot, robot system, image comparison range generation program and positional orientation detection program
CN114310880A (en) Mechanical arm calibration method and device
CN111583342A (en) Target rapid positioning method and device based on binocular vision
JPH0780790A (en) Three-dimensional object grasping system
CN115446847A (en) System and method for improving 3D eye-hand coordination accuracy of a robotic system
CN107993227B (en) Method and device for acquiring hand-eye matrix of 3D laparoscope
WO2022124232A1 (en) Image processing system and image processing method
KR100784734B1 (en) Error compensation method for the elliptical trajectory of industrial robot
JPH08272451A (en) Calibration method in robot with visual sensor
CN114800520A (en) High-precision hand-eye calibration method
CN114693798A (en) Manipulator control method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant