CN108627178A - Robotic Hand-Eye Calibration method and system - Google Patents

Robotic Hand-Eye Calibration method and system Download PDF

Info

Publication number
CN108627178A
CN108627178A CN201810442460.8A CN201810442460A CN108627178A CN 108627178 A CN108627178 A CN 108627178A CN 201810442460 A CN201810442460 A CN 201810442460A CN 108627178 A CN108627178 A CN 108627178A
Authority
CN
China
Prior art keywords
coordinate
machine
coordinates
robot
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810442460.8A
Other languages
Chinese (zh)
Other versions
CN108627178B (en
Inventor
孙高磊
吴丰礼
罗小军
李相前
张文刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Topstar Technology Co Ltd
Original Assignee
Guangdong Topstar Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Topstar Technology Co Ltd filed Critical Guangdong Topstar Technology Co Ltd
Priority to CN201810442460.8A priority Critical patent/CN108627178B/en
Publication of CN108627178A publication Critical patent/CN108627178A/en
Application granted granted Critical
Publication of CN108627178B publication Critical patent/CN108627178B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass

Landscapes

  • Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Manipulator (AREA)

Abstract

This application involves a kind of Robotic Hand-Eye Calibration method and systems.The end of robot is equipped with camera and standard smelting tool, and characteristic point is provided on fixed calibration object, the method includes:Obtain the first machine coordinates, the second machine coordinates and calibration pixel coordinate, according to the first machine coordinates, the second machine coordinates and calibration pixel coordinate, obtain perspective transformation matrix, the mapping relations that the mobile control coordinate and the position coordinates in camera fields of view of robot are obtained according to perspective transformation matrix, target machine coordinate is converted into according to mapping relations by the pixel coordinate of target point.The mapping relations of the mobile control coordinate and the position coordinates in camera fields of view of acquisition robot can be passed through using this method, it may be implemented that the pixel coordinate of target point is converted into target machine coordinate according to mapping relations, the hand and eye calibrating for completing robot can improve the precision that robot reaches crawl object position.

Description

Robotic Hand-Eye Calibration method and system
Technical field
This application involves technical field of machine vision, more particularly to a kind of Robotic Hand-Eye Calibration method and robot Eye calibration system.
Background technology
When machine vision applications are in robot technology, often camera is fixed on the end effector of robot, when The end effector of robot can measure the relative position of end effector and workpiece, shape in grabbing workpiece by camera At robot " trick " vision system.
Robot " trick " vision system generally requires the focal plane of camera and robot ring flange place plane after installing It is absolute parallel, but can all have different degrees of error in the end of robot and actuator installation process, cause robot " trick " vision system can also have large error so that it is low that robot captures object space precision.
Invention content
Based on this, it is necessary to for the low problem of robot crawl object space precision, provide a kind of Robot Hand-eye mark Determine method and system.
The end of a kind of Robotic Hand-Eye Calibration method, robot is equipped with camera and standard smelting tool, fixed calibration object Characteristic point is provided on body, method includes the following steps:
Obtain the first machine coordinates, the second machine coordinates and calibration pixel coordinate, wherein the first machine coordinates are to calibrate Control coordinate is moved in the end of smelting tool along reference direction alignment characteristics Dian Shi robots, and reference direction in robot by installing Ring flange normal direction, the second machine coordinates be when characteristic point is located at designated position in camera fields of view robot movement Control coordinate, calibration pixel coordinate be when characteristic point random device people is moved to designated position in camera fields of view characteristic point in camera Position coordinates in the visual field;
According to the first machine coordinates, the second machine coordinates and calibration pixel coordinate, perspective transformation matrix is obtained;
The mapping of the mobile control coordinate and the position coordinates in camera fields of view of robot is obtained according to perspective transformation matrix The pixel coordinate of target point is converted into target machine coordinate, wherein the pixel coordinate packet of target point by relationship according to mapping relations The position coordinates of target point in the camera are included, target machine coordinate is that robot is moved to the corresponding mobile control seat of target point Mark.
The end of a kind of Robotic Hand-Eye Calibration system, robot is equipped with camera and standard smelting tool, fixed calibration object Characteristic point is provided on body, system includes:
Coordinate obtaining module, for obtaining the first machine coordinates, the second machine coordinates and calibration pixel coordinate, wherein the One machine coordinates are to move control coordinate, standard side along reference direction alignment characteristics Dian Shi robots in the end of calibration smelting tool The normal direction for the ring flange installed on by robot, the second machine coordinates are to be located in camera fields of view to specify in characteristic point The mobile control coordinate of robot when position, calibration pixel coordinate are to be specified when characteristic point random device people is moved in camera fields of view Position coordinates of the characteristic point in camera fields of view when position;
Perspective transformation matrix acquisition module is used for according to the first machine coordinates, the second machine coordinates and calibration pixel coordinate, Obtain perspective transformation matrix;
Coordinate transformation module, in the mobile control coordinate and camera fields of view according to perspective transformation matrix acquisition robot Position coordinates mapping relations, the pixel coordinate of target point is converted into target machine coordinate according to mapping relations, wherein mesh The pixel coordinate of punctuate includes the position coordinates of target point in the camera, and target machine coordinate is that robot is moved to target point pair The mobile control coordinate answered.
Above-mentioned Robotic Hand-Eye Calibration method and system, in mobile control coordinate and the camera fields of view by obtaining robot Position coordinates mapping relations, may be implemented according to mapping relations by the pixel coordinate of target point be converted into target machine sit Mark, completes the hand and eye calibrating of robot, can improve the precision that robot reaches crawl object position.
Description of the drawings
Fig. 1 is the applied environment figure of Robotic Hand-Eye Calibration method in one embodiment;
Fig. 2 is the flow chart of Robotic Hand-Eye Calibration method in one embodiment;
Fig. 3 is the flow chart that target machine coordinate is converted in one embodiment;
Fig. 4 is the flow chart that perspective transformation matrix obtains in one embodiment;
Fig. 5 is the flow chart that perspective transformation matrix solves in one embodiment;
Fig. 6 is the flow chart of Robotic Hand-Eye Calibration method in another embodiment;
Fig. 7 is the structural schematic diagram of Robotic Hand-Eye Calibration system in one embodiment;
Fig. 8 is the internal structure chart of one embodiment Computer equipment.
Specific implementation mode
It is with reference to the accompanying drawings and embodiments, right in order to make the object, technical solution and advantage of the application be more clearly understood The application is further elaborated.It should be appreciated that specific embodiment described herein is only used to explain the application, not For limiting the application.
Robotic Hand-Eye Calibration method provided by the present application, can be applied in application environment as shown in Figure 1, and Fig. 1 is The applied environment figure of Robotic Hand-Eye Calibration method in one embodiment.Wherein, the end of robot 10 includes that there are one removable Dynamic arm tool 11 can be equipped with camera 20 and standard smelting tool 30, in machine by arm tool 11 in the end of robot 10 The lower section of 10 end of people is placed there are one fixed characteristic point 40, in addition, arm tool 11 is mounted on machine by ring flange 50 The end of device people 10, therefore the direction for calibrating smelting tool 30 is parallel with the normal direction of ring flange 50.
In one embodiment, as shown in Fig. 2, Fig. 2 is the flow chart of Robotic Hand-Eye Calibration method in one embodiment, A kind of Robotic Hand-Eye Calibration method is provided, the application environment being applied in this way in Fig. 1 illustrates, the end of robot End is equipped with camera and standard smelting tool, is provided with characteristic point on fixed calibration object, the above method includes the following steps:
Step S210:Obtain the first machine coordinates, the second machine coordinates and calibration pixel coordinate, wherein the first machine is sat It is designated as moving control coordinate along reference direction alignment characteristics Dian Shi robots in the end of calibration smelting tool, reference direction is machine The normal direction for the ring flange installed on people, the second machine coordinates are to be located at designated position opportunity in camera fields of view in characteristic point The mobile control coordinate of device people, calibration pixel coordinate are special when characteristic point random device people is moved to designated position in camera fields of view Position coordinates of the sign point in camera fields of view.
Standard smelting tool is indirectly by the end that ring flange is mounted on robot, therefore calibrates the placing direction and method of smelting tool The normal direction of blue disk is parallel.When calibrating the end of smelting tool along reference direction alignment characteristics point, it is ensured that the edge of calibration smelting tool Its placing direction alignment characteristics point, and calibrate the end alignment characteristics point of smelting tool.
The mobile control coordinate of robot can be used for spatial position and the control work of the arm tool of recorder people end Tool arm is moved to specified spatial position.Therefore, the first machine coordinates can be aligned in the end of calibration smelting tool along reference direction The mobile control coordinate of robot at this time is acquired when characteristic point, and is stored.Second machine coordinates can be located at camera in characteristic point The mobile control coordinate of robot is acquired in the visual field when designated position, and is stored.
The magazine camera fields of view of robot end can include characteristic point, and the focal length of camera can make spy Sign point can be identified clearly.Calibration pixel coordinate can be moved to designated position in camera fields of view as characteristic point random device people When, in by the camera fields of view acquired in camera, position coordinates of the identification feature point in camera fields of view, and store.
Designated position includes multiple, may include upper left side, top, upper right side, left, centre, the right side in camera fields of view Side, lower left, lower section and lower right.For example, keeping characteristic point stationary in the earth space coordinates, machine can be moved People and make characteristic point respectively appear in upper left side in camera fields of view, top, upper right side, left, centre, right, lower left, Lower section and lower right obtain the second machine coordinates and calibration pixel coordinate under 9 designated positions, can also obtain wherein arbitrary The second machine coordinates and calibration pixel coordinate of number designated position.
Step S220:According to the first machine coordinates, the second machine coordinates and calibration pixel coordinate, perspective transform square is obtained Battle array.
It is appreciated that from another angle when the position of robot is in the first machine coordinates, in order to enable characteristic point It is moved to designated position in camera fields of view, actually need to control robot and robot is made to be moved to the second machine coordinates. Therefore there are perspective transform relationships between the second machine coordinates and calibration pixel coordinate, according to the first machine coordinates, the second machine Coordinate and calibration pixel coordinate can calculate the perspective transformation matrix of reflection perspective transform relationship.
Step S230:The mobile control coordinate of robot is obtained according to perspective transformation matrix and the position in camera fields of view is sat Target mapping relations.
Step S240:The pixel coordinate of target point is converted into target machine coordinate according to mapping relations, wherein target point Pixel coordinate include the position coordinates of target point in the camera, it is corresponding that target machine coordinate is that robot is moved to target point Mobile control coordinate.
According to perspective transformation matrix, the pixel coordinate of target point can be converted into target machine coordinate, in order to realize Robot is moved to target point, completes the hand and eye calibrating of robot.
Above-mentioned Robotic Hand-Eye Calibration method, the mobile control coordinate by obtaining robot and the position in camera fields of view The mapping relations of coordinate may be implemented that the pixel coordinate of target point is converted into target machine coordinate according to mapping relations, complete The hand and eye calibrating of robot can improve the precision that robot reaches crawl object position.
In one embodiment, as shown in figure 3, Fig. 3 is the flow chart that target machine coordinate is converted in one embodiment, root The step of pixel coordinate of target point is converted into target machine coordinate according to mapping relations, includes the following steps:
Step S241:Obtain the pixel coordinate of initial machine coordinate and target point, wherein initial machine coordinate is to obtain Mobile control coordinate where target point when camera fields of view.
While acquiring the pixel coordinate of target point, initial machine coordinate is acquired.
Step S242:According to initial machine coordinate, the pixel coordinate of target point and mapping relations, obtains target machine and sit Mark.
Robot is in the position corresponding to initial machine coordinate, is closed according to the pixel coordinate of target point at this time and mapping System, can obtain target machine coordinate.
Above-mentioned Robotic Hand-Eye Calibration method is obtained according to initial machine coordinate, the pixel coordinate of target point and mapping relations Target machine coordinate is taken, the hand and eye calibrating of robot is completed, the precision that robot reaches crawl object position can be improved.
In one embodiment, as shown in figure 4, Fig. 4 is the flow chart that perspective transformation matrix obtains in one embodiment, root The step of according to the first machine coordinates, the second machine coordinates and demarcating pixel coordinate, obtain perspective transformation matrix, including following step Suddenly:
Step S221:Third machine coordinates are obtained according to the first machine coordinates and the second machine coordinates, wherein third machine Coordinate is the coordinate of plane where the second machine coordinates are projected in the first machine coordinates.
By plane where the first machine coordinates, perspective transformation matrix is solved according to the third machine coordinates of acquisition, is reduced The complexity and operand of calculating, and improve precision.
May include having the coordinate of X-axis, the coordinate of Y-axis and rotational coordinates in first machine coordinates.By the second machine coordinates Plane where being projected in the first machine coordinates, i.e. angle corresponding to the rotational coordinates by the second machine coordinates according to the first machine coordinates Degree is projected, then, the third machine coordinates that the second machine coordinates obtain after projection are the X-axis under the corresponding angle With the coordinate of Y-axis.
For example, the first machine coordinates may include having the coordinate of X-axis, the coordinate of Y-axis and rotational coordinates, wherein rotational coordinates It is 0, then after plane where the second machine coordinates are projected in the first machine coordinates that rotational coordinates is 0, to be interpreted as second Machine coordinates are projected in the plane that rotation angle is 0.The third machine coordinates for projecting acquisition are the coordinate that rotation angle is 0, The complexity and operand of calculating can be reduced, and improves precision.
Step S222:Perspective transformation matrix is obtained according to third machine coordinates and calibration pixel coordinate.
According to one-to-one relationship between third machine coordinates and calibration pixel coordinate, solves and obtain perspective transform square Battle array, the mobile control coordinate and the position coordinates in camera fields of view that the perspective transformation matrix of acquisition can be used for representation robot Mapping relations.
Above-mentioned Robotic Hand-Eye Calibration method obtains third machine according to the first machine coordinates and the second machine coordinates and sits Mark obtains perspective transformation matrix according to third machine coordinates and calibration pixel coordinate, can reduce the complexity and fortune of calculating Calculation amount, and improve precision.
In one embodiment, the step of perspective transformation matrix being obtained according to third machine coordinates and calibration pixel coordinate, Include the following steps:
According to Qi=Pi* A obtains perspective transformation matrix, wherein i is the serial number of designated position, QiFor i-th of third machine Coordinate, PiFor i-th of calibration pixel coordinate, A is perspective transformation matrix.
Above-mentioned Robotic Hand-Eye Calibration method according to third machine coordinates and the determination of calibration pixel coordinate and obtains perspective change Matrix is changed, the complexity and operand of calculating can be reduced, and improves precision.
In one embodiment, the step of obtaining third machine coordinates according to the first machine coordinates and the second machine coordinates, Include the following steps:
According toThird machine coordinates are obtained, In, i is the serial number of designated position, Qi(xi,yi) it is i-th of third machine coordinates, xiAnd yiRespectively i-th of third machine coordinates Abscissa and ordinate, vxi、vyiAnd vriAbscissa, ordinate and the rotational coordinates of respectively i-th second machine coordinates, x0And y0The abscissa and ordinate of respectively the first machine coordinates.
Above-mentioned Robotic Hand-Eye Calibration method, x0And y0The abscissa and ordinate of respectively the first machine coordinates, the first machine The rotational coordinates of device coordinate is 0, and the first machine coordinates for being 0 according to rotational coordinates calculate third machine coordinates, can be obtained same The third machine coordinates that sample rotational coordinates is 0, can reduce the complexity and operand of calculating, and improve precision.
In one embodiment, as shown in figure 5, Fig. 5 is the flow chart that perspective transformation matrix solves in one embodiment, root The step of obtaining perspective transformation matrix according to third machine coordinates and calibration pixel coordinate, includes the following steps:
Step S223:Perspective transform equation is established according to third machine coordinates and calibration pixel coordinate.
According under designated position third machine coordinates and calibration pixel coordinate in, the mobile control of representative robot The relationship between position coordinates in coordinate and camera fields of view, establishes perspective transform equation, is used for subsequent perspective transformation matrix Solution.
Step S224:Perspective transformation matrix is solved according to perspective transform equation.
According to perspective transform equation, determining equilibrium relationships can accurately solve perspective transformation matrix.
Above-mentioned Robotic Hand-Eye Calibration method, the perspective transform etc. established by third machine coordinates and calibration pixel coordinate Formula can accurately solve perspective transformation matrix, in order to subsequently improve the precision that robot reaches crawl object position.
In one embodiment, the step of perspective transform equation being established according to third machine coordinates and calibration pixel coordinate, Include the following steps:
Step S225:It determinesFor third machine coordinates and calibration pixel Transformation of coordinates relationship, wherein i is the serial number of designated position, xiAnd yiThe abscissa of respectively i-th third machine coordinates and Ordinate, ziMeet zi=a13ui+a23wi+a33, uiAnd wiThe abscissa and ordinate of respectively i-th calibration pixel coordinate,For perspective transformation matrix, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively perspective transformation matrix Element.
According in the third machine coordinates and calibration pixel coordinate under designated position, third machine coordinates and calibration picture are obtained Plain transformation of coordinates relationship, transformation relation can be used for representing the mobile control coordinate of robot and the position in camera fields of view is sat Relationship between mark.
Wherein, 1 can regard given reference data as, which can be definite value, and the reference data is in addition to being 1 In addition, other definite values be can also be.It is calculated in addition, can facilitate and simplify when reference data is 1, improves accuracy.
Step S226:It is obtained according to transformation relationWithWherein,WithFor perspective transform equation, i is the serial number of designated position, xiAnd yi The abscissa and ordinate of respectively i-th third machine coordinates, uiAnd wiThe abscissa of respectively i-th calibration pixel coordinate And ordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33The respectively element of perspective transformation matrix.
According to perspective transform equation acquired under multiple designated positions, can accurately solve in perspective transformation matrix Element.
It is important that in addition, by the perspective transformation matrix solved after perspective transform equation, can calculate Angle between plane where plane where ring flange and camera fields of view calculates the focal plane of camera and robot after installing Error between plane where ring flange.Wherein, a13The angle between X-axis and U axis, a can be embodied23Y-axis and W can be embodied Angle between axis, a11The proportionality coefficient between X-axis and U axis, a can be embodied12The ratio system between X-axis and W axis can be embodied Number, a21The proportionality coefficient between Y-axis and U axis, a can be embodied22The proportionality coefficient between Y-axis and W axis, a can be embodied31It can be with Embody the translational movement between X-axis and U axis, a32The translational movement between Y-axis and W axis, a can be embodied33a33It under special circumstances can be with It is 1.X-axis is the reference axis where x, and Y-axis is the reference axis where y, and U axis is the reference axis where u, and W axis is the coordinate where w Axis.
Above-mentioned Robotic Hand-Eye Calibration method, by determining third machine coordinates and demarcating the transformation relation of pixel coordinate, And perspective transform equation is obtained, the element in perspective transformation matrix can be accurately solved, perspective transformation matrix is obtained.
In one embodiment, in mobile control coordinate and camera fields of view that robot is obtained according to perspective transformation matrix The step of mapping relations of position coordinates, include the following steps:
According toObtain the shifting of robot The mapping relations of dynamic control coordinate and the position coordinates in camera fields of view, wherein x and y is respectively that the mobile control of robot is sat Target abscissa and ordinate, u and w are respectively the abscissa and ordinate of the position coordinates in camera fields of view, a11、a12、a13、 a21、a22、a23、a31、a32And a33The respectively element of perspective transformation matrix, xt、ytAnd rtThe respectively horizontal seat of initial machine coordinate Mark, ordinate and rotational coordinates, initial machine coordinate are the mobile control coordinate when obtaining camera fields of view where target point.
Above-mentioned Robotic Hand-Eye Calibration method can accurately obtain machine after the perspective transformation matrix after solution The mapping relations of the mobile control coordinate of people and the position coordinates in camera fields of view, in order to subsequently obtain target machine coordinate, The hand and eye calibrating of robot is completed, and improves the precision that robot reaches crawl object position.
In one embodiment, the pixel coordinate of target point is converted into the step of target machine coordinate according to mapping relations Suddenly, include the following steps:
Obtain the pixel coordinate of initial machine coordinate and target point;
According toObtain target machine Device coordinate, wherein xfAnd yfThe respectively abscissa and ordinate of target machine coordinate, utAnd wtThe respectively pixel of target point The abscissa and ordinate of coordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33The respectively element of perspective transformation matrix, xt、ytAnd rtThe respectively abscissa of initial machine coordinate, ordinate and rotational coordinates.
Above-mentioned Robotic Hand-Eye Calibration method is obtained according to initial machine coordinate, the pixel coordinate of target point and mapping relations Target machine coordinate is taken, the hand and eye calibrating of robot is completed, the precision that robot reaches crawl object position can be improved.
While acquiring the pixel coordinate of target point, initial machine coordinate is acquired.Robot is in initial machine In position corresponding to device coordinate, according to the pixel coordinate and mapping relations of target point at this time, target machine coordinate can be obtained.
In one embodiment, according to perspective transform equation solve perspective transformation matrix the step of after, further include with Lower step:
According toObtain the perspective transformation matrix so that ε value minimums, wherein i For the serial number of designated position, xiAnd yiThe abscissa and ordinate of respectively i-th third machine coordinates, uiAnd wiRespectively i-th The abscissa and ordinate of a calibration pixel coordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively perspective transform square The element of battle array.
Above-mentioned Robotic Hand-Eye Calibration method, by acquisition so that the perspective transformation matrix of ε value minimums, can obtain optimal Change and closest to actual perspective transformation matrix, further decrease error, improve the accuracy of perspective transformation matrix, in order to can To improve the precision that robot reaches crawl object position.
In another embodiment, as shown in fig. 6, Fig. 6 is the stream of Robotic Hand-Eye Calibration method in another embodiment Cheng Tu, Robotic Hand-Eye Calibration method includes the following steps in the present embodiment:
The first machine coordinates and the second machine coordinates of robot are obtained, and obtain the calibration of characteristic point in camera fields of view Pixel coordinate matches corresponding second machine coordinates and calibration pixel coordinate under 9 groups of designated positions.By camera and standard smelting tool peace Mounted in the end of robot, placed there are one fixed characteristic point in the lower section of robot end, adjustment camera makes phase The focal plane of machine can plane where clear identification feature point and characteristic point, this feature point can be easily identified and uniquely, The shape of characteristic point can be circle, circular hole etc..The center of the end alignment characteristics point of smelting tool will be calibrated, and records this opportunity The mobile control coordinate of device people is the first machine coordinates.It is moved by robot and drives camera fields of view variation, and make characteristic point Upper left side, top, upper right side, left, centre, right, lower left, lower section and the bottom right in camera fields of view can be respectively appeared in It is square, while obtaining the calibration pixel coordinate under this 9 designated positions, and acquire this 9 calibration pixel coordinates, obtain corresponding 9 the second machine coordinates;The calibration picture of characteristic point can be handled image and be extracted by acquiring the image of camera at this time Plain coordinate.According to the first machine coordinates, 9 the second machine coordinates are normalized, and obtains corresponding 9 third machines and sits Mark, normalization also refer to, and the second machine coordinates are projected in the plane that rotation angle is 0, acquired third machine is sat It is designated as the coordinate that rotation angle is 0.According toObtain the Three machine coordinates, wherein i is the serial number of designated position, Qi(xi,yi) it is i-th of third machine coordinates, xiAnd yiRespectively i-th The abscissa and ordinate of a third machine coordinates, vxi、vyiAnd vriThe abscissa of respectively i-th second machine coordinates is indulged Coordinate and rotational coordinates, x0And y0The abscissa and ordinate of respectively the first machine coordinates.
Perspective transformation matrix is calculated according to matched second machine coordinates and calibration pixel coordinate.It is sat according to the second machine After mark obtains third machine coordinates, established according to the third machine coordinates of the same designated position and corresponding calibration pixel coordinate Perspective transform transformation model so that Qi=Pi* A, wherein i is the serial number of designated position, QiFor i-th of third machine Coordinate, PiFor i-th of calibration pixel coordinate, A is perspective transformation matrix.By being determined in perspective transform transformation modelFor the transformation relation of third machine coordinates and calibration pixel coordinate, wherein i For the serial number of designated position, xiAnd yiThe abscissa and ordinate of respectively i-th third machine coordinates, ziMeet zi=a13ui+ a23wi+a33, uiAnd wiThe abscissa and ordinate of respectively i-th calibration pixel coordinate,For perspective transform square Battle array, a11、a12、a13、a21、a22、a23、a31、a32And a33The respectively element of perspective transformation matrix.It is obtained according to transformation relationWithWherein,With For perspective transform equation.According to perspective transform equation andIt obtains so that ε values are minimum Perspective transformation matrix.
The pixel coordinate of the target point of camera fields of view is converted to target machine according to perspective transformation matrix Device coordinate completes the hand and eye calibrating of robot in order to realize that robot is moved to target point.According toObtain robot mobile control coordinate with The mapping relations of position coordinates in camera fields of view, wherein x and y be respectively the mobile control coordinate of robot abscissa and Ordinate, u and w are respectively the abscissa and ordinate of the position coordinates in camera fields of view, xt、ytAnd rtRespectively initial machine Abscissa, ordinate and the rotational coordinates of coordinate, initial machine coordinate are the movement when obtaining camera fields of view where target point Control coordinate.According toObtain target machine Device coordinate, wherein xfAnd yfThe respectively abscissa and ordinate of target machine coordinate, utAnd wtThe respectively pixel of target point The abscissa and ordinate of coordinate.
Above-mentioned Robotic Hand-Eye Calibration method proposes the second machine coordinates and the calibration of the characteristic point under 9 designated positions Pixel coordinate can calculate accurate and optimal perspective transformation matrix, and the pixel by target point according to mapping relations may be implemented Coordinate is converted into target machine coordinate, completes the hand and eye calibrating of robot, can improve robot and reach crawl object institute in place The precision set.The perspective transformation matrix by being solved after perspective transform equation is calculated simultaneously, can calculate ring flange Angle between plane where place plane and camera fields of view calculates the focal plane of camera and robot ring flange after installing Error between the plane of place.Wherein, a13The angle between X-axis and U axis, a can be embodied23It can embody between Y-axis and W axis Angle.
It should be understood that although each step in the flow chart of Fig. 2 to 6 is shown successively according to the instruction of arrow, Be these steps it is not that the inevitable sequence indicated according to arrow executes successively.Unless expressly stating otherwise herein, these steps There is no stringent sequences to limit for rapid execution, these steps can execute in other order.Moreover, in Fig. 2 to 6 at least A part of step may include that either these sub-steps of multiple stages or stage are not necessarily in same a period of time to multiple sub-steps Quarter executes completion, but can execute at different times, the execution in these sub-steps or stage be sequentially also not necessarily according to Secondary progress, but can either the sub-step of other steps or at least part in stage in turn or replace with other steps Ground executes.
In one embodiment, as shown in fig. 7, the structure that Fig. 7 is Robotic Hand-Eye Calibration system in one embodiment is shown It is intended to, provides a kind of Robotic Hand-Eye Calibration system, the end of robot is equipped with camera and standard smelting tool, fixed calibration Characteristic point is provided on object, system includes coordinate obtaining module 310, perspective transformation matrix acquisition module 320 and coordinate transform Module 330, wherein:
Coordinate obtaining module 310, for obtaining the first machine coordinates, the second machine coordinates and calibration pixel coordinate, wherein First machine coordinates are to move control coordinate, standard along reference direction alignment characteristics Dian Shi robots in the end of calibration smelting tool By the normal direction for the ring flange installed in robot, the second machine coordinates are to be located in camera fields of view to refer in characteristic point in direction The mobile control coordinate of Zhi Shi robots is positioned, calibration pixel coordinate is when characteristic point random device people is moved to camera fields of view middle finger Position coordinates of the characteristic point in camera fields of view when positioning is set;
Perspective transformation matrix acquisition module 320, for being sat according to the first machine coordinates, the second machine coordinates and calibration pixel Mark obtains perspective transformation matrix;
Coordinate transformation module 330, the mobile control coordinate for obtaining robot according to perspective transformation matrix are regarded with camera The pixel coordinate of target point is converted into target machine coordinate by the mapping relations of the position coordinates of Yezhong according to mapping relations, In, the pixel coordinate of target point includes the position coordinates of target point in the camera, and target machine coordinate is that robot is moved to mesh The corresponding mobile control coordinate of punctuate.
Above-mentioned Robotic Hand-Eye Calibration system, the mobile control coordinate by obtaining robot and the position in camera fields of view The mapping relations of coordinate may be implemented that the pixel coordinate of target point is converted into target machine coordinate according to mapping relations, complete The hand and eye calibrating of robot can improve the precision that robot reaches crawl object position.
In one embodiment, coordinate transformation module 330 is additionally operable to obtain initial machine coordinate and the pixel of target point is sat Mark, wherein initial machine coordinate is the mobile control coordinate when obtaining camera fields of view where target point;It is sat according to initial machine Mark, the pixel coordinate of target point and mapping relations obtain target machine coordinate.
Above-mentioned Robotic Hand-Eye Calibration system is obtained according to initial machine coordinate, the pixel coordinate of target point and mapping relations Target machine coordinate is taken, the hand and eye calibrating of robot is completed, the precision that robot reaches crawl object position can be improved.
In one embodiment, perspective transformation matrix acquisition module 320 is additionally operable to according to the first machine coordinates and the second machine Device coordinate obtains third machine coordinates, wherein third machine coordinates are that the second machine coordinates are projected in where the first machine coordinates The coordinate of plane;Perspective transformation matrix is obtained according to third machine coordinates and calibration pixel coordinate.
Above-mentioned Robotic Hand-Eye Calibration system obtains third machine according to the first machine coordinates and the second machine coordinates and sits Mark obtains perspective transformation matrix according to third machine coordinates and calibration pixel coordinate, can reduce the complexity and fortune of calculating Calculation amount, and improve precision.
In one embodiment, perspective transformation matrix acquisition module 320 is additionally operable to according to Qi=Pi* A obtains perspective transform square Battle array, wherein i is the serial number of designated position, QiFor i-th of third machine coordinates, PiFor i-th of calibration pixel coordinate, A is perspective Transformation matrix.
Above-mentioned Robotic Hand-Eye Calibration system according to third machine coordinates and the determination of calibration pixel coordinate and obtains perspective change Matrix is changed, the complexity and operand of calculating can be reduced, and improves precision.
In one embodiment, perspective transformation matrix acquisition module 320 is additionally operable to basisObtain third machine coordinates, wherein i is designated position Serial number, Qi(xi,yi) it is i-th of third machine coordinates, xiAnd yiThe abscissa of respectively i-th third machine coordinates and vertical seat Mark, vxi、vyiAnd vriAbscissa, ordinate and the rotational coordinates of respectively i-th second machine coordinates, x0And y0Respectively The abscissa and ordinate of one machine coordinates.
Above-mentioned Robotic Hand-Eye Calibration system, x0And y0The abscissa and ordinate of respectively the first machine coordinates, the first machine The rotational coordinates of device coordinate is 0, and the first machine coordinates for being 0 according to rotational coordinates calculate third machine coordinates, can be obtained same The third machine coordinates that sample rotational coordinates is 0, can reduce the complexity and operand of calculating, and improve precision.
In one embodiment, perspective transformation matrix acquisition module 320 is additionally operable to according to third machine coordinates and calibration picture Plain coordinate establishes perspective transform equation;Perspective transformation matrix is solved according to perspective transform equation.
Above-mentioned Robotic Hand-Eye Calibration system, the perspective transform etc. established by third machine coordinates and calibration pixel coordinate Formula can accurately solve perspective transformation matrix, in order to subsequently improve the precision that robot reaches crawl object position.
In one embodiment, perspective transformation matrix acquisition module 320 is additionally operable to determineFor the transformation relation of third machine coordinates and calibration pixel coordinate, wherein i For the serial number of designated position, xiAnd yiThe abscissa and ordinate of respectively i-th third machine coordinates, ziMeet zi=a13ui+ a23wi+a33, uiAnd wiThe abscissa and ordinate of respectively i-th calibration pixel coordinate,For perspective transform square Battle array, a11、a12、a13、a21、a22、a23、a31、a32And a33The respectively element of perspective transformation matrix;It is obtained according to transformation relationWithWherein,With For perspective transform equation, i is the serial number of designated position, xiAnd yiThe abscissa of respectively i-th third machine coordinates and vertical seat Mark, uiAnd wiThe abscissa and ordinate of respectively i-th calibration pixel coordinate, a11、a12、a13、a21、a22、a23、a31、a32With a33The respectively element of perspective transformation matrix.
Above-mentioned Robotic Hand-Eye Calibration system, by determining third machine coordinates and demarcating the transformation relation of pixel coordinate, And perspective transform equation is obtained, the element in perspective transformation matrix can be accurately solved, perspective transformation matrix is obtained.
In one embodiment, coordinate transformation module 330 is additionally operable to basisObtain robot mobile control coordinate with The mapping relations of position coordinates in camera fields of view, wherein x and y be respectively the mobile control coordinate of robot abscissa and Ordinate, u and w are respectively the abscissa and ordinate of the position coordinates in camera fields of view, a11、a12、a13、a21、a22、a23、a31、 a32And a33The respectively element of perspective transformation matrix, xt、ytAnd rtThe respectively abscissa of initial machine coordinate, ordinate and rotation Turn coordinate, initial machine coordinate is the mobile control coordinate when obtaining camera fields of view where target point.
Above-mentioned Robotic Hand-Eye Calibration system can accurately obtain machine after the perspective transformation matrix after solution The mapping relations of the mobile control coordinate of people and the position coordinates in camera fields of view, in order to subsequently obtain target machine coordinate, The hand and eye calibrating of robot is completed, and improves the precision that robot reaches crawl object position.
In one embodiment, coordinate transformation module 330 is additionally operable to obtain initial machine coordinate and the pixel of target point is sat Mark;According toTarget machine is obtained to sit Mark, wherein xfAnd yfThe respectively abscissa and ordinate of target machine coordinate, utAnd wtThe respectively pixel coordinate of target point Abscissa and ordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33The respectively element of perspective transformation matrix, xt、yt And rtThe respectively abscissa of initial machine coordinate, ordinate and rotational coordinates.
Above-mentioned Robotic Hand-Eye Calibration system is obtained according to initial machine coordinate, the pixel coordinate of target point and mapping relations Target machine coordinate is taken, the hand and eye calibrating of robot is completed, the precision that robot reaches crawl object position can be improved.
In one embodiment, perspective transformation matrix acquisition module 320 is additionally operable to basisObtain the perspective transformation matrix so that ε value minimums, wherein i is designated position Serial number, xiAnd yiThe abscissa and ordinate of respectively i-th third machine coordinates, uiAnd wiRespectively i-th calibration pixel is sat Target abscissa and ordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33The respectively element of perspective transformation matrix.
Above-mentioned Robotic Hand-Eye Calibration system, by acquisition so that the perspective transformation matrix of ε value minimums, can obtain optimal Change and closest to actual perspective transformation matrix, further decrease error, improve the accuracy of perspective transformation matrix, in order to can To improve the precision that robot reaches crawl object position.
Specific restriction about Robotic Hand-Eye Calibration system may refer to above for Robotic Hand-Eye Calibration method Restriction, details are not described herein.Modules in above-mentioned Robotic Hand-Eye Calibration system can be fully or partially through software, hard Part and combinations thereof is realized.Above-mentioned each module can be embedded in or in the form of hardware independently of in the processor in computer equipment, It can also in a software form be stored in the memory in computer equipment, the above modules are executed in order to which processor calls Corresponding operation.
In one embodiment, a kind of computer equipment is provided, which can be server, internal junction Composition can be as shown in Figure 8.The computer equipment include the processor connected by system bus, memory, network interface and Database.Wherein, the processor of the computer equipment is for providing calculating and control ability.The memory packet of the computer equipment Include non-volatile memory medium, built-in storage.The non-volatile memory medium is stored with operating system, computer program and data Library.The built-in storage provides environment for the operation of operating system and computer program in non-volatile memory medium.The calculating The database of machine equipment is for storing data.The network interface of the computer equipment is used to pass through network connection with external terminal Communication.To realize a kind of Robotic Hand-Eye Calibration method when the computer program is executed by processor.
It will be understood by those skilled in the art that structure shown in Fig. 8, is only tied with the relevant part of application scheme The block diagram of structure does not constitute the restriction for the computer equipment being applied thereon to application scheme, specific computer equipment May include either combining certain components than more or fewer components as shown in the figure or being arranged with different components.
In one embodiment, a kind of computer equipment is provided, including memory, processor and storage are on a memory And the computer program that can be run on a processor, processor realize following steps when executing computer program:
Obtain the first machine coordinates, the second machine coordinates and calibration pixel coordinate, wherein the first machine coordinates are to calibrate Control coordinate is moved in the end of smelting tool along reference direction alignment characteristics Dian Shi robots, and reference direction in robot by installing Ring flange normal direction, the second machine coordinates be when characteristic point is located at designated position in camera fields of view robot movement Control coordinate, calibration pixel coordinate be when characteristic point random device people is moved to designated position in camera fields of view characteristic point in camera Position coordinates in the visual field;
According to the first machine coordinates, the second machine coordinates and calibration pixel coordinate, perspective transformation matrix is obtained;
The mapping of the mobile control coordinate and the position coordinates in camera fields of view of robot is obtained according to perspective transformation matrix The pixel coordinate of target point is converted into target machine coordinate, wherein the pixel coordinate packet of target point by relationship according to mapping relations The position coordinates of target point in the camera are included, target machine coordinate is that robot is moved to the corresponding mobile control seat of target point Mark.
In one embodiment, following steps are also realized when processor executes computer program:
Obtain the pixel coordinate of initial machine coordinate and target point, wherein initial machine coordinate is in acquisition target point institute Mobile control coordinate in camera fields of view;According to initial machine coordinate, the pixel coordinate of target point and mapping relations, mesh is obtained Mark machine coordinates.
In one embodiment, following steps are also realized when processor executes computer program:
Third machine coordinates are obtained according to the first machine coordinates and the second machine coordinates, wherein third machine coordinates are the The coordinate of plane where two machine coordinates are projected in the first machine coordinates;It is obtained according to third machine coordinates and calibration pixel coordinate Perspective transformation matrix.
In one embodiment, following steps are also realized when processor executes computer program:
According to Qi=Pi* A obtains perspective transformation matrix, wherein i is the serial number of designated position, QiFor i-th of third machine Coordinate, PiFor i-th of calibration pixel coordinate, A is perspective transformation matrix.
In one embodiment, following steps are also realized when processor executes computer program:
According toThird machine coordinates are obtained, In, i is the serial number of designated position, Qi(xi,yi) it is i-th of third machine coordinates, xiAnd yiRespectively i-th of third machine coordinates Abscissa and ordinate, vxi、vyiAnd vriAbscissa, ordinate and the rotational coordinates of respectively i-th second machine coordinates, x0And y0The abscissa and ordinate of respectively the first machine coordinates.
In one embodiment, following steps are also realized when processor executes computer program:
Perspective transform equation is established according to third machine coordinates and calibration pixel coordinate;It is solved according to perspective transform equation saturating Depending on transformation matrix.
In one embodiment, following steps are also realized when processor executes computer program:
It determinesFor the transformation of third machine coordinates and calibration pixel coordinate Relationship, wherein i is the serial number of designated position, xiAnd yiThe abscissa and ordinate of respectively i-th third machine coordinates, ziIt is full Sufficient zi=a13ui+a23wi+a33, uiAnd wiThe abscissa and ordinate of respectively i-th calibration pixel coordinate,For Perspective transformation matrix, a11、a12、a13、a21、a22、a23、a31、a32And a33The respectively element of perspective transformation matrix;
It is obtained according to transformation relationWithWherein,WithFor perspective transform equation, i is the serial number of designated position, xiAnd yi The abscissa and ordinate of respectively i-th third machine coordinates, uiAnd wiThe abscissa of respectively i-th calibration pixel coordinate And ordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33The respectively element of perspective transformation matrix.
In one embodiment, following steps are also realized when processor executes computer program:
Obtain the pixel coordinate of initial machine coordinate and target point;According toObtain target machine coordinate, wherein xfAnd yfThe respectively abscissa and ordinate of target machine coordinate, utAnd wtThe respectively abscissa of the pixel coordinate of target point And ordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33The respectively element of perspective transformation matrix, xt、ytAnd rtRespectively For the abscissa, ordinate and rotational coordinates of initial machine coordinate.
In one embodiment, following steps are also realized when processor executes computer program:
According toObtain the perspective transformation matrix so that ε value minimums, wherein i For the serial number of designated position, xiAnd yiThe abscissa and ordinate of respectively i-th third machine coordinates, uiAnd wiRespectively i-th The abscissa and ordinate of a calibration pixel coordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively perspective transform square The element of battle array.
In one embodiment, a kind of computer readable storage medium is provided, computer program is stored thereon with, is calculated Machine program realizes following steps when being executed by processor:
Obtain the first machine coordinates, the second machine coordinates and calibration pixel coordinate, wherein the first machine coordinates are to calibrate Control coordinate is moved in the end of smelting tool along reference direction alignment characteristics Dian Shi robots, and reference direction in robot by installing Ring flange normal direction, the second machine coordinates be when characteristic point is located at designated position in camera fields of view robot movement Control coordinate, calibration pixel coordinate be when characteristic point random device people is moved to designated position in camera fields of view characteristic point in camera Position coordinates in the visual field;
According to the first machine coordinates, the second machine coordinates and calibration pixel coordinate, perspective transformation matrix is obtained;
The mapping of the mobile control coordinate and the position coordinates in camera fields of view of robot is obtained according to perspective transformation matrix The pixel coordinate of target point is converted into target machine coordinate, wherein the pixel coordinate packet of target point by relationship according to mapping relations The position coordinates of target point in the camera are included, target machine coordinate is that robot is moved to the corresponding mobile control seat of target point Mark.
In one embodiment, following steps are also realized when computer program is executed by processor:
Obtain the pixel coordinate of initial machine coordinate and target point, wherein initial machine coordinate is in acquisition target point institute Mobile control coordinate in camera fields of view;According to initial machine coordinate, the pixel coordinate of target point and mapping relations, mesh is obtained Mark machine coordinates.
In one embodiment, following steps are also realized when computer program is executed by processor:
Third machine coordinates are obtained according to the first machine coordinates and the second machine coordinates, wherein third machine coordinates are the The coordinate of plane where two machine coordinates are projected in the first machine coordinates;It is obtained according to third machine coordinates and calibration pixel coordinate Perspective transformation matrix.
In one embodiment, following steps are also realized when computer program is executed by processor:
According to Qi=Pi* A obtains perspective transformation matrix, wherein i is the serial number of designated position, QiFor i-th of third machine Coordinate, PiFor i-th of calibration pixel coordinate, A is perspective transformation matrix.
In one embodiment, following steps are also realized when computer program is executed by processor:
According toThird machine coordinates are obtained, In, i is the serial number of designated position, Qi(xi,yi) it is i-th of third machine coordinates, xiAnd yiRespectively i-th of third machine coordinates Abscissa and ordinate, vxi、vyiAnd vriAbscissa, ordinate and the rotational coordinates of respectively i-th second machine coordinates, x0And y0The abscissa and ordinate of respectively the first machine coordinates.
In one embodiment, following steps are also realized when computer program is executed by processor:
Perspective transform equation is established according to third machine coordinates and calibration pixel coordinate;It is solved according to perspective transform equation saturating Depending on transformation matrix.
In one embodiment, following steps are also realized when computer program is executed by processor:
It determinesFor the transformation of third machine coordinates and calibration pixel coordinate Relationship, wherein i is the serial number of designated position, xiAnd yiThe abscissa and ordinate of respectively i-th third machine coordinates, ziIt is full Sufficient zi=a13ui+a23wi+a33, uiAnd wiThe abscissa and ordinate of respectively i-th calibration pixel coordinate,For Perspective transformation matrix, a11、a12、a13、a21、a22、a23、a31、a32And a33The respectively element of perspective transformation matrix;
It is obtained according to transformation relationWithWherein,WithFor perspective transform equation, i is the serial number of designated position, xiAnd yi The abscissa and ordinate of respectively i-th third machine coordinates, uiAnd wiThe abscissa of respectively i-th calibration pixel coordinate And ordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33The respectively element of perspective transformation matrix.
In one embodiment, following steps are also realized when computer program is executed by processor:
Obtain the pixel coordinate of initial machine coordinate and target point;According toObtain target machine coordinate, wherein xfAnd yfThe respectively abscissa and ordinate of target machine coordinate, utAnd wtThe respectively abscissa of the pixel coordinate of target point And ordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33The respectively element of perspective transformation matrix, xt、ytAnd rtRespectively For the abscissa, ordinate and rotational coordinates of initial machine coordinate.
In one embodiment, following steps are also realized when computer program is executed by processor:
According toObtain the perspective transformation matrix so that ε value minimums, wherein i For the serial number of designated position, xiAnd yiThe abscissa and ordinate of respectively i-th third machine coordinates, uiAnd wiRespectively i-th The abscissa and ordinate of a calibration pixel coordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33Respectively perspective transform square The element of battle array.
One of ordinary skill in the art will appreciate that realizing all or part of flow in above-described embodiment method, being can be with Relevant hardware is instructed to complete by computer program, the computer program can be stored in a non-volatile computer In read/write memory medium, the computer program is when being executed, it may include such as the flow of the embodiment of above-mentioned each method.Wherein, Any reference to memory, storage, database or other media used in each embodiment provided herein, Including non-volatile and/or volatile memory.Nonvolatile memory may include read-only memory (ROM), programming ROM (PROM), electrically programmable ROM (EPROM), electrically erasable ROM (EEPROM) or flash memory.Volatile memory may include Random access memory (RAM) or external cache.
Each technical characteristic of above example can be combined arbitrarily, to keep description succinct, not to above-described embodiment In each technical characteristic it is all possible combination be all described, as long as however, the combination of these technical characteristics be not present lance Shield is all considered to be the range of this specification record.
The several embodiments of the application above described embodiment only expresses, the description thereof is more specific and detailed, but simultaneously It cannot therefore be construed as limiting the scope of the patent.It should be pointed out that coming for those of ordinary skill in the art It says, under the premise of not departing from the application design, various modifications and improvements can be made, these belong to the protection of the application Range.Therefore, the protection domain of the application patent should be determined by the appended claims.

Claims (13)

1. a kind of Robotic Hand-Eye Calibration method, which is characterized in that the end of robot is equipped with camera and standard smelting tool, fixed Calibration object on be provided with characteristic point, the described method comprises the following steps:
Obtain the first machine coordinates, the second machine coordinates and calibration pixel coordinate, wherein first machine coordinates are described The end of calibration smelting tool when being directed at the characteristic point along reference direction the robot move control coordinate, the reference direction Normal direction by the ring flange installed in the robot, second machine coordinates are in the characteristic point positioned at described In camera fields of view when designated position the robot mobile control coordinate, the calibration pixel coordinate be when the characteristic point with The robot is moved to position coordinates of the characteristic point in camera fields of view when designated position in the camera fields of view;
According to first machine coordinates, second machine coordinates and calibration pixel coordinate, perspective transformation matrix is obtained;
The mobile control coordinate of the robot is obtained according to the perspective transformation matrix and the position in the camera fields of view is sat The pixel coordinate of target point is converted into target machine coordinate, wherein the mesh by target mapping relations according to the mapping relations The pixel coordinate of punctuate includes the target point in the magazine position coordinates, and the target machine coordinate is the machine People is moved to the corresponding mobile control coordinate of the target point.
2. Robotic Hand-Eye Calibration method according to claim 1, which is characterized in that described to be incited somebody to action according to the mapping relations The pixel coordinate of target point is converted into the step of target machine coordinate, includes the following steps:
Obtain the pixel coordinate of initial machine coordinate and the target point, wherein the initial machine coordinate is described in acquisition Mobile control coordinate where target point when camera fields of view;
According to the initial machine coordinate, the pixel coordinate of the target point and the mapping relations, the target machine is obtained Coordinate.
3. Robotic Hand-Eye Calibration method according to claim 1, which is characterized in that described to be sat according to first machine The step of mark, second machine coordinates and calibration pixel coordinate, acquisition perspective transformation matrix, include the following steps:
Third machine coordinates are obtained according to first machine coordinates and second machine coordinates, wherein the third machine Coordinate is the coordinate of plane where second machine coordinates are projected in first machine coordinates;
The perspective transformation matrix is obtained according to the third machine coordinates and the calibration pixel coordinate.
4. Robotic Hand-Eye Calibration method according to claim 3, which is characterized in that described to be sat according to the third machine The step of mark and the calibration pixel coordinate obtain the perspective transformation matrix, includes the following steps:
According to Qi=Pi* A obtains the perspective transformation matrix, wherein i is the serial number of the designated position, QiFor i-th of third Machine coordinates, PiFor i-th of calibration pixel coordinate, A is the perspective transformation matrix.
5. Robotic Hand-Eye Calibration method according to claim 3, which is characterized in that described to be sat according to first machine The step of mark and second machine coordinates obtain third machine coordinates, includes the following steps:
According toThe third machine coordinates are obtained, In, the i is the serial number of designated position, Qi(xi,yi) it is i-th of third machine coordinates, xiAnd yiRespectively i-th of third machine The abscissa and ordinate of coordinate, vxi、vyiAnd vriAbscissa, ordinate and the rotation of respectively i-th second machine coordinates Coordinate, x0And y0The abscissa and ordinate of respectively described first machine coordinates.
6. Robotic Hand-Eye Calibration method according to claim 3, which is characterized in that described to be sat according to the third machine The step of mark and the calibration pixel coordinate obtain the perspective transformation matrix, includes the following steps:
Perspective transform equation is established according to the third machine coordinates and the calibration pixel coordinate;
The perspective transformation matrix is solved according to the perspective transform equation.
7. Robotic Hand-Eye Calibration method according to claim 6, which is characterized in that described to be sat according to the third machine The step of mark and the calibration pixel coordinate establish perspective transform equation, includes the following steps:
It determinesFor the third machine coordinates and the calibration pixel coordinate Transformation relation, wherein i is the serial number of the designated position, xiAnd yiThe abscissa of respectively i-th third machine coordinates and vertical Coordinate, ziMeet zi=a13ui+a23wi+a33, uiAnd wiThe abscissa and ordinate of respectively i-th calibration pixel coordinate,For the perspective transformation matrix, a11、a12、a13、a21、a22、a23、a31、a32And a33The respectively described perspective becomes Change the element of matrix;
It is obtained according to the transformation relationWithWherein,WithFor the perspective transform equation, i is the sequence of the designated position Number, xiAnd yiThe abscissa and ordinate of respectively i-th third machine coordinates, uiAnd wiRespectively i-th calibration pixel coordinate Abscissa and ordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33The element of the respectively described perspective transformation matrix.
8. Robotic Hand-Eye Calibration method according to claim 7, which is characterized in that described according to the perspective transform square Battle array obtains the step of mapping relations of mobile control coordinate and the position coordinates in the camera fields of view of the robot, including Following steps:
According toObtain the shifting of the robot The mapping relations of dynamic control coordinate and the position coordinates in the camera fields of view, wherein x and y is respectively the shifting of the robot The abscissa and ordinate of dynamic control coordinate, u and w are respectively the abscissa of the position coordinates in the camera fields of view and vertical seat Mark, a11、a12、a13、a21、a22、a23、a31、a32And a33The element of the respectively described perspective transformation matrix, xt、ytAnd rtRespectively Abscissa, ordinate and the rotational coordinates of initial machine coordinate, the initial machine coordinate are where obtaining the target point Mobile control coordinate when camera fields of view.
9. Robotic Hand-Eye Calibration method according to claim 8, which is characterized in that described to be incited somebody to action according to the mapping relations The pixel coordinate of target point is converted into the step of target machine coordinate, includes the following steps:
Obtain the pixel coordinate of the initial machine coordinate and the target point;
According toObtain the target machine Device coordinate, wherein xfAnd yfThe abscissa and ordinate of the respectively described target machine coordinate, utAnd wtThe respectively described target The abscissa and ordinate of the pixel coordinate of point, a11、a12、a13、a21、a22、a23、a31、a32And a33The respectively described perspective transform The element of matrix, xt、ytAnd rtAbscissa, ordinate and the rotational coordinates of the respectively described initial machine coordinate.
10. Robotic Hand-Eye Calibration method according to claim 7, which is characterized in that become according to the perspective described It changes after the step of equation solves the perspective transformation matrix, it is further comprising the steps of:
According toObtain the perspective transformation matrix so that ε value minimums, wherein i is institute State the serial number of designated position, xiAnd yiThe abscissa and ordinate of respectively i-th third machine coordinates, uiAnd wiRespectively i-th The abscissa and ordinate of a calibration pixel coordinate, a11、a12、a13、a21、a22、a23、a31、a32And a33The respectively described perspective becomes Change the element of matrix.
11. a kind of Robotic Hand-Eye Calibration system, which is characterized in that the end of robot is equipped with camera and standard smelting tool, Gu It is provided with characteristic point on fixed calibration object, the system comprises:
Coordinate obtaining module, for obtaining the first machine coordinates, the second machine coordinates and calibration pixel coordinate, wherein described the One machine coordinates are that the robot moves control when the end of the calibration smelting tool is directed at the characteristic point along reference direction Coordinate processed, the reference direction are by the normal direction for the ring flange installed in the robot, second machine coordinates The mobile control coordinate of the robot, the calibration pixel when the characteristic point is located at designated position in the camera fields of view Coordinate be when the characteristic point is moved to designated position in the camera fields of view with the robot characteristic point described Position coordinates in camera fields of view;
Perspective transformation matrix acquisition module, for according to first machine coordinates, second machine coordinates and calibration pixel Coordinate obtains perspective transformation matrix;
Coordinate transformation module, for obtaining the mobile control coordinate of the robot and the phase according to the perspective transformation matrix The pixel coordinate of target point is converted into target machine by the mapping relations of the position coordinates in the machine visual field according to the mapping relations Coordinate, wherein the pixel coordinate of the target point includes the target point in the magazine position coordinates, the target machine Device coordinate is that the robot is moved to the corresponding mobile control coordinate of the target point.
12. a kind of computer equipment, including memory, processor and storage are on a memory and the meter that can run on a processor Calculation machine program, which is characterized in that the processor realizes any one of claims 1 to 10 institute when executing the computer program The step of Robotic Hand-Eye Calibration method stated.
13. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the computer program The step of Robotic Hand-Eye Calibration method described in any one of claims 1 to 10 is realized when being executed by processor.
CN201810442460.8A 2018-05-10 2018-05-10 Robot eye calibration method and system Active CN108627178B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810442460.8A CN108627178B (en) 2018-05-10 2018-05-10 Robot eye calibration method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810442460.8A CN108627178B (en) 2018-05-10 2018-05-10 Robot eye calibration method and system

Publications (2)

Publication Number Publication Date
CN108627178A true CN108627178A (en) 2018-10-09
CN108627178B CN108627178B (en) 2020-10-13

Family

ID=63692516

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810442460.8A Active CN108627178B (en) 2018-05-10 2018-05-10 Robot eye calibration method and system

Country Status (1)

Country Link
CN (1) CN108627178B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109702738A (en) * 2018-11-06 2019-05-03 深圳大学 A kind of mechanical arm hand and eye calibrating method and device based on Three-dimension object recognition
CN109737871A (en) * 2018-12-29 2019-05-10 南方科技大学 A kind of scaling method of the relative position of three-dimension sensor and mechanical arm
CN110930442A (en) * 2019-11-26 2020-03-27 广东技术师范大学 Method and device for determining positions of key points in robot hand-eye calibration based on calibration block
CN112308931A (en) * 2020-11-02 2021-02-02 深圳市泰沃德技术有限公司 Camera calibration method and device, computer equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9457470B2 (en) * 2013-04-05 2016-10-04 Abb Technology Ltd Robot system and method for calibration
CN106767393A (en) * 2015-11-20 2017-05-31 沈阳新松机器人自动化股份有限公司 The hand and eye calibrating apparatus and method of robot
CN108122257A (en) * 2016-11-28 2018-06-05 沈阳新松机器人自动化股份有限公司 A kind of Robotic Hand-Eye Calibration method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9457470B2 (en) * 2013-04-05 2016-10-04 Abb Technology Ltd Robot system and method for calibration
CN106767393A (en) * 2015-11-20 2017-05-31 沈阳新松机器人自动化股份有限公司 The hand and eye calibrating apparatus and method of robot
CN108122257A (en) * 2016-11-28 2018-06-05 沈阳新松机器人自动化股份有限公司 A kind of Robotic Hand-Eye Calibration method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
WEN-LONG LI等: "Hand–Eye Calibration in Visually-Guided", 《IEEE TRANSACTIONS ON CYBERNETICS》 *
万晓峰: "基于SCARA机械手的手眼标定", 《电子世界》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109702738A (en) * 2018-11-06 2019-05-03 深圳大学 A kind of mechanical arm hand and eye calibrating method and device based on Three-dimension object recognition
CN109702738B (en) * 2018-11-06 2021-12-07 深圳大学 Mechanical arm hand-eye calibration method and device based on three-dimensional object recognition
CN109737871A (en) * 2018-12-29 2019-05-10 南方科技大学 A kind of scaling method of the relative position of three-dimension sensor and mechanical arm
CN109737871B (en) * 2018-12-29 2020-11-17 南方科技大学 Calibration method for relative position of three-dimensional sensor and mechanical arm
CN110930442A (en) * 2019-11-26 2020-03-27 广东技术师范大学 Method and device for determining positions of key points in robot hand-eye calibration based on calibration block
CN112308931A (en) * 2020-11-02 2021-02-02 深圳市泰沃德技术有限公司 Camera calibration method and device, computer equipment and storage medium
CN112308931B (en) * 2020-11-02 2021-09-17 深圳市泰沃德技术有限公司 Camera calibration method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN108627178B (en) 2020-10-13

Similar Documents

Publication Publication Date Title
CN108627178A (en) Robotic Hand-Eye Calibration method and system
CN108453701A (en) Control method, the method for teaching robot and the robot system of robot
CN109285190B (en) Object positioning method and device, electronic equipment and storage medium
CN110221276A (en) Scaling method, device, computer equipment and the storage medium of laser radar
CN110298888B (en) Camera calibration method based on single-axis high-precision displacement platform
CN110969662B (en) Method and device for calibrating internal parameters of fish-eye camera, calibration device controller and system
CN107598977B (en) Method and system for realizing automatic robot teaching by using vision and laser range finder
CN111070199A (en) Hand-eye calibration assessment method and robot
CN109493389B (en) Camera calibration method and system based on deep learning
JP2005300230A (en) Measuring instrument
CN109648568B (en) Robot control method, system and storage medium
CN105451461A (en) PCB board positioning method based on SCARA robot
CN109952176B (en) Robot calibration method and system, robot and storage medium
CN111152223A (en) Full-automatic robot hand-eye calibration method
CN112767493B (en) Machine vision calibration method for kinematic parameters of Stewart platform
CN112809668B (en) Method, system and terminal for automatic hand-eye calibration of mechanical arm
EP3693697B1 (en) Method for calibrating a 3d measurement arrangement and 3d measurement arrangement
US20210008724A1 (en) Method and apparatus for managing robot system
CN111604904B (en) Robot positioning calibration method and device and electronic equipment
CN112489133A (en) Calibration method, device and equipment of hand-eye system
CN112743546B (en) Robot hand-eye calibration pose selection method and device, robot system and medium
CN114677429B (en) Positioning method and device of manipulator, computer equipment and storage medium
CN110853101A (en) Camera position calibration method and device and computer readable storage medium
CN109389645B (en) Camera self-calibration method and system, camera, robot and cloud server
CN111145268B (en) Video registration method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20181009

Assignee: Guangdong Rongtong Financial Leasing Co.,Ltd.

Assignor: GUANGDONG TOPSTAR TECHNOLOGY Co.,Ltd.

Contract record no.: X2022980013974

Denomination of invention: Robot hand-eye calibration method and system

Granted publication date: 20201013

License type: Exclusive License

Record date: 20220902

EE01 Entry into force of recordation of patent licensing contract
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Robot hand-eye calibration method and system

Effective date of registration: 20220906

Granted publication date: 20201013

Pledgee: Guangdong Rongtong Financial Leasing Co.,Ltd.

Pledgor: GUANGDONG TOPSTAR TECHNOLOGY Co.,Ltd.

Registration number: Y2022980014594

PC01 Cancellation of the registration of the contract for pledge of patent right
PC01 Cancellation of the registration of the contract for pledge of patent right

Date of cancellation: 20231101

Granted publication date: 20201013

Pledgee: Guangdong Rongtong Financial Leasing Co.,Ltd.

Pledgor: GUANGDONG TOPSTAR TECHNOLOGY Co.,Ltd.

Registration number: Y2022980014594

EC01 Cancellation of recordation of patent licensing contract
EC01 Cancellation of recordation of patent licensing contract

Assignee: Guangdong Rongtong Financial Leasing Co.,Ltd.

Assignor: GUANGDONG TOPSTAR TECHNOLOGY Co.,Ltd.

Contract record no.: X2022980013974

Date of cancellation: 20231124