CN111801198A - Hand-eye calibration method, system and computer storage medium - Google Patents
Hand-eye calibration method, system and computer storage medium Download PDFInfo
- Publication number
- CN111801198A CN111801198A CN201880088580.0A CN201880088580A CN111801198A CN 111801198 A CN111801198 A CN 111801198A CN 201880088580 A CN201880088580 A CN 201880088580A CN 111801198 A CN111801198 A CN 111801198A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- camera
- view
- hand
- robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Manipulator (AREA)
Abstract
The application provides a hand-eye calibration method, a hand-eye calibration system and a computer storage medium, wherein the method comprises the following steps: determining a reference object coordinate system, an origin of a robot rotating shaft coordinate system and an origin of a camera equipment coordinate system; acquiring a first view of a plurality of reference objects along a first direction through a camera device; the first direction is a direction parallel to a connecting line from the original point of the robot rotating shaft coordinate system to the original point of the camera equipment coordinate system; acquiring a second view of the plurality of reference objects along a second direction through the camera device; the second direction is a direction which is not parallel to a connecting line from the original point of the robot rotating shaft coordinate system to the original point of the camera equipment coordinate system; and obtaining a hand-eye calibration result based on the first view, the second view and the motion information of the camera equipment. Through the mode, the camera equipment is only required to be controlled to move around the reference object to acquire the plurality of first views and the plurality of second views of the reference object, so that the workload and the calibration process are reduced, the calibration efficiency is high, and the accuracy of hand-eye calibration is ensured.
Description
The present application relates to the field of robotics, and in particular, to a hand-eye calibration method, system, and computer storage medium.
With the development of artificial intelligence technology and the change of social demands, robots have been widely used in various industries. In the industrial application field, the robot can realize various functions through self power and control capacity, and the important part for supporting the technology is a visual system of the robot, the visual system is used for acquiring images, and the robot can control an actuator to execute actions such as machining, installation and the like.
The hand-eye calibration is the first step of the application of all robots in cooperation with the machine vision, and the robots can work with the vision system in a cooperative mode only when the robots and the vision system can convert own coordinate systems into the same world coordinate system.
The existing method is that a robot and a vision system respectively calibrate the transformation relation of a coordinate system of the robot and the coordinate system of the vision system relative to a world coordinate system, and the method needs an operator to manually and accurately move a mechanical arm to at least three specific positions for calibration, so that the accuracy is unstable, and the calibration operation is complex for accurately achieving the specific positions.
[ summary of the invention ]
The technical problem solved by the application is to provide a hand-eye calibration method, a hand-eye calibration system and a computer storage medium, wherein hand-eye calibration is performed by acquiring reference object pictures along different directions through camera equipment carried on a robot, so that the calibration process is greatly simplified, and the calibration accuracy is improved.
In order to solve the above problem, the present application provides a hand-eye calibration method, including the following steps: determining a reference object coordinate system, an origin of a robot rotating shaft coordinate system and an origin of a camera equipment coordinate system; acquiring a first view of a plurality of reference objects along a first direction through a camera device; the camera shooting device is installed on a robot rotating shaft, the motion of the camera shooting device is controlled by the robot rotating shaft, the camera shooting device moves in a translation mode along a first direction, the first direction is parallel to a connecting line from an original point of a robot rotating shaft coordinate system to an original point of a camera shooting device coordinate system, and at most 3 shooting planes of first views are in the same plane; acquiring a second view of the plurality of reference objects along a second direction through the camera device; the camera shooting equipment moves in a translational mode along a second direction, and the second direction is a direction which is not parallel to a connecting line from the original point of the robot rotating shaft coordinate system to the original point of the camera shooting equipment coordinate system; and obtaining a hand-eye calibration result based on the first view, the second view and the motion information of the camera equipment.
In order to solve the above problem, the present application further provides a hand-eye calibration system, including: an image pickup apparatus having an image pickup apparatus coordinate system; the robot is provided with a mechanical front arm, the mechanical front arm is provided with a rotating shaft, the camera shooting equipment is connected with the rotating shaft, the rotating shaft controls the camera shooting equipment to do rigid motion, and the robot is provided with a robot rotating shaft coordinate system; a reference object having a reference object coordinate system; a processing unit for acquiring a first view of a plurality of reference objects in a first direction by an image pickup apparatus; the camera shooting equipment moves in a translation mode along a first direction and shoots a reference object image, the first direction is parallel to a direction from the original point of a robot rotating shaft coordinate system to a connecting line of the original point of a camera shooting equipment coordinate system, and at most 3 shooting planes of first views are in the same plane; acquiring a second view of the plurality of reference objects along a second direction through the camera device; the camera shooting equipment moves in a translational mode along a second direction and shoots a reference object image, and the second direction is a direction which is not parallel to a connecting line from the original point of the robot rotating shaft coordinate system to the original point of the camera shooting equipment coordinate system; and obtaining a hand-eye calibration result based on the first view, the second view and the motion information of the camera equipment.
To solve the above problem, the present application further provides a computer storage medium, wherein the computer storage medium has program data stored thereon, and when the program data is executed by a processor, the program data performs the following steps: determining a reference object coordinate system, an origin of a robot rotating shaft coordinate system and an origin of a camera equipment coordinate system; acquiring a first view of a plurality of reference objects along a first direction through a camera device; the camera shooting device is installed on a robot rotating shaft, the motion of the camera shooting device is controlled by the robot rotating shaft, the camera shooting device moves in a translation mode along a first direction, the first direction is parallel to a connecting line from an original point of a robot rotating shaft coordinate system to an original point of a camera shooting device coordinate system, and at most 3 shooting planes of first views are in the same plane; acquiring a second view of the plurality of reference objects along a second direction through the camera device; the camera shooting equipment moves in a translational mode along a second direction, and the second direction is a direction which is not parallel to a connecting line from the original point of the robot rotating shaft coordinate system to the original point of the camera shooting equipment coordinate system; and obtaining a hand-eye calibration result based on the first view, the second view and the motion information of the camera equipment.
The beneficial effect of this application is: different from the prior art, the method and the device for calibrating the hand-eye calibration do not need to preset the calibration point and control the camera equipment to accurately move to the preset position, only need to control the camera equipment to move around the reference object to obtain a plurality of first views and second views of the reference object, and obtain the conversion relation among the coordinate system of the camera equipment, the coordinate system of the reference object and the coordinate system of the rotating shaft according to the first views and the second views, so that the workload and the calibration process are reduced, the calibration efficiency is high, and the accuracy of the hand-eye calibration is ensured.
FIG. 1 is a schematic flow chart diagram illustrating an embodiment of a hand-eye calibration method according to the present application;
FIG. 2 is a schematic diagram of coordinate systems in an embodiment of the hand-eye calibration method shown in FIG. 1;
FIG. 3 is a schematic view of the translation of the image capturing device in the hand-eye calibration method shown in FIG. 1;
FIG. 4 is a schematic view of the translation and rotation of the camera in the hand-eye calibration method shown in FIG. 1;
FIG. 5 is a schematic structural diagram of an embodiment of a hand-eye calibration system according to the present application;
FIG. 6 is a schematic structural diagram of an embodiment of a computer storage medium according to the present application.
Referring to fig. 1, fig. 1 is a schematic flowchart of an embodiment of a robot hand-eye calibration method according to the present application, where the robot hand-eye calibration method of the present embodiment includes the following steps:
101: and determining the reference object coordinate system, the robot rotating shaft coordinate system origin and the camera equipment coordinate system origin.
Fig. 2 is a schematic diagram of coordinate systems in an embodiment of the hand-eye calibration method shown in fig. 1, in this embodiment, a robot is provided with a mechanical forearm, the mechanical forearm is provided with a rotating shaft 203, the image capture device 202 is mounted on the rotating shaft 203, and the rotating shaft 203 is used for controlling rigid motions, such as translation and rotation, of the image capture device 202. The reference object is a plane 201 comprising a calibration coordinate system, i.e. the reference object coordinate system is a calibration coordinate system, which may be a world coordinate system. On the plane 201, X of the reference object coordinate system is included1Axis and Y1Axis, Z1The axis is preferably perpendicular to the plane 201. It is possible to use the optical center of the image pickup apparatus 202 as the origin of the image pickup apparatus coordinate system and define one of the joint points of the rotation axis 203 as the origin of the robot rotation axis coordinate system. For example, X2Axis, Y2Axis and Z2The intersection of the axes is the origin of the coordinate system of the camera device, X3Axis, Y3Axis and Z3The intersection of the axes is the origin of a coordinate system of the rotating axis of the robot.
In this application, the rotation axis 203 of the robot drives the camera device 202 to move, the camera device 202 obtains an image of a reference object in the moving process, and the projection relationship between the coordinate U of the image in the camera device coordinate system and the coordinate P of the reference object coordinate system is as follows:
U≈K[R t]·P
wherein R is a rotation relation matrix between the coordinate system of the camera equipment and the coordinate system of the reference object; t is a motion vector (mainly a translation vector in this application); k is a parameter matrix in the image pickup apparatus,wherein f isxAnd fyIn the coordinate system X of the camera for reference2Axis and Y2On-axis scale factor, x0And y0Is an image pickup apparatusAnd s is a preset distortion factor at the intersection point of the shooting plane and the optical axis of the shooting equipment.
102: acquiring a first view of a plurality of reference objects along a first direction through a camera device; the first direction is parallel to the direction from the original point of the robot rotating shaft coordinate system to the original point connecting line of the camera equipment coordinate system, the translation is rigid motion, and at most 3 shooting planes of the first views are in the same plane.
Specifically, as shown in fig. 3, fig. 3 is a schematic translation diagram of the image capturing apparatus in the hand-eye calibration method shown in fig. 1, where the image capturing apparatus sequentially obtains 5 first views of the reference object 306 along 5 shooting points (301, 302, 303, 304, and 305) in the first direction; the first direction is a direction parallel to a connecting line from the origin of the robot rotating shaft coordinate system to the origin of the camera equipment coordinate system. Each photographing point does not need to be preset as in the prior art, and only the photographing plane with at most 3 first views in 5 first views is required to be in the same plane, so that the coordinates of each photographing point can be expressed through the acquired first views, and at most 3 photographing points have repeated x-axis coordinates or y-axis coordinates, so that the first views can more comprehensively indicate the spatial position of the reference object, and the calibration accuracy is improved. In the above embodiment, in order to ensure that the reference object coordinates can be accurately expressed through the first views, the number of the first views captured by the image capturing device in the present application is at least 5, and in order to improve the accuracy of the result, at most 3 first view capturing planes of the first views can be located in the same plane during the translation process.
103: acquiring a second view of the plurality of reference objects along a second direction through the camera device; the second direction is a direction intersecting a connecting line from the origin of the robot rotation axis coordinate system to the origin of the camera device coordinate system, that is, a direction non-parallel to the connecting line from the origin of the robot rotation axis coordinate system to the origin of the camera device coordinate system.
In the present embodiment, in order to accurately acquire the hand-eye relationship between the robot rotation axis coordinate system and the imaging apparatus coordinate system, at least two second views of the reference object need to be acquired by the imaging apparatus. The following description will take as an example two second views of the reference object acquired by the image pickup apparatus.
Specifically, as shown in fig. 4, fig. 4 is a schematic view of translation and rotation of the imaging apparatus in the hand-eye calibration method shown in fig. 1. The camera device is driven to translate and rotate through the rotating shaft, the camera device sequentially obtains 2 second views of the reference object 403 along 2 photographing points (401, 402) in the second direction, and in order to reduce calculation errors, the rotating angle can be controlled to be 30-35 degrees, namely, an included angle formed by the second direction and a connecting line from the robot rotating shaft coordinate system origin to the camera device coordinate system origin is 30-35 degrees, wherein the second direction is the intersecting direction from the robot rotating shaft coordinate system origin to the connecting line from the camera device coordinate system origin.
In the process of coordinate transformation in a three-dimensional space, firstly, it is required to determine a transformation relationship between a robot rotation axis coordinate system and an image pickup apparatus coordinate system, in this embodiment, to simplify calculation, it is determined that one of joint points of a rotation axis of a robot is an origin of the robot rotation axis coordinate system, and assuming that a hand-eye relationship between the robot rotation axis coordinate system and the image pickup apparatus coordinate system is G, a is a transformation relationship between rotation axis coordinate systems of robots moving twice adjacent to each other, and B is a transformation relationship between two image pickup apparatus coordinate systems moving twice adjacent to each other, it can be obtained:
A·G=B·G
after the requirement of obtaining the basic photographing positions of the plurality of first views and the plurality of second views is met, more photographing positions can be added to increase the robustness and accuracy of the hand-eye calibration result.
104: and obtaining a hand-eye calibration result based on the first view, the second view and the motion information of the camera equipment.
In this embodiment, a homography matrix is obtained through a first view; acquiring X of spatial point on reference object in coordinate system of camera device through second view2Axis and Y2Constructing a measurement matrix by using the scale factor on the axis and the parameter matrix in the camera equipment; and combining the movement amount and the rotation amount of the camera to obtain the result of the hand-eye calibration.
Specifically, a first view is selected and randomly selectedA spatial point S on the middle reference object, and a coordinate U of the spatial point in the coordinate system of the image pickup device is represented by any 3 non-collinear spatial points in the first view:wherein a, b and c are coordinates of 3 non-collinear space points in a coordinate system of the camera equipment,which represents the position of the spatial point S on the two-dimensional image plane of the camera device. Referring again to fig. 2, the coordinate P of the space point S in the reference object coordinate system is:wherein x1、y1For a spatial point S on the reference object in X of the coordinate system of the reference object1Axis and Y1Unit vector on axis.
After obtaining from the projection relationship:
the homography matrix H can be obtained by transforming the formula:
assuming that the motion ordinary differential equation of the image pickup apparatus when the second view is photographed is of order m × n, the coordinates (x) of the spatial point S in the second view0,y0) Can be expressed as: x is the number of0=m/2,y0N/2. Suppose that the distance from the optical center of the image pickup apparatus to the reference object is D1The length of the coordinate line segment on the reference object is D2The length of the pixel coordinate of the image pickup device image is D3The optical geometric relationship of the camera device is as follows:where f is the focal length of the imaging device and L is the length of the pixel in the image, such that X of the spatial point S in the coordinate system of the imaging device can be obtained2Axis and Y2On-axis scale factor fx=SxD1D2/D3And fy ═ SyD1D2/D3Wherein
Obtaining parameter matrix K in camera equipment by the method1And K2。
Wherein i is the shooting point of the ith camera equipment; j is the jth position of the robot rotating shaft movement;is a scale factor, H is a homography, Rir1jRir2jRir3j+ t is a motion formula of the image pickup apparatus.
Then, Singular Value Decomposition (SVD) is performed on the formula to obtain the final productAnd (4) calculating a scale factor to obtain a measurement matrix M. Finally, the combined calculation is obtainedAccording to the conventional coordinate conversion formula, the conversion relation among a robot rotation axis coordinate system, an image pickup device coordinate system and a reference object coordinate system (as a world coordinate system) can be obtained, and the hand-eye calibration is completed.
Compared with the prior art that when hand-eye calibration is carried out, a calibration point needs to be specially set and camera equipment needs to be accurately moved to the position of the calibration point, the calibration point does not need to be preset and the camera equipment needs to be controlled to accurately move to the preset position, the camera equipment only needs to be controlled to move around a reference object to obtain a plurality of first views and second views of the reference object, and the conversion relation among a coordinate system of the camera equipment, a coordinate system of the reference object and a coordinate system of a rotating shaft is obtained according to the first views and the second views, so that the workload and the calibration process are reduced, and the calibration efficiency and the accuracy are high.
In addition, in general, calibration errors are caused by noise, movement accuracy and visual calibration accuracy in the robot operation process, particularly, large errors are caused by noise to camera device parameters K, translation relation vectors t and the like in the hand-eye positioning process, but the errors have small influences on the camera device parameters K and the translation relation vectors t in image information.
In the above embodiment, the image pickup apparatus may include: a viewfinder, an image analyzer and an image processor. The viewfinder is used to acquire an image of a reference object. The image analyzer acquires imaging apparatus coordinate information of a reference object. The image processor may be configured to perform conversion calculations between the robot rotation axis coordinate system, the camera coordinate system, and the reference object coordinate system based on the image of the reference object.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an embodiment of a hand-eye calibration system according to the present application. In this embodiment, the hand-eye calibration system includes: an imaging apparatus 501, a reference object 504, a processing unit 505, and a robot (not shown).
The robot is provided with a mechanical forearm 502, the mechanical forearm 502 is provided with a rotation shaft 503, and the image pickup apparatus 501 is mounted on the rotation shaft 503. The image pickup apparatus 501 is electrically connected to the robot, and the rotation axis 503 is used to control the image pickup apparatus 501 to make rigid motions such as translation and rotation. Alternatively, the image pickup apparatus 501 is controlled to make rigid motion by the mechanical forearm 502 in cooperation with the rotation axis 503. The image capturing apparatus 501 has an image capturing apparatus coordinate system, and the origin of the image capturing apparatus coordinate system may be the optical center of the image capturing apparatus 501. The robot has a robot rotation axis coordinate system, and one joint point in the rotation axis 503 may be defined as the origin of the robot rotation axis coordinate system. The reference object 504 is a plane containing a calibration coordinate system, i.e. the reference object coordinate system is a calibration coordinate system, which may be a world coordinate system. The robot and the camera 501 cooperate to photograph the reference object 504 to realize the hand-eye calibration method. The method comprises the following specific steps:
in this application, the rotation axis 503 of the robot drives the camera device 501 to move, the camera device 501 obtains an image of the reference object 504 during the movement, and a projection relationship between a coordinate U of the image in the camera device coordinate system and a coordinate P of the reference object coordinate system is as follows:
U≈K[R t]·P
wherein R is a rotation relation matrix between the coordinate system of the camera equipment and the coordinate system of the reference object; t is a motion vector (mainly a translation vector in this application); k is an internal parameter matrix of the image pickup apparatus,wherein f isxAnd fyIn the coordinate system X of the camera for reference2Axis and Y2On-axis scale factor, x0And y0Is the intersection point of the shooting plane of the camera and the optical axis of the camera, and s is a preset distortion factor.
The processing unit 505 is communicatively connected to the image pickup apparatus 501, and in the present embodiment, the processing unit 505 is integrated with the image pickup apparatus 501. In other embodiments, the processing unit 505 may be integrated with the robot, or may be independent of the imaging apparatus 505 and the robot, as long as the processing unit 505 and the imaging apparatus 501 can perform data communication with each other, and the position thereof is not particularly limited. When the processing unit 505 is integrated with the image pickup apparatus 501, it may be specifically integrated with a processor of the image pickup apparatus 501; when the processing unit 505 is integrated with the robot, it may be specifically integrated with a processor of the robot. The processing unit can be used as a control unit of the robot rotating shaft and the camera equipment at the same time, is in communication link with the robot and the camera equipment, and is used for controlling the rotating shaft to drive the camera equipment to move and controlling the camera equipment to shoot. The processing unit page can not be used as a control unit of the rotating shaft and the camera shooting equipment of the robot, and the control unit of the robot controls the rotating shaft to move according to a preset program so as to drive the camera shooting equipment to move; the self control unit of the camera device controls the camera device to shoot according to a preset program, and the processing unit only acquires a first view and a second view obtained by shooting of the camera device and motion information of the camera device to finish hand-eye calibration calculation.
The robot may drive the camera device 501 to translate within a reference object coordinate system of the reference object 504, and the processing unit 505 obtains a plurality of first views of the reference object 504 through the camera device 501 along a first direction; the camera 501 performs translational motion along a first direction and captures images of a reference object, the first direction is parallel to a direction from an original point of a robot rotation axis coordinate system to a connecting line of the original point of a camera coordinate system, the translational motion is rigid motion, and at most 3 capturing planes of first views are in the same plane.
Specifically, the processing unit 505 sequentially obtains 5 first views of the reference object 504 through 5 photographing points of the image capturing apparatus 501 along the first direction; the first direction is a direction parallel to a connecting line from the origin of the robot rotating shaft coordinate system to the origin of the camera equipment coordinate system. Each photographing point does not need to be preset as in the prior art, and only the photographing plane with at most 3 first views in 5 first views is required to be in the same plane, so that the coordinates of each photographing point can be expressed through the acquired first views, and at most 3 photographing points have repeated x-axis coordinates or y-axis coordinates, so that the first views can more comprehensively indicate the spatial position of the reference object, and the calibration accuracy is improved. In the above embodiment, in order to ensure that the reference object coordinates can be accurately expressed by the first view, the number of the first views acquired by the processing unit 505 through the image capturing device 501 in the present application is at least 5, and in order to improve the accuracy of the result, at most 3 shooting planes of the first views acquired through the image capturing device 501 during the translation process can be located in the same plane.
In order to accurately acquire the hand-eye relationship between the robot rotation axis coordinate system and the imaging apparatus coordinate system, the processing unit 505 is further required to acquire a second view of the plurality of reference objects 504 in the second direction through the imaging apparatus 501. The imaging device 501 moves in a translational manner in a second direction and captures an image of the reference object, where the second direction is a direction intersecting with a line from the origin of the robot rotation axis coordinate system to the origin of the imaging device coordinate system, that is, a direction non-parallel to a line from the origin of the robot rotation axis coordinate system to the origin of the imaging device coordinate system. In the present embodiment, in order to accurately acquire the hand-eye relationship between the robot rotation axis coordinate system and the imaging apparatus coordinate system, at least two second views of the reference object need to be acquired by the imaging apparatus 501. The following description will take as an example two second views of the reference object 504 acquired by the image pickup apparatus 501.
The robot drives the camera device 501 to translate and rotate through the rotating shaft 503, and the camera device 501 sequentially obtains 2 second views of the reference object 504 along 2 shooting points in the second direction. In order to reduce calculation errors, the rotation angle is controlled to be 30-35 degrees, namely, the included angle between the second direction and the connecting line from the original point of the robot rotating shaft coordinate system to the original point of the camera equipment coordinate system is 30-35 degrees, wherein the second direction is the intersecting direction from the original point of the robot rotating shaft coordinate system to the connecting line from the original point of the camera equipment coordinate system.
In the process of transforming the coordinates of the three-dimensional space, it is first required to determine the transformation relationship between the robot rotation axis coordinate system and the camera device coordinate system, in this embodiment, to simplify the calculation, it is determined that one of the joint points of the rotation axis 503 of the robot is the origin of the robot rotation axis coordinate system, and it is assumed that the hand-eye relationship between the robot rotation axis coordinate system and the camera device coordinate system is G, a is the transformation relationship between the rotation axis coordinate systems of the robots moving twice adjacent to each other, and B is the transformation relationship between the camera device coordinate systems moving twice adjacent to each other, so that:
A·G=B·G
after the requirement of obtaining the basic photographing positions of the plurality of first views and the plurality of second views is met, more photographing positions can be added to increase the robustness and accuracy of the hand-eye calibration result.
The processing unit 505 can obtain the result of the hand-eye calibration based on the first view, the second view, and the motion information of the image capturing apparatus 501.
In a specific embodiment, the processing unit 505 obtains the homography matrix according to the first view; acquiring the scale factors of the space points on the reference object 504 on the X axis and the Y axis in the coordinate system of the camera equipment and the parameter matrix in the camera equipment according to the second view, and constructing a measurement matrix; and then the result of the hand-eye calibration is obtained by combining the movement amount and the rotation amount of the imaging device 501. For a more detailed calculation method, reference is made to the description of the above method embodiment, and details are not repeated here.
In the present embodiment, the processing unit 505 is integrated with the image capturing apparatus 501, and the image capturing apparatus 501 may include: a viewfinder and an image analyzer. The viewfinder is communicatively coupled to the image analyzer, which is communicatively coupled to the processing unit. The viewfinder is used for acquiring an image containing a reference object, and the image comprises the first view and the second view. The image analyzer is used for acquiring the coordinate information of the camera equipment of the reference object. The image processor is in communication link with the viewfinder and the image analyzer, acquires the information acquired by the image analyzer and the image acquired by the viewfinder, and performs conversion calculation among the robot rotating shaft coordinate system, the camera equipment coordinate system and the reference object coordinate system
Compared with the prior art that the calibration point needs to be specially set and the camera device is moved to the accurate position of the calibration point when the hand-eye calibration is carried out, the calibration point does not need to be preset and the camera device is controlled to accurately move to the preset position, the camera device only needs to be controlled to move around the reference object to obtain a plurality of first views and second views of the reference object, and the conversion relation among the coordinate system of the camera device, the coordinate system of the reference object and the coordinate system of the rotating shaft is obtained according to the first views and the second views, so that the workload and the calibration process are reduced, the calibration efficiency is high, and the accuracy of the hand-eye calibration is ensured.
In addition, in general, calibration generates errors due to noise, movement accuracy and visual calibration accuracy in the operation process of the robot, and particularly, the noise causes large errors to the internal parameter matrix K, the translation relation vector t and the like of the camera in the hand-eye positioning process, but the errors have small influences on the internal parameter matrix K and the translation relation vector t of the camera in the image information.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a computer storage medium according to an embodiment of the present application. In the present embodiment, the computer storage medium 60 includes program data 61, and the hand-eye calibration method as described in the above embodiments is implemented when the program data 61 is executed.
According to the hand-eye calibration method, the hand-eye calibration system and the computer storage medium, only when the camera shooting device is at the shooting position, the reference object is in the visual field of the camera shooting device, and the camera shooting device is not required to be precisely positioned and moved to the specified calibration point, so that the calibration process is greatly simplified, and the stable accuracy is achieved.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.
Claims (17)
- A hand-eye calibration method is characterized by comprising the following steps:determining a reference object coordinate system, an origin of a robot rotating shaft coordinate system and an origin of a camera equipment coordinate system;acquiring a first view of a plurality of reference objects along a first direction through a camera device; the camera shooting equipment is arranged on a robot rotating shaft, the movement of the camera shooting equipment is controlled by the robot rotating shaft, the camera shooting equipment moves along the first direction, the first direction is parallel to the direction from the original point of a robot rotating shaft coordinate system to the original point connecting line of the camera shooting equipment coordinate system, and at most 3 shooting planes of the first view are in the same plane;acquiring a plurality of second views of the reference object along a second direction by the camera device; wherein the camera device moves along the second direction, and the second direction is a direction intersecting with a connecting line from the origin of the robot rotation axis coordinate system to the origin of the camera device coordinate system;and obtaining a hand-eye calibration result based on the first view, the second view and the motion information of the camera equipment.
- A hand-eye calibration method according to claim 1, wherein the number of said first views is at least 5.
- A hand-eye calibration method according to claim 1, wherein the number of said second views is at least 2.
- The hand-eye calibration method according to claim 1, wherein obtaining a result of hand-eye calibration based on the first view, the second view and the motion information of the camera comprises:acquiring a homography matrix according to the first view;acquiring the scale factors of the space points on the reference object on the X axis and the Y axis in the coordinate system of the camera equipment and the internal parameter matrix of the camera equipment according to the second view, and constructing a measurement matrix;and combining the motion information of the camera equipment to obtain the result of the hand-eye calibration, wherein the motion information of the camera equipment comprises the movement amount and the rotation amount of the camera equipment.
- A hand-eye calibration method as claimed in any one of claims 1-4, wherein the angle between the second direction and the line from the origin of the robot rotation axis coordinate system to the origin of the camera coordinate system is 30 ° -35 °.
- A hand-eye calibration system, comprising:an image capture apparatus having an image capture apparatus coordinate system;the robot is provided with a mechanical front arm, the mechanical front arm is provided with a rotating shaft, the camera shooting equipment is connected with the rotating shaft, the rotating shaft controls the camera shooting equipment to do rigid motion, and the robot is provided with a robot rotating shaft coordinate system;a reference object having a reference object coordinate system;a processing unit for acquiring a plurality of first views of the reference object along a first direction by means of an image pickup apparatus; the camera device moves along the first direction and shoots a reference object image, the first direction is a direction parallel to a connecting line from the origin of the robot rotating shaft coordinate system to the origin of the camera device coordinate system, and at most 3 shooting planes of the first view are in the same plane;acquiring a plurality of second views of the reference object along a second direction by the camera device; the camera device moves along the second direction and shoots a reference object image, wherein the second direction is a direction intersecting with a connecting line from the origin of the robot rotating shaft coordinate system to the origin of the camera device coordinate system;and obtaining a hand-eye calibration result based on the first view, the second view and the motion information of the camera equipment.
- A hand-eye punctuation system according to claim 6 characterized in that said first number of views is at least 5.
- A hand-eye calibration system according to claim 6, wherein the number of said second views is at least 2.
- A hand-eye calibration system according to claim 6,when obtaining a result of the hand-eye calibration based on the first view, the second view, and the motion information of the image capturing apparatus, the processing unit is specifically configured to:acquiring a homography matrix according to the first view;acquiring the scale factors of the space points on the reference object on the X axis and the Y axis in the coordinate system of the camera equipment and the internal parameter matrix of the camera equipment according to the second view, and constructing a measurement matrix;and combining the motion information of the camera equipment to obtain the result of the hand-eye calibration, wherein the motion information of the camera equipment comprises the movement amount and the rotation amount of the camera equipment.
- A hand-eye calibration system according to any one of claims 6-9, wherein the second direction is at an angle of 30 ° -35 ° with respect to a line connecting the origin of the robot rotation axis coordinate system and the origin of the camera coordinate system.
- A hand-eye calibration system according to claim 10, wherein the processing unit is integrated with the camera device, or integrated with the robot, or independent of the camera device and the robot.
- A hand-eye calibration system according to claim 11, wherein said processing unit is integral with said camera device, said camera device comprising a viewfinder and an image analyzer, said viewfinder being communicatively coupled to said image analyzer and said processing unit, said image analyzer being communicatively coupled to said processing unit; the viewfinder is used for acquiring an image containing the reference object, and the image comprises the first view and the second view; the image analyzer is used for acquiring the coordinate information of the camera equipment of the reference object.
- A computer storage medium having program data stored thereon, which when executed by a processor, performs the steps of:determining a reference object coordinate system, an origin of a robot rotating shaft coordinate system and an origin of a camera equipment coordinate system;acquiring a first view of a plurality of reference objects along a first direction through a camera device; the camera shooting equipment is installed on the robot rotating shaft, the movement of the camera shooting equipment is controlled by the robot rotating shaft, the camera shooting equipment moves along the first direction, the first direction is parallel to the direction from the original point of the robot rotating shaft coordinate system to the connecting line of the original point of the camera shooting equipment coordinate system, and at most 3 shooting planes of the first view are in the same plane;acquiring a plurality of second views of the reference object along a second direction by the camera device; wherein the camera device moves along the second direction, and the second direction is a direction intersecting with a connecting line from the origin of the robot rotation axis coordinate system to the origin of the camera device coordinate system;and obtaining a hand-eye calibration result based on the first view, the second view and the motion information of the camera equipment.
- The computer storage medium of claim 13, wherein the first view is at least 5 in number.
- The computer storage medium of claim 13, wherein the second view is at least 2 in number.
- The computer storage medium of claim 13, wherein obtaining the result of the hand-eye calibration based on the first view, the second view, and the motion information of the camera device comprises:acquiring a homography matrix according to the first view;acquiring the scale factors of the space points on the reference object on the X axis and the Y axis in the coordinate system of the camera equipment and the internal parameter matrix of the camera equipment according to the second view, and constructing a measurement matrix;and combining the motion information of the camera equipment to obtain the result of the hand-eye calibration, wherein the motion information of the camera equipment comprises the movement amount and the rotation amount of the camera equipment.
- The computer storage medium of any of claims 13-16, wherein the second direction is at an angle of 30 ° -35 ° from a line connecting an origin of a coordinate system of a rotational axis of the robot to an origin of a coordinate system of the camera apparatus.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2018/098119 WO2020024178A1 (en) | 2018-08-01 | 2018-08-01 | Hand-eye calibration method and system, and computer storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111801198A true CN111801198A (en) | 2020-10-20 |
CN111801198B CN111801198B (en) | 2023-07-04 |
Family
ID=69232288
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201880088580.0A Active CN111801198B (en) | 2018-08-01 | 2018-08-01 | Hand-eye calibration method, system and computer storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111801198B (en) |
WO (1) | WO2020024178A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113146633A (en) * | 2021-04-23 | 2021-07-23 | 无锡信捷电气股份有限公司 | High-precision hand-eye calibration method based on automatic box pasting system |
CN116945195A (en) * | 2023-09-19 | 2023-10-27 | 成都飞机工业(集团)有限责任公司 | Omnidirectional measurement device system arrangement, registration method, electronic device and storage medium |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113252066B (en) * | 2020-02-13 | 2024-04-09 | 纳恩博(北京)科技有限公司 | Calibration method and device for parameters of odometer equipment, storage medium and electronic device |
WO2021159489A1 (en) * | 2020-02-14 | 2021-08-19 | 西门子(中国)有限公司 | Coordinate system calibrating method, demonstrating board, and projecting member |
CN113676696B (en) * | 2020-05-14 | 2024-08-30 | 杭州萤石软件有限公司 | Target area monitoring method and system |
CN112001967B (en) * | 2020-08-14 | 2024-08-06 | 苏州华兴源创科技股份有限公司 | Method and device for guiding manipulator to carry object by camera |
CN112102419B (en) * | 2020-09-24 | 2024-01-26 | 烟台艾睿光电科技有限公司 | Dual-light imaging equipment calibration method and system and image registration method |
CN112802122B (en) * | 2021-01-21 | 2023-08-29 | 珠海市运泰利自动化设备有限公司 | Robot vision guiding assembly method |
CN113103238B (en) * | 2021-04-26 | 2024-10-25 | 福建(泉州)先进制造技术研究院 | Hand-eye calibration method based on data optimization |
CN113362396B (en) * | 2021-06-21 | 2024-03-26 | 上海仙工智能科技有限公司 | Mobile robot 3D hand-eye calibration method and device |
CN113843792B (en) * | 2021-09-23 | 2024-02-06 | 四川锋准机器人科技有限公司 | Hand-eye calibration method of surgical robot |
CN116061162A (en) * | 2021-10-29 | 2023-05-05 | 北京理工大学 | Method for acquiring and processing hand-eye calibration data for avoiding singularity |
CN114952910A (en) * | 2022-05-26 | 2022-08-30 | 杭州海康机器人技术有限公司 | Positioning method and device of movement mechanism |
CN114872048B (en) * | 2022-05-27 | 2024-01-05 | 河南职业技术学院 | Robot steering engine angle calibration method |
CN115741666A (en) * | 2022-08-31 | 2023-03-07 | 深圳前海瑞集科技有限公司 | Robot hand-eye calibration method, robot and robot operation method |
CN117103286B (en) * | 2023-10-25 | 2024-03-19 | 杭州汇萃智能科技有限公司 | Manipulator eye calibration method and system and readable storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050107920A1 (en) * | 2003-11-18 | 2005-05-19 | Fanuc Ltd | Teaching position correcting device |
CN1843710A (en) * | 2005-04-07 | 2006-10-11 | 精工爱普生株式会社 | Motion control apparatus and method, position instruction apparatus, position instruction method and control programme |
US20110280472A1 (en) * | 2010-05-14 | 2011-11-17 | Wallack Aaron S | System and method for robust calibration between a machine vision system and a robot |
CN106920261A (en) * | 2017-03-02 | 2017-07-04 | 江南大学 | A kind of Robot Hand-eye static demarcating method |
CN106940894A (en) * | 2017-04-12 | 2017-07-11 | 无锡职业技术学院 | A kind of hand-eye system self-calibrating method based on active vision |
CN107995885A (en) * | 2016-11-18 | 2018-05-04 | 深圳配天智能技术研究院有限公司 | A kind of coordinate system scaling method, system and device |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105021139B (en) * | 2015-07-16 | 2017-09-12 | 北京理工大学 | A kind of hand and eye calibrating method of robot Vision Measuring System With Structured Light Stripe |
CN105451461B (en) * | 2015-11-25 | 2018-08-14 | 四川长虹电器股份有限公司 | Pcb board localization method based on SCARA robots |
CN107808400B (en) * | 2017-10-24 | 2021-11-26 | 上海交通大学 | Camera calibration system and calibration method thereof |
-
2018
- 2018-08-01 WO PCT/CN2018/098119 patent/WO2020024178A1/en active Application Filing
- 2018-08-01 CN CN201880088580.0A patent/CN111801198B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050107920A1 (en) * | 2003-11-18 | 2005-05-19 | Fanuc Ltd | Teaching position correcting device |
CN1843710A (en) * | 2005-04-07 | 2006-10-11 | 精工爱普生株式会社 | Motion control apparatus and method, position instruction apparatus, position instruction method and control programme |
US20110280472A1 (en) * | 2010-05-14 | 2011-11-17 | Wallack Aaron S | System and method for robust calibration between a machine vision system and a robot |
CN107995885A (en) * | 2016-11-18 | 2018-05-04 | 深圳配天智能技术研究院有限公司 | A kind of coordinate system scaling method, system and device |
CN106920261A (en) * | 2017-03-02 | 2017-07-04 | 江南大学 | A kind of Robot Hand-eye static demarcating method |
CN106940894A (en) * | 2017-04-12 | 2017-07-11 | 无锡职业技术学院 | A kind of hand-eye system self-calibrating method based on active vision |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113146633A (en) * | 2021-04-23 | 2021-07-23 | 无锡信捷电气股份有限公司 | High-precision hand-eye calibration method based on automatic box pasting system |
CN113146633B (en) * | 2021-04-23 | 2023-12-19 | 无锡信捷电气股份有限公司 | High-precision hand-eye calibration method based on automatic box pasting system |
CN116945195A (en) * | 2023-09-19 | 2023-10-27 | 成都飞机工业(集团)有限责任公司 | Omnidirectional measurement device system arrangement, registration method, electronic device and storage medium |
CN116945195B (en) * | 2023-09-19 | 2024-01-12 | 成都飞机工业(集团)有限责任公司 | Omnidirectional measurement device system arrangement, registration method, electronic device and storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2020024178A1 (en) | 2020-02-06 |
CN111801198B (en) | 2023-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111801198B (en) | Hand-eye calibration method, system and computer storage medium | |
WO2022120567A1 (en) | Automatic calibration system based on visual guidance | |
CN106408612B (en) | Machine vision system calibration | |
US20200061837A1 (en) | Method for industrial robot commissioning, industrial robot system and control system using the same | |
JP2020116734A (en) | System and method for automatic hand-eye calibration of vision system for robot motion | |
JP6180086B2 (en) | Information processing apparatus and information processing method | |
JP4825980B2 (en) | Calibration method for fisheye camera. | |
CN110136208A (en) | A kind of the joint automatic calibration method and device of Visual Servoing System | |
CN109658460A (en) | A kind of mechanical arm tail end camera hand and eye calibrating method and system | |
CN110728715A (en) | Camera angle self-adaptive adjusting method of intelligent inspection robot | |
JP4825971B2 (en) | Distance calculation device, distance calculation method, structure analysis device, and structure analysis method. | |
CN110717943A (en) | Method and system for calibrating eyes of on-hand manipulator for two-dimensional plane | |
CN113379849A (en) | Robot autonomous recognition intelligent grabbing method and system based on depth camera | |
CN110465946B (en) | Method for calibrating relation between pixel coordinate and robot coordinate | |
JP2005201824A (en) | Measuring device | |
CN102096923A (en) | Fisheye calibration method and device | |
WO2020063058A1 (en) | Calibration method for multi-degree-of-freedom movable vision system | |
US12128571B2 (en) | 3D computer-vision system with variable spatial resolution | |
Wang et al. | LF-VIO: A visual-inertial-odometry framework for large field-of-view cameras with negative plane | |
CN115588054A (en) | Camera calibration method and device without angle constraint, electronic equipment and storage medium | |
JP2015005093A (en) | Pattern matching device and pattern matching method | |
CN117340879A (en) | Industrial machine ginseng number identification method and system based on graph optimization model | |
CN114952832B (en) | Mechanical arm assembling method and device based on monocular six-degree-of-freedom object attitude estimation | |
Secil et al. | A robotic system for autonomous 3-D surface reconstruction of objects | |
CN112060083B (en) | Binocular stereoscopic vision system for mechanical arm and measuring method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |