CN111801198B - Hand-eye calibration method, system and computer storage medium - Google Patents

Hand-eye calibration method, system and computer storage medium Download PDF

Info

Publication number
CN111801198B
CN111801198B CN201880088580.0A CN201880088580A CN111801198B CN 111801198 B CN111801198 B CN 111801198B CN 201880088580 A CN201880088580 A CN 201880088580A CN 111801198 B CN111801198 B CN 111801198B
Authority
CN
China
Prior art keywords
coordinate system
image pickup
origin
view
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880088580.0A
Other languages
Chinese (zh)
Other versions
CN111801198A (en
Inventor
王少飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen A&E Intelligent Technology Institute Co Ltd
Original Assignee
Shenzhen A&E Intelligent Technology Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen A&E Intelligent Technology Institute Co Ltd filed Critical Shenzhen A&E Intelligent Technology Institute Co Ltd
Publication of CN111801198A publication Critical patent/CN111801198A/en
Application granted granted Critical
Publication of CN111801198B publication Critical patent/CN111801198B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The application provides a hand-eye calibration method, a hand-eye calibration system and a computer storage medium, wherein the hand-eye calibration method comprises the following steps: determining a reference object coordinate system, a robot rotation axis coordinate system origin and an imaging device coordinate system origin; acquiring a first view of a plurality of references along a first direction by an image pickup device; the first direction is a direction parallel to a line connecting an origin of a coordinate system of the robot rotation axis to an origin of a coordinate system of the image pickup device; acquiring a second view of the plurality of references in a second direction by the image capturing apparatus; the second direction is a direction which is not parallel to a connecting line from the origin of the coordinate system of the robot rotation axis to the origin of the coordinate system of the image pickup device; and obtaining a result of the hand eye calibration based on the first view, the second view and the motion information of the image pickup device. By means of the method, the camera equipment is controlled to move around the reference object to obtain a plurality of first views and second views of the reference object, workload and calibration procedures are reduced, calibration efficiency is high, and accuracy of hand-eye calibration is guaranteed.

Description

Hand-eye calibration method, system and computer storage medium
Technical Field
The present disclosure relates to the field of robotics, and in particular, to a method and a system for calibrating eyes and hands, and a computer storage medium.
Background
Robots have been widely used in a variety of industries with the development of artificial intelligence technology and the change of social demands. In the field of industrial application, a robot can realize various functions through its own power and control capability, and a vision system supporting the technology is an important part of the robot, and the robot can control an actuator to execute actions such as machining, installation and the like by utilizing the vision system to acquire images.
The hand-eye calibration is the first step of the application of all robots in cooperation with machine vision, and the robots can work cooperatively with the vision system only when the robots and the vision system can convert own coordinate systems into the same world coordinate system.
The existing method generally enables the robot and the vision system to respectively calibrate the conversion relation of the self coordinate system relative to the world coordinate system, the mode requires an operator to manually and accurately move the mechanical arm to at least three specific positions for calibration, the accuracy is unstable, and the calibration operation is complex for accurately reaching the specific positions.
Disclosure of Invention
The technical problem that this application solved is, provides a hand eye calibration method, system and computer storage medium, obtains the reference object picture through the camera equipment who carries on the robot along different directions and carries out hand eye calibration, simplifies the process of demarcating by a wide margin, improves the demarcating degree of accuracy.
In order to solve the above problems, the present application provides a hand-eye calibration method, which includes the following steps: determining a reference object coordinate system, a robot rotation axis coordinate system origin and an imaging device coordinate system origin; acquiring a first view of a plurality of references along a first direction by an image pickup device; the camera equipment is arranged on the robot rotating shaft, the robot rotating shaft controls the camera equipment to move, the camera equipment moves in a translation mode along a first direction, the first direction is parallel to the direction of connecting the origin of the coordinate system of the robot rotating shaft to the origin of the coordinate system of the camera equipment, and at most 3 shooting planes of the first views are in the same plane; acquiring a second view of the plurality of references in a second direction by the image capturing apparatus; the camera equipment moves in a translation mode along a second direction, wherein the second direction is a direction which is not parallel to a connecting line from the origin of the coordinate system of the robot rotation axis to the origin of the coordinate system of the camera equipment; and obtaining a result of the hand eye calibration based on the first view, the second view and the motion information of the image pickup device.
To solve the above problems, the present application further provides a hand-eye calibration system, including: an image pickup apparatus having an image pickup apparatus coordinate system; the robot is provided with a mechanical forearm, a rotating shaft is arranged on the mechanical forearm, the camera equipment is connected with the rotating shaft, the rotating shaft controls the camera equipment to perform rigid movement, and the robot is provided with a robot rotating shaft coordinate system; a reference object having a reference object coordinate system; a processing unit for acquiring a first view of a plurality of reference objects along a first direction by an image pickup apparatus; the camera equipment moves in a translation mode along a first direction and shoots a reference object image, the first direction is parallel to the direction of connecting lines from the origin of a coordinate system of a robot rotation axis to the origin of the coordinate system of the camera equipment, and at most 3 shooting planes of the first views are in the same plane; acquiring a second view of the plurality of references in a second direction by the image capturing apparatus; the camera equipment moves in a translation mode along a second direction and shoots a reference object image, wherein the second direction is a direction which is not parallel to a connecting line from the origin of a coordinate system of a robot rotation axis to the origin of the coordinate system of the camera equipment; and obtaining a result of the hand eye calibration based on the first view, the second view and the motion information of the image pickup device.
In order to solve the above-mentioned problems, the present application further provides a computer storage medium, wherein the computer storage medium has program data stored thereon, and when the program data is executed by a processor, the following steps are performed: determining a reference object coordinate system, a robot rotation axis coordinate system origin and an imaging device coordinate system origin; acquiring a first view of a plurality of references along a first direction by an image pickup device; the camera equipment is arranged on the robot rotating shaft, the robot rotating shaft controls the camera equipment to move, the camera equipment moves in a translation mode along a first direction, the first direction is parallel to the direction of connecting the origin of the coordinate system of the robot rotating shaft to the origin of the coordinate system of the camera equipment, and at most 3 shooting planes of the first views are in the same plane; acquiring a second view of the plurality of references in a second direction by the image capturing apparatus; the camera equipment moves in a translation mode along a second direction, wherein the second direction is a direction which is not parallel to a connecting line from the origin of the coordinate system of the robot rotation axis to the origin of the coordinate system of the camera equipment; and obtaining a result of the hand eye calibration based on the first view, the second view and the motion information of the image pickup device.
The beneficial effects of this application are: compared with the prior art, the method and the device have the advantages that the calibration point does not need to be preset, the camera equipment is controlled to move to the preset position accurately, only the camera equipment is controlled to move around the reference object to obtain a plurality of first views and second views of the reference object, and the conversion relation among the coordinate system of the camera equipment, the coordinate system of the reference object and the coordinate system of the rotating shaft is obtained according to the first views and the second views, so that the workload and the calibration procedure are reduced, the calibration efficiency is high, and the accuracy of hand-eye calibration is guaranteed.
Drawings
FIG. 1 is a flow chart of an embodiment of a hand-eye calibration method of the present application;
FIG. 2 is a schematic diagram of coordinate systems in an embodiment of the hand-eye calibration method shown in FIG. 1;
FIG. 3 is a schematic view of translation of the image capturing apparatus in the hand-eye calibration method shown in FIG. 1;
FIG. 4 is a schematic view of translation and rotation of the image capturing apparatus in the hand-eye calibration method shown in FIG. 1;
FIG. 5 is a schematic diagram of an embodiment of a hand-eye calibration system of the present application;
FIG. 6 is a schematic diagram of an embodiment of a computer storage medium of the present application.
Detailed Description
Referring to fig. 1, fig. 1 is a flow chart of an embodiment of a robot hand-eye calibration method according to the present application, the robot hand-eye calibration method according to the present embodiment includes the following steps:
101: a reference object coordinate system, a robot rotation axis coordinate system origin, and an image pickup apparatus coordinate system origin are determined.
Fig. 2 is a schematic diagram of the respective coordinate systems in an embodiment of the hand-eye calibration method shown in fig. 1, in which a robot is provided with a mechanical forearm, the mechanical forearm is provided with a rotation shaft 203, the image capturing apparatus 202 is mounted on the rotation shaft 203, and the rotation shaft 203 is used to control rigid movement, such as translation and rotation, of the image capturing apparatus 202. The reference is a plane 201 containing a calibration coordinate system, i.e. the reference coordinate system is a calibration coordinate system, which may be a world coordinate system. On the plane 201, X is included in the reference coordinate system 1 Axis and Y 1 Axis, Z 1 The axis is preferably perpendicular to the plane 201. It is possible to take the optical center of the image pickup apparatus 202 as the origin of the image pickup apparatus coordinate system and define one of the nodes of the rotation axis 203 as the origin of the robot rotation axis coordinate system. For example, X 2 Axis, Y 2 Axis and Z 2 The intersection of the axes is the origin of the coordinate system of the image pickup device, X 3 Axis, Y 3 Axis and Z 3 The intersection of the axes is the origin of the robot rotation axis coordinate system.
In the application, the rotation shaft 203 of the robot drives the image pickup device 202 to move, the image pickup device 202 obtains an image of a reference object in the moving process, and the projection relationship between a coordinate U of the image in the coordinate system of the image pickup device and a coordinate P of the coordinate system of the reference object is as follows:
U≈K[R t]·P
wherein R is a rotation relation matrix between a coordinate system of the image pickup device and a coordinate system of a reference object; t is a motion vector (mainly a translation vector in the present application); k is a matrix of parameters in the image pickup device,
Figure BDA0002617874550000041
wherein f x And f y In the coordinate system X of the camera equipment for reference object 2 Axes and Y 2 Scale factor on axis, x 0 And y 0 Is the intersection point of the shooting plane of the image pickup device and the optical axis of the image pickup device, and s is a preset distortion factor.
102: acquiring a first view of a plurality of references along a first direction by an image pickup device; the first direction is parallel to the line connecting the origin of the coordinate system of the robot rotation axis to the origin of the coordinate system of the imaging device, the first direction translates into rigid motion, and at most 3 shooting planes of the first view are in the same plane.
Specifically, as shown in fig. 3, fig. 3 is a schematic translation diagram of the image capturing apparatus in the hand-eye calibration method shown in fig. 1, where the image capturing apparatus sequentially acquires a first view of 5 reference objects 306 along 5 photo-spots (301, 302, 303, 304, and 305) in a first direction; wherein the first direction is a direction parallel to a line connecting an origin of a coordinate system of the robot rotation axis to an origin of a coordinate system of the image pickup apparatus. The shooting points do not need to be preset as in the prior art, and only the shooting planes of at most 3 first views in the 5 first views are required to be in the same plane, so that the coordinates of the shooting points can be expressed through the acquired first views, and at most 3 shooting points have repeated x-axis coordinates or y-axis coordinates, so that the first views can more comprehensively indicate the spatial position of a reference object, and the calibration accuracy is improved. In the above embodiment, in order to ensure that the coordinates of the reference object can be accurately expressed through the first views, the number of the first views shot by the image capturing apparatus in the present application is at least 5, and in order to improve the accuracy of the result, at most, only 3 shooting planes of the first views can be located in the same plane in the first views acquired by the image capturing apparatus during the translation process.
103: acquiring a second view of the plurality of references in a second direction by the image capturing apparatus; wherein the second direction is a direction intersecting a line connecting the origin of the robot rotation axis coordinate system to the origin of the image capturing apparatus coordinate system, that is, a direction non-parallel to a line connecting the origin of the robot rotation axis coordinate system to the origin of the image capturing apparatus coordinate system.
In the present embodiment, in order to accurately acquire the hand-eye relationship between the robot rotation axis coordinate system and the image capturing apparatus coordinate system, at least two second views of the reference object need to be acquired by the image capturing apparatus. The following description will take, as an example, two second views of a reference object acquired by an image pickup apparatus.
Specifically, as shown in fig. 4, fig. 4 is a schematic view of translation and rotation of the image capturing apparatus in the hand-eye calibration method shown in fig. 1. The camera is driven to translate and rotate through the rotation shaft, 2 photographing points (401, 402) of the camera along a second direction sequentially acquire second views of 2 reference objects 403, and in order to reduce calculation errors, the rotation angle can be controlled between 30 degrees and 35 degrees, namely, an included angle between the second direction and a line connecting an origin of a robot rotation axis coordinate system to an origin of the camera is 30 degrees to 35 degrees, wherein the second direction is a direction intersecting with the line connecting the origin of the robot rotation axis coordinate system to the origin of the camera.
In the coordinate transformation process of the three-dimensional space, firstly, the transformation relationship between the rotating shaft coordinate system of the robot and the coordinate system of the camera equipment needs to be determined, in this embodiment, in order to simplify the calculation, one of the joint points of the rotating shaft of the robot is determined to be the origin of the rotating shaft coordinate system of the robot, the hand-eye relationship between the rotating shaft coordinate system of the robot and the coordinate system of the camera equipment is assumed to be G, A is the transformation relationship between the rotating shaft coordinate systems of the two adjacent robots, and B is the transformation relationship between the two adjacent camera equipment coordinate systems, so that the following can be obtained:
A·G=B·G
after meeting the above requirement of acquiring basic photographing positions of a plurality of first views and second views, more photographing positions can be added to increase the robustness and accuracy of the hand-eye calibration result.
104: and obtaining a result of the hand eye calibration based on the first view, the second view and the motion information of the image pickup device.
In the embodiment, a homography matrix is acquired through a first view; acquiring X of spatial point on reference object in coordinate system of image pickup apparatus through second view 2 Axes and Y 2 The scale factors on the axes and the parameter matrix in the camera equipment are used for constructing a measurement matrix; and combining the moving amount and the rotating amount of the camera equipment to obtain the hand eye calibration result.
Specifically, a first view is selected, a space point S on a reference object in the first view is randomly selected, and coordinates U of the space point in a coordinate system of the image pickup device are represented by any 3 non-collinear space points in the first view:
Figure BDA0002617874550000071
wherein a, b and c are coordinates of 3 non-collinear space points in a coordinate system of the image pickup device,
Figure BDA0002617874550000072
the position of the spatial point S on the two-dimensional image plane of the image capturing apparatus is represented. Referring again to fig. 2, correspondingly, the coordinates P of the spatial point S in the reference object coordinate system are: />
Figure BDA0002617874550000073
Wherein x is 1 、y 1 X being the spatial point S on the reference in the coordinate system of the reference 1 Axis and Y 1 Unit vector on axis.
The projection relation can be obtained by:
Figure BDA0002617874550000074
the homography matrix H can be obtained by transforming the above formula:
Figure BDA0002617874550000075
assuming that the normal differential equation of the motion of the image capturing apparatus at the time of capturing the second view is m×n order, the coordinates (x 0 ,y 0 ) Can be expressed as: x is x 0 =m/2,y 0 =n/2. Assume that the distance from the optical center of the image pickup apparatus to the reference object is D 1 The length of the coordinate line segment on the reference object is D 2 The pixel coordinate length of the image pickup apparatus is D 3 The optical geometrical relationship of the image pickup device is as follows:
Figure BDA0002617874550000076
where f is the focal length of the image capturing apparatus, and L is the length of the pixel in the image, so that X of the spatial point S in the coordinate system of the image capturing apparatus can be obtained 2 Axes and Y 2 Scaling factor f on axis x =S x D 1 D 2 /D 3 And fy=s y D 1 D 2 /D 3 Wherein->
Figure BDA0002617874550000077
Obtaining the parameter matrix K in the image pickup device by the method 1 And K 2
Figure BDA0002617874550000081
Wherein i is the imaging point of the ith imaging device; j is the j-th position of the rotation axis motion of the robot;
Figure BDA0002617874550000082
is a scale factor, H is a homography matrix, R i r 1j R i r 2j R i r 3j +t is a motion formula of the image pickup apparatus.
Singular value decomposition (Singular Value Decomposition, SVD) is performed on the formula to obtain
Figure BDA0002617874550000083
The scale factor is calculated to obtain the measurement matrix M. Finally, combining the calculated scale factors, the measurement matrix and the movement amount and rotation amount of the robot, and obtaining the conversion relation among a robot rotation axis coordinate system, an imaging equipment coordinate system and a reference object coordinate system (as a world coordinate system) according to a conventional coordinate conversion formula, thereby completing hand-eye calibration.
Compared with the condition that a calibration point is required to be specially set and the camera equipment is required to be accurately moved to the position of the calibration point when hand-eye calibration is carried out in the prior art, the method and the device do not need to preset the calibration point and control the camera equipment to be accurately moved to the preset position, only control the camera equipment to move around a reference object to obtain a plurality of first views and second views of the reference object, and obtain conversion relations among a coordinate system of the camera equipment, a coordinate system of the reference object and a coordinate system of a rotating shaft according to the first views and the second views, so that workload and calibration procedures are reduced, and calibration efficiency and accuracy are high.
In addition, in general, noise, movement precision and visual calibration precision in the running process of the robot can cause errors in calibration, particularly, noise can cause larger errors on the parameters K of the image pickup device, the translation relation vector t and the like in the hand-eye positioning process, but the errors have smaller influence on the parameters K of the image pickup device and the translation relation vector t in image information.
In the above-described embodiment, the image pickup apparatus may include: viewfinder, image analyzer and image processor. The viewfinder is used for acquiring an image of the reference object. The image analyzer acquires image capturing apparatus coordinate information of a reference object. The image processor may be configured to perform conversion calculations between the robot rotation axis coordinate system, the image capturing apparatus coordinate system, and the reference object coordinate system based on the image of the reference object.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an embodiment of the hand-eye calibration system of the present application. In this embodiment, the hand-eye calibration system includes: an imaging apparatus 501, a reference object 504, a processing unit 505, and a robot (not labeled in the figure).
The robot is provided with a mechanical forearm 502, the mechanical forearm 502 is provided with a rotation shaft 503, and the imaging apparatus 501 is mounted on the rotation shaft 503. The image pickup apparatus 501 is electrically connected to a robot, and a rotation shaft 503 is used to control the image pickup apparatus 501 to perform rigid motions such as translation and rotation. Alternatively, the mechanical forearm 502 controls the imaging apparatus 501 to perform a rigid motion in cooperation with the rotation shaft 503. Wherein the image capturing apparatus 501 has an image capturing apparatus coordinate system, the optical center of the image capturing apparatus 501 may be taken as the origin of the image capturing apparatus coordinate system. The robot has a robot rotation axis coordinate system, and one joint point in the rotation axis 503 may be defined as the origin of the robot rotation axis coordinate system. The reference 504 is a plane containing a calibration coordinate system, i.e. the reference coordinate system is a calibration coordinate system, which may be a world coordinate system. A method for photographing a reference object 504 by a robot in cooperation with an image pickup apparatus 501 to achieve hand-eye calibration. The method comprises the following specific steps:
in the application, the rotation shaft 503 of the robot drives the imaging device 501 to move, the imaging device 501 acquires an image of the reference object 504 in the moving process, and the projection relationship between the coordinate U of the image in the coordinate system of the imaging device and the coordinate P of the coordinate system of the reference object is:
U≈K[R t]·P
wherein R is a rotation relation matrix between a coordinate system of the image pickup device and a coordinate system of a reference object; t is a motion vector (mainly a translation vector in the present application); k is an internal parameter matrix of the image pickup apparatus,
Figure BDA0002617874550000101
wherein f x And f y In the coordinate system X of the camera equipment for reference object 2 Axes and Y 2 Scale factor on axis, x 0 And y 0 Is the intersection point of the shooting plane of the image pickup device and the optical axis of the image pickup device, and s is a preset distortion factor.
The processing unit 505 is communicatively connected to the image capturing apparatus 501, and in this embodiment, the processing unit 505 is integral with the image capturing apparatus 501. In other embodiments, the processing unit 505 may be integrated with the robot, or may be independent of the imaging apparatus 505 and the robot, so long as the processing unit 505 and the imaging apparatus 501 are guaranteed to be capable of communicating data, and the position thereof is not particularly limited. When the processing unit 505 is integral with the image capturing apparatus 501, it may be specifically integral with a processor of the image capturing apparatus 501; when the processing unit 505 is integrated with the robot, it may be specifically integrated with a processor of the robot. The processing unit can be used as a control unit of the robot rotating shaft and the imaging equipment and is in communication connection with the robot and the imaging equipment to control the rotating shaft to drive the imaging equipment to move and control the imaging equipment to shoot. The processing unit page is not used as a control unit of the robot rotating shaft and the camera equipment, and the control unit of the robot controls the rotating shaft to move according to a preset program, so that the camera equipment is driven to move; the self control unit of the image pickup device controls the image pickup device to shoot according to a preset program, and the processing unit only acquires the first view and the second view shot by the image pickup device and the movement information of the image pickup device to finish the hand-eye calibration calculation.
The robot may drive the image capturing device 501 to translate in a reference object coordinate system of the reference object 504, and the processing unit 505 obtains a first view of the plurality of reference objects 504 along a first direction through the image capturing device 501; wherein, the image capturing apparatus 501 translates along a first direction and captures an image of a reference object, the first direction is a direction parallel to a line connecting an origin of a coordinate system of a robot rotation axis to the origin of the coordinate system of the image capturing apparatus, translates into a rigid motion, and at most 3 capturing planes of the first views are in the same plane.
Specifically, the processing unit 505 sequentially acquires first views of 5 references 504 through 5 photographing points of the imaging apparatus 501 in the first direction; wherein the first direction is a direction parallel to a line connecting an origin of a coordinate system of the robot rotation axis to an origin of a coordinate system of the image pickup apparatus. The shooting points do not need to be preset as in the prior art, and only the shooting planes of at most 3 first views in the 5 first views are required to be in the same plane, so that the coordinates of the shooting points can be expressed through the acquired first views, and at most 3 shooting points have repeated x-axis coordinates or y-axis coordinates, so that the first views can more comprehensively indicate the spatial position of a reference object, and the calibration accuracy is improved. In the above embodiment, in order to ensure that the coordinates of the reference object can be accurately expressed by the first views, the number of the first views acquired by the processing unit 505 through the image capturing apparatus 501 in this application is at least 5, and in order to provide accuracy of the results, at most only 3 capturing planes of the first views acquired by the image capturing apparatus 501 during the translation process are located in the same plane.
In order to accurately acquire the hand-eye relationship between the robot rotation axis coordinate system and the imaging apparatus coordinate system, it is also necessary for the processing unit 505 to acquire a second view of the plurality of references 504 in the second direction by the imaging apparatus 501. Wherein the image capturing apparatus 501 moves in translation and captures an image of the reference object in a second direction, which is a direction intersecting a line connecting the origin of the robot rotation axis coordinate system to the origin of the image capturing apparatus coordinate system, that is, a direction non-parallel to a line connecting the origin of the robot rotation axis coordinate system to the origin of the image capturing apparatus coordinate system. In the present embodiment, in order to accurately acquire the hand-eye relationship between the robot rotation axis coordinate system and the image capturing apparatus coordinate system, at least two second views of the reference object need to be acquired by the image capturing apparatus 501. The following description will take, as an example, two second views of the reference object 504 acquired by the image capturing apparatus 501.
The robot translates and rotates the imaging apparatus 501 through the rotation shaft 503, and the imaging apparatus 501 sequentially acquires the second view of 2 references 504 at 2 photo spots along the second direction. In order to reduce calculation errors, the rotation angle is controlled to be 30-35 degrees, namely, an included angle between a second direction and a line connecting the origin of the coordinate system of the rotation shaft of the robot to the origin of the coordinate system of the image pickup device is 30-35 degrees, wherein the second direction is a direction intersecting with the line connecting the origin of the coordinate system of the rotation shaft of the robot to the origin of the coordinate system of the image pickup device.
In the coordinate transformation process of the three-dimensional space, firstly, the transformation relationship between the rotation axis coordinate system of the robot and the coordinate system of the image capturing device needs to be determined, in this embodiment, in order to simplify the calculation, one of the joint points of the rotation axis 503 of the robot is determined to be the origin of the rotation axis coordinate system of the robot, and it is assumed that the hand-eye relationship between the rotation axis coordinate system of the robot and the coordinate system of the image capturing device is G, a is the transformation relationship between the rotation axis coordinate systems of the two adjacent moving robots, and B is the transformation relationship between the two adjacent moving image capturing device coordinate systems, which can be obtained:
A·G=B·G
after meeting the above requirement of acquiring basic photographing positions of a plurality of first views and second views, more photographing positions can be added to increase the robustness and accuracy of the hand-eye calibration result.
The processing unit 505 can obtain the result of the hand-eye calibration based on the first view, the second view, and the motion information of the image capturing apparatus 501.
In a specific embodiment, the processing unit 505 obtains the homography matrix according to the first view; acquiring scale factors of space points on the reference object 504 on an X axis and a Y axis in a coordinate system of the imaging equipment and a parameter matrix in the imaging equipment according to the second view, and constructing a measurement matrix; and combining the movement amount and the rotation amount of the imaging apparatus 501 to obtain a result of hand eye calibration. The more detailed calculation method is referred to the description of the above method embodiments, and will not be repeated here.
In the present embodiment, the processing unit 505 is integral with the image capturing apparatus 501, and the image capturing apparatus 501 may include: a viewfinder and an image analyzer. The viewfinder is in communication with the image analyzer, which is in communication with the processing unit. The viewfinder is used for acquiring an image containing a reference object, and the image comprises the first view and the second view. The image analyzer is used for acquiring image capturing apparatus coordinate information of a reference object. The image processor is in communication connection with the viewfinder and the image analyzer, and acquires information acquired by the viewfinder and the image analyzer to complete conversion calculation among a robot rotation axis coordinate system, an image pickup device coordinate system and a reference object coordinate system
Compared with the condition that a calibration point is required to be specially set and the imaging device is required to be moved to the accurate position of the calibration point in the hand-eye calibration in the prior art, the method and the device do not need to preset the calibration point and control the imaging device to be precisely moved to the preset position, only the imaging device is controlled to move around the reference object to obtain a plurality of first views and second views of the reference object, and the conversion relation among the coordinate system of the imaging device, the coordinate system of the reference object and the coordinate system of the rotating shaft is obtained according to the first views and the second views, so that the workload and the calibration procedure are reduced, the calibration efficiency is high, and the accuracy of hand-eye calibration is guaranteed.
In addition, in general, noise, movement precision and visual calibration precision in the running process of the robot can cause errors in calibration, particularly, noise can cause larger errors on an internal parameter matrix K, a translation relation vector t and the like of the image pickup device in the hand-eye positioning process, but the errors have smaller influence on the internal parameter matrix K and the translation relation vector t of the image pickup device in image information.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an embodiment of a computer storage medium of the present application. In the present embodiment, the computer storage medium 60 includes program data 61, and the hand-eye calibration method described in the above embodiment is implemented when the program data 61 is executed.
According to the hand-eye calibration method, the hand-eye calibration system and the computer storage medium, only the reference object is needed to be in the field of view of the image pickup device when the image pickup device is at the photographing position, the image pickup device is not needed to be accurately positioned and moved to the appointed calibration point, the calibration process is greatly simplified, and the stable accuracy is achieved.
The foregoing description is only of embodiments of the present application, and is not intended to limit the scope of the patent application, and all equivalent structures or equivalent processes using the descriptions and the contents of the present application or other related technical fields are included in the scope of the patent application.

Claims (17)

1. A hand-eye calibration method, comprising:
determining a reference object coordinate system, a robot rotation axis coordinate system origin and an imaging device coordinate system origin;
acquiring a first view of a plurality of references along a first direction by an image pickup device; the camera is arranged on a robot rotating shaft, the robot rotating shaft controls the camera to move, the camera moves along the first direction, the first direction is parallel to the direction of connecting the origin of the coordinate system of the robot rotating shaft to the origin of the coordinate system of the camera, and at most 3 shooting planes of the first view are in the same plane;
acquiring a plurality of second views of the reference object along a second direction by the image pickup device; wherein the image capturing apparatus moves in the second direction, the second direction being a direction intersecting a line connecting the origin of the robot rotation axis coordinate system to the origin of the image capturing apparatus coordinate system;
and obtaining a result of hand eye calibration based on the first view, the second view and the motion information of the image pickup device.
2. The hand-eye calibration method according to claim 1, wherein the number of first views is at least 5.
3. The hand-eye calibration method according to claim 1, wherein the number of second views is at least 2.
4. The hand-eye calibration method according to claim 1, wherein the obtaining a result of hand-eye calibration based on the first view, the second view, and the movement information of the image capturing apparatus includes:
acquiring a homography matrix according to the first view;
acquiring scale factors of space points on the reference object on an X axis and a Y axis in a coordinate system of the image pickup device and an internal parameter matrix of the image pickup device according to the second view, and constructing a measurement matrix;
and combining the motion information of the image pickup device to obtain a hand eye calibration result, wherein the motion information of the image pickup device comprises the moving amount and the rotating amount of the image pickup device.
5. The hand-eye calibration method according to any one of claims 1-4, wherein the angle between the second direction and the line connecting the origin of the robot rotation axis coordinate system to the origin of the camera device coordinate system is 30 ° -35 °.
6. A hand-eye calibration system, comprising:
an image pickup apparatus having an image pickup apparatus coordinate system;
the robot is provided with a mechanical front arm, a rotating shaft is arranged on the mechanical front arm, the camera equipment is connected with the rotating shaft, the rotating shaft controls the camera equipment to perform rigid movement, and the robot is provided with a robot rotating shaft coordinate system;
a reference object having a reference object coordinate system;
a processing unit for acquiring a plurality of first views of the reference object along a first direction by an image pickup apparatus; the camera moves along the first direction and shoots a reference object image, the first direction is parallel to the direction of connecting the origin of the coordinate system of the robot rotation axis to the origin of the coordinate system of the camera, and at most 3 shooting planes of the first view are in the same plane;
acquiring a plurality of second views of the reference object along a second direction by the image pickup device; wherein the image pickup apparatus moves in the second direction, which is a direction intersecting a line connecting the origin of the robot rotation axis coordinate system to the origin of the image pickup apparatus coordinate system, and captures a reference object image;
and obtaining a result of hand eye calibration based on the first view, the second view and the motion information of the image pickup device.
7. The hand-eye calibration system according to claim 6, wherein the number of first views is at least 5.
8. The hand-eye calibration system according to claim 6, wherein the number of second views is at least 2.
9. The hand-eye calibration system according to claim 6, wherein,
the processing unit is specifically configured to, when obtaining a result of hand eye calibration based on the first view, the second view, and the motion information of the image capturing apparatus:
acquiring a homography matrix according to the first view;
acquiring scale factors of space points on the reference object on an X axis and a Y axis in a coordinate system of the image pickup device and an internal parameter matrix of the image pickup device according to the second view, and constructing a measurement matrix;
and combining the motion information of the image pickup device to obtain a hand eye calibration result, wherein the motion information of the image pickup device comprises the moving amount and the rotating amount of the image pickup device.
10. The hand-eye calibration system according to any one of claims 6-9, wherein the angle between the second direction and the origin of the robot rotation axis coordinate system and the origin of the camera device coordinate system is 30 ° -35 °.
11. The hand-eye calibration system according to claim 10, wherein the processing unit is integral with the camera device, or integral with the robot, or independent of the camera device and the robot.
12. The hand-eye calibration system according to claim 11, wherein the processing unit is integral with the image capturing device, the image capturing device comprising a viewfinder and an image analyzer, the viewfinder being communicatively coupled to the image analyzer and the processing unit, the image analyzer being communicatively coupled to the processing unit; the viewfinder is used for acquiring an image containing the reference object, and the image comprises the first view and the second view; the image analyzer is used for acquiring the coordinate information of the image capturing device of the reference object.
13. A computer storage medium having program data stored thereon, which when executed by a processor, performs the steps of:
determining a reference object coordinate system, a robot rotation axis coordinate system origin and an imaging device coordinate system origin;
acquiring a first view of a plurality of references along a first direction by an image pickup device; the camera is arranged on the robot rotating shaft, the robot rotating shaft controls the camera to move, the camera moves along the first direction, the first direction is parallel to the direction of connecting the origin of the coordinate system of the robot rotating shaft to the origin of the coordinate system of the camera, and at most 3 shooting planes of the first view are in the same plane;
acquiring a plurality of second views of the reference object along a second direction by the image pickup device; wherein the image capturing apparatus moves in the second direction, the second direction being a direction intersecting a line connecting the origin of the robot rotation axis coordinate system to the origin of the image capturing apparatus coordinate system;
and obtaining a result of hand eye calibration based on the first view, the second view and the motion information of the image pickup device.
14. The computer storage medium of claim 13, wherein the number of first views is at least 5.
15. The computer storage medium of claim 13, wherein the number of second views is at least 2.
16. The computer storage medium of claim 13, wherein the obtaining a result of the hand-eye calibration based on the first view, the second view, and the motion information of the image capturing apparatus comprises:
acquiring a homography matrix according to the first view;
acquiring scale factors of space points on the reference object on an X axis and a Y axis in a coordinate system of the image pickup device and an internal parameter matrix of the image pickup device according to the second view, and constructing a measurement matrix;
and combining the motion information of the image pickup device to obtain a hand eye calibration result, wherein the motion information of the image pickup device comprises the moving amount and the rotating amount of the image pickup device.
17. The computer storage medium of any of claims 13-16, wherein the second direction is at an angle of 30 ° -35 ° from the origin of the robot rotation axis coordinate system to the origin of the camera device coordinate system.
CN201880088580.0A 2018-08-01 2018-08-01 Hand-eye calibration method, system and computer storage medium Active CN111801198B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/098119 WO2020024178A1 (en) 2018-08-01 2018-08-01 Hand-eye calibration method and system, and computer storage medium

Publications (2)

Publication Number Publication Date
CN111801198A CN111801198A (en) 2020-10-20
CN111801198B true CN111801198B (en) 2023-07-04

Family

ID=69232288

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880088580.0A Active CN111801198B (en) 2018-08-01 2018-08-01 Hand-eye calibration method, system and computer storage medium

Country Status (2)

Country Link
CN (1) CN111801198B (en)
WO (1) WO2020024178A1 (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113252066B (en) * 2020-02-13 2024-04-09 纳恩博(北京)科技有限公司 Calibration method and device for parameters of odometer equipment, storage medium and electronic device
CN115136198A (en) * 2020-02-14 2022-09-30 西门子(中国)有限公司 Coordinate system calibration method, teaching board and protruding part
CN113676696B (en) * 2020-05-14 2024-08-30 杭州萤石软件有限公司 Target area monitoring method and system
CN112001967B (en) * 2020-08-14 2024-08-06 苏州华兴源创科技股份有限公司 Method and device for guiding manipulator to carry object by camera
CN112102419B (en) * 2020-09-24 2024-01-26 烟台艾睿光电科技有限公司 Dual-light imaging equipment calibration method and system and image registration method
CN112802122B (en) * 2021-01-21 2023-08-29 珠海市运泰利自动化设备有限公司 Robot vision guiding assembly method
CN113146633B (en) * 2021-04-23 2023-12-19 无锡信捷电气股份有限公司 High-precision hand-eye calibration method based on automatic box pasting system
CN113103238A (en) * 2021-04-26 2021-07-13 福建(泉州)哈工大工程技术研究院 Hand-eye calibration method based on data optimization
CN113362396B (en) * 2021-06-21 2024-03-26 上海仙工智能科技有限公司 Mobile robot 3D hand-eye calibration method and device
CN113843792B (en) * 2021-09-23 2024-02-06 四川锋准机器人科技有限公司 Hand-eye calibration method of surgical robot
CN116061162A (en) * 2021-10-29 2023-05-05 北京理工大学 Method for acquiring and processing hand-eye calibration data for avoiding singularity
CN114872048B (en) * 2022-05-27 2024-01-05 河南职业技术学院 Robot steering engine angle calibration method
CN115741666A (en) * 2022-08-31 2023-03-07 深圳前海瑞集科技有限公司 Robot hand-eye calibration method, robot and robot operation method
CN116945195B (en) * 2023-09-19 2024-01-12 成都飞机工业(集团)有限责任公司 Omnidirectional measurement device system arrangement, registration method, electronic device and storage medium
CN117103286B (en) * 2023-10-25 2024-03-19 杭州汇萃智能科技有限公司 Manipulator eye calibration method and system and readable storage medium

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3733364B2 (en) * 2003-11-18 2006-01-11 ファナック株式会社 Teaching position correction method
JP2006289531A (en) * 2005-04-07 2006-10-26 Seiko Epson Corp Movement control device for teaching robot position, teaching device of robot position, movement control method for teaching robot position, teaching method for robot position, and movement control program for teaching robot position
US9393694B2 (en) * 2010-05-14 2016-07-19 Cognex Corporation System and method for robust calibration between a machine vision system and a robot
CN105021139B (en) * 2015-07-16 2017-09-12 北京理工大学 A kind of hand and eye calibrating method of robot Vision Measuring System With Structured Light Stripe
CN105451461B (en) * 2015-11-25 2018-08-14 四川长虹电器股份有限公司 Pcb board localization method based on SCARA robots
WO2018090323A1 (en) * 2016-11-18 2018-05-24 深圳配天智能技术研究院有限公司 Method, system, and device for calibrating coordinate system
CN106920261B (en) * 2017-03-02 2019-09-03 江南大学 A kind of Robot Hand-eye static demarcating method
CN106940894A (en) * 2017-04-12 2017-07-11 无锡职业技术学院 A kind of hand-eye system self-calibrating method based on active vision
CN107808400B (en) * 2017-10-24 2021-11-26 上海交通大学 Camera calibration system and calibration method thereof

Also Published As

Publication number Publication date
WO2020024178A1 (en) 2020-02-06
CN111801198A (en) 2020-10-20

Similar Documents

Publication Publication Date Title
CN111801198B (en) Hand-eye calibration method, system and computer storage medium
JP6966582B2 (en) Systems and methods for automatic hand-eye calibration of vision systems for robot motion
CN110421562B (en) Mechanical arm calibration system and calibration method based on four-eye stereoscopic vision
CN106426172B (en) A kind of scaling method and system of industrial robot tool coordinates system
CN109658460A (en) A kind of mechanical arm tail end camera hand and eye calibrating method and system
CN111775146A (en) Visual alignment method under industrial mechanical arm multi-station operation
JP6180086B2 (en) Information processing apparatus and information processing method
CN113379849B (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
CN110465946B (en) Method for calibrating relation between pixel coordinate and robot coordinate
CN106003020A (en) Robot, robot control device, and robotic system
CN112907682B (en) Hand-eye calibration method and device for five-axis motion platform and related equipment
CN109360243B (en) Calibration method of multi-degree-of-freedom movable vision system
CN102842117A (en) Method for correcting kinematic errors in microscopic vision system
Gratal et al. Visual servoing on unknown objects
US20220395981A1 (en) System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system
CN116026252A (en) Point cloud measurement method and system
CN115446847A (en) System and method for improving 3D eye-hand coordination accuracy of a robotic system
Das et al. Calibration of a dynamic camera cluster for multi-camera visual SLAM
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution
CN115619877A (en) Method for calibrating position relation between monocular laser sensor and two-axis machine tool system
CN215701709U (en) Configurable hand-eye calibration device
Pajor et al. Stereovision system for motion tracking and position error compensation of loading crane
CN111823222B (en) Monocular camera multi-view visual guidance device and method
CN114589682A (en) Iteration method for automatic calibration of robot hand and eye
CN112584041A (en) Image identification dynamic deviation rectifying method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant