CN114643577A - Universal robot vision automatic calibration device and method - Google Patents

Universal robot vision automatic calibration device and method Download PDF

Info

Publication number
CN114643577A
CN114643577A CN202011501308.6A CN202011501308A CN114643577A CN 114643577 A CN114643577 A CN 114643577A CN 202011501308 A CN202011501308 A CN 202011501308A CN 114643577 A CN114643577 A CN 114643577A
Authority
CN
China
Prior art keywords
robot
calibration
controller
camera
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011501308.6A
Other languages
Chinese (zh)
Other versions
CN114643577B (en
Inventor
秦勇
高一佳
张宏宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Siasun Robot and Automation Co Ltd
Original Assignee
Shenyang Siasun Robot and Automation Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Siasun Robot and Automation Co Ltd filed Critical Shenyang Siasun Robot and Automation Co Ltd
Priority to CN202011501308.6A priority Critical patent/CN114643577B/en
Publication of CN114643577A publication Critical patent/CN114643577A/en
Application granted granted Critical
Publication of CN114643577B publication Critical patent/CN114643577B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1653Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention discloses a universal robot vision automatic calibration device and a method, wherein the device comprises: the robot calibration device comprises a robot component (115), a camera (130), a range finder (135), a calibration board (140), a robot controller (150), a camera controller (160) and a calibration and position calculation controller (170), and the method comprises the steps of integrating functions of robot motion control, parameter setting, camera calibration conversion, new coordinate capturing calculation and the like into an external independent third-party-independent software and hardware-independent mode, and realizing complete automation of a robot hand-eye calibration process independent of the third-party software and hardware only by a mode of transmitting necessary result data through a standard communication interface, the robot and the camera, thereby greatly improving the universality of the hand-eye calibration device and method, improving the calibration speed and efficiency, and simultaneously reducing the technical requirements on field operators.

Description

Universal robot vision automatic calibration device and method
Technical Field
The invention belongs to the technical field of robot vision calibration, and particularly relates to a universal robot vision automatic calibration device and method.
Background
Industrial robots typically include a controller and a robotic arm. The controller corresponds to the "brain" of the robot, and the robot arm corresponds to the "hand" of the robot. In order to enhance the perception of the robot to the surrounding environment, the robot is usually equipped with a pair of "eyes", i.e. an industrial camera. This poses a new problem and there is a need for an apparatus and method for converting the pixel coordinates of a camera to the spatial coordinates of a robot. The process of establishing coordinate transformation between the robot and the camera is commonly called "hand-eye calibration".
Many studies have been made in this respect at home and abroad. For example, Kangnai Vision Corporation (COGNEX) proposed an automatic hand-eye calibration method with minimal human intervention in a patent entitled "automatic hand-eye calibration System and method for robot motion Vision System" (patent No. CN111482959A) filed in 2020. In a patent entitled "hand-eye camera calibration method and device" (patent number: CN109671122A) applied by Changhong electrical appliances of Sichuan in 2018, a method for quickly and conveniently calibrating a camera by using a low-precision calibration plate is proposed. In a patent entitled "a camera calibration method, device and equipment" (patent number: CN111445535A) applied in 2020 by zhejiang kolan information technology limited company, a high-efficiency calibration method for directly optimizing real number parameters of an error function by using pre-selected characteristic points without manually adjusting a camera is provided.
The calibration method has the obvious disadvantages that firstly, the automatic calibration equipment and the method have no universality, and when the brand and model of the robot or the camera involved in calibration are changed, the automatic calibration equipment and the method cannot be used normally or can be used after being greatly changed. Secondly, complete automatic calibration is not realized, and a lot of intervention and assistance work with certain professional difficulty still needs to be carried out manually in the calibration process.
Disclosure of Invention
In view of the above technical problems, it is an object of the present invention to provide a universal automatic hand-eye calibration apparatus and method that is independent of the brand and model of the robot and camera being calibrated, and does not require manual intervention during the calibration process.
The technical scheme adopted by the invention is as follows: a universal robot vision automatic calibration device comprises:
the robot comprises a robot component, a camera, a range finder, a calibration plate, a robot controller, a camera controller and a calibration and position calculation controller;
the robot assembly is fixed in a space, the camera and the range finder are fixed at the tail end of the robot assembly, and the calibration plate is arranged in the visual range of the camera;
the robot assembly is electrically connected with the robot controller, the camera is electrically connected with the camera controller, the range finder is electrically connected with the camera controller, the robot controller is electrically connected with the calibration and position calculation controller, and the camera controller is electrically connected with the calibration and position calculation controller;
the robot assembly comprises a robot mounting seat, a mechanical arm and a mechanical claw, the mechanical arm is fixed on the robot mounting seat, and the mechanical claw is fixed at the tail end of the mechanical arm.
The robot controller comprises a processor, an input/output port and a memory, wherein a robot side communication interface is installed in the memory.
The camera controller comprises a processor, an input/output port and a memory, wherein a camera side communication interface is installed in the memory.
The calibration and position calculation controller comprises a processor, an input/output port and a memory, wherein a robot control module, a calibration conversion module and a workpiece position calculation module are installed in the memory.
The surface of the calibration plate is provided with characteristic points.
A universal robot vision automatic calibration method comprises the following steps:
step 1: the calibration and position calculation controller establishes a robot user coordinate system;
step 2: searching and determining the position of the calibration plate by the robot;
and 3, step 3: the robot moves the camera to the center of the calibration plate;
and 4, step 4: the robot translates and rotates for a plurality of times and takes pictures to measure the pixel coordinates of the characteristic points on the calibration plate;
and 5: the calibration and position calculation controller calculates a homography matrix of the robot where the camera is located;
step 6: controlling the mechanical arm to move so that the laser range finder reaches a point on the working surface;
and 7: adjusting the photographing height according to the laser ranging value;
and 8: the calibration and position calculation controller calculates the grabbing position of the mechanical claw;
and step 9: and moving the mechanical claw to grab the workpiece.
The step 1 specifically comprises:
step 11: the robot controller outputs instructions to control the moving robot arm such that the rangefinder is located at three planar points Puo, Pux, and Puy on a calibration plane, wherein Puo is used to determine the origin of the user's coordinate system, Pux is used to determine the X-axis of the user's coordinate system, and Puy is used to determine the Y-axis of the user's coordinate system;
step 12: the distance measuring device measures the Z coordinates of the three points;
step 13: the robot controller sends X, Y and Z coordinate values of the three points to the calibration and position calculation controller;
step 14: the calibration and position calculation controller calculates the translation value and the rotation value [ Tx, Ty, Tz, Rx, Ry, Rz ] of the robot user coordinate system relative to the base coordinate system according to the coordinate values of the three points, and the calculation formula is as follows:
Pw=R·Pu+t
where Pu is a point coordinate value in the robot coordinate system recorded by the robot controller in step 11 and step 12, Pw is a base coordinate value corresponding to the point, and R is a rotation matrix formed by rotation values Rx, Ry, and Rz:
Figure BDA0002843691380000031
the base coordinate system performs rotation of α, β and γ along x, y and z axes to obtain a user coordinate system, where s α is sin α, c α is cos α, and so on;
t is the column vector consisting of the translation values Tx, Ty, Tz:
Figure BDA0002843691380000041
step 15: and the calibration and position calculation controller transmits the calculated translation values and rotation values [ Tx, Ty, Tz, Rx, Ry and Rz ] to the robot controller to establish a user coordinate system of the robot.
The step 5 specifically comprises the following steps:
step 51: obtaining pixel coordinates of N points according to the step 4;
step 52: calculating a homography matrix H according to the following conversion formula from a camera pixel coordinate system to a robot coordinate system:
Figure BDA0002843691380000042
in the formula: z is a linear or branched membercAre the coordinates in the camera coordinate system,
Figure BDA0002843691380000043
as many pixels in units of millimeters in the x-direction,
Figure BDA0002843691380000044
comprises the following steps: how many pixels are within unit millimeter in the y-direction, f is the focal length, (X, Y, Z) is the three-dimensional coordinates in the robot coordinate system, (u, v) is the two-dimensional pixel coordinates of the corresponding three-dimensional point on the image, (u, v)0,v0) The actual position of the origin, H, is a homography matrix from a camera pixel coordinate system to a robot coordinate system;
in addition
Figure BDA0002843691380000045
The above equation can be simplified to two equations:
Figure BDA0002843691380000046
usually, another robot coordinate system is attached to the calibration plane, i.e. Z equals 0, so h3=h7=h110, and h12At least 4 pairs of points are required to solve the homography H.
The step 8 specifically includes:
step 81: taking pictures by a camera to measure pixel coordinate values Pc1 and Pc2 of two feature points of the workpiece;
step 82: the calibration and position calculation controller converts the pixel coordinate values into user coordinate values Pw1 and Pw2 of the robot according to the homography matrix H;
step 83: the calibration and position calculation controller calculates the workpiece offset and rotation angle [ Tx, Ty, Rz ], as follows:
Figure BDA0002843691380000051
step 84: calibration and position calculation controller calculates grabbing pose Pick of workpieceactThe formula is as follows:
Figure BDA0002843691380000052
wherein PickcadIs the known home position of the grab point.
Compared with the prior art, the invention has the following beneficial effects:
the robot eye calibration process is completely automatic independent of third-party software and hardware by integrating functions of robot motion control, parameter setting, camera calibration conversion, new coordinate capturing calculation and the like into an external independent mode independent of third-party software and hardware and only by a mode of transmitting necessary result data through a standard communication interface, the robot and the camera, the universality of the eye calibration device and the eye calibration method is greatly improved, the calibration speed and efficiency are improved, and the technical requirements on field operators are reduced.
Drawings
FIG. 1 is a schematic structural diagram of a universal robot vision automatic calibration device of the present invention;
FIG. 2 is a flow chart of the calibration process of the automatic vision calibration method for a universal robot of the present invention;
FIG. 3 is a flowchart of the operation process of the automatic calibration method for the vision of the universal robot according to the present invention;
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention more comprehensible, embodiments of the present invention are described in detail below with reference to the accompanying drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein, but rather should be construed as modified in the spirit and scope of the present invention as set forth in the appended claims.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
Referring to fig. 1, one embodiment of the present invention is shown. Wherein, fig. 1 is a schematic structural diagram of a universal robot vision automatic calibration device of the present invention. Including a robot assembly 115, a camera 130, a range finder 135, a calibration board 140, a robot controller 150, a camera controller 160, and a calibration and position calculation controller 170.
Among other things, the robot assembly 115 in fig. 1 includes a robot mount 110, a robot arm 120, and a gripper 125. The robot arm 120 is fixed to the robot mount 110, and the robot mount 110 is fixed in the space 100. The robot arm 120 and the robot controller 150 are electrically connected through an input/output port 152. The gripper 125 is fixed to the end of the robot arm 120. In this embodiment, a robot is used as a positioning element for the camera and the gripper to perform the translational and rotational movements required for camera calibration and to grasp the workpiece. Besides the robot in fig. 1, other various ways including a servo cylinder platform, a single-shaft robot platform, etc. may be used.
Among them, the camera 130 in fig. 1 includes an image sensor, a lens, and a light source. The camera 130 is fixed to the end of the robot arm 120 and is electrically connected to the camera controller 160 through the input/output port 162. In this example, the pixel coordinates of the feature point 141 in the calibration plate 140 and the pixel coordinates of the feature point on the workpiece are measured using an area-array scanning camera. In other embodiments, the camera may comprise a two-dimensional CCD camera sensor, a two-dimensional CMOS camera sensor, or any other type of area scan sensor for generating an image.
The distance measuring device 135 in fig. 1 may be a laser distance measuring device, or may be other types of measuring sensors capable of measuring distance, including a mechanical distance measuring device, an ultrasonic distance measuring device, and the like.
The calibration board 140 in fig. 1 may be a calibration board with feature points 141. Other types of calibration patterns are also possible, some exemplary patterns include, but are not limited to, a grid of dots, a grid of lines, crosses or cells, a triangular checkerboard, and the like.
Among them, the robot controller 150 in fig. 1 includes a processor 151, an input/output port 152, and a memory 153. The memory has mounted therein a robot control system and a robot-side communication interface 154. The robot-side communication interface 154 is used to handle the exchange of data between the calibration and position calculation controller 170 and the robot controller 150. And the calibration and position calculation controller sends an instruction for modifying the coordinate system of the robot user and moving the mechanical arm to the robot controller, and the robot controller sends the current coordinate position of the robot to the calibration and position calculation controller. The robot-side communication interface 154 may be embedded in the robot controller 150 in the form of Programmable Read Only Memory (PROM), non-volatile random access memory (NRAM), or the like. In other embodiments, the robot-side communication interface 154 may also be hardwired without software.
Among them, the camera controller 160 in fig. 1 includes a processor 161, an input/output port 162, and a memory 163. The memory has mounted therein a camera control system and a camera side communication interface 164. The camera-side communication interface 164 is used to handle the exchange of data between the calibration and position calculation controller 170 and the camera controller 160. The camera controller 160 sends the pixel coordinate values of the feature point 141 to the calibration and position calculation controller 170, and the calibration and position calculation controller 170 converts the pixel coordinate values into robot target coordinates and transmits the robot target coordinates to the robot through the robot-side communication interface 154, and controls the robot to move to the position. The camera-side communication interface 164 may be embedded in the camera controller 160 in the form of Programmable Read Only Memory (PROM), non-volatile random access memory (NRAM), or the like. In other embodiments, the camera-side communication interface 164 may also be hardwired without software.
The calibration and position calculation controller 170 in fig. 1 includes a processor 171, an input/output port 172, and a memory 173. The memory has mounted therein a robot control module 174, a calibration conversion module 175, and a workpiece position calculation module 176. The robot control module 174 is used to send motion commands to the robot through the input/output port 172, so as to drive the robot arm 120 to perform the translation and rotation actions required by the camera hand-eye calibration or the grabbing operation. The calibration conversion module 175 is used to calculate a homography matrix according to the pixel coordinate system sent from the camera controller 160, or convert the sent pixel coordinate system into coordinate values of the user coordinate system of the robot through homography. The workpiece position calculating module 176 is used for calculating the translation amount and the rotation amount of the workpiece according to the coordinate values of the current feature point of the workpiece calculated by the calibration converting module 175, and further calculating the grasping position of the gripper 125 according to the translation amount and the rotation amount.
When performing the camera calibration operation, first, the calibration and position calculation controller 170 moves the distance meter 135 through the robot-side communication interface 154 of the robot controller 150 to measure the height values of any three points on the calibration plane, and calculates the robot user coordinate system parallel to the calibration plane. The calibration and position calculation controller 170 then moves the camera 150 on the user coordinate system to search for the position of the calibration plate and to center the calibration plane in the camera field of view. Finally, the calibration and position calculation controller 170 controls the machine to translate and rotate for several times around the point and takes a picture to measure the pixel coordinates of the feature point, and calculates the homography matrix from the camera coordinate system to the robot coordinate system according to several groups of pixel coordinates measured by taking a picture and the corresponding robot coordinate point pairs, thereby completing the calibration operation.
When executing the operation, the robot control module 174 in the calibration and position calculation controller 170 sends a command to the robot controller 150 to control the robot arm 120 to move the distance measuring device 135 to any point on the working surface to measure the height, then adjusts the photographing position of the robot according to the measured height value, and photographs the pixel coordinate system of the feature point on the measured workpiece. The calibration conversion module 175 in the calibration and position calculation controller 170 then converts the pixel coordinate values to user target coordinate values. Finally, the workpiece position calculation module 176 in the calibration and position calculation controller 170 calculates the grabbing coordinate values of the workpiece and transmits the grabbing coordinate values to the robot controller 150, so that the robot moving gripper 125 performs corresponding operations on the workpiece.
Referring to fig. 2, another embodiment of the present invention is shown. Fig. 2 is a flow chart of a calibration process of the universal robot vision automatic calibration method of the present invention, which includes the following steps:
step 205: starting calibration;
step 210: the robot moves the laser rangefinder 135 to any three points on the calibration plane. Under the robot base coordinate, the laser rangefinder 135 is moved to non-collinear three points Puo, Pux, and Puy on the calibration plane, where Puo determines the origin of the user coordinate system, Pux determines the X-axis of the user coordinate system, and Puy determines the X-Y plane of the user coordinate system. The robot controller 160 records the X and Y coordinate values under the base coordinates of these three points. The photometric distance meter 135 measures Z-coordinate values of these three points. Finally, the robot controller 160 sends the X-Y-Z spatial coordinate values of the three points to the calibration and position calculation controller 170;
step 215: the calibration and position calculation controller 170 calculates the user coordinate system of the robot. According to the principle that three non-collinear points can determine a plane, the translation value and the rotation value [ Tx, Ty, Tz, Rx, Ry, Rz ] of the robot user coordinate system relative to the base coordinate system can be calculated, and the calculation formula is as follows:
Pw=R·Pu+T
where Pu is the pixel coordinate value of the point measured in step 214, Pw is the base coordinate value corresponding to the point, and R is a rotation matrix formed by rotation values Rx, Ry, and Rz:
Figure BDA0002843691380000091
the base coordinate system performs rotation of α, β and γ along x, y and z axes to obtain a user coordinate system, where s α is sin α, c α is cos α, and so on;
t is the column vector consisting of the translation values Tx, Ty, Tz:
Figure BDA0002843691380000092
step 220: the robot search determines the position of the calibration plate 140 in the robot user coordinate system. The robot arm 120 drives the camera 130 to move several positions within the working range to take pictures, and the existence of the calibration plate is checked. If yes, stopping searching;
step 225: the robot moves the robot arm 120 to bring the camera to the center of the calibration plate. The mobile robot centers the calibration plate in the field of view of the camera in preparation for several translational and rotational movements to be performed next. If the position of the calibration plate in the field of view is too close to the edge, the feature points may be out of view when performing the translation and rotation movements;
step 230: the robot translates and rotates several times and takes a picture to measure the pixel coordinates of the feature points. The robot performs translational and rotational movements about a central point offset by [ -dx, -dy,0], [0, -dy,0], [ dx, -dy,0], [ -dx,0,0], [0,0,0], [ dx,0,0], [ -dx, dy,0], [0, dy,0], [ dx, dy,0], [0,0, -2-rz ], [0,0,0, 0], [0,0, rz ], [0,0, 2-rz ]. In this example, 9 translations plus 5 rotations are implemented, but in other examples, fewer or greater numbers of translations and rotations may be implemented. After each translation or rotation movement, the camera takes a picture to measure the pixel coordinate value of a certain characteristic point;
step 235: the calibration and position calculation controller 170 calculates a homography matrix of the camera to the robot. Step 230 results in 14 sets of pixel coordinates. And artificially appointing the corresponding theoretical coordinate value for each group of pixel coordinates according to the expected origin position and coordinate axis orientation of the camera coordinate system. Calculating a homography matrix H according to the following conversion formula from a camera pixel coordinate system to a robot coordinate system:
Figure BDA0002843691380000101
in the formula: zcAre the coordinates in the camera coordinate system,
Figure BDA0002843691380000102
how many pixels are within a unit millimeter of the x-direction,
Figure BDA0002843691380000103
comprises the following steps: how many pixels are within unit millimeter in the y-direction, f is the focal length, (X, Y, Z) is the three-dimensional coordinates in the robot coordinate system, (u, v) is the two-dimensional pixel coordinates of the corresponding three-dimensional point on the image, (u, v)0,v0) The actual position of the origin, H, is the homography matrix of the camera pixel coordinate system to the robot coordinate system.
In addition
Figure BDA0002843691380000111
The above equation can be simplified to two equations:
Figure BDA0002843691380000112
usually, another robot coordinate system is attached to the calibration plane, i.e. Z is 0. Thus h3=h7=h11Is equal to 0, and h121. The above equation set has 8 unknowns and each pair of points (pixel and corresponding three-dimensional point) can list 2 equations, which show that at least 4 pairs of points are needed to solve the homography matrix H.
In order to eliminate the influence of random normal error distribution in the data acquisition process and improve the accuracy and stability of solution, 14 point pairs are acquired in the embodiment, a nonlinear equation set containing 28 equations is constructed, and a least square principle-based Levenberg-Marquardt nonlinear optimization iterative algorithm is used for solving a homography matrix H.
Step 240: and completing calibration.
Referring to fig. 3, another embodiment of the present invention is shown. Fig. 3 is a flowchart of an operation process of the universal robot vision automatic calibration method of the present invention, which includes the following steps:
step 305: starting to operate;
step 310: the robot moves the laser rangefinder 135 to any point on the work surface. Firstly, the robot needs to measure whether the distance from the feature point to be measured to the camera 130 is the same as the distance between the calibration plate 140 and the camera 130 during calibration, and if the distance is different, the photographing height needs to be adjusted so as to keep the same measurement precision;
step 315: and the robot adjusts the photographing height according to the laser ranging value. The laser range finder 135 feeds back the height value to the calibration and position calculation controller 170 after measuring the height value, and the calibration and position calculation controller 170 calculates a deviation value of the distance from the calibration time and controls the robot to rise or fall by a corresponding distance;
step 320: the camera 130 photographs and measures the pixel coordinate values of the feature points of the workpiece, and records the pixel coordinate [ u ] of the feature points obtained by the photographing measurement; v ];
step 325: the calibration and position calculation controller 170 converts the pixel coordinate values into robot coordinate values. Calculating a robot coordinate value [ X ] of the point according to the homography matrix H calculated in the calibration process; y; z ], the calculation formula is as follows:
Figure BDA0002843691380000121
step 330: the calibration and position calculation controller 170 calculates the workpiece offset and rotation angle. Based on the robot coordinate values Pw1 and Pw2 of the two feature points calculated in step 325, the workpiece offset amount and rotation angle [ Tx, Ty, Rz ] are calculated as follows:
Figure BDA0002843691380000122
step 335: the calibration and position calculation controller 170 calculates the gripper grasping pose. The amount of workpiece offset and the rotation angle [ Tx, Ty, Rz ] calculated from step 330]And theoretical gripping coordinates Pick of gripper 125 measured from the CAD modelcadThe grasping coordinate value Pi ckact of the gripper 125 is calculated. The calculation formula is as follows:
Figure BDA0002843691380000123
step 340: and finishing the operation.
The embodiments described in the above description will assist those skilled in the art in further understanding the invention, but do not limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the present invention.

Claims (10)

1. The utility model provides an automatic calibration device of general type robot vision which characterized in that includes:
a robot assembly (115), a camera (130), a range finder (135), a calibration board (140), a robot controller (150), a camera controller (160), and a calibration and position calculation controller (170);
the robot assembly (115) is fixed in the space (100), the camera (130) and the range finder (135) are fixed at the tail end of the robot assembly (115), and the calibration board (140) is placed in the visual range of the camera (130);
the robot component (115) is electrically connected with the robot controller (150), the camera (130) is electrically connected with the camera controller (160), the range finder (135) is electrically connected with the camera controller (160), the robot controller (150) is electrically connected with the calibration and position calculation controller (170), and the camera controller (160) is electrically connected with the calibration and position calculation controller (170).
2. The universal robot vision automatic calibration device according to claim 1, comprising: the robot assembly (115) comprises a robot mounting seat (110), a mechanical arm (120) and a mechanical claw (125), wherein the mechanical arm (120) is fixed on the robot mounting seat (110), and the mechanical claw (125) is fixed at the tail end of the mechanical arm (120).
3. The universal robot vision automatic calibration device according to claim 1, comprising: the robot controller (150) comprises a processor (151), an input/output port (152) and a memory (153), wherein a robot-side communication interface (154) is installed in the memory (153).
4. The universal robot vision automatic calibration device according to claim 1, comprising: the camera controller (160) comprises a processor (161), an input/output port (162) and a memory (163), wherein a camera side communication interface (164) is installed in the memory (163).
5. The universal robot vision automatic calibration device according to claim 1, comprising: the calibration and position calculation controller (170) comprises a processor (171), an input/output port (172) and a memory (173), wherein a robot control module (174), a calibration conversion module (175) and a workpiece position calculation module (176) are installed in the memory (173).
6. The universal robot vision automatic calibration device according to claim 1, comprising: the surface of the calibration plate (140) is provided with characteristic points (141).
7. A universal robot vision automatic calibration method is characterized by comprising the following steps:
step 1: the calibration and position calculation controller (170) establishes a robot user coordinate system;
step 2: searching and determining the position of the calibration plate by the robot;
and step 3: the robot moves the camera to the center of the calibration plate;
and 4, step 4: the robot translates and rotates for a plurality of times and takes pictures to measure the pixel coordinates of the characteristic points on the calibration plate (140);
and 5: a calibration and position calculation controller (170) calculates a homography matrix of the robot to which the camera is seated;
and 6: controlling the mechanical arm (120) to move so that the laser range finder (135) reaches a point on the working surface;
and 7: adjusting the photographing height according to the laser ranging value;
and 8: the calibration and position calculation controller (170) calculates the grabbing position of the mechanical claw (125);
and step 9: the movable gripper (125) grips the workpiece.
8. The method for automatic calibration of the vision of a universal robot according to claim 7, wherein the step 1 specifically comprises:
step 11: the robot controller (150) outputs instructions to control the moving robot arm (120) such that the rangefinder (135) is located at three plane points Puo, Pux and Puy on a calibration plane, wherein Puo is used to determine the origin of the user coordinate system, Pux is used to determine the X-axis of the user coordinate system, and Puy is used to determine the Y-axis of the user coordinate system;
step 12: a range finder (135) measures the Z coordinates of the three points;
step 13: the robot controller (150) sends X, Y and Z coordinate values of the three points to a calibration and position calculation controller (170);
step 14: the calibration and position calculation controller (170) calculates the translation value and the rotation value [ Tx, Ty, Tz, Rx, Ry, Rz ] of the robot user coordinate system relative to the base coordinate system according to the coordinate values of the three points, and the calculation formula is as follows:
Pw=R·Pu+t
where Pu is a coordinate value of a point in the robot coordinate system recorded by the robot controller (150) in step 11 and step 12, Pw is a base coordinate value corresponding to the point, and R is a rotation matrix composed of rotation values Rx, Ry, and Rz:
Figure FDA0002843691370000031
the base coordinate system performs rotation of α, β and γ along x, y and z axes to obtain a user coordinate system, where s α is sin α, c α is cos α, and so on;
t is the column vector consisting of translation values Tx, Ty, Tz:
Figure FDA0002843691370000032
step 15: the calibration and position calculation controller (170) transmits the calculated translation and rotation values [ Tx, Ty, Tz, Rx, Ry, Rz ] to the robot controller (150) to establish a user coordinate system of the robot.
9. The method for automatic calibration of the vision of a universal robot as claimed in claim 7, wherein the step 5 specifically comprises:
step 51: obtaining pixel coordinates of N points according to the step 4;
step 52: calculating a homography matrix H according to the following conversion formula from the camera pixel coordinate system to the robot coordinate system:
Figure FDA0002843691370000041
in the formula: zcAre the coordinates in the camera coordinate system,
Figure FDA0002843691370000042
as many pixels in units of millimeters in the x-direction,
Figure FDA0002843691370000043
comprises the following steps: how many pixels are within unit millimeter in the y-direction, f is the focal length, (X, Y, Z) is the three-dimensional coordinates in the robot coordinate system, (u, v) is the two-dimensional pixel coordinates of the corresponding three-dimensional point on the image, (u, v)0,v0) The actual position of the origin, H, is a homography matrix from a camera pixel coordinate system to a robot coordinate system;
in addition
Figure FDA0002843691370000044
The above equation can be simplified to two equations:
Figure FDA0002843691370000045
usually, another robot coordinate system is attached to the calibration plane, i.e. Z equals 0, so h3=h7=h110, and h12At least 4 pairs of points are required to solve the homography H.
10. The method for automatic calibration of the vision of a universal robot as claimed in claim 7, wherein the step 8 specifically comprises:
step 81: taking pictures by a camera to measure pixel coordinate values Pc1 and Pc2 of two feature points of the workpiece;
step 82: the calibration and position calculation controller (170) converts the pixel coordinate values into user coordinate values Pw1 and Pw2 of the robot according to the homography matrix H;
step 83: the calibration and position calculation controller (170) calculates the workpiece offset and rotation angle [ Tx, Ty, Rz ], as follows:
Figure FDA0002843691370000051
step 84: capturing of calibration and position calculation controller calculation workpiecePosition and posture PickactThe formula is as follows:
Figure FDA0002843691370000052
wherein PickcadIs the known original position of the grasping point.
CN202011501308.6A 2020-12-18 2020-12-18 Universal robot vision automatic calibration device and method Active CN114643577B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011501308.6A CN114643577B (en) 2020-12-18 2020-12-18 Universal robot vision automatic calibration device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011501308.6A CN114643577B (en) 2020-12-18 2020-12-18 Universal robot vision automatic calibration device and method

Publications (2)

Publication Number Publication Date
CN114643577A true CN114643577A (en) 2022-06-21
CN114643577B CN114643577B (en) 2023-07-14

Family

ID=81991478

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011501308.6A Active CN114643577B (en) 2020-12-18 2020-12-18 Universal robot vision automatic calibration device and method

Country Status (1)

Country Link
CN (1) CN114643577B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116160454A (en) * 2023-03-28 2023-05-26 重庆智能机器人研究院 Robot tail end plane vision hand-eye calibration algorithm model

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2682711A1 (en) * 2012-07-03 2014-01-08 Canon Kabushiki Kaisha Apparatus and method for three-dimensional measurement and robot system comprising said apparatus
CN105014678A (en) * 2015-07-16 2015-11-04 深圳市得意自动化科技有限公司 Robot hand-eye calibration method based on laser range finding
JP2018051758A (en) * 2016-09-28 2018-04-05 コグネックス・コーポレイション Simultaneous kinematics and hand eye calibration
CN111571082A (en) * 2020-06-02 2020-08-25 深圳市超准视觉科技有限公司 Automatic welding method and device, mobile terminal and readable storage medium
CN111751136A (en) * 2020-06-29 2020-10-09 伯肯森自动化技术(上海)有限公司 POS machine test system based on binocular vision subassembly
CN111775154A (en) * 2020-07-20 2020-10-16 广东拓斯达科技股份有限公司 Robot vision system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2682711A1 (en) * 2012-07-03 2014-01-08 Canon Kabushiki Kaisha Apparatus and method for three-dimensional measurement and robot system comprising said apparatus
CN105014678A (en) * 2015-07-16 2015-11-04 深圳市得意自动化科技有限公司 Robot hand-eye calibration method based on laser range finding
JP2018051758A (en) * 2016-09-28 2018-04-05 コグネックス・コーポレイション Simultaneous kinematics and hand eye calibration
CN111571082A (en) * 2020-06-02 2020-08-25 深圳市超准视觉科技有限公司 Automatic welding method and device, mobile terminal and readable storage medium
CN111751136A (en) * 2020-06-29 2020-10-09 伯肯森自动化技术(上海)有限公司 POS machine test system based on binocular vision subassembly
CN111775154A (en) * 2020-07-20 2020-10-16 广东拓斯达科技股份有限公司 Robot vision system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李卫国主编: "工业机器人基础", 北京:北京理工大学出版社 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116160454A (en) * 2023-03-28 2023-05-26 重庆智能机器人研究院 Robot tail end plane vision hand-eye calibration algorithm model

Also Published As

Publication number Publication date
CN114643577B (en) 2023-07-14

Similar Documents

Publication Publication Date Title
JP6966582B2 (en) Systems and methods for automatic hand-eye calibration of vision systems for robot motion
CN110842928B (en) Visual guiding and positioning method for compound robot
KR102280663B1 (en) Calibration method for robot using vision technology
JP4021413B2 (en) Measuring device
CN109658460A (en) A kind of mechanical arm tail end camera hand and eye calibrating method and system
CN110276799B (en) Coordinate calibration method, calibration system and mechanical arm
JP2015042437A (en) Robot system and calibration method of robot system
JP2014151427A (en) Robot system and control method therefor
CN113601158B (en) Bolt feeding pre-tightening system based on visual positioning and control method
CN114643578B (en) Calibration device and method for improving robot vision guiding precision
TWI699264B (en) Correction method of vision guided robotic arm
CN110202560A (en) A kind of hand and eye calibrating method based on single feature point
WO2023193362A1 (en) Hybrid robot and three-dimensional vision based large-scale structural part automatic welding system and method
EP4101604A1 (en) System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system
Hvilshøj et al. Calibration techniques for industrial mobile manipulators: Theoretical configurations and best practices
CN114643577B (en) Universal robot vision automatic calibration device and method
CN208246822U (en) A kind of 3D vision positioning robot
JP2010207990A (en) Measuring system
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution
JP6912529B2 (en) How to correct the visual guidance robot arm
CN112381881A (en) Monocular vision-based automatic butt joint method for large rigid body component
WO2023013740A1 (en) Robot control device, robot control system, and robot control method
CN111028298B (en) Convergent binocular system for rigid coordinate system space transformation calibration
Abbott et al. University of Illinois active vision system
CN215701709U (en) Configurable hand-eye calibration device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant