CN114643577B - Universal robot vision automatic calibration device and method - Google Patents

Universal robot vision automatic calibration device and method Download PDF

Info

Publication number
CN114643577B
CN114643577B CN202011501308.6A CN202011501308A CN114643577B CN 114643577 B CN114643577 B CN 114643577B CN 202011501308 A CN202011501308 A CN 202011501308A CN 114643577 B CN114643577 B CN 114643577B
Authority
CN
China
Prior art keywords
robot
calibration
camera
controller
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011501308.6A
Other languages
Chinese (zh)
Other versions
CN114643577A (en
Inventor
秦勇
高一佳
张宏宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Siasun Robot and Automation Co Ltd
Original Assignee
Shenyang Siasun Robot and Automation Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Siasun Robot and Automation Co Ltd filed Critical Shenyang Siasun Robot and Automation Co Ltd
Priority to CN202011501308.6A priority Critical patent/CN114643577B/en
Publication of CN114643577A publication Critical patent/CN114643577A/en
Application granted granted Critical
Publication of CN114643577B publication Critical patent/CN114643577B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1653Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Robotics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Automation & Control Theory (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a universal automatic robot vision calibration device and method, wherein the device comprises: the robot eye calibration process realizes complete automation independent of the third-party software and hardware by integrating the functions of robot motion control, parameter setting, camera calibration conversion, grabbing new coordinate calculation and the like into an external independent software and hardware independent mode and only transmitting necessary result data through a standard communication interface, a robot and a camera, greatly improves the universality of the hand eye calibration device and method, improves the calibration speed and efficiency, and simultaneously reduces the technical requirements on field operators.

Description

Universal robot vision automatic calibration device and method
Technical Field
The invention belongs to the technical field of robot vision calibration, and particularly relates to a universal automatic robot vision calibration device and method.
Background
Industrial robots typically include a controller and a robotic arm. The controller corresponds to the "brain" of the robot, and the mechanical arm corresponds to the "hand" of the robot. In order to enhance the perception of the surrounding environment of a robot, the robot is usually provided with a pair of "eyes", i.e. an industrial camera in practical applications. This brings a new problem, and there is a need for an apparatus and method for converting the pixel coordinates of a camera into robot spatial coordinates. The process of establishing a coordinate transformation between the robot and the camera is known as "hand-eye calibration".
Many studies have been made in this regard at home and abroad. For example, in a patent filed by Kangnai vision Company (COGNEX) entitled "automatic hand-eye calibration System and method of robot motion Vision System" (patent number: CN 111482959A), an automatic hand-eye calibration method with minimal human intervention is proposed. In 2018, the Sichuan Changhong electric apparatus company applies for a method and a device for calibrating a hand-eye camera (patent number: CN 109671122A), which is named as a method for calibrating a hand-eye camera, a method for rapidly and conveniently calibrating the hand-eye camera by using a low-precision calibration plate is proposed. In the patent of the Zhejiang Kola information technology Co., ltd in 2020 entitled "method, apparatus and device for calibrating Camera" (patent No. CN 111445535A), a high-efficiency calibration method is proposed in which the real parameters of the error function are optimized directly by using pre-selected feature points without manually adjusting the camera.
The calibration method has the obvious defects that firstly, the automatic calibration equipment and the method have no universality, and when the brand model of a robot or a camera related to calibration is changed, the automatic calibration equipment and the method cannot be normally used or can be used after the robot or the camera is required to be changed greatly. Secondly, complete automatic calibration is not realized, and a lot of intervention and assistance work with certain professional difficulty still need to be manually carried out in the calibration process.
Disclosure of Invention
Aiming at the technical problems, the invention aims to provide a general automatic hand-eye calibration device and method which are independent of the brands and models of the calibrated robot and camera and do not need manual intervention in the calibration process.
The technical scheme adopted by the invention is as follows: a universal robotic vision auto-calibration device, comprising:
the device comprises a robot assembly, a camera, a range finder, a calibration plate, a robot controller, a camera controller and a calibration and position calculation controller;
the robot assembly is fixed in space, the camera and the range finder are fixed at the tail end of the robot assembly, and the calibration plate is arranged in the visual range of the camera;
the robot assembly is electrically connected with the robot controller, the camera is electrically connected with the camera controller, the distance meter is electrically connected with the camera controller, the robot controller is electrically connected with the calibration and position calculation controller, and the camera controller is electrically connected with the calibration and position calculation controller;
the robot assembly comprises a robot mounting seat, a mechanical arm and a mechanical claw, wherein the mechanical arm is fixed on the robot mounting seat, and the mechanical claw is fixed at the tail end of the mechanical arm.
The robot controller comprises a processor, an input/output port and a memory, wherein a robot side communication interface is arranged in the memory.
The camera controller comprises a processor, an input/output port and a memory, wherein a camera side communication interface is installed in the memory.
The calibration and position calculation controller comprises a processor, an input/output port and a memory, wherein a robot control module, a calibration conversion module and a workpiece position calculation module are installed in the memory.
And characteristic points are arranged on the surface of the calibration plate.
A general robot vision automatic calibration method comprises the following steps:
step 1: the calibration and position calculation controller establishes a robot user coordinate system;
step 2: the robot searches to determine the position of the calibration plate;
step 3: the robot moves the camera to the center of the calibration plate;
step 4: the robot translates and rotates for a plurality of times and photographs and measures the pixel coordinates of the characteristic points on the calibration plate;
step 5: the calibration and position calculation controller calculates a homography matrix of the camera sitting on the robot;
step 6: controlling the mechanical arm to move so that the laser distance meter reaches one point on the working surface;
step 7: adjusting the photographing height according to the laser ranging value;
step 8: the calibration and position calculation controller calculates the grabbing position of the mechanical claw;
step 9: the moving gripper grips the workpiece.
The step 1 specifically includes:
step 11: the robot controller outputs instructions to control the mobile mechanical arm so that the distance meter reaches three plane points Puo, pux and Puy on the calibration plane, wherein Puo is used for determining the origin of the user coordinate system, pux is used for determining the X axis of the user coordinate system, and Puy is used for determining the Y axis of the user coordinate system;
step 12: the distance meter measures the Z coordinates of the three points;
step 13: the robot controller sends X, Y and Z coordinate values of the three points to a calibration and position calculation controller;
step 14: the calibration and position calculation controller calculates a translation value and a rotation value [ Tx, ty, tz, rx, ry, rz ] of the robot user coordinate system relative to the base coordinate system according to the three-point coordinate values, and the calculation formula is as follows:
Pw=R·Pu+t
wherein Pu is a point coordinate value under a robot coordinate system recorded by the robot controller in step 11 and step 12, pw is a base coordinate value corresponding to the point, and R is a rotation matrix formed by rotation values Rx, ry and Rz:
Figure BDA0002843691380000031
the base coordinate system rotates along x, y and z axes to obtain a user coordinate system, sα=sinα, cα=cos α, and the rest;
t is a column vector consisting of translation values Tx, ty, tz:
Figure BDA0002843691380000041
step 15: the calibration and position calculation controller transmits the calculated translation value and rotation value [ Tx, ty, tz, rx, ry, rz ] to the robot controller, and establishes a user coordinate system of the robot.
The step 5 specifically comprises the following steps:
step 51: obtaining pixel coordinates of N points according to the step 4;
step 52: the homography matrix H is calculated according to the following conversion formula from the camera pixel coordinate system to the robot coordinate system:
Figure BDA0002843691380000042
wherein: z is Z c Is the coordinates in the camera's coordinate system,
Figure BDA0002843691380000043
for how many pixels are in the x-direction unit millimeter, and (2)>
Figure BDA0002843691380000044
The method comprises the following steps: how many pixels are in the unit millimeter in the y direction, f is the focal length, (X, Y, Z) is the three-dimensional coordinate in the robot coordinate system, (u, v) is the two-dimensional pixel coordinate of the corresponding three-dimensional point on the image, (u) 0 ,v 0 ) The actual position of the origin is H which is a homography matrix from a camera pixel coordinate system to a robot coordinate system;
in addition, another
Figure BDA0002843691380000045
The above can be reduced to two equations:
Figure BDA0002843691380000046
usually, the robot coordinate system is attached to the calibration plane, i.e. z=0, so h 3 =h 7 =h 11 =0, and h 12 =1, at least 4 pairs of points are needed to solve the homography matrix H.
The step 8 specifically includes:
step 81: photographing by a camera to measure pixel coordinate values Pc1 and Pc2 of two characteristic points of the workpiece;
step 82: the calibration and position calculation controller converts pixel coordinate values into user coordinate values Pw1 and Pw2 of the robot according to the homography matrix H;
step 83: the calibration and position calculation controller calculates the offset and rotation angle [ Tx, ty, rz ] of the workpiece, and the formula is as follows:
Figure BDA0002843691380000051
step 84: calibrating and markingThe position calculation controller calculates the grabbing pose Pick of the workpiece act The formula is as follows:
Figure BDA0002843691380000052
wherein Pick is cad Is the known origin of the grabbing point.
Compared with the prior art, the invention has the following beneficial effects:
the functions of robot motion control, parameter setting, camera calibration conversion, new coordinate grabbing calculation and the like are integrated into an external independent mode independent of software and hardware of a third party, and necessary result data are transmitted only through a standard communication interface, the robot and the camera, so that the robot hand-eye calibration process is completely automated independent of the software and hardware of the third party, the universality of the hand-eye calibration device and method is greatly improved, the calibration speed and efficiency are improved, and meanwhile, the technical requirements on field operators are reduced.
Drawings
FIG. 1 is a schematic diagram of a general-purpose robot vision automatic calibration device according to the present invention;
FIG. 2 is a flow chart of a calibration process of the universal robot vision automatic calibration method of the present invention;
FIG. 3 is a flow chart of the operation process of the universal robot vision automatic calibration method of the present invention;
Detailed Description
In order that the above objects, features and advantages of the invention will be readily understood, a more particular description of the invention will be rendered by reference to the appended drawings. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. The invention may be embodied in many other forms than described herein and similarly modified by those skilled in the art without departing from the spirit or scope of the invention, which is therefore not limited to the specific embodiments disclosed below.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
Referring to fig. 1, one embodiment of the present invention is illustrated. FIG. 1 is a schematic diagram of a general-purpose robot vision automatic calibration device according to the present invention. Including a robot assembly 115, a camera 130, a rangefinder 135, a calibration plate 140, a robot controller 150, a camera controller 160, and a calibration and position calculation controller 170.
The robot assembly 115 of fig. 1 includes, among other things, a robot mount 110, a robotic arm 120, and a gripper 125. The robot arm 120 is fixed to the robot mount 110, and the robot mount 110 is fixed in the space 100. The robot arm 120 and the robot controller 150 are electrically connected through an input/output port 152. A gripper 125 is fixed to the end of the arm 120. In this embodiment, the robot is used as a positioning element for the camera and the gripper to perform the translational and rotational actions required for camera calibration and to grasp the workpiece. In addition to using the robot of fig. 1, various other ways including a servo cylinder platform, a single axis robot platform, etc. may be used.
The camera 130 in fig. 1 includes an image sensor, a lens, and a light source, among others. The camera 130 is fixed at the end of the mechanical arm 120 and is electrically connected to the camera controller 160 through the input/output port 162. In this example, the pixel coordinates of the feature points 141 in the calibration plate 140, and the pixel coordinates of the feature points on the workpiece are measured using an area array scanning camera. In other embodiments, the camera may include a two-dimensional CCD camera sensor, a two-dimensional CMOS camera sensor, or any other type of area scanning sensor for generating images.
The distance meter 135 in fig. 1 may be a laser distance meter, or may be another type of measuring sensor capable of measuring a distance, including a mechanical distance meter, an ultrasonic distance meter, and the like.
The calibration plate 140 in fig. 1 may be a calibration plate with feature points 141. Other types of calibration patterns of calibration plates are also possible, some exemplary patterns include, but are not limited to, dot grids, line grids, cross or honeycomb, triangular checkerboards, and the like.
Among them, the robot controller 150 in fig. 1 includes a processor 151, an input-output port 152, and a memory 153. The memory has mounted therein a robot control system and a robot-side communication interface 154. The robot-side communication interface 154 is used to handle the exchange of data between the calibration and position calculation controllers 170 and the robot controller 150. The calibration and position calculation controller sends an instruction for modifying the user coordinate system of the robot and moving the mechanical arm to the robot controller, and the robot controller sends the current coordinate position of the robot to the calibration and position calculation controller. The robot-side communication interface 154 may be a programmable read-only memory (PROM), nonvolatile Random Access Memory (NRAM), or the like, formatted in the robotic controller 150. In other embodiments, the robotic-side communication interface 154 may also be hardwired without software.
Among them, the camera controller 160 in fig. 1 includes a processor 161, an input-output port 162, and a memory 163. The memory has mounted therein a camera control system and a camera side communication interface 164. The camera side communication interface 164 is used to handle the exchange of data between the calibration and position calculation controller 170 and the camera controller 160. The camera controller 160 transmits the pixel coordinate values of the feature points 141 to the calibration and position calculation controller 170, and the calibration and position calculation controller 170 converts the pixel coordinate values into robot target coordinates, transmits the robot target coordinates to the robot through the robot-side communication interface 154, and controls the robot to move to the position. The camera side communication interface 164 may be embedded in the camera controller 160 in a Programmable Read Only Memory (PROM), nonvolatile Random Access Memory (NRAM), or the like format. In other embodiments, the camera side communication interface 164 may also be hardwired without software.
The calibration and position calculation controller 170 in fig. 1 includes a processor 171, an input/output port 172, and a memory 173. The memory has mounted therein a robot control module 174, a calibration conversion module 175, and a workpiece position calculation module 176. The robot control module 174 is configured to send motion commands to the robot via the input/output port 172, and drive the mechanical arm 120 to perform translation and rotation operations required for camera hand-eye calibration or grabbing operation. The calibration conversion module 175 is used for calculating a homography matrix according to the pixel coordinate system sent by the camera controller 160, or converting the sent pixel coordinate system into coordinate values of a user coordinate system of the robot through homography. The workpiece position calculation module 176 is configured to calculate a translation amount and a rotation amount of the workpiece based on the coordinate values of the current feature point of the workpiece calculated by the calibration conversion module 175, and further calculate a gripping position of the gripper 125 based on the translation amount and the rotation amount.
When the camera calibration operation is performed, first, the calibration and position calculation controller 170 moves the range finder 135 through the robot side communication interface 154 of the robot controller 150 to measure the height values of any three points on the calibration plane, and calculates the robot user coordinate system parallel to the calibration plane. The calibration and position calculation controller 170 then moves the camera 150 on the user coordinate system to search for the position of the calibration plate and center the calibration plane in the camera field of view. Finally, the calibration and position calculation controller 170 controls the machine to translate and rotate around the point for a plurality of times and take a picture to measure the pixel coordinates of the feature points, calculates a homography matrix from the camera coordinate system to the robot coordinate system according to a plurality of groups of pixel coordinates measured by taking the picture and corresponding robot coordinate point pairs, and completes the calibration operation.
When executing the operation, the robot control module 174 in the calibration and position calculation controller 170 sends an instruction to the robot controller 150 to control the mechanical arm 120 to move the distance meter 135 to any point on the working surface to measure the height, and then adjusts the photographing position of the robot according to the measured height value, and photographs the pixel coordinate system of the feature point on the measured workpiece. The calibration conversion module 175 in the calibration and position calculation controller 170 then converts the pixel coordinate values into user target coordinate values. Finally, a workpiece position calculation module 176 in the calibration and position calculation controller 170 calculates the grabbing coordinate values of the workpiece and transmits the grabbing coordinate values to the robot controller 150, and the robot moves the gripper 125 to perform corresponding operations on the workpiece.
Referring to fig. 2, another embodiment of the present invention is shown. Fig. 2 is a flowchart of a calibration process of the universal robot vision automatic calibration method of the present invention, comprising the steps of:
step 205: starting calibration;
step 210: the robot moves the laser rangefinder 135 to any three points on the calibration plane. At the robot base coordinates, the laser rangefinder 135 is moved to non-collinear three points Puo, pux and Puy on the calibration plane, where Puo determines the origin of the user coordinate system, pux determines the X-axis of the user coordinate system, and Puy determines the X-Y plane of the user coordinate system. The robot controller 160 records X and Y coordinate values at the base coordinates of the three points. The photometric distance meter 135 measures the Z coordinate values of these three points. Finally, the robot controller 160 sends the X-Y-Z space coordinate values of the three points to the calibration and position calculation controller 170;
step 215: the calibration and position calculation controller 170 calculates a user coordinate system of the robot. According to the principle that a plane can be determined by three non-collinear points, the translational value and the rotational value [ Tx, ty, tz, rx, ry, rz ] of the robot user coordinate system relative to the base coordinate system can be calculated, and the calculation formula is as follows:
Pw=R·Pu+T
where Pu is the pixel coordinate value of the point measured in step 214, pw is the base coordinate value corresponding to the point, and R is the rotation matrix formed by the rotation values Rx, ry and Rz:
Figure BDA0002843691380000091
the base coordinate system rotates along x, y and z axes to obtain a user coordinate system, sα=sinα, cα=cos α, and the rest;
t is a column vector consisting of translation values Tx, ty, tz:
Figure BDA0002843691380000092
step 220: the robotic search determines the position of the calibration plate 140 in the robot user coordinate system. The mechanical arm 120 of the robot drives the camera 130 to move a plurality of positions in the working range to take pictures, and whether the calibration plate exists or not is checked. If yes, stopping searching;
step 225: the robot moves the mechanical arm 120 to drive the camera to the center of the calibration plate. The mobile robot centers the calibration plate in the camera field of view in preparation for several translational and rotational movements to be performed next. If the position of the calibration plate in the visual field is too close to the edge, the characteristic points possibly go out of the visual field when translational and rotational movements are performed;
step 230: the robot translates and rotates several times and photographs and measures the pixel coordinates of the feature points. The robot performs translational and rotational movements that deviate by values, [ -dx, -dy,0], [0, -dy,0], [ dx, -dy,0], [ -dx,0,0], [0, 0], [ dx, 0], [ -dx, dy,0], [0, dy,0], [ dx, dy,0], [0, -2 x rz ], [0, -rz ], [0, 0], [0, rz ], [0, 2 x rz ]. In this example, 9 translations plus 5 rotations are performed, but in other examples fewer or greater numbers of translations and rotations may be performed. After each translation or rotation movement is completed, the camera photographs and measures the pixel coordinate value of a certain characteristic point;
step 235: the calibration and position calculation controller 170 calculates a homography matrix for the camera sitting on the robot. Step 230 results in 14 sets of pixel coordinates. And manually designating the corresponding theoretical coordinate values for each group of pixel coordinates according to the origin position and coordinate axis orientation of the camera coordinate system which are expected to be obtained. The homography matrix H is calculated according to the following conversion formula from the camera pixel coordinate system to the robot coordinate system:
Figure BDA0002843691380000101
wherein: z is Z c Is the coordinates in the camera's coordinate system,
Figure BDA0002843691380000102
for how many pixels are in the x-direction unit millimeter, and (2)>
Figure BDA0002843691380000103
The method comprises the following steps: in the y directionHow many pixels are in a unit millimeter, f is a focal length, (X, Y, Z) is a three-dimensional coordinate in a robot coordinate system, (u, v) is a two-dimensional pixel coordinate of a corresponding three-dimensional point on an image, (u) 0 ,v 0 ) The actual position of the origin, H, is the homography matrix from the camera pixel coordinate system to the robot coordinate system.
In addition, another
Figure BDA0002843691380000111
The above can be reduced to two equations:
Figure BDA0002843691380000112
usually, the robot coordinate system is attached to the calibration plane, i.e. z=0. Thus h 3 =h 7 =h 11 =0, and h 12 =1. The above equation set has 8 unknowns and each pair of points (pixel and corresponding three-dimensional point) can list 2 equations, at least 4 pairs of points are shown to solve for the homography matrix H.
In order to eliminate the influence of random normal error distribution in the data acquisition process and improve the solving precision and stability, 14 point pairs are acquired, a nonlinear equation set containing 28 equations is constructed, and a homography matrix H is solved by using a Levenberg-Marquardt nonlinear optimization iterative algorithm based on the least square principle.
Step 240: and (5) finishing calibration.
Referring to fig. 3, another embodiment of the present invention is shown. FIG. 3 is a flow chart of the operation process of the universal robot vision automatic calibration method of the invention, comprising the following steps:
step 305: starting to operate;
step 310: the robot moves the laser rangefinder 135 to any point on the work surface. Firstly, the robot needs to measure whether the distance between the feature point to be measured and the camera 130 is the same as the distance between the calibration standard board 140 and the camera 130, and if the distance is different, the photographing height needs to be adjusted so as to maintain the same measurement accuracy;
step 315: the robot adjusts the photographing height according to the laser ranging value. The laser distance measuring device 135 measures the height value and feeds back to the calibration and position calculating controller 170, and the calibration and position calculating controller 170 calculates the deviation value of the distance between the calibration and position calculating controller and the calibration and then controls the robot to rise or fall by corresponding distance;
step 320: the camera 130 photographs and measures pixel coordinate values of feature points of the workpiece, and records that the pixel coordinate of the feature points obtained by photographing and measuring is [ u ]; v ];
step 325: the calibration and position calculation controller 170 converts the pixel coordinate values into robot coordinate values. Calculating a robot coordinate value [ X ] of the point according to the homography matrix H in the calculation of the calibration process; y; z ], the calculation formula is as follows:
Figure BDA0002843691380000121
step 330: the calibration and position calculation controller 170 calculates the workpiece offset and the rotation angle. According to the robot coordinate values Pw1 and Pw2 of the two feature points calculated in step 325, the workpiece offset and rotation angle [ Tx, ty, rz ] are calculated as follows:
Figure BDA0002843691380000122
step 335: the calibration and position calculation controller 170 calculates the gripper gripping pose. The workpiece offset and rotation angle [ Tx, ty, rz ] calculated according to step 330]And theoretical grabbing coordinates Pick of the gripper 125 measured from the CAD model cad The grasping coordinate value Pi ckact of the gripper 125 is calculated. The calculation formula is as follows:
Figure BDA0002843691380000123
step 340: and (5) finishing the operation.
The embodiments described in the above description will assist those skilled in the art in further understanding the invention, but do not limit the invention in any way. It should be noted that several variations and modifications could be made by those skilled in the art without departing from the inventive concept. These are all within the scope of the present invention.

Claims (3)

1. The utility model provides a general type robot vision automatic calibration method, this method is realized based on a general type robot vision automatic calibration device, and this device includes:
a robot assembly (115), a camera (130), a range finder (135), a calibration plate (140), a robot controller (150), a camera controller (160), and a calibration and position calculation controller (170);
the robot assembly (115) is fixed in a space (100), the camera (130) and the range finder (135) are fixed at the tail end of the robot assembly (115), and the calibration plate (140) is arranged in the visual range of the camera (130);
the robot assembly (115) is electrically connected with the robot controller (150), the camera (130) is electrically connected with the camera controller (160), the range finder (135) is electrically connected with the camera controller (160), the robot controller (150) is electrically connected with the calibration and position calculation controller (170), and the camera controller (160) is electrically connected with the calibration and position calculation controller (170);
the method is characterized by comprising the following steps:
step 1: the calibration and position calculation controller (170) establishes a robot user coordinate system;
step 2: the robot searches to determine the position of the calibration plate;
step 3: the robot moves the camera to the center of the calibration plate;
step 4: the robot translates and rotates for a plurality of times and photographs and measures the pixel coordinates of the characteristic points on the calibration plate (140);
step 5: a calibration and position calculation controller (170) calculates a homography matrix of the camera sitting on the robot; the method specifically comprises the following steps:
step 51: obtaining pixel coordinates of N points according to the step 4;
step 52: the homography matrix H is calculated according to the following conversion formula from the camera pixel coordinate system to the robot coordinate system:
Figure FDA0004255320270000021
wherein: z is Z c Is the coordinates in the camera's coordinate system,
Figure FDA0004255320270000022
for how many pixels are in the x-direction unit millimeter, and (2)>
Figure FDA0004255320270000023
The method comprises the following steps: how many pixels are in the unit millimeter in the y direction, f is the focal length, (X, Y, Z) is the three-dimensional coordinate in the robot coordinate system, (u, v) is the two-dimensional pixel coordinate of the corresponding three-dimensional point on the image, (u) 0 ,v 0 ) The actual position of the origin is H which is a homography matrix from a camera pixel coordinate system to a robot coordinate system;
in addition, another
Figure FDA0004255320270000024
The above can be reduced to two equations:
Figure FDA0004255320270000025
usually, the robot coordinate system is attached to the calibration plane, i.e. z=0, so h 3 =h 7 =h 11 =0, and h 12 =1, at least 4 points are needed to solve the homography matrix H;
step 6: controlling the mechanical arm (120) to move so that the laser distance meter (135) reaches a point on the working surface;
step 7: adjusting the photographing height according to the laser ranging value;
step 8: the calibration and position calculation controller (170) calculates the grabbing position of the mechanical claw (125);
step 9: the moving gripper (125) grips the workpiece.
2. The method for automatically calibrating the vision of the universal robot according to claim 1, wherein the step 1 specifically comprises:
step 11: the robot controller (150) outputs instructions to control the mobile robotic arm (120) such that the rangefinder (135) reaches three plane points Puo, pux and Puy on the calibration plane, wherein Puo is used to determine the origin of the user coordinate system, pux is used to determine the X-axis of the user coordinate system, puy is used to determine the Y-axis of the user coordinate system;
step 12: a range finder (135) measures the Z coordinates of the three points;
step 13: the robot controller (150) sends X, Y and Z coordinate values of the three points to a calibration and position calculation controller (170);
step 14: the calibration and position calculation controller (170) calculates translational and rotational values [ Tx, ty, tz, rx, ry, rz ] of the robot user coordinate system relative to the base coordinate system according to the three-point coordinate values, and the calculation formula is as follows:
Pw=R·Pu+t
wherein Pu is a point coordinate value in a robot coordinate system recorded by the robot controller (150) in step 11 and step 12, pw is a base coordinate value corresponding to the point, and R is a rotation matrix composed of rotation values Rx, ry and Rz:
Figure FDA0004255320270000031
the base coordinate system rotates along x, y and z axes to obtain a user coordinate system, sα=sinα, cα=cos α, and the rest;
t is a column vector consisting of translation values Tx, ty, tz:
Figure FDA0004255320270000032
step 15: the calibration and position calculation controller (170) transmits the calculated translation and rotation values [ Tx, ty, tz, rx, ry, rz ] to the robot controller (150) to establish a user coordinate system of the robot.
3. The method for automatically calibrating the vision of the universal robot according to claim 1, wherein the step 8 specifically comprises:
step 81: photographing by a camera to measure pixel coordinate values Pc1 and Pc2 of two characteristic points of the workpiece;
step 82: the calibration and position calculation controller (170) converts pixel coordinate values into user coordinate values Pw1 and Pw2 of the robot according to the homography matrix H;
step 83: the calibration and position calculation controller (170) calculates the workpiece offset and rotation angle [ Tx, ty, rz ] as follows:
Figure FDA0004255320270000041
step 84: the calibration and position calculation controller calculates the grabbing pose Pick of the workpiece act The formula is as follows:
Figure FDA0004255320270000042
wherein Pick is cad Is the known origin of the grabbing point.
CN202011501308.6A 2020-12-18 2020-12-18 Universal robot vision automatic calibration device and method Active CN114643577B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011501308.6A CN114643577B (en) 2020-12-18 2020-12-18 Universal robot vision automatic calibration device and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011501308.6A CN114643577B (en) 2020-12-18 2020-12-18 Universal robot vision automatic calibration device and method

Publications (2)

Publication Number Publication Date
CN114643577A CN114643577A (en) 2022-06-21
CN114643577B true CN114643577B (en) 2023-07-14

Family

ID=81991478

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011501308.6A Active CN114643577B (en) 2020-12-18 2020-12-18 Universal robot vision automatic calibration device and method

Country Status (1)

Country Link
CN (1) CN114643577B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116160454A (en) * 2023-03-28 2023-05-26 重庆智能机器人研究院 Robot tail end plane vision hand-eye calibration algorithm model

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6222898B2 (en) * 2012-07-03 2017-11-01 キヤノン株式会社 Three-dimensional measuring device and robot device
CN105014678A (en) * 2015-07-16 2015-11-04 深圳市得意自动化科技有限公司 Robot hand-eye calibration method based on laser range finding
US10076842B2 (en) * 2016-09-28 2018-09-18 Cognex Corporation Simultaneous kinematic and hand-eye calibration
CN111571082B (en) * 2020-06-02 2022-08-16 深圳市超准视觉科技有限公司 Automatic welding method and device, mobile terminal and readable storage medium
CN111751136A (en) * 2020-06-29 2020-10-09 伯肯森自动化技术(上海)有限公司 POS machine test system based on binocular vision subassembly
CN111775154B (en) * 2020-07-20 2021-09-03 广东拓斯达科技股份有限公司 Robot vision system

Also Published As

Publication number Publication date
CN114643577A (en) 2022-06-21

Similar Documents

Publication Publication Date Title
JP6966582B2 (en) Systems and methods for automatic hand-eye calibration of vision systems for robot motion
JP7237483B2 (en) Robot system control method, control program, recording medium, control device, robot system, article manufacturing method
CN110842928B (en) Visual guiding and positioning method for compound robot
KR102280663B1 (en) Calibration method for robot using vision technology
WO2023193362A1 (en) Hybrid robot and three-dimensional vision based large-scale structural part automatic welding system and method
CN111127568B (en) Camera pose calibration method based on spatial point location information
JP4021413B2 (en) Measuring device
CN109658460A (en) A kind of mechanical arm tail end camera hand and eye calibrating method and system
EP3011362B1 (en) Systems and methods for tracking location of movable target object
CN111781894B (en) Method for carrying out space positioning and attitude navigation on assembly tool by utilizing machine vision
JP2014151427A (en) Robot system and control method therefor
JP2015042437A (en) Robot system and calibration method of robot system
JPWO2018043525A1 (en) Robot system, robot system control apparatus, and robot system control method
CN114643578B (en) Calibration device and method for improving robot vision guiding precision
CN110202560A (en) A kind of hand and eye calibrating method based on single feature point
CN111590593A (en) Calibration method, device and system of mechanical arm and storage medium
CN115042175A (en) Method for adjusting tail end posture of mechanical arm of robot
CN114643577B (en) Universal robot vision automatic calibration device and method
CN115284292A (en) Mechanical arm hand-eye calibration method and device based on laser camera
CN115972192A (en) 3D computer vision system with variable spatial resolution
CN112958960A (en) Robot hand-eye calibration device based on optical target
CN110568866A (en) Three-dimensional curved surface vision guiding alignment system and alignment method
CN112598752B (en) Calibration method and operation method based on visual recognition
CN111028298B (en) Convergent binocular system for rigid coordinate system space transformation calibration
CN116652970A (en) Four-axis mechanical arm 2D hand-eye calibration method and system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant