CN114519738A - Hand-eye calibration error correction method based on ICP algorithm - Google Patents

Hand-eye calibration error correction method based on ICP algorithm Download PDF

Info

Publication number
CN114519738A
CN114519738A CN202210078209.4A CN202210078209A CN114519738A CN 114519738 A CN114519738 A CN 114519738A CN 202210078209 A CN202210078209 A CN 202210078209A CN 114519738 A CN114519738 A CN 114519738A
Authority
CN
China
Prior art keywords
robot
camera
coordinate system
hand
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210078209.4A
Other languages
Chinese (zh)
Inventor
侯坤
乔大勇
李萌新
汪佳静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Research Institute of Northwestern Polytechnical University
Original Assignee
Ningbo Research Institute of Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Research Institute of Northwestern Polytechnical University filed Critical Ningbo Research Institute of Northwestern Polytechnical University
Priority to CN202210078209.4A priority Critical patent/CN114519738A/en
Publication of CN114519738A publication Critical patent/CN114519738A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to a hand-eye calibration error correction method based on an ICP (inductively coupled plasma) algorithm, wherein the hand-eye calibration comprises a calibration object, a robot and a camera, a point cloud coordinate in the camera is converted to a conversion coordinate under a robot coordinate system by utilizing a matrix conversion relation X, an actual coordinate of the point cloud coordinate under the robot coordinate system is collected by a robot TCP (transmission control protocol) tool, two sets of point sets of the conversion coordinate and the actual coordinate are matched by the ICP algorithm, a hand-eye calibration error matrix of translation and rotation between the two sets of point sets is obtained, and a corrected hand-eye calibration matrix is obtained.

Description

Hand-eye calibration error correction method based on ICP algorithm
Technical Field
The invention relates to the technical field of hand-eye calibration of robots and cameras, in particular to a hand-eye calibration error correction method based on an ICP (inductively coupled plasma) algorithm.
Background
In the field of three-dimensional vision and robot application, the hand-eye calibration process is a process of establishing a matrix relation between a robot coordinate system and a camera coordinate system, so that the coordinate position acquired by a camera is converted into the robot coordinate system, and the robot can complete accurate operation tasks. The position relation between the robot and the camera is mainly divided into two fixing modes of eye-to-hand and eye-in-hand, the camera is fixed at a specific position outside the robot under the eye-to-hand relation, and the camera is fixed at the tail end of the shaft of the robot under the eye-in-hand relation and moves along with the robot. For eye-to-hand mode, a camera generally acquires image data of a certain fixed scene, and the relationship between a camera coordinate system and a robot base coordinate system is calibrated by hands and eyes; for the eye-in-hand mode, the camera moves along with the robot, image data of different views can be obtained, the integrity of the data is guaranteed, and at the moment, the relationship between a camera coordinate system and a flange plate coordinate system at the tail end of the robot is calibrated by hands and eyes. At present, the hand-eye calibration mode is mainly realized by shooting a calibration object by a camera to identify the pose, acquiring the coordinates of a robot and then resolving a hand-eye calibration equation. The hand-eye calibration precision obtained by the hand-eye calibration method is easily influenced by the precision of the camera and the robot, the pose identification precision and the accuracy of the hand-eye calibration equation calculation method, and accumulated errors are formed, so that the robot cannot accurately move after the camera positions a target point. The hand-eye calibration precision directly influences the operation precision of the robot, and the improvement of the hand-eye calibration precision is a task which is urgently needed to solve for production tasks with high precision requirements such as welding and assembly.
The existing method for improving the hand-eye calibration precision can only carry out optimization from the equation algorithms of calibration object pose identification and hand-eye calibration, but the calculated hand-eye calibration matrix is still different from the true value, so that the high-precision task operation requirement cannot be met, and the precision improvement effect is not obvious.
Disclosure of Invention
The invention solves the problem of how to correct the error of the hand-eye calibration and improve the precision of the hand-eye calibration.
In order to solve the above problems, the present invention provides a method for correcting an error of a hand-eye calibration based on an ICP algorithm, wherein the hand-eye calibration comprises a calibration object, a robot and a camera, and comprises the following steps:
step 1, calibrating relative movement between an object and a camera under the drive of a robot to obtain a matrix conversion relation X between a camera coordinate system and a robot coordinate system;
step 2, collecting a point cloud image of a calibration object under a camera, acquiring N point cloud coordinates from the point cloud image, and converting the N point cloud coordinates acquired under the camera into conversion coordinates of the robot through a matrix conversion relation X;
step 3, adopting a robot TCP tool to enable the robot to move to N point cloud coordinates, and recording the N point cloud coordinates as actual coordinates of the robot;
and 4, calculating a hand-eye calibration error matrix between the actual coordinates and the converted coordinates of the N robots by adopting an ICP (inductively coupled plasma) algorithm to obtain a hand-eye calibration matrix after error correction.
The invention has the beneficial effects that: the method comprises the steps of converting point cloud coordinates in a camera into conversion coordinates under a robot coordinate system by utilizing a matrix conversion relation X, collecting actual coordinates of the point cloud coordinates under the robot coordinate system by a robot TCP tool, matching two groups of point sets of the conversion coordinates and the actual coordinates by an ICP algorithm, obtaining a hand-eye calibration error matrix of translation and rotation between the two groups of point sets, and obtaining a corrected hand-eye calibration matrix.
Preferably, in the step 1, the relative motion between the calibration object and the camera driven by the robot is specifically: the calibration object is fixed on a flange plate at the tail end of a robot shaft, and the camera is arranged at a fixed position relative to a robot base coordinate system, so that the position relation between the robot and the camera is an eye-to-hand fixing mode.
Preferably, the step 1 of obtaining the matrix transformation relationship X between the camera coordinate system and the robot coordinate system specifically includes:
step 101A, the robot drives a calibration object to move, and posture position data of a flange plate at the tail end of a robot shaft in a demonstrator is recorded; meanwhile, the camera acquires original image data of each movement of the calibration object;
102A, performing pose identification on the original image data by adopting homography transformation in OpenCV to obtain pose position data of each original image data in a camera coordinate system;
and 103A, solving the solution of AX (X-XB) equation by using the OpenCV hand-eye calibration function through the attitude position data of the flange plate at the tail end of the robot shaft in the demonstrator and the attitude position data under the robot coordinate system to obtain a matrix conversion relation X between the camera coordinate system and the robot base coordinate system.
Preferably, in the step 2, the N point cloud coordinates acquired by the camera are converted into conversion coordinates in the robot base coordinate system through a matrix conversion relation X;
in the step 3, a robot TCP tool is adopted to enable the robot to move to N point cloud coordinates, coordinates under a robot base coordinate system are obtained, and the coordinates under the robot base coordinate system are converted into actual coordinates of a flange plate at the tail end of a robot shaft;
the hand-eye calibration matrix obtained after error correction in the step 4 specifically includes:
Figure BDA0003484969760000031
in the formula, Tcane blazing aFor collected point cloud coordinates, T i2, converting the point cloud coordinate into a conversion coordinate under a robot base coordinate system, wherein i is more than or equal to 1 and less than or equal to N; t isjJ is more than or equal to 1 and less than or equal to N for the actual coordinate of the robot base coordinate system obtained in the step 3; x0Hand-eye calibration error matrix, X, for calculating rigid transformations between coordinates in ICP algorithm0And X is a corrected hand-eye calibration matrix.
Preferably, in step 1, the relative motion between the calibration object and the camera driven by the robot is specifically: the camera is fixed on a flange plate at the tail end of a robot shaft, and the calibration object is arranged at a fixed position relative to a robot base coordinate system, so that the position relation between the robot and the camera is an eye-in-hand fixing mode.
Preferably, the step 1 of obtaining the matrix transformation relationship X between the camera coordinate system and the robot coordinate system specifically includes:
step 101B, the robot drives the camera to move, and posture position data of a flange plate at the tail end of a robot shaft in the demonstrator is recorded; meanwhile, the camera acquires the original image data of the calibration object every time the camera moves;
102B, performing pose identification on the original image data by adopting homography transformation in OpenCV to obtain pose position data of each original image data in a camera coordinate system;
and 103B, solving the solution of AX (X-XB) equation by using the OpenCV hand-eye calibration function through the attitude position data of the flange plate at the tail end of the robot shaft in the demonstrator and the attitude position data under the camera coordinate system to obtain a matrix conversion relation X between the camera coordinate system and the flange plate coordinate system at the tail end of the robot shaft.
Preferably, in the step 2, the N point cloud coordinates obtained by the camera are converted into conversion coordinates in a flange coordinate system at the tail end of the robot shaft through a matrix conversion relation X;
in the step 3, a robot TCP tool is adopted to enable the robot to move to N point cloud coordinates, and the actual coordinates of a flange plate at the tail end of a robot shaft are recorded;
the hand-eye calibration matrix obtained after error correction in the step 4 specifically includes:
Figure BDA0003484969760000041
in the formula, Tcane blazing aFor collected point cloud coordinates, TnConverting the point cloud coordinate into a conversion coordinate of a flange plate coordinate system at the tail end of the robot shaft in the step 2, wherein N is more than or equal to 1 and less than or equal to N; t isnN is more than or equal to 1 and less than or equal to N for the actual coordinates of the flange plate at the tail end of the robot shaft obtained in the step 3; x0Hand-eye calibration error matrix, X, for calculating rigid transformations between coordinates in ICP algorithm0X is the corrected hand-eye calibration matrix.
Drawings
FIG. 1 is a hand-eye calibration chart according to embodiment 1 of the present invention;
fig. 2 is a hand-eye calibration chart according to embodiment 2 of the present invention.
Description of reference numerals:
1. a robot; 2. a camera; 3. and calibrating the object.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
Detailed description of the preferred embodiment 1
A hand-eye calibration error correction method based on an ICP algorithm comprises the steps of calibrating an object, a robot and a camera, and comprises the following steps:
step 1, as shown in fig. 1, the calibration object is fixed on a flange plate at the tail end of a robot shaft, and the camera is arranged at a fixed position corresponding to a robot base coordinate system, so that the position relation between the robot and the camera is an eye-to-hand fixing mode;
step 101A, the robot drives a calibration object to move, the relative position of the calibration object and a flange plate at the tail end of a robot shaft is kept unchanged during movement, and attitude position data of the flange plate at the tail end of the robot shaft in a demonstrator is recorded during each movement; simultaneously, the camera acquires original image data of each movement of the calibration object;
102A, performing pose identification on original image data by using existing internal parameters of a camera and adopting homography transformation in OpenCV to obtain pose position data of each original image data in a camera coordinate system;
103A, solving a solution of AX (X ═ XB) equation through an OpenCV hand-eye calibration function by using attitude position data of a flange plate at the tail end of a robot shaft in a demonstrator and attitude position data under a robot coordinate system to obtain a matrix conversion relation X between the camera coordinate system and a robot base coordinate system; in this specific embodiment, the hand-eye calibration function AX ═ XB and the matrix transformation relation X are the prior art, and are not described herein in detail;
step 2, collecting a point cloud image of a calibration object by using a camera, acquiring N point cloud coordinates from the point cloud image, converting the N point cloud coordinates acquired under the camera into conversion coordinates under a robot base coordinate system through a matrix conversion relation X, and recording the conversion coordinates as Ti
Step 3, adopting a robot TCP tool to enable the robot to move to N point cloud coordinates, and recording the N point cloud coordinates as actual coordinates under a robot base coordinate system; the method specifically comprises the following steps: the robot drives the needle point of the moving needle to sequentially touch N point cloud coordinates in the prior artMarking to obtain the actual coordinate of the robot under the base coordinate system, and recording as Tj(ii) a In this embodiment, the coordinate T is convertediWith the actual coordinate TjThe points of (a) are in one-to-one correspondence;
step 4, calculating a hand-eye calibration error matrix between the actual coordinates and the converted coordinates of the N robots by adopting an ICP (inductively coupled plasma) algorithm, wherein the obtained error-corrected hand-eye calibration matrix specifically comprises the following steps:
Figure BDA0003484969760000051
in the formula, Tcane blazing aFor collected point cloud coordinates, TiStep 2, converting the point cloud coordinate into a conversion coordinate under a robot base coordinate system, wherein i is more than or equal to 1 and is less than or equal to N; t isjJ is more than or equal to 1 and less than or equal to N for the actual coordinate of the robot base coordinate system obtained in the step 3; x0Hand-eye calibration error matrix, X, for calculating rigid transformations between coordinates in ICP algorithm0And X is a corrected hand-eye calibration matrix.
Specific example 2
A hand-eye calibration error correction method based on an ICP algorithm comprises the steps of calibrating an object, a robot and a camera, and comprises the following steps:
step 1, as shown in fig. 2, the camera is fixed on a flange plate at the tail end of a robot shaft, and a calibration object is arranged at a fixed position corresponding to a robot base coordinate system, so that the position relation between the robot and the camera is an eye-in-hand fixing mode;
step 101B, the robot drives the camera to move, when the camera moves, the relative position of the camera and a flange plate at the tail end of the robot shaft is kept unchanged, and meanwhile posture position data of the flange plate at the tail end of the robot shaft in the demonstrator are recorded; meanwhile, the camera acquires the original image data of the calibration object every time the camera moves;
102B, performing pose identification on the original image data by adopting homography transformation in OpenCV to obtain pose position data of each original image data in a camera coordinate system;
103B, solving the solution of AX (X-XB) equation by using the pose position data of the flange plate at the tail end of the robot shaft in the demonstrator and the pose position data under the camera coordinate system through the OpenCV hand-eye calibration function to obtain a matrix conversion relation X between the camera coordinate system and the flange plate coordinate system at the tail end of the robot shaft;
step 2, collecting a point cloud image of a calibration object under a camera, acquiring N point cloud coordinates from the point cloud image, converting the N point cloud coordinates acquired under the camera into conversion coordinates under a flange plate coordinate system at the tail end of a robot shaft through a matrix conversion relation X, and recording the conversion coordinates as Tm
Step 3, moving the robot to N point cloud coordinates by adopting a robot TCP tool to obtain coordinates under a robot base coordinate system, and converting the coordinates under the robot base coordinate system into actual coordinates of a flange plate at the tail end of a robot shaft; the method specifically comprises the following steps: the method comprises the steps of fixing a moving needle at the tail end of a robot, calibrating a needle point of the moving needle by a four-point method through TCP, and when the point cloud coordinates are touched in more than four different postures, the robot can automatically calculate the coordinates of the moving needle under a robot base coordinate systemnReal coordinate TnAnd converting the coordinates TmOne-to-one correspondence is realized;
step 4, calculating a hand-eye calibration error matrix between the actual coordinates and the converted coordinates of the N robots by adopting an ICP (inductively coupled plasma) algorithm to obtain a hand-eye calibration matrix after error correction, which specifically comprises the following steps:
Figure BDA0003484969760000071
in the formula, Tcam YaoaFor collected point cloud coordinates, TmStep 2, converting the point cloud coordinate into a conversion coordinate of a flange plate coordinate system at the tail end of the robot shaft, wherein m is more than or equal to 1 and less than or equal to N; t isnN is more than or equal to 1 and less than or equal to N for the actual coordinates of the flange plate at the tail end of the robot shaft obtained in the step 3; x0Hand-eye calibration error matrix, X, for calculating rigid transformations between coordinates in ICP algorithm0And X is a corrected hand-eye calibration matrix.
In addition, the calibration object in the embodiments 1 and 2 is a chessboard calibration plate or a calibration sphere which is easy to perform pose recognition, and the camera is a depth camera.
Although the present disclosure has been described above, the scope of the present disclosure is not limited thereto. Those skilled in the art can make various changes and modifications without departing from the spirit and scope of the present disclosure, and such changes and modifications will fall within the scope of the present invention.

Claims (7)

1. A hand-eye calibration error correction method based on an ICP algorithm is disclosed, wherein the hand-eye calibration comprises a calibration object, a robot and a camera, and the method is characterized by comprising the following steps:
step 1, calibrating relative movement between an object and a camera under the drive of a robot to obtain a matrix conversion relation X between a camera coordinate system and a robot coordinate system;
step 2, collecting a point cloud image of a calibration object under a camera, acquiring N point cloud coordinates from the point cloud image, and converting the N point cloud coordinates acquired under the camera into conversion coordinates of the robot through a matrix conversion relation X;
step 3,
The robot is moved to N point cloud coordinates by adopting a robot TCP tool and recorded as actual coordinates of the robot;
and 4, calculating a hand-eye calibration error matrix between the actual coordinates and the converted coordinates of the N robots by adopting an ICP (inductively coupled plasma) algorithm to obtain a hand-eye calibration matrix after error correction.
2. The ICP algorithm-based hand-eye calibration error correction method according to claim 1, wherein the relative motion between the calibration object and the camera driven by the robot in the step 1 is specifically as follows: the calibration object is fixed on a flange plate at the tail end of a robot shaft, and the camera is arranged at a fixed position relative to the robot base coordinate system.
3. The ICP algorithm-based hand-eye calibration error correction method according to claim 2, wherein the obtaining of the matrix transformation relation X between the camera coordinate system and the robot coordinate system in step 1 specifically comprises:
step 101A, the robot drives a calibration object to move, and posture position data of a flange plate at the tail end of a robot shaft in a demonstrator is recorded; meanwhile, the camera acquires original image data of each movement of the calibration object;
102A, performing pose identification on the original image data by adopting homography transformation in OpenCV to obtain pose position data of each original image data in a camera coordinate system;
and 103A, solving the solution of AX (X-XB) equation by using the OpenCV hand-eye calibration function through the attitude position data of the flange plate at the tail end of the robot shaft in the demonstrator and the attitude position data under the robot coordinate system to obtain a matrix conversion relation X between the camera coordinate system and the robot base coordinate system.
4. The ICP algorithm-based hand-eye calibration error correction method according to claim 3, wherein in the step 2, the N point cloud coordinates obtained under the camera are converted into conversion coordinates under a robot-based coordinate system through a matrix conversion relation X;
in the step 3, a robot TCP tool is adopted to enable the robot to move to N point cloud coordinates and record the N point cloud coordinates as actual coordinates under a robot base coordinate system;
the hand-eye calibration matrix obtained after error correction in the step 4 specifically includes:
Figure FDA0003484969750000021
in the formula, TcameraFor collected point cloud coordinates, Ti2, converting the point cloud coordinate into a conversion coordinate under a robot base coordinate system, wherein i is more than or equal to 1 and less than or equal to N; t isjJ is more than or equal to 1 and less than or equal to N for the actual coordinate of the robot base coordinate system obtained in the step 3; x0Hand-eye calibration error matrix, X, for calculating rigid transformations between coordinates in ICP algorithm0And X is a corrected hand-eye calibration matrix.
5. The ICP algorithm-based hand-eye calibration error correction method according to claim 1, wherein in the step 1, the relative motion between the calibration object and the camera driven by the robot is specifically as follows: the camera is fixed on a flange plate at the tail end of a robot shaft, and the calibration object is arranged at a fixed position relative to a robot base coordinate system.
6. The ICP algorithm-based hand-eye calibration error correction method according to claim 5, wherein the step 1 of obtaining the matrix transformation relation X between the camera coordinate system and the robot coordinate system specifically comprises:
step 101B, the robot drives the camera to move, and posture position data of a flange plate at the tail end of a robot shaft in the demonstrator is recorded; meanwhile, the camera acquires the original image data of the calibration object every time the camera moves;
102B, performing pose identification on the original image data by adopting homography transformation in OpenCV to obtain pose position data of each original image data in a camera coordinate system;
and 103B, solving the solution of AX (X-XB) equation by using the OpenCV hand-eye calibration function through the attitude position data of the flange plate at the tail end of the robot shaft in the demonstrator and the attitude position data under the camera coordinate system to obtain a matrix conversion relation X between the camera coordinate system and the flange plate coordinate system at the tail end of the robot shaft.
7. The ICP algorithm-based hand-eye calibration error correction method according to claim 5, wherein in the step 2, N point cloud coordinates obtained under a camera are converted into conversion coordinates under a robot shaft end flange plate coordinate system through a matrix conversion relation X;
in the step 3, a robot TCP tool is adopted to enable the robot to move to N point cloud coordinates, coordinates under a robot base coordinate system are obtained, and the coordinates under the robot base coordinate system are converted into actual coordinates of a flange plate at the tail end of a robot shaft;
the hand-eye calibration matrix obtained after error correction in the step 4 specifically includes:
Figure FDA0003484969750000031
in the formula, TcameraFor collected point cloud coordinates, TmStep 2, converting the point cloud coordinate into a conversion coordinate of a flange plate coordinate system at the tail end of the robot shaft, wherein m is more than or equal to 1 and less than or equal to N; t isnN is more than or equal to 1 and less than or equal to N for the actual coordinates of the flange plate at the tail end of the robot shaft obtained in the step 3; x0Hand-eye calibration error matrix, X, for calculating rigid transformations between coordinates in ICP algorithm0And X is a corrected hand-eye calibration matrix.
CN202210078209.4A 2022-01-24 2022-01-24 Hand-eye calibration error correction method based on ICP algorithm Pending CN114519738A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210078209.4A CN114519738A (en) 2022-01-24 2022-01-24 Hand-eye calibration error correction method based on ICP algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210078209.4A CN114519738A (en) 2022-01-24 2022-01-24 Hand-eye calibration error correction method based on ICP algorithm

Publications (1)

Publication Number Publication Date
CN114519738A true CN114519738A (en) 2022-05-20

Family

ID=81596095

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210078209.4A Pending CN114519738A (en) 2022-01-24 2022-01-24 Hand-eye calibration error correction method based on ICP algorithm

Country Status (1)

Country Link
CN (1) CN114519738A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115026814A (en) * 2022-06-01 2022-09-09 中科苏州智能计算技术研究院 Camera automatic calibration method for mechanical arm motion space reconstruction
CN115139283A (en) * 2022-07-18 2022-10-04 中船重工鹏力(南京)智能装备系统有限公司 Robot hand-eye calibration method based on random mark dot matrix
CN115229805A (en) * 2022-09-21 2022-10-25 北京壹点灵动科技有限公司 Hand-eye calibration method and device for surgical robot, storage medium and processor
CN115345943A (en) * 2022-08-08 2022-11-15 恩纳基智能科技无锡有限公司 Calibration method based on differential mode concept
CN115741666A (en) * 2022-08-31 2023-03-07 深圳前海瑞集科技有限公司 Robot hand-eye calibration method, robot and robot operation method
CN117021137A (en) * 2023-09-01 2023-11-10 无锡斯帝尔科技有限公司 Visual teaching device adapting to various polishing tools
WO2024109403A1 (en) * 2022-11-24 2024-05-30 梅卡曼德(北京)机器人科技有限公司 3d camera calibration method, point cloud image acquisition method, and camera calibration system

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115026814A (en) * 2022-06-01 2022-09-09 中科苏州智能计算技术研究院 Camera automatic calibration method for mechanical arm motion space reconstruction
CN115026814B (en) * 2022-06-01 2024-04-12 中科苏州智能计算技术研究院 Camera automatic calibration method for mechanical arm movement space reconstruction
CN115139283A (en) * 2022-07-18 2022-10-04 中船重工鹏力(南京)智能装备系统有限公司 Robot hand-eye calibration method based on random mark dot matrix
CN115139283B (en) * 2022-07-18 2023-10-24 中船重工鹏力(南京)智能装备系统有限公司 Robot hand-eye calibration method based on random mark dot matrix
CN115345943A (en) * 2022-08-08 2022-11-15 恩纳基智能科技无锡有限公司 Calibration method based on differential mode concept
CN115345943B (en) * 2022-08-08 2024-04-16 恩纳基智能装备(无锡)股份有限公司 Calibration method based on differential mode concept
CN115741666A (en) * 2022-08-31 2023-03-07 深圳前海瑞集科技有限公司 Robot hand-eye calibration method, robot and robot operation method
CN115229805A (en) * 2022-09-21 2022-10-25 北京壹点灵动科技有限公司 Hand-eye calibration method and device for surgical robot, storage medium and processor
CN115229805B (en) * 2022-09-21 2022-12-09 北京壹点灵动科技有限公司 Hand-eye calibration method and device for surgical robot, storage medium and processor
WO2024109403A1 (en) * 2022-11-24 2024-05-30 梅卡曼德(北京)机器人科技有限公司 3d camera calibration method, point cloud image acquisition method, and camera calibration system
CN117021137A (en) * 2023-09-01 2023-11-10 无锡斯帝尔科技有限公司 Visual teaching device adapting to various polishing tools

Similar Documents

Publication Publication Date Title
CN114519738A (en) Hand-eye calibration error correction method based on ICP algorithm
JP6966582B2 (en) Systems and methods for automatic hand-eye calibration of vision systems for robot motion
CN111300422B (en) Robot workpiece grabbing pose error compensation method based on visual image
CN110238849B (en) Robot hand-eye calibration method and device
CN111775146B (en) Visual alignment method under industrial mechanical arm multi-station operation
CN108818535B (en) Robot 3D vision hand-eye calibration method
CN111801198B (en) Hand-eye calibration method, system and computer storage medium
US20190184570A1 (en) Intelligent robots
JP6429473B2 (en) Robot system, robot system calibration method, program, and computer-readable recording medium
US8244402B2 (en) Visual perception system and method for a humanoid robot
CN113379849B (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
CN109794963B (en) Robot rapid positioning method facing curved surface component
CN110717943A (en) Method and system for calibrating eyes of on-hand manipulator for two-dimensional plane
JP2013036988A (en) Information processing apparatus and information processing method
CN109940626B (en) Control method of eyebrow drawing robot system based on robot vision
CN113172659B (en) Flexible robot arm shape measuring method and system based on equivalent center point identification
CN112958960B (en) Robot hand-eye calibration device based on optical target
US20220395981A1 (en) System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system
EP4101604A1 (en) System and method for improving accuracy of 3d eye-to-hand coordination of a robotic system
CN109900251A (en) A kind of robotic positioning device and method of view-based access control model technology
CN113172632A (en) Simplified robot vision servo control method based on images
CN113102882B (en) Geometric error compensation model training method and geometric error compensation method
JP2021024053A (en) Correction method of visual guidance robot arm
CN113664826A (en) Robot grabbing method and system in unknown environment
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination