CN113362396B - Mobile robot 3D hand-eye calibration method and device - Google Patents

Mobile robot 3D hand-eye calibration method and device Download PDF

Info

Publication number
CN113362396B
CN113362396B CN202110689530.1A CN202110689530A CN113362396B CN 113362396 B CN113362396 B CN 113362396B CN 202110689530 A CN202110689530 A CN 202110689530A CN 113362396 B CN113362396 B CN 113362396B
Authority
CN
China
Prior art keywords
robot
hand
coordinate system
calibration
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110689530.1A
Other languages
Chinese (zh)
Other versions
CN113362396A (en
Inventor
邓辉
李华伟
王益亮
陈忠伟
石岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Xiangong Intelligent Technology Co ltd
Original Assignee
Shanghai Xiangong Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Xiangong Intelligent Technology Co ltd filed Critical Shanghai Xiangong Intelligent Technology Co ltd
Priority to CN202110689530.1A priority Critical patent/CN113362396B/en
Publication of CN113362396A publication Critical patent/CN113362396A/en
Application granted granted Critical
Publication of CN113362396B publication Critical patent/CN113362396B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0219Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/11Complex mathematical operations for solving equations, e.g. nonlinear equations, general mathematical optimization problems
    • G06F17/12Simultaneous equations, e.g. systems of linear equations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Abstract

The invention provides a mobile robot 3D hand-eye calibration method and a device, wherein the method comprises the following steps: s1, calibrating a camera view finding range to cover all degrees of freedom of the mobile robot; s2, recording point cloud data of a calibration ball held by the mobile robot in a camera coordinate system and the pose of the robot corresponding to the current robot coordinate system, and forming n groups of point cloud data; s3, extracting a calibrated spherical point cloud, and determining coordinate values of a calibrated spherical point in a camera coordinate system in each group of point cloud data by adopting a least square fitting algorithm; s4, solving an offset and a hand-eye calibration matrix according to the pose of the robot and coordinates of a corresponding calibration sphere center point and a first algorithm; s5, converting the hand-eye calibration matrix into an Euler angle form, and taking a hand-eye calibration matrix result as a nonlinear optimized initial value; and respectively calculating the hand-eye calibration matrix and the difference value between the offset and the true value according to the second algorithm, and correcting the hand-eye calibration matrix according to the difference value. Therefore, the adaptation requirement on the degree of freedom of the robot is reduced, and the universality is improved.

Description

Mobile robot 3D hand-eye calibration method and device
Technical Field
The invention relates to the technical field of machine vision, in particular to a 3D hand-eye calibration method and device suitable for at least one degree of freedom motion.
Background
Along with the continuous deep industrial and living scenes of robot technology, various single robot production and use technologies such as a mechanical ARM, an AGV trolley, an ARM and the like are mature day by day according to the use requirements of different scenes. The single robot is used as an actuator, and can well replace people to finish the work with high risk, strong repeatability and high working strength such as cargo box transportation, logistics sorting, mechanical assembly and the like. However, robots often have no sensing capability as end effectors. The 3D camera can acquire three-dimensional information of an object and has the capability of sensing the spatial position and the gesture of the object. In most working scenarios, robots often need 3D vision coordination to provide guidance, together with some intelligent work. The 3D vision and the robot are provided with independent functional modules, and the hand-eye calibration method is to establish a connection between the two; and converting the object position information identified by the 3D vision into coordinate values which can be understood by the robot, and guiding the robot to the target position to finish corresponding work.
The hand-eye calibration is mainly used for establishing a conversion relation between a camera coordinate system and a robot coordinate system, and the main research work of the current hand-eye calibration method is to calibrate the hand eyes of the mechanical arm and the 3D camera and is used for guiding the mechanical arm to perform work such as part loading and unloading, logistics disassembly and stacking and the like. The hand-eye calibration method of the mechanical arm and the 3D camera uses a special calibration plate or a calibration ball as a medium, and coordinate values of the medium under a camera coordinate system acquired by the 3D camera and coordinate values under a robot coordinate system at the same moment correspond to each other. And according to the coordinate values corresponding to the same object under different coordinate systems, settling the conversion relationship between the two coordinate systems. Since the medium is an object having a spatial dimension instead of a point, it is often difficult in reality to obtain coordinate values of the fixed point on the medium in the camera coordinate system and the robot coordinate system, respectively. The current mainstream hand-eye calibration method settles the transformation relation between two coordinate systems by manual assistance or stipulating that a robot moves according to a certain specific mode, however, the manual assistance hand-eye calibration method has low automation degree and cannot be applied in large scale.
For this purpose, the prior art proposes a robot hand-eye calibration method and device (CN 108582076 a) based on standard balls, the method comprising: acquiring point cloud information of a standard ball based on a camera coordinate system and position information of a robot TCP in a robot base coordinate system; positioning the spherical surface of a standard sphere according to point cloud information, determining the position of the spherical center of the standard sphere, and determining the three-dimensional coordinate value of the spherical center of the standard sphere in the camera coordinate system based on a maximum likelihood estimation algorithm according to the position of the spherical center; and determining an overdetermined transformation matrix equation of the three-dimensional coordinate values from the camera coordinate system to the robot base coordinate system, and determining a homogeneous transformation matrix of the camera coordinate system in the robot base coordinate system by utilizing a least squares optimization algorithm to achieve calibration of the robot eyes.
However, this prior art is mainly applied to a six-degree-of-freedom mechanical arm, and thus limits the need for spatial rotational and translational movements of the robot in a specific manner. Although the calibration process is fully automated, it requires the robot to have three degrees of freedom in space in terms of rotational motion. However, the mobile robots such as the AGV trolley and the ARM have only one degree of freedom of rotation capability in space, so the prior art cannot be applied. To fill the gap in this technology, a new solution is needed in the art to solve the above-mentioned problems.
Disclosure of Invention
The invention mainly aims to provide a mobile robot 3D hand-eye calibration method and device, so as to reduce the adaptation requirement on the degree of freedom of a robot and improve the universality.
In order to achieve the above purpose, the present invention provides a mobile robot 3D hand-eye calibration method, which includes the steps of:
s1, calibrating a camera view finding range to cover all degrees of freedom of the mobile robot;
s2, recording point cloud data cP under a camera coordinate system where a calibration ball held by the mobile robot is located i And corresponding robot pose rP under current robot coordinate system i Forming n groups of point cloud data;
s3, extracting a calibrated spherical point cloud, and determining coordinate values of a calibrated spherical point in a camera coordinate system in each group of point cloud data by adopting a least square fitting algorithm
S4, according to the pose rP of the robot i And coordinates cT corresponding to the spherical center point of the calibration sphere i Solving an offset delta t and a hand-eye calibration matrix H according to a first algorithm;
s5, converting the hand-eye calibration matrix H into an Euler angle form, and taking a hand-eye calibration matrix result H as a nonlinear optimized initial value; and respectively calculating a hand-eye calibration matrix H and difference values dH and dDeltat of the offset Deltat and the true value according to a second algorithm, and correcting the hand-eye calibration matrix according to the difference values dH and dDeltat.
In a possibly preferred embodiment, the first algorithm step comprises:
s41 according to the pose of each robotObtaining pose information +.>
S42 is set to have an offset of Δt= [ Δx, Δy, Δz] T Coordinate values of the spherical center point of the sphere at corresponding moments under the robot coordinate system can be calibrated
S43, setting a hand-eye calibration matrix from a camera coordinate system to a robot coordinate system as H, and turning the coordinates of the spherical center point under the camera coordinate system to the coordinates of the spherical center point under the robot coordinate system as HIs provided with->The robot pose at the same moment and the coordinates of the calibrated sphere center point under the camera coordinate system can establish the following equation:wherein the method comprises the steps ofThe offset delta t and the hand-eye calibration matrix H are unknown quantity X to be solved;
s44, according to n groups of corresponding robot pose and calibration sphere center point coordinates, establishing equations in a step S43, and dividing the equations into a shape matrix form of A.X=b;
s45, solving a linear equation by utilizing singular value decomposition to obtain an unknown quantity X, wherein the unknown quantity X is the calculated offset delta t and the hand-eye calibration matrix H.
-in a possibly preferred embodiment, said second algorithm step comprises:
s51, converting the hand-eye calibration matrix H into Euler angle formWherein the matrix is rotatedWith Euler angle t= [ alpha, beta, gamma ]]The conversion relation between the two is as follows:
s52, setting a hand-eye calibration matrix H and difference values of offset deltat and true values as dH and ddeltat respectively, and expressing the difference values in Euler angle form as follows:dΔt=[dΔx,dΔy,dΔz]the method comprises the steps of carrying out a first treatment on the surface of the Wherein, the values in dH and dDeltat all tend to 0; when dα, dβ, dγ≡0, sindα≡0 and cosdα≡1 can be estimated, then:
then +.>
S53 brings H, dH, Δt, dΔt into the formula of the first algorithm to obtain the equation of A· (Δt+dΔt) = (H+dH) ·b, which is known because H and Δt are knownThe method of linear least square can be utilized to solve, and the compensation quantity dH and dDeltat are obtained, so that a more accurate hand-eye calibration matrix is obtained
In a possible preferred embodiment, the recording of the point cloud data cP in the camera coordinate system in which the calibration sphere held by the mobile robot is located i And corresponding robot pose rP under current robot coordinate system i The method comprises the following steps:
s21, moving the robot to the camera view finding range and then standing still, triggering the camera to shoot the point cloud data cP of the first group of calibration balls 1 The method comprises the steps of carrying out a first treatment on the surface of the Simultaneously recording the pose rP of the robot under the robot base coordinate system at the moment 1 Pose rP 1 Comprises coordinate values of a robot end effector in a robot coordinate systemAnd its posture->Representing pose information of the robot in Euler angle manner, < >>Respectively the rotation angle around X, Y, Z axis;
s22, changing the position and the posture of the robot in the camera view finding range, and recording new pose information rP of the robot 2 The method comprises the steps of carrying out a first treatment on the surface of the At the same time, the camera shoots the point cloud data cP of the calibration ball at the moment 2
S23, repeating the steps n times to cover the whole view finding range of the camera and the space freedom degree of the robot, and obtaining n groups of calibrated point cloud data.
In a possibly preferred embodiment, the step of extracting the calibrated spherical point cloud in step S3 comprises: for any group of scanning point cloud data cP i And positioning the point cloud positioned on the calibration sphere from the point cloud data by utilizing a random sampling consistency algorithm.
The invention also provides a 3D hand-eye calibration device of the mobile robot, which comprises: the three-dimensional (3D) calibration camera, a main controller, a processor and a calibration ball, wherein the view finding range of the 3D calibration camera covers all degrees of freedom of the mobile robot, and the main controller is connected with the 3D calibration camera to record point cloud data cP under a camera coordinate system where the calibration ball held by the mobile robot is located i And corresponding robot pose rP under current robot coordinate system i Forming n groups of point cloud data to be transmitted to a processor, wherein the processor extracts the calibrated spherical point cloud according to a preset calibration algorithm, and adopts a least square fitting algorithm to determine coordinate values of the calibrated spherical point under a camera coordinate system in each group of point cloud dataAt the same time according to the pose rP of the robot i And coordinates cT corresponding to the spherical center point of the calibration sphere i Solving an offset delta t and a hand-eye calibration matrix H according to a first algorithm, converting the hand-eye calibration matrix H into an Euler angle form, taking a hand-eye calibration matrix result H as an initial value of nonlinear optimization, then respectively calculating the hand-eye calibration matrix H, and differences dH and dDeltat between the offset delta t and a true value according to a second algorithm, and correcting the hand-eye calibration matrix according to the differences dH and dDeltat.
In a possibly preferred embodiment, the first algorithm step comprises:
s41 processor according to each robot poseObtaining pose information +.>
S42 is set to have an offset of Δt= [ Δx, Δy, Δz]T, can be used for calibrating coordinate values of the sphere center point at corresponding moments in a robot coordinate system
S43, setting a hand-eye calibration matrix from a 3D camera coordinate system to a robot coordinate system as H, and turning the coordinates of the spherical center point under the camera coordinate system into the robot coordinate system asIs provided with->The robot pose at the same moment and the coordinates of the calibrated sphere center point under the camera coordinate system can establish the following equation:the offset delta t and the hand-eye calibration matrix H are unknown quantities X to be solved;
s44, the processor establishes an equation according to n groups of corresponding robot pose and calibration sphere center point coordinates through the step S43, and the equation is divided into a shape matrix form of A.X=b;
the S45 processor solves a linear equation by utilizing singular value decomposition to obtain an unknown quantity X, namely the offset delta t and the hand-eye calibration matrix H.
-in a possibly preferred embodiment, said second algorithm step comprises:
s51, converting the hand-eye calibration matrix H into Euler angle form by the processorWherein the rotation matrix->With Euler angle r= [ alpha, beta, gamma ]]The conversion relation between the two is as follows:
the S52 processor sets the difference value between the hand-eye calibration matrix H and the offset delta t and the true value as dH and ddelta t respectively, and the expression under the Euler angle form is as follows:dΔt=[dΔx,dΔy,dΔz]the method comprises the steps of carrying out a first treatment on the surface of the Wherein, the values in dH and dDeltat all tend to 0; when dα, dβ, dγ≡0, sindα≡0 and cosdα≡1 can be estimated, then:
in matrix form
S53 the processor brings H, dH, Δt and dΔt into the formula of the first algorithm to obtain an equation of A- (Δt+dΔt) = (H+dH) & b, and since H and Δt are known, the equation can be solved by using a linear least squares method to obtain compensation amounts dH and dΔt, thereby obtaining a more accurate hand-eye calibration matrix
In a possible preferred embodiment, the master controller records point cloud data cP in a camera coordinate system in which a calibration sphere held by the mobile robot is located i The step of the robot pose rPi corresponding to the current robot coordinate system comprises the following steps:
s21, after the robot moves to the camera view finding range and then is stationary, the master controller controls the 3D camera to trigger shooting of point cloud data cP of the first group of calibration balls 1 The method comprises the steps of carrying out a first treatment on the surface of the Simultaneously recording the pose rP of the robot under the robot base coordinate system at the moment 1 Pose rP 1 Comprises coordinate values of a robot end effector in a robot coordinate systemAnd its posture->Representing pose information of the robot in Euler angle manner, < >>Respectively the rotation angle around X, Y, Z axis;
s22, the master controller controls the robot to change the position and the posture in the view-finding range of the 3D camera, and records new pose information rP of the robot 2 The method comprises the steps of carrying out a first treatment on the surface of the Meanwhile, the 3D camera shoots point cloud data cP of the calibration ball at the moment 2
And S23, repeating the steps n times by the main controller so as to cover the whole view finding range of the 3D camera and the space freedom degree of the robot, and obtaining n groups of calibrated point cloud data.
In a possibly preferred embodiment, the processor extracting the calibrated spherical point cloud step comprises: for any group of scanning point cloud data cP i And positioning the point cloud positioned on the calibration sphere from the point cloud data by utilizing a random sampling consistency algorithm.
The mobile robot 3D hand-eye calibration method and device provided by the invention have the beneficial effects that under each implementation mode:
(1) The method has universality, only the robot is required to drive the calibration ball to freely move in the visual field range of the 3D camera in the calibration process, and the pose information of the robot and the point cloud information acquired by the 3D camera at the same moment are recorded. The robot has no need of limiting the movement form of the robot in the calibration process, and can be generally used for hand-eye calibration work of robots of various types such as mechanical arms, AGV carts, AMR and the like.
(2) The method can realize automation in the processes of collecting calibration data, identifying the coordinates of the center point of the calibration sphere and settling the calibration matrix, can realize batch application, can ensure that the calibration result cannot introduce human errors, and improves the calibration precision and stability.
(3) The method introduces a method for further optimizing parameters, can effectively inhibit the amplification effect of the calibration algorithm on the positioning error of the robot, and ensures that the overall accuracy of the system is equivalent to the positioning accuracy of the robot.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention. In the drawings:
FIG. 1 is a schematic diagram of a calibration system of a robot 3D hand-eye calibration method of the present invention;
fig. 2 is a step diagram of a robot 3D hand-eye calibration method according to the present invention.
Detailed Description
The following describes specific embodiments of the present invention in detail. The following examples will assist those skilled in the art in further understanding the present invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications could be made by those skilled in the art without departing from the inventive concept. These are all within the scope of the present invention.
It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other. The invention will be described in detail below with reference to the drawings in connection with embodiments.
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, based on the embodiments of the invention, which are obtained without inventive effort by a person of ordinary skill in the art, shall fall within the scope of the invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion.
The mobile robot 3D hand-eye calibration method provided by the invention can be suitable for various robots such as mechanical ARMs, AGV carts and ARM. The calibration ball is used as a medium, and the robot is tracked to carry the calibration ball to freely move in the visual field of the 3D camera, so that the hand-eye calibration work of the robot and the 3D camera is automatically completed.
Referring to fig. 1 to 2, a first aspect of the present invention provides a mobile robot 3D hand-eye calibration device, which includes a 23D calibration camera, a main controller, a processor, and a calibration sphere, wherein the 3D calibration camera view-finding range covers all degrees of freedom of the mobile robot, and the main controller is connected with the 3D calibration camera to record point cloud data cP under a camera coordinate system where the calibration sphere held by the mobile robot is located i And corresponding robot pose rP under current robot coordinate system i Forming n groups of point cloud data to be transmitted to a processor, wherein the processor extracts the calibrated spherical point cloud according to a preset calibration algorithm, and adopts a least square fitting algorithm to determine coordinate values of the calibrated spherical point under a camera coordinate system in each group of point cloud dataAt the same time according to the pose rP of the robot i And coordinates cT corresponding to the spherical center point of the calibration sphere i Solving an offset delta t and a hand-eye calibration matrix H according to a first algorithm, converting the hand-eye calibration matrix H into an Euler angle form, taking a hand-eye calibration matrix result H as an initial value of nonlinear optimization, then respectively calculating the hand-eye calibration matrix H, and differences dH and dDeltat between the offset delta t and a true value according to a second algorithm, and correcting the hand-eye calibration matrix according to the differences dH and dDeltat.
Wherein the first algorithm step comprises:
s41 processor according to each robot poseObtaining pose information +.>
S42 is set to have an offset of Δt= [ Δx, Δy, Δz] T Coordinate values of the spherical center point of the sphere at corresponding moments under the robot coordinate system can be calibrated
S43, setting a hand-eye calibration matrix from a 3D camera coordinate system to a robot coordinate system as H, and turning the coordinates of the spherical center point under the camera coordinate system into the robot coordinate system asIs provided with->The robot pose at the same moment and the coordinates of the calibrated sphere center point under the camera coordinate system can establish the following equation:
the offset delta t and the hand-eye calibration matrix H are unknown quantities X to be solved;
s44, the processor establishes an equation according to n groups of corresponding robot pose and calibration sphere center point coordinates through the step S43, and the equation is divided into a shape matrix form of A.X=b;
the S45 processor solves a linear equation by utilizing singular value decomposition to obtain an unknown quantity X, namely the offset delta t and the hand-eye calibration matrix H.
Wherein the second algorithm step comprises:
s51, converting the hand-eye calibration matrix H into Euler angle form by the processorWherein the rotation matrix->With Euler angle r= [ alpha, beta, gamma ]]The conversion relation between the two is as follows:
the S52 processor sets the difference value between the hand-eye calibration matrix H and the offset delta t and the true value as dH and ddelta t respectively, and the expression under the Euler angle form is as follows:dΔt=[dΔx,dΔy,dΔz]the method comprises the steps of carrying out a first treatment on the surface of the Wherein, the values in dH and dDeltat all tend to 0; when da, dβ, dγ≡0, sinda≡0, cosda≡1 can be estimated, then:
in matrix form
S53 the processor brings H, dH, Δt and dΔt into the formula of the first algorithm to obtain an equation of A- (Δt+dΔt) = (H+dH) & b, and since H and Δt are known, the equation can be solved by using a linear least squares method to obtain compensation amounts dH and dΔt, thereby obtaining a more accurate hand-eye calibration matrix
Wherein the master controller records point cloud data cP under a camera coordinate system of a calibration ball held by the mobile robot i And corresponding robot pose rP under current robot coordinate system i The method comprises the following steps:
s21, after the robot moves to the camera view finding range and then is stationary, the master controller controls the 3D camera to trigger shooting of point cloud data cP of the first group of calibration balls 1 The method comprises the steps of carrying out a first treatment on the surface of the Simultaneously recording the pose rP of the robot under the robot base coordinate system at the moment 1 Pose rP 1 Comprises coordinate values of a robot end effector in a robot coordinate systemAnd its posture->Representing pose information of the robot in Euler angle manner, < >>Respectively the rotation angle around X, Y, Z axis;
s22, the master controller controls the robot to change the position and the posture in the view-finding range of the 3D camera, and records new pose information rP of the robot 2 The method comprises the steps of carrying out a first treatment on the surface of the Meanwhile, the 3D camera shoots point cloud data cP of the calibration ball at the moment 2
And S23, repeating the steps n times by the main controller so as to cover the whole view finding range of the 3D camera and the space freedom degree of the robot, and obtaining n groups of calibrated point cloud data.
The step of extracting the calibrated spherical point cloud by the processor comprises the following steps: for any group of scanning point cloud data cP i And positioning the point cloud positioned on the calibration sphere from the point cloud data by utilizing a random sampling consistency algorithm.
As shown in fig. 1 to 2, in another aspect, the present invention further provides a method for calibrating a 3D hand and eye of a mobile robot, where in a preferred embodiment, the steps include:
s1, calibrating a camera view range to cover all degrees of freedom of the mobile robot, wherein in the embodiment, the calibrating step comprises the following steps:
1) The calibration ball is fixed on the robot actuator, so that displacement between the calibration ball and the robot in the calibration process is avoided.
2) The robot is adjusted to be within the field of view of the 3D camera, so that the 3D camera can shoot a calibration ball.
3) Moving the robot within the field of view of the 3D camera, and recording pose information rP of the robot under a robot coordinate system in each moving process, wherein the pose information rP comprises positionsAnd attitudes rR i (R x ,R y ,R z ). Meanwhile, the 3D camera is utilized to collect point cloud data cP of the calibration ball at the corresponding moment i . And collecting calibration ball point cloud set cP of n groups of robots under different poses and robot pose information rP (rT, rR) under the same moment. And requires to cover all degrees of freedom that can be achieved by the robot, and in this implementation n is preferably a natural number greater than or equal to 5, i.epsilon.1, n]。
4) Scanning point cloud data cP for each group i Automatically extracting spherical point cloud by using random sampling consistency algorithm (RANSAC) algorithm, and performing least square spherical fitting on the spherical point cloud to obtain coordinate values of the spherical point under a camera coordinate systemEach spherical center point coordinate cT i Robot corresponding to the same moment is in pose rP under its base coordinate system i Without a set of calibration data, i.e. [1, n ]]。
5) According to pose information of n groups of robots and coordinates of spherical center points of linked calibration balls under a camera coordinate system, constructing a linear equation, and solving a transformation matrix H from the camera coordinate system to the robot coordinate system by using a least square method, wherein the matrix H is a hand-eye calibration matrix from the 3D camera to the robot.
6) And (3) optimizing the hand-eye calibration matrix H obtained in the step (4) and improving the calibration precision.
Specifically, the step (1) further comprises the following parts:
1.1 calibration balls are to be placed on the end effectors of robots, the working mechanisms of different types of robots are different, and the positions of the end effectors are slightly different. In the calibration process, the calibration ball is fixed on the robot and cannot slide in the calibration process, and the robot drives the calibration ball to move.
Further, the step (2) includes the following parts:
2.1 moving the robot to the field of view of the 3D camera, standing still, triggering the 3D camera to shoot the point cloud data cP of the first group of calibration balls 1 . Simultaneously recording that the robot is at the moment of the robotPose rP in base coordinate system 1 Pose rP 1 Comprises coordinate values of a robot end effector under a robot base coordinate systemAnd its posture->Representing pose information of the robot in Euler angle manner, < >>Indicating their rotation angles about the X, Y, Z axis, respectively.
2.2 changing the position and the gesture of the robot within the field of view of the 3D camera, and recording new gesture information rP of the robot 2 . Meanwhile, 3D shooting is utilized to shoot point cloud data cP of calibration ball at the moment 2
2.3 repeating the steps for n times, and covering the whole view field range of the 3D camera and the space freedom degree of the robot as much as possible to obtain n groups of calibration data, wherein n is a natural number greater than or equal to 5.
Further, the step (3) includes the following parts:
3.1 scanning Point cloud cP for any group i And (3) preliminarily positioning the local point cloud positioned on the calibration sphere from the point cloud by utilizing a RANSAC algorithm to obtain model parameters of the initial calibration sphere.
3.2 performing least square spherical fitting on the local point cloud positioned on the calibration sphere to obtain the accurate spherical center point coordinates of the calibration sphere
Further, the step (4) includes the following parts:
4.1 based on pose information of each robotAccording to the conversion relation from Euler angle to rotation matrix, obtaining pose information +.>
4.2 since the robot coordinates read on the robot and the sphere center point are not at the same point, it is assumed that the fixed offset of the phase difference between the two is Δt= [ Δx, Δy, Δz] T The offset Δt is an unknown quantity that needs to be solved.
4.3 obtaining the coordinate value of the spherical center point at the corresponding moment under the robot coordinate system according to the pose information of the robot and the offset delta t of the spherical center point and the end effector of the robot
4.4 assuming that the hand-eye calibration matrix from the camera coordinate system to the robot coordinate system is H, the coordinates of the spherical center point under the camera coordinate system are transferred to the robot coordinate system to beHypothesis matrix +.> The robot pose at the same moment and the coordinates of the calibrated sphere center point under the camera coordinate system can establish the following equation:
the offset delta t and the hand-eye calibration matrix H are unknown quantities to be solved.
4.5, according to n groups of corresponding robot pose information and calibration sphere center point coordinates, passing the simultaneous equations of the step 4.4, wherein the method specifically comprises the following steps:
the above equation is divided into a matrix form of a·x=b:
the unknown quantity X (namely, a vector X) can be obtained by solving a linear equation through singular value decomposition, and the vector X is the obtained offset delta t and the hand-eye calibration matrix H. The matrix H is a hand-eye calibration matrix from a camera coordinate system to a robot coordinate system.
Furthermore, in another preferred embodiment, since the AGV type robot does not have the ability to rotate along the Rx, ry axes, to be suitable for such a robot, the third line equation in step 4.5 above may be changed to:
the simplified form is as follows:
wherein: dz=Δz-b 3
It should be noted that Δz and b are calculated by the above equation 3 Is not a true value. Because the radius of the calibration sphere and the design parameters of the robot are known, the true value of the center of the calibration sphere under the coordinate system of the robot can be obtained according to the known parametersThus b can be obtained 3 Is>
From this, a hand-eye calibration matrix H can be obtained:
further, the step (5) includes the following parts:
5.1 converting the hand-eye calibration matrix H into Euler anglesAnd (3) taking the hand-eye calibration matrix result H obtained by the linear solution in the step (4) as an initial value of nonlinear optimization. Wherein the matrix is rotated
With Euler angle r= [ alpha, beta, gamma ]]The conversion relationship between them is as follows:
5.2 assuming that the difference between the hand-eye calibration matrix H and the offset Δt obtained in the step (4) and the true value is dH, dΔt, the representation in the form of euler angles is:
dΔt=[dΔx,dΔy,dΔz]. Wherein, the values in dH and dDeltat both tend to 0.
When dα, dβ, dγ≡0, sindα≡0 and cosdα≡1 can be estimated. Then:
/>
in matrix form
5.3 bringing H, dH, Δt, dΔt into the equation of step (4) can obtain the equation of A· (Δt+dΔt) = (H+dH) ·b, since H and Δt are known, the equation can be solved by using a linear least squares method to obtain the compensation amounts dH and dΔt, thereby obtaining a more accurate hand-eye calibration matrix
Experimental example
In order to more specifically describe the scheme of the invention, the embodiment of the 3D hand-eye calibration of the mobile robot of the invention will be described in detail below by taking an AGV forklift and a 3D camera as examples with reference to the accompanying drawings.
Compared with a mechanical arm, the AGV forklift only has X, Y, Z moving capability in three directions and rotating capability around a Z axis, and the method has strong universality by introducing the application flow of the method on the AGV forklift.
Meanwhile, the calibration method abandons the traditional two-step algorithm, namely, the mode of solving the offset of the calibration ball and the robot and then solving the hand-eye calibration matrix is adopted, and the calibration ball, the offset of the robot and the hand-eye calibration matrix are solved together by adopting a one-step solving method, so that the final result is directly obtained, and the calculation efficiency is higher.
Specifically, in this experimental example, the mobile robot 3D hand-eye calibration method includes the steps of:
s1: calibration data is collected.
As shown in fig. 1, the calibration balls are fixed on the fork teeth of the AGV forklift, and the radius of the calibration balls is predicted to be dr=0.05m, so that the height value from the center of the balls to the robot coordinate system isThe AGV fork truck is controlled to move in the field of view of the 3D camera, and each movement has changes in three dimensions of the space coordinates X, Y, Z and the space attitude Rz compared with the last movement. Wherein X, Y are coordinate values of the forklift on a two-dimensional map, and Z is a forklift fork tooth height value. Because the AGV forklift only has yaw angle rotation capability and does not have yaw angle and pitch angle rotation capability, the Euler angle is adopted to represent the attitude information of the forklift around the Z-axis rotation angle Rz.
Recording pose information rP (x) r ,y r ,z r Rz) with a 3D cameraAnd collecting point cloud data of a calibration ball. The position and pose information recorded after each movement of the forklift is in one-to-one correspondence with the point cloud data acquired by the 3D camera. The method comprises the steps of dividing point cloud positioned on a calibration sphere from a scene by utilizing a RANSAC method for each group of point cloud data, then carrying out least square fitting on inner points of the sphere, and accurately positioning the coordinates cT (x c ,y c ,z c ). Specific implementations can be seen in literature [ Dan Hong, wang Yanmin, yang Bingwei ] an automatic detection method for target balls [ J]Mapping notification, 2013 (S1): 58-60 ].
Calibration data collected in this example are shown in table 1.
/>
TABLE 1
S2: and solving the unknown quantity X.
And constructing a linear equation A.X=b according to the calibration data acquired in the step S1, and carrying out singular value decomposition on the linear equation to obtain a least square solution of the vector X. The vector X includes an offset Δt= [ Δx, Δy, Δz] T Hand-eye calibration matrix
The vectors X, Δt and H calculated in this example are:
X=[0.7094 -1.10338 -1.19314 -0.666154 0.414944 -0.610368 1.34539 0.724705 0.297334 -0.580368 0.238512 -0.0354099 -0.856986 -0.521586 1.19314] T
s3: correcting the hand-eye calibration matrix H.
Because the AGV forklift does not have the capability of rotating along the Rx and Ry axes, the constructed equation does not have the constraint capability in the Z axis direction. In the experimental example, the true value of the Z-direction coordinate of the spherical center under the robot coordinate system is determined asThe corrected true offsets Δt and H are:
s4: solving the optimization parameters dH and dDeltat.
The unknown quantities dH, dΔt are constructed and the optimization parameters dH, dΔt can be solved by taking the linear equation a·x=b in combination with the known H and Δt. In the experimental example, the numerical value of the corrected hand-eye calibration matrix converted into the Euler angle form is
H= [1.34539,0.238512,1.19314, 132.589,2.029, -121.325], optimizing matrix
dh= [0.041,0.003,0.021,1.325, -0.937,0.534]. The hand-eye calibration matrix obtained finally is as follows:
therefore, the offset of the calibration ball and the robot and the hand-eye calibration matrix can be solved in one step, and the calculation efficiency is improved.
The preferred embodiments of the invention disclosed above are intended only to assist in the explanation of the invention. The preferred embodiments are not exhaustive or to limit the invention to the precise form disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best understand and utilize the invention. The invention is to be limited only by the following claims and their full scope and equivalents, and any modifications, equivalents, improvements, etc., which fall within the spirit and principles of the invention are intended to be included within the scope of the invention.
It will be appreciated by those skilled in the art that the system, apparatus and their respective modules provided by the present invention may be implemented entirely by logic programming method steps, in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc., except for implementing the system, apparatus and their respective modules provided by the present invention in a purely computer readable program code. Therefore, the system, the apparatus, and the respective modules thereof provided by the present invention may be regarded as one hardware component, and the modules included therein for implementing various programs may also be regarded as structures within the hardware component; modules for implementing various functions may also be regarded as being either software programs for implementing the methods or structures within hardware components.
Furthermore, all or part of the steps in implementing the methods of the embodiments described above may be implemented by a program, where the program is stored in a storage medium and includes several instructions for causing a single-chip microcomputer, chip or processor (processor) to perform all or part of the steps in the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In addition, any combination of various embodiments of the present invention may be performed, so long as the concept of the embodiments of the present invention is not violated, and the disclosure of the embodiments of the present invention should also be considered.

Claims (6)

1. The 3D hand-eye calibration method of the mobile robot is characterized by comprising the following steps of:
s1, calibrating a camera view finding range to cover all degrees of freedom of the mobile robot;
s2, recording point cloud data of a mobile robot under a camera coordinate system where a calibration ball is locatedAnd its corresponding robot pose in the current robot coordinate system +.>Forming n groups of point cloud data;
s3, extracting a calibrated spherical point cloud, and determining coordinate values of a calibrated spherical point in a camera coordinate system in each group of point cloud data by adopting a least square fitting algorithm
S4, according to the pose of the robotAnd the coordinates of the sphere center point of the corresponding calibration sphere +.>Solving the offset according to a first algorithm>Hand-eye calibration matrix->
S5, calibrating the hand and eye calibration matrixConverts into Euler angle form and marks the hand and eye with matrix result +.>As an initial value for nonlinear optimization; according to a second algorithm, calculating a hand-eye calibration matrix respectively>And offset->Difference from true value->Correcting the hand eye calibration matrix according to the correction;
wherein the first algorithm step comprises:
s41 according to the pose of each robotObtaining pose information in a matrix form by using the conversion relation from Euler angles to a rotation matrix>
S42 setting the offset asCoordinate values of the spherical center point of the standard sphere at corresponding moments under the robot coordinate system can be obtained>
S43, setting a hand-eye calibration matrix from a camera coordinate system to a robot coordinate system asThe coordinates of the spherical center point in the camera coordinate system are transferred to the robot coordinate system to be +.>The method comprises the steps of carrying out a first treatment on the surface of the Is provided with->The following equation can be established between the pose of the robot at the same moment and the coordinates of the calibrated sphere center point under the camera coordinate system:
wherein the offset +.>And hand-eye calibration matrix->Is the unknown quantity X to be solved;
s44 according toThe corresponding robot pose and the coordinates of the spherical center point of the calibration sphere are combined through the step S43, and the equations are divided into +.>In the form of a matrix of shapes;
s45, solving a linear equation by utilizing singular value decomposition to solve the unknown quantityWhich is the offset which is sought +.>And hand-eye calibration matrix->
Wherein the second algorithm step comprises:
s51 calibrating the hand and eye calibration matrixForm converted into Euler angle->Wherein the matrix is rotatedAnd Euler angle->The conversion relation between the two is as follows:
s52 setting a hand-eye calibration matrixAnd offset->The difference from the true value is ∈>The expression under the form of euler angles is: />The method comprises the steps of carrying out a first treatment on the surface of the Wherein (1)>The values of (a) all tend to be 0; when->When +.>Then:
in matrix form
S53 willIs brought into the formula of the first algorithm to obtain +.>Due to->The equation can be solved by linear least squares method to obtain compensationThereby obtaining a more accurate hand-eye calibration matrix +.>
2. The mobile robot 3D hand-eye calibration method according to claim 1, wherein the recording of the point cloud data under the camera coordinate system of the calibration sphere held by the mobile robotAnd its corresponding robot pose in the current robot coordinate system +.>The method comprises the following steps:
s21, moving the robot to the camera view finding range and then standing still, and triggering the camera to shoot the point cloud data of the first group of calibration ballsThe method comprises the steps of carrying out a first treatment on the surface of the Simultaneously recording the pose of the robot under the robot base coordinate system>Pose->Comprises coordinate values of the robot end effector in the robot coordinate system>And its posture->The method comprises the steps of carrying out a first treatment on the surface of the Representing pose information of the robot in Euler angle manner, < >>、/>、/>Respectively indicate the winding +.>、/>、/>The rotation angle of the shaft;
s22, changing the position and the posture of the robot in the camera view finding range, and recording new pose information of the robotThe method comprises the steps of carrying out a first treatment on the surface of the At the same time, the camera shoots the point cloud data of the calibration sphere at the moment +.>
S23 repeating the stepsSecond, to cover the whole view range of the camera and the spatial degree of freedom of the robot, get +.>And (5) grouping calibrated point cloud data.
3. The mobile robot 3D hand-eye calibration method according to claim 1, wherein the calibration sphere point cloud is extracted in step S3The method comprises the following steps: scanning point cloud data for any groupAnd positioning the point cloud positioned on the calibration sphere from the point cloud data by utilizing a random sampling consistency algorithm.
4. A mobile robot 3D hand-eye calibration device, comprising: the three-dimensional (3D) calibration camera, a main controller, a processor and a calibration ball, wherein the view finding range of the 3D calibration camera covers all degrees of freedom of the mobile robot, and the main controller is connected with the 3D calibration camera to record point cloud data under a camera coordinate system where the calibration ball held by the mobile robot is locatedAnd its corresponding robot pose in the current robot coordinate system +.>N groups of point cloud data are formed and transmitted to a processor, the processor extracts the calibrated spherical point cloud according to a preset calibration algorithm, and a least square fitting algorithm is adopted to determine coordinate values of the calibrated spherical point in a camera coordinate system in each group of point cloud data, wherein the calibrated spherical point is located in the coordinate system of the camera>At the same time according to the pose of the robot>And the coordinates of the sphere center point of the corresponding calibration sphere +.>Solving the offset according to a first algorithm>Hand-eye calibration matrix->And calibrating the moment of the hand and the eyeMatrix->Converted into Euler angle form and hand-eye calibration matrix result +.>As an initial value for the nonlinear optimization, the hand-eye calibration matrix is then calculated according to a second algorithm, respectively +.>And offset->Difference from true value->Correcting the hand eye calibration matrix according to the correction;
wherein the first algorithm step comprises:
s41 processor according to each robot poseObtaining pose information in a matrix form by using the conversion relation from Euler angles to a rotation matrix>
S42 setting the offset asCoordinate values of the spherical center point of the standard sphere at corresponding moments under the robot coordinate system can be obtained>
S43, setting a hand-eye calibration matrix from a 3D camera coordinate system to a robot coordinate system asCamera with camera bodyThe coordinates of the spherical center point in the coordinate system are transferred to the robot coordinate system to be +.>The method comprises the steps of carrying out a first treatment on the surface of the Is provided with->The following equation can be established between the pose of the robot at the same moment and the coordinates of the calibrated sphere center point under the camera coordinate system:
wherein the offset +.>And hand-eye calibration matrix->Is the unknown quantity X to be solved;
s44 processor according toThe corresponding robot pose and the coordinates of the spherical center point of the calibration sphere are combined through the step S43, and the equations are divided into +.>In the form of a matrix of shapes;
s45 processor solves linear equation by singular value decomposition to obtain unknown quantityWhich is the offset which is sought +.>And hand-eye calibration matrix->
Wherein the second algorithm step comprises:
s51, the processor marks the hand and eye with a matrixForm converted into Euler angle->Wherein the rotation matrix->And Euler angle->The conversion relation between the two is as follows:
s52 processor sets hand-eye calibration matrixAnd offset->The difference from the true value is ∈>The expression under the form of euler angles is: />The method comprises the steps of carrying out a first treatment on the surface of the Wherein (1)>The values of (a) all tend to be 0; when->In this case, it is possible to estimate->Then:
in matrix form
S53 the processor willIs brought into the formula of the first algorithm to obtain +.>Due to->It is known that the above equation can be solved by using the linear least squares method to obtain the compensation amount +.>Thereby obtaining a more accurate hand-eye calibration matrix +.>
5. The mobile robot 3D hand-eye calibration device of claim 4, wherein the master controller records point cloud data under a camera coordinate system in which a calibration sphere held by the mobile robot is locatedAnd its corresponding robot pose in the current robot coordinate system +.>The method comprises the following steps:
s21, after the robot moves to the camera view finding range and then is stationary, the master controller controls the 3D camera to trigger shooting of point cloud data of the first group of calibration ballsThe method comprises the steps of carrying out a first treatment on the surface of the Simultaneously recording the pose of the robot under the robot base coordinate system>Pose->Comprises coordinate values of the robot end effector in the robot coordinate system>And its posture->The method comprises the steps of carrying out a first treatment on the surface of the Representing pose information of the robot in Euler angle manner, < >>、/>、/>Respectively indicate the winding +.>、/>、/>The rotation angle of the shaft;
s22, the master controller controls the robot to change the position and the posture in the view-finding range of the 3D camera, and records new posture information of the robotThe method comprises the steps of carrying out a first treatment on the surface of the Simultaneous 3D phasesShooting point cloud data of calibration ball at the moment by a camera>
S23, repeating the steps by the master controllerSecond, to cover the whole view range of the 3D camera and the spatial freedom of the robot, get +.>And (5) grouping calibrated point cloud data.
6. The mobile robotic 3D hand-eye calibration device of claim 4, wherein the processor extracting the calibration sphere point cloud comprises: scanning point cloud data for any groupAnd positioning the point cloud positioned on the calibration sphere from the point cloud data by utilizing a random sampling consistency algorithm.
CN202110689530.1A 2021-06-21 2021-06-21 Mobile robot 3D hand-eye calibration method and device Active CN113362396B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110689530.1A CN113362396B (en) 2021-06-21 2021-06-21 Mobile robot 3D hand-eye calibration method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110689530.1A CN113362396B (en) 2021-06-21 2021-06-21 Mobile robot 3D hand-eye calibration method and device

Publications (2)

Publication Number Publication Date
CN113362396A CN113362396A (en) 2021-09-07
CN113362396B true CN113362396B (en) 2024-03-26

Family

ID=77535498

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110689530.1A Active CN113362396B (en) 2021-06-21 2021-06-21 Mobile robot 3D hand-eye calibration method and device

Country Status (1)

Country Link
CN (1) CN113362396B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113814987B (en) * 2021-11-24 2022-06-03 季华实验室 Multi-camera robot hand-eye calibration method and device, electronic equipment and storage medium
CN114347027B (en) * 2022-01-08 2022-12-13 天晟智享(常州)机器人科技有限公司 Pose calibration method of 3D camera relative to mechanical arm
CN114147728B (en) * 2022-02-07 2022-05-06 杭州灵西机器人智能科技有限公司 Universal robot eye on-hand calibration method and system
CN115488878A (en) * 2022-08-29 2022-12-20 上海智能制造功能平台有限公司 Hand-eye calibration method, system, terminal and medium for robot vision system
CN116038721B (en) * 2023-04-03 2023-07-18 广东工业大学 Hand-eye calibration method and system without kinematic participation
CN117409080A (en) * 2023-11-10 2024-01-16 广州市斯睿特智能科技有限公司 Hand-eye calibration method for line scanning 3D camera

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108582076A (en) * 2018-05-10 2018-09-28 武汉库柏特科技有限公司 A kind of Robotic Hand-Eye Calibration method and device based on standard ball
CN109746920A (en) * 2019-03-06 2019-05-14 南京航空航天大学 A kind of industrial robot geometric parameter error calibrating method based on two-step method
CN110116411A (en) * 2019-06-06 2019-08-13 浙江汉振智能技术有限公司 A kind of robot 3D vision hand and eye calibrating method based on ball target
WO2020024178A1 (en) * 2018-08-01 2020-02-06 深圳配天智能技术研究院有限公司 Hand-eye calibration method and system, and computer storage medium
CN111002312A (en) * 2019-12-18 2020-04-14 江苏集萃微纳自动化系统与装备技术研究所有限公司 Industrial robot hand-eye calibration method based on calibration ball
CN111546328A (en) * 2020-04-02 2020-08-18 天津大学 Hand-eye calibration method based on three-dimensional vision measurement
CN112091971A (en) * 2020-08-21 2020-12-18 季华实验室 Robot eye calibration method and device, electronic equipment and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108582076A (en) * 2018-05-10 2018-09-28 武汉库柏特科技有限公司 A kind of Robotic Hand-Eye Calibration method and device based on standard ball
WO2020024178A1 (en) * 2018-08-01 2020-02-06 深圳配天智能技术研究院有限公司 Hand-eye calibration method and system, and computer storage medium
CN109746920A (en) * 2019-03-06 2019-05-14 南京航空航天大学 A kind of industrial robot geometric parameter error calibrating method based on two-step method
CN110116411A (en) * 2019-06-06 2019-08-13 浙江汉振智能技术有限公司 A kind of robot 3D vision hand and eye calibrating method based on ball target
CN111002312A (en) * 2019-12-18 2020-04-14 江苏集萃微纳自动化系统与装备技术研究所有限公司 Industrial robot hand-eye calibration method based on calibration ball
CN111546328A (en) * 2020-04-02 2020-08-18 天津大学 Hand-eye calibration method based on three-dimensional vision measurement
CN112091971A (en) * 2020-08-21 2020-12-18 季华实验室 Robot eye calibration method and device, electronic equipment and system

Also Published As

Publication number Publication date
CN113362396A (en) 2021-09-07

Similar Documents

Publication Publication Date Title
CN113362396B (en) Mobile robot 3D hand-eye calibration method and device
CN107443382B (en) Industrial robot structure parameter error identification and compensation method
CN111360827B (en) Visual servo switching control method and system
US9197810B2 (en) Systems and methods for tracking location of movable target object
CN109153125B (en) Method for orienting an industrial robot and industrial robot
CN112105484B (en) Robot kinematics parameter self-calibration method, system and storage device
CN107139178B (en) Unmanned aerial vehicle and vision-based grabbing method thereof
CN109877840B (en) Double-mechanical-arm calibration method based on camera optical axis constraint
Qiu et al. Visual servo tracking of wheeled mobile robots with unknown extrinsic parameters
CN106595474A (en) Double-robot base coordinate system calibration method based on laser tracker
CN110919658B (en) Robot calibration method based on vision and multi-coordinate system closed-loop conversion
CN111426270B (en) Industrial robot pose measurement target device and joint position sensitive error calibration method
CN114211484B (en) Front-end tool pose synchronization method, electronic equipment and storage medium
CN110815204B (en) Industrial robot kinematics calibration method
CN110962127B (en) Auxiliary calibration device for tail end pose of mechanical arm and calibration method thereof
CN114161411B (en) Visual-based multi-legged robot kinematics parameter calibration method
Hvilshøj et al. Calibration techniques for industrial mobile manipulators: Theoretical configurations and best practices
CN113524201A (en) Active adjusting method and device for pose of mechanical arm, mechanical arm and readable storage medium
CN114654466B (en) Automatic calibration method, device, system, electronic equipment and storage medium
Đurović et al. Low cost robot arm with visual guided positioning
CN113878586B (en) Robot kinematics calibration device, method and system
Liu et al. An automated method to calibrate industrial robot kinematic parameters using Spherical Surface constraint approach
CN110722547A (en) Robot vision stabilization under model unknown dynamic scene
CN112123329A (en) Robot 3D vision hand-eye calibration method
CN114734435A (en) Encoder calibration method, device and system based on hypersphere

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant