CN113362396A - Mobile robot 3D hand-eye calibration method and device - Google Patents
Mobile robot 3D hand-eye calibration method and device Download PDFInfo
- Publication number
- CN113362396A CN113362396A CN202110689530.1A CN202110689530A CN113362396A CN 113362396 A CN113362396 A CN 113362396A CN 202110689530 A CN202110689530 A CN 202110689530A CN 113362396 A CN113362396 A CN 113362396A
- Authority
- CN
- China
- Prior art keywords
- robot
- calibration
- hand
- coordinate system
- matrix
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 58
- 239000011159 matrix material Substances 0.000 claims abstract description 116
- 238000005457 optimization Methods 0.000 claims abstract description 11
- 239000012636 effector Substances 0.000 claims description 10
- 230000009466 transformation Effects 0.000 claims description 10
- 238000006243 chemical reaction Methods 0.000 claims description 8
- 238000000354 decomposition reaction Methods 0.000 claims description 7
- 238000005070 sampling Methods 0.000 claims description 6
- 239000000284 extract Substances 0.000 claims description 3
- HOWHQWFXSLOJEF-MGZLOUMQSA-N systemin Chemical group NCCCC[C@H](N)C(=O)N[C@@H](CCSC)C(=O)N[C@@H](CCC(N)=O)C(=O)N[C@@H]([C@@H](C)O)C(=O)N[C@@H](CC(O)=O)C(=O)OC(=O)[C@@H]1CCCN1C(=O)[C@H]1N(C(=O)[C@H](CC(O)=O)NC(=O)[C@H](CCCN=C(N)N)NC(=O)[C@H](CCCCN)NC(=O)[C@H](CO)NC(=O)[C@H]2N(CCC2)C(=O)[C@H]2N(CCC2)C(=O)[C@H](CCCCN)NC(=O)[C@H](CO)NC(=O)[C@H](CCC(N)=O)NC(=O)[C@@H](NC(=O)[C@H](C)N)C(C)C)CCC1 HOWHQWFXSLOJEF-MGZLOUMQSA-N 0.000 claims description 3
- 230000003044 adaptive effect Effects 0.000 abstract 1
- 230000008569 process Effects 0.000 description 9
- 239000013598 vector Substances 0.000 description 5
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 238000007476 Maximum Likelihood Methods 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0219—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/11—Complex mathematical operations for solving equations, e.g. nonlinear equations, general mathematical optimization problems
- G06F17/12—Simultaneous equations, e.g. systems of linear equations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Abstract
The invention provides a method and a device for calibrating the 3D hand and eye of a mobile robot, wherein the method comprises the following steps: s1 calibrating the camera view range to cover all the freedom of the mobile robot; s2 point cloud data of a camera coordinate system where a calibration sphere held by the mobile robot is located are recorded, and the point cloud data correspond to robot poses of the current robot coordinate system to form n groups of point cloud data; s3, extracting point clouds of the spherical surface of the calibration sphere, and determining coordinate values of the center point of the calibration sphere in each group of point cloud data by adopting a least square fitting algorithm; s4, solving an offset and a hand-eye calibration matrix according to a first algorithm according to the robot pose and the coordinates of the corresponding calibration sphere center point; s5, converting the hand-eye calibration matrix into a Euler angle form, and taking the hand-eye calibration matrix result as an initial value of nonlinear optimization; and respectively calculating the hand-eye calibration matrix and the difference value between the offset and the true value according to a second algorithm, and correcting the hand-eye calibration matrix according to the difference value. Therefore, the adaptive requirement on the degree of freedom of the robot is reduced, and the universality is improved.
Description
Technical Field
The invention relates to the technical field of machine vision, in particular to a 3D hand-eye calibration method and device applicable to at least one degree of freedom movement.
Background
With the continuous deepening of the robot technology into industrial and living scenes, various single robots such as a mechanical ARM, an AGV trolley and an ARM are gradually mature in production and use technologies according to the use requirements of different scenes. The single robots are used as an actuator, and can well replace people to finish the work of container transportation, logistics sorting, mechanical assembly and the like, wherein the work is high in danger, strong in repeatability and high in working strength. However, the robot as an end effector often has no sensing capability. The 3D camera can acquire three-dimensional information of an object and has the ability of perceiving the spatial position and the posture of the object. In most work scenes, the robot often needs 3D visual cooperation to provide guidance to jointly complete some intelligent work. The 3D vision and the robot are made into independent functional modules, and the hand-eye calibration method is to establish a connection between the two; and converting the object position information recognized by the 3D vision into coordinate values which can be understood by the robot, and guiding the robot to a target position to complete corresponding work.
The hand-eye calibration is mainly used for establishing a conversion relation between a camera coordinate system and a robot coordinate system, and the main research work of the current hand-eye calibration method is the hand-eye calibration of a mechanical arm and a 3D camera and is used for guiding the mechanical arm to carry out the work of loading and unloading parts, logistics stacking and the like. The hand-eye calibration method of the mechanical arm and the 3D camera uses a specially-made calibration plate or calibration ball as a medium, and coordinates of the medium acquired by the 3D camera in a camera coordinate system and coordinates of the medium in a robot coordinate system corresponding to the same moment. And settling the conversion relation between the two coordinate systems according to the corresponding coordinate values of the same object in different coordinate systems. Since the medium is an object with a space size and not a point, it is often difficult to obtain coordinate values of the fixed point on the medium under the camera coordinate system and the robot coordinate system, respectively. Currently, the mainstream hand-eye calibration method is to settle the transformation relation between two coordinate systems by manually assisting or specifying that a robot moves according to a certain specific mode, however, the manually-assisted hand-eye calibration method has low automation degree and cannot be applied in a large scale.
For this reason, the prior art has proposed a calibration method and device (CN108582076A) for robot hand-eye based on standard sphere, the method includes: acquiring point cloud information of a standard ball based on a camera coordinate system and position information of a robot TCP in the robot based coordinate system; positioning the spherical surface of a standard ball according to the point cloud information, determining the center position of the standard ball, and determining the three-dimensional coordinate value of the center of the standard ball in the camera coordinate system based on the maximum likelihood estimation algorithm according to the center position; and determining an over-determined transformation matrix equation of the three-dimensional coordinate values from the camera coordinate system to the robot base coordinate system, and determining a homogeneous transformation matrix in the robot base coordinate system by using a least square optimization algorithm to realize calibration of the robot eyes.
However, the prior art is mainly applied to a six-degree-of-freedom mechanical arm, so that the robot is limited to do spatial rotation motion and translation motion according to a specific mode. Although full automation of the calibration process is achieved, the robot is required to have rotational motion capability with three degrees of freedom in space. However, mobile robots such as AGV carts and ARM have only one degree of freedom in space, and thus the prior art is not applicable. To fill this technical gap, a new solution is needed in the art to solve the above problems.
Disclosure of Invention
The invention mainly aims to provide a 3D hand-eye calibration method and a device for a mobile robot, so as to reduce the adaptation requirement on the degree of freedom of the robot and improve the universality.
In order to achieve the above object, the present invention provides a mobile robot 3D hand-eye calibration method, which comprises the steps of:
s1 calibrating the camera view range to cover all the freedom of the mobile robot;
s2 records point cloud data cP of a camera coordinate system where a calibration sphere held by the mobile robot is locatediAnd its robot pose rP under current robot coordinate systemiForming n groups of point cloud data;
s3 extracting point clouds of the calibration sphere, and determining coordinate values of the calibration sphere center point in each group of point cloud data under the camera coordinate system by adopting a least square fitting algorithm
S4 according to the pose rP of the robotiAnd corresponding calibration sphere center point coordinates cTiSolving the offset delta t and a hand-eye calibration matrix H according to a first algorithm;
s5, converting the hand-eye calibration matrix H into a Euler angle form, and taking the hand-eye calibration matrix result H as an initial value of nonlinear optimization; and respectively calculating the hand-eye calibration matrix H and the difference values dH and d delta t between the offset delta t and the true value according to a second algorithm, and correcting the hand-eye calibration matrix according to the difference values dH and d delta t.
In a possible preferred embodiment, said first algorithm step comprises:
s41 according to each robot poseObtaining the pose information in the form of matrix by using the transformation relation from Euler angle to rotation matrix
S42 setting the offset as Δ t ═ Δ x, Δ y, Δ z]TCoordinate values of the center point of the calibration sphere at the corresponding moment in the robot coordinate system can be obtained
S43, setting the hand-eye calibration matrix from the camera coordinate system to the robot coordinate system as H, and transferring the coordinates of the spherical center point in the camera coordinate system to the robot coordinate systemIs as followsIs provided withThe robot pose at the same time and the coordinates of the fixed spherical center point under the camera coordinate system can be established as follows:the offset delta t and the hand-eye calibration matrix H are unknown quantity X to be solved;
s44, establishing an equation through step S43 according to the n groups of corresponding robot poses and the coordinates of the center point of the calibration sphere, and dividing the equation into a form matrix form of A.X.b;
s45, solving the linear equation by using singular value decomposition to obtain the unknown quantity X, which is the offset delta t and the hand-eye calibration matrix H.
In a possible preferred embodiment, the second algorithm step comprises:
s51 converting hand-eye calibration matrix H into Euler angle formWherein the rotation matrixAngle t ═ α, β, γ]The conversion relationship between the two is as follows:
s52, the hand-eye calibration matrix H and the difference between the offset Δ t and the true value are dH and d Δ t, respectively, and the representation in the euler angle form is:dΔt=[dΔx,dΔy,dΔz](ii) a Wherein, the numerical values of dH and d delta t both tend to 0; when d α, d β, d γ ≈ 0, it can be estimated that sind α ≈ 0, cosd α ≈ 1, then:in the form of a matrix
S53, substituting H, dH, delta t and d delta t into the formula of the first algorithm to obtain an equation of A- (delta t + d delta t) ═ H + dH) · b, wherein H and delta t are known, the equation can be solved by a linear two-times method to obtain compensation quantities dH and d delta t, and therefore a more accurate hand-eye calibration matrix is obtained
-in a possible preferred embodiment, said recording of point cloud data cP in the camera coordinate system of the calibration sphere held by the mobile robotiAnd its robot pose rP under current robot coordinate systemiComprises the following steps:
s21, moving the robot to the camera view range and then standing still, triggering the camera to shoot the point cloud data cP of the first group of calibration balls1(ii) a Simultaneously recording the pose rP of the robot under the robot base coordinate system at the moment1Pose rP1Including the coordinate values of the robot end effector in the robot coordinate systemAnd its postureThe attitude information of the robot is expressed by adopting an Euler angle mode,respectively, the rotation angles of the two parts around the X, Y, Z axis;
s22, changing the position and posture of the robot in the camera view finding range, and recording the new pose information rP of the robot2(ii) a Simultaneously, the camera shoots point cloud data cP of the calibration sphere at the moment2;
S23, repeating the steps for n times to cover the whole view range of the camera and the space freedom degree of the robot, and obtaining n groups of calibrated point cloud data.
In a possible preferred embodiment, the step of extracting a calibration sphere point cloud in step S3 includes: for any group of scanning point cloud data cPiAnd positioning the point cloud on the calibration sphere from the point cloud data by using a random sampling consistency algorithm.
The invention also provides a 3D hand-eye calibration device for the mobile robot, which comprises: the mobile robot calibration system comprises a 3D calibration camera, a main controller, a processor and a calibration ball, wherein the view finding range of the 3D calibration camera covers all the degrees of freedom of the mobile robot, and the main controller is connected with the 3D calibration camera to record point cloud data (cP) under a camera coordinate system where the calibration ball held by the mobile robot is locatediAnd its robot pose rP under current robot coordinate systemiN groups of point cloud data are formed and transmitted to a processor, the processor extracts point clouds of a calibration spherical surface according to a preset calibration algorithm, and a least square fitting algorithm is adopted to determine coordinate values of the calibration spherical center point in each group of point cloud data under a camera coordinate systemAccording to the pose rP of the robot at the same timeiAnd corresponding calibration sphere center point coordinates cTiThe offset delta t and the hand-eye calibration matrix H are solved according to a first algorithm, the hand-eye calibration matrix H is converted into a Euler angle form, a hand-eye calibration matrix result H is used as an initial value of nonlinear optimization, then the hand-eye calibration matrix H and the difference values dH and d delta t between the offset delta t and a true value are respectively calculated according to a second algorithm, and the hand-eye calibration matrix is corrected according to the difference values dH and d delta t.
In a possible preferred embodiment, said first algorithm step comprises:
s41 processor based on each robot poseObtaining the pose information in the form of matrix by using the transformation relation from Euler angle to rotation matrix
S42 setting the offset as Δ t ═ Δ x, Δ y, Δ z]T, obtaining the coordinate value of the corresponding moment of the calibration sphere center point under the robot coordinate system
S43, setting a hand-eye calibration matrix from a 3D camera coordinate system to a robot coordinate system as H, and transferring the coordinates of the spherical center point in the camera coordinate system to the robot coordinate system asIs provided withThe robot pose at the same time and the coordinates of the fixed spherical center point under the camera coordinate system can be established as follows:the offset delta t and the hand-eye calibration matrix H are unknown quantity X to be solved;
the S44 processor establishes an equation through step S43 according to the n groups of corresponding robot poses and the coordinates of the center point of the calibration sphere, and the equation is drawn into a form matrix of A.X.b;
the S45 processor solves the linear equation by singular value decomposition to obtain the unknown quantity X, which is the offset delta t and the hand-eye calibration matrix H.
In a possible preferred embodiment, the second algorithm step comprises:
the S51 processor converts the hand-eye calibration matrix H into Euler angle formWherein the rotation matrixAnd euler angle r ═ α, β, γ]The conversion relationship between the two is as follows:
the S52 processor sets the difference between the hand-eye calibration matrix H and the offset Δ t and the true value to dH and d Δ t, respectively, and the representation in the euler angle form is:dΔt=[dΔx,dΔy,dΔz](ii) a Wherein, the numerical values of dH and d delta t both tend to 0; when d α, d β, d γ ≈ 0, it can be estimated that sind α ≈ 0, cosd α ≈ 1, then:
The S53 processor brings H, dH, Δ t and d Δ t into the formula of the first algorithm to obtain an equation of A · (Δ t + d Δ t) ═ H + dH · b, and since H and Δ t are known, the equation can be solved by using a linear least squares method to obtain compensation quantities dH and d Δ t, so that a more accurate hand-eye calibration matrix is obtained
In a possible preferred embodiment, the master controller records point cloud data cP in a camera coordinate system where a calibration sphere held by the mobile robot is locatediAnd the step of corresponding the robot pose rPi in the current robot coordinate system comprises:
s21 the main controller moves the robot to the camera view range and then stands still, controls the 3D camera to trigger shooting the point cloud data cP of the first group of calibration balls1(ii) a Simultaneously recording the pose rP of the robot under the robot base coordinate system at the moment1Pose rP1Including the coordinate values of the robot end effector in the robot coordinate systemAnd its postureThe attitude information of the robot is expressed by adopting an Euler angle mode,respectively, the rotation angles of the two parts around the X, Y, Z axis;
s22 main controller controls the robot to change position and posture in the 3D camera view range and records the new pose information rP of the robot2(ii) a Simultaneously, the 3D camera shoots point cloud data cP of the calibration sphere at the moment2;
And S23, repeating the steps by the main controller for n times to cover the whole view range of the 3D camera and the space freedom degree of the robot to obtain n groups of calibrated point cloud data.
In a possible preferred embodiment, the step of extracting the calibration sphere spherical point cloud by the processor comprises: for any group of scanning point cloud data cPiAnd positioning the point cloud on the calibration sphere from the point cloud data by using a random sampling consistency algorithm.
According to the 3D hand-eye calibration method and device for the mobile robot, provided by the invention, under each implementation mode, the beneficial effects comprise that:
(1) the method has universality, and only the robot is required to drive the calibration ball to move freely in the visual field range of the 3D camera in the calibration process, and the pose information of the robot and the point cloud information acquired by the 3D camera at the same moment are recorded. The robot calibration method does not need to limit the movement form of the robot in the calibration process, and can be universally used for the hand-eye calibration work of various types of robots such as mechanical arms, AGV trolleys and AMR.
(2) The method can realize automation in the processes of collecting calibration data, identifying the coordinates of the calibration sphere center point and calibrating a matrix settlement, not only can realize batch application, but also can ensure that the calibration result does not introduce human errors, and improves the calibration precision and stability.
(3) The method introduces a method for further optimizing parameters, can effectively inhibit the amplification effect of a calibration algorithm on the positioning error of the robot, and ensures that the overall accuracy of the system is equivalent to the positioning accuracy of the robot.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a schematic diagram of a calibration system of a robot 3D hand-eye calibration method according to the present invention;
fig. 2 is a step diagram of the robot 3D hand-eye calibration method of the present invention.
Detailed Description
The following describes in detail embodiments of the present invention. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the spirit of the invention. All falling within the scope of the present invention.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make those skilled in the art better understand the technical solution of the present invention, the technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions.
The 3D hand-eye calibration method for the mobile robot provided by the invention can be suitable for various robots such as mechanical ARMs, AGV trolleys and ARM. The calibration ball is used as a medium, and the tracking robot carries the calibration ball to freely move in the visual field of the 3D camera, so that the hand-eye calibration work of the robot and the 3D camera is automatically completed.
Referring to fig. 1 to 2, a first aspect of the present invention provides a 3D hand-eye calibration device for a mobile robot, which includes a 23D calibration camera, a main controller, a processor, and a calibration sphere, wherein a viewing range of the 3D calibration camera covers all degrees of freedom of the mobile robot, the main controller is connected to the 3D calibration camera to record point cloud data cP under a camera coordinate system where the calibration sphere is located and held by the mobile robotiAnd its robot pose rP under current robot coordinate systemiN groups of point cloud data are formed and transmitted to a processor, the processor extracts point clouds of a calibration spherical surface according to a preset calibration algorithm, and a least square fitting algorithm is adopted to determine coordinate values of the calibration spherical center point in each group of point cloud data under a camera coordinate systemAccording to the pose rP of the robot at the same timeiAnd corresponding calibration sphere center point coordinates cTiThe offset delta t and the hand-eye calibration matrix H are solved according to a first algorithm, the hand-eye calibration matrix H is converted into a Euler angle form, a hand-eye calibration matrix result H is used as an initial value of nonlinear optimization, then the hand-eye calibration matrix H and the difference values dH and d delta t between the offset delta t and a true value are respectively calculated according to a second algorithm, and the hand-eye calibration matrix is corrected according to the difference values dH and d delta t.
Wherein the first algorithm step comprises:
s41 processor based on each robot poseObtaining the pose information in the form of matrix by using the transformation relation from Euler angle to rotation matrix
S42 setting the offset as Δ t ═ Δ x, Δ y, Δ z]TCoordinate values of the center point of the calibration sphere at the corresponding moment in the robot coordinate system can be obtained
S43, setting a hand-eye calibration matrix from a 3D camera coordinate system to a robot coordinate system as H, and transferring the coordinates of the spherical center point in the camera coordinate system to the robot coordinate system asIs provided withThe robot pose at the same time and the coordinates of the fixed spherical center point under the camera coordinate system can be established as follows:
the S44 processor establishes an equation through step S43 according to the n groups of corresponding robot poses and the coordinates of the center point of the calibration sphere, and the equation is drawn into a form matrix of A.X.b;
the S45 processor solves the linear equation by singular value decomposition to obtain the unknown quantity X, which is the offset delta t and the hand-eye calibration matrix H.
Wherein the second algorithm step comprises:
the S51 processor converts the hand-eye calibration matrix H into Euler angle shapeFormula (II)Wherein the rotation matrixAnd euler angle r ═ α, β, γ]The conversion relationship between the two is as follows:
the S52 processor sets the difference between the hand-eye calibration matrix H and the offset Δ t and the true value to dH and d Δ t, respectively, and the representation in the euler angle form is:dΔt=[dΔx,dΔy,dΔz](ii) a Wherein, the numerical values of dH and d delta t both tend to 0; when da, d β, d γ ≈ 0, sinda ≈ 0, cosda ≈ 1 can be estimated, then:
The S53 processor brings H, dH, Δ t and d Δ t into the formula of the first algorithm to obtain an equation of A · (Δ t + d Δ t) ═ H + dH · b, and since H and Δ t are known, the equation can be solved by using a linear least squares method to obtain compensation quantities dH and d Δ t, so that a more accurate hand-eye calibration matrix is obtained
The main controller records point cloud data cP under a camera coordinate system where a calibration ball held by the mobile robot is locatediAnd its robot pose rP under current robot coordinate systemiComprises the following steps:
s21 the main controller moves the robot to the camera view range and then stands still, controls the 3D camera to trigger shooting the point cloud data cP of the first group of calibration balls1(ii) a Simultaneously recording the pose rP of the robot under the robot base coordinate system at the moment1Pose rP1Including the coordinate values of the robot end effector in the robot coordinate systemAnd its postureThe attitude information of the robot is expressed by adopting an Euler angle mode,respectively, the rotation angles of the two parts around the X, Y, Z axis;
s22 main controller controls the robot to change position and posture in the 3D camera view range and records the new pose information rP of the robot2(ii) a Simultaneously, the 3D camera shoots point cloud data cP of the calibration sphere at the moment2;
And S23, repeating the steps by the main controller for n times to cover the whole view range of the 3D camera and the space freedom degree of the robot to obtain n groups of calibrated point cloud data.
The step of extracting the cloud of the spherical points of the calibration sphere by the processor comprises the following steps: for any group of scanning point cloud data cPiAnd positioning the point cloud on the calibration sphere from the point cloud data by using a random sampling consistency algorithm.
As shown in fig. 1 to fig. 2, in another aspect, the present invention further provides a mobile robot 3D hand-eye calibration method, which in a preferred embodiment, includes the steps of:
s1 calibrating the camera viewing range to cover all the degrees of freedom of the mobile robot, wherein in this embodiment, the calibrating step includes:
1) the calibration ball is fixed on the robot actuator, so that the calibration ball and the robot cannot be displaced in the calibration process.
2) The robot is adjusted to the field range of the 3D camera, and the 3D camera can shoot the calibration ball.
3) Moving the robot in the field of view of the 3D camera, and recording pose information rP of the robot under a robot coordinate system in each moving process, including positionAnd attitude rRi(Rx,Ry,Rz). Simultaneously, the point cloud data cP of the calibration sphere at the corresponding moment is collected by using the 3D camerai. And acquiring a calibration spherical point cloud set cP of n groups of robots at different poses and robot pose information rP (rT, rR) at the same time. And all the degrees of freedom that the robot can reach are required to be covered, and n is preferably a natural number which is more than or equal to 5 under the implementation, i epsilon [1, n]。
4) For each group of scanning point cloud data cPiAutomatically extracting the point cloud of the spherical surface by utilizing a random sampling consistency algorithm (RANSAC) algorithm, and then performing least square spherical fitting on the point cloud of the spherical surface to obtain a coordinate value of the point of the spherical center under a camera coordinate systemCoordinates cT of each sphere center pointiThe robot corresponding to the same time is in a posture rP under the base coordinate systemiWithout a set of calibration data, i ∈ [1, n ]]。
5) And constructing a linear equation according to n groups of robot pose information and the sphere center point coordinates of the linkage calibration sphere in the camera coordinate system, and solving a transformation matrix H from the camera coordinate system to the robot coordinate system by using a least square method, wherein the matrix H is a hand-eye calibration matrix from the 3D camera to the robot.
6) And (4) optimizing the hand-eye calibration matrix H obtained in the step (4) to improve the calibration precision.
Specifically, the step (1) further includes the following steps:
1.1 the calibration ball is to be placed on the robot end effector, the working mechanism of different types of robots is different, and the positions of the end effector are slightly different. In the calibration process, the calibration ball is ensured to be fixed on the robot and not slide in the calibration process, and the robot drives the calibration ball to move.
Further, the step (2) includes the following steps:
2.1 moving the robot to the field of view of the 3D camera and then standing still to trigger the 3D camera to shoot the point cloud data cP of the first group of calibration balls1. Simultaneously recording the pose rP of the robot under the robot base coordinate system at the moment1Pose rP1Including the coordinate values of the robot end effector under the robot base coordinate systemAnd its postureThe attitude information of the robot is expressed by adopting an Euler angle mode,respectively, which indicate their rotation angles about axis X, Y, Z.
2.2 changing the position and the posture of the robot in the field of view of the 3D camera and recording new pose information rP of the robot2. Simultaneously, 3D is utilized to shoot point cloud data cP of the calibration sphere at the moment2。
And 2.3 repeating the steps for n times, covering the whole view field range of the 3D camera and the space freedom degree of the robot as much as possible, and obtaining n groups of calibration data, wherein n is a natural number more than or equal to 5.
Further, the step (3) includes the following steps:
3.1 scanning any group of point clouds cPiAnd initially positioning the local point cloud on the calibration sphere from the point cloud by using a RANSAC algorithm to obtain the model parameters of the initial calibration sphere.
3.2 carrying out least square spherical surface fitting on the local point cloud on the calibration sphere to obtain the accurate sphere center point coordinate of the calibration sphere
Further, the step (4) comprises the following steps:
4.1 pose information according to each robotObtaining the pose information in the form of matrix according to the transformation relation from Euler angle to rotation matrix
4.2 since the robot coordinates read by the robot are not at the same point as the center point of the sphere, assume that the fixed offset amount between the two is Δ t ═ Δ x, Δ y, Δ z]TThe offset Δ t is an unknown quantity that needs to be solved for.
4.3 according to the pose information of the robot and the offset delta t of the spherical center point and the robot end effector, the coordinate value of the spherical center point at the corresponding moment in the robot coordinate system can be obtained
4.4 suppose the hand-eye calibration matrix from the camera coordinate system to the robot coordinate system is H, and the sphere center point coordinate in the camera coordinate system is transferred to the robot coordinate systemHypothesis matrix The robot pose at the same time and the coordinates of the fixed center point under the camera coordinate system can be established as follows:
4.5 according to the n groups of corresponding robot pose information and the calibration center point coordinates, establishing a simultaneous equation through a step 4.4, which is specifically as follows:
the above equation is drawn as a form matrix of a · X ═ b:
the unknown quantity X (namely the vector X) can be solved by solving a linear equation by using singular value decomposition, and the vector X is the solved offset delta t and the hand-eye calibration matrix H. The matrix H is a hand-eye calibration matrix from the camera coordinate system to the robot coordinate system.
Furthermore, in another preferred embodiment, since AGV type robots do not have the ability to rotate along the Rx, Ry axes, the third row of equations may be modified in step 4.5 above to apply to such robots:
the simplified form is:
Wherein, it should be noted that the above equation is solved for Δ z and b3Is not its true value. Because the radius of the calibration sphere and the design parameters of the robot are known, the real value of the calibration sphere center under the coordinate system of the robot can be obtained according to the known parametersThereby can obtain b3True value of
From this, a hand-eye calibration matrix H is obtained:
further, the step (5) comprises the following steps:
5.1 converting hand-eye calibration matrix H into Euler angle formAnd (4) taking the hand-eye calibration matrix result H obtained by linear solving in the step (4) as an initial value of the nonlinear optimization. Wherein the rotation matrix
5.2, assuming that the difference between the hand-eye calibration matrix H and the offset Δ t obtained in step (4) and the true value is dH and d Δ t, the representation in the euler angle form is:
When d α, d β, d γ ≈ 0, sind α ≈ 0, cosd α ≈ 1 can be estimated. Then:
5.3 substituting H, dH, Δ t, and d Δ t into the formula in step (4) can obtain an equation of (a · (Δ t + d Δ t) ═ H + dH) · b, since H and Δ t are known, the above equation can be solved by using a linear least square method to obtain compensation quantities dH and d Δ t, thereby obtaining a more accurate hand-eye calibration matrix
Examples of the experiments
In order to describe the scheme of the present invention more specifically, the following will describe in detail an embodiment of the mobile robot 3D hand-eye calibration of the present invention by taking an AGV forklift and a 3D camera as examples, with reference to the accompanying drawings.
Compared with a mechanical arm, the AGV forklift only has the capability of X, Y, Z moving in three directions and rotating around the Z axis, and the application process of the method on the AGV forklift is introduced, so that the method has strong universality.
Meanwhile, the calibration method of the scheme abandons the traditional two-step algorithm, namely, the method of solving the offset of the calibration ball and the robot and then solving the hand-eye calibration matrix is adopted, and the calibration ball, the offset of the robot and the hand-eye calibration matrix are solved together by adopting the one-step solving method, so that the final result is directly obtained, and the calculation efficiency is higher.
Specifically, in this experimental example, the method for calibrating the 3D hand-eye of the mobile robot includes the steps of:
s1: and collecting calibration data.
As shown in fig. 1, when the calibration sphere is fixed on the fork of the AGV forklift, and the radius dr of the calibration sphere is predicted to be 0.05m, the height value from the center of the sphere to the robot coordinate system isAnd controlling the AGV forklift to move within the field of view of the 3D camera, wherein each movement is changed in three dimensions of space coordinates X, Y, Z and space attitude Rz compared with the last movement. Wherein, X, Y are the coordinate value of fork truck on the two-dimensional map, and Z is fork truck fork tooth height value. Because the AGV fork truck onlySince the capability of yaw angle rotation does not include the capability of yaw angle and pitch angle rotation, the attitude information of the forklift is expressed by the angle Rz of euler angle rotation around the Z axis.
Recording the current pose information rP (x) of the forklift after the forklift movement is finished every timer,yr,zrRz) while a 3D camera is used to capture a cloud of points of the calibration sphere. And the recorded pose information after the forklift moves each time corresponds to the point cloud data acquired by the 3D camera one by one. And (3) segmenting point clouds on the calibration spherical surface from the scene by utilizing a RANSAC method for each group of point cloud data, then performing least square fitting on points in the spherical surface, and accurately positioning the coordinates cT (x) of the center points of the spherical surfacec,yc,zc). The specific implementation can be found in a document [ Shihong, Wanyanmin, Yanpuwei ] an automatic detection method of a target ball [ J & lt & gtbin & gt]Mapping notification, 2013 (S1): 58-60.
The calibration data collected in this example are shown in table 1.
TABLE 1
S2: and solving the unknown quantity X.
And constructing a linear equation A.X ═ b according to the calibration data acquired in S1, and performing singular value decomposition on the linear equation to obtain a least square solution of the vector X. The vector X includes an offset amount Δ t ═ Δ X, Δ y, Δ z]THand-eye calibration matrix
The vectors X, Δ t, and H calculated in this example are:
X=[0.7094 -1.10338 -1.19314 -0.666154 0.414944 -0.610368 1.34539 0.724705 0.297334 -0.580368 0.238512 -0.0354099 -0.856986 -0.521586 1.19314]T
s3: and correcting the hand-eye calibration matrix H.
Since the AGV forklift does not have the capability of rotating along the Rx and Ry axes, the constructed equation has no constraint capability in the Z-axis direction. In the experimental example, the real value of the coordinate of the calibration sphere center in the Z direction under the robot coordinate system isThe corrected true offsets Δ t and H are:
s4: and solving the optimization parameters dH and d delta t.
And constructing unknown quantities dH and d delta t, and substituting the unknown quantities dH and delta t into a linear equation A.X.b, namely solving optimization parameters dH and d delta t. In this experimental example, the hand-eye calibration matrix after correction is converted into a value in the form of euler angle as
H ═ 1.34539, 0.238512, 1.19314, 132.589, 2.029, -121.325], optimization matrix
dH ═ 0.041, 0.003, 0.021, 1.325, -0.937, 0.534. The finally obtained hand-eye calibration matrix is as follows:
therefore, the offset of the calibration ball and the robot and the one-step solution of the hand-eye calibration matrix can be realized, and the calculation efficiency is improved.
The preferred embodiments of the invention disclosed above are intended to be illustrative only. The preferred embodiments are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best utilize the invention. The invention is limited only by the claims and the full scope and equivalents thereof, and any modification, equivalent replacement, or improvement made within the spirit and principle of the invention should be included in the protection scope of the invention.
It will be appreciated by those skilled in the art that, in addition to implementing the system, apparatus and various modules thereof provided by the present invention in the form of pure computer readable program code, the same procedures may be implemented entirely by logically programming method steps such that the system, apparatus and various modules thereof provided by the present invention are implemented in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Therefore, the system, the device and the modules thereof provided by the present invention can be considered as a hardware component, and the modules included in the system, the device and the modules thereof for implementing various programs can also be considered as structures in the hardware component; modules for performing various functions may also be considered to be both software programs for performing the methods and structures within hardware components.
In addition, all or part of the steps of the method according to the above embodiments may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a single chip, a chip, or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In addition, any combination of various different implementation manners of the embodiments of the present invention is also possible, and the embodiments of the present invention should be considered as disclosed in the embodiments of the present invention as long as the combination does not depart from the spirit of the embodiments of the present invention.
Claims (10)
1. A3D hand-eye calibration method for a mobile robot is characterized by comprising the following steps:
s1 calibrating the camera view range to cover all the freedom of the mobile robot;
s2 records point cloud data cP of a camera coordinate system where a calibration sphere held by the mobile robot is locatediAnd its robot pose rP under current robot coordinate systemiForming n groups of point cloud data;
s3 extracting point clouds of the calibration sphere, and determining coordinate values of the calibration sphere center point in each group of point cloud data under the camera coordinate system by adopting a least square fitting algorithm
S4 according to the pose rP of the robotiAnd corresponding calibration sphere center point coordinates cTiSolving the offset delta t and a hand-eye calibration matrix H according to a first algorithm;
s5, converting the hand-eye calibration matrix H into a Euler angle form, and taking the hand-eye calibration matrix result H as an initial value of nonlinear optimization; and respectively calculating the hand-eye calibration matrix H and the difference values dH and d delta r between the offset delta t and the true value according to a second algorithm, and correcting the hand-eye calibration matrix according to the difference values dH and d delta r.
2. The method for 3D hand-eye calibration of a mobile robot according to claim 1, wherein the first algorithm step comprises:
s41 according to each robot poseObtaining the pose information in the form of matrix by using the transformation relation from Euler angle to rotation matrix
S42 setting the offset as Δ t ═ Δ x, Δ y, Δ z]TCoordinate values of the center point of the calibration sphere at the corresponding moment in the robot coordinate system can be obtained
S43, setting a hand-eye calibration matrix from a camera coordinate system to a robot coordinate system as H, and transferring the coordinates of the spherical center point in the camera coordinate system to the robot coordinate system asIs provided withThe robot pose at the same time and the coordinates of the fixed spherical center point under the camera coordinate system can be established as follows:the offset delta t and the hand-eye calibration matrix H are unknown quantity X to be solved;
s44, establishing an equation through step S43 according to the n groups of corresponding robot poses and the coordinates of the center point of the calibration sphere, and dividing the equation into a form matrix form of A.X.b;
s45, solving the linear equation by using singular value decomposition to obtain the unknown quantity X, which is the offset delta t and the hand-eye calibration matrix H.
3. The 3D hand-eye calibration method for mobile robots according to claim 2, characterized in that the second algorithm step comprises:
s51 converting hand-eye calibration matrix H into Euler angle formWherein the rotation matrixAnd euler angle r ═ α, β, γ]The conversion relationship between the two is as follows:
s52, the hand-eye calibration matrix H and the difference between the offset Δ t and the true value are dH and d Δ t, respectively, and the representation in the euler angle form is:dΔt=[dΔx,dΔy,dΔz](ii) a Wherein, the numerical values of dH and d delta t both tend to 0; when d α, d β, d γ ≈ 0, it can be estimated that sind α ≈ 0, cosd α ≈ 1, then:
S53, substituting H, dH, delta t and d delta t into the formula of the first algorithm to obtain an equation of A- (delta t + d delta t) ═ H + dH) · b, wherein H and delta t are known, the equation can be solved by a linear two-times method to obtain compensation quantities dH and d delta t, and therefore a more accurate hand-eye calibration matrix is obtained
4. The 3D hand-eye calibration method for the mobile robot as claimed in claim 1, wherein the method records the point cloud data cP of the camera coordinate system where the calibration sphere held by the mobile robot is locatediAnd its robot pose rP under current robot coordinate systemiComprises the following steps:
s21, moving the robot to the camera view range and then standing still, triggering the camera to shoot the point cloud data cP of the first group of calibration balls1(ii) a Simultaneously recording the pose rP of the robot under the robot base coordinate system at the moment1Pose rP1Including the coordinate values of the robot end effector in the robot coordinate systemAnd its postureThe attitude information of the robot is expressed by adopting an Euler angle mode,respectively, the rotation angles of the two parts around the X, Y, Z axis;
s22, changing the position and posture of the robot in the camera view finding range, and recording the new pose information rP of the robot2(ii) a Simultaneously, the camera shoots point cloud data cP of the calibration sphere at the moment2;
S23, repeating the steps for n times to cover the whole view range of the camera and the space freedom degree of the robot, and obtaining n groups of calibrated point cloud data.
5. The 3D hand-eye calibration method for the mobile robot as claimed in claim 1, wherein the step of extracting the calibration sphere spherical point cloud in step S3 comprises: for any group of scanning point cloud data cPiAnd positioning the point cloud on the calibration sphere from the point cloud data by using a random sampling consistency algorithm.
6. A mobile robot 3D hand-eye calibration device is characterized by comprising: the mobile robot calibration system comprises a 3D calibration camera, a main controller, a processor and a calibration ball, wherein the view finding range of the 3D calibration camera covers all the degrees of freedom of the mobile robot, and the main controller is connected with the 3D calibration camera to record point cloud data (cP) under a camera coordinate system where the calibration ball held by the mobile robot is locatediAnd its robot pose rP under current robot coordinate systemiN groups of point cloud data are formed and transmitted to a processor, the processor extracts point clouds of a calibration spherical surface according to a preset calibration algorithm, and a least square fitting algorithm is adopted to determine coordinate values of the calibration spherical center point in each group of point cloud data under a camera coordinate systemAccording to the pose rP of the robot at the same timeiAnd corresponding calibration sphere center point coordinates cTiThe offset delta t and the hand-eye calibration matrix H are solved according to a first algorithm, the hand-eye calibration matrix H is converted into a Euler angle form, a hand-eye calibration matrix result sheet is used as an initial value of nonlinear optimization, then the hand-eye calibration matrix H and the difference values dH and d delta t between the offset delta t and a true value are respectively calculated according to a second algorithm, and the hand-eye calibration matrix is corrected according to the difference values dH and d delta t.
7. The mobile robot 3D hand-eye calibration device of claim 6, wherein the first algorithm step comprises:
s41 processor based on each robot poseObtaining the pose information in the form of matrix by using the transformation relation from Euler angle to rotation matrix
S42 setting the offset as Δ t ═ Δ x, Δ y, Δ z]TCoordinate values of the center point of the calibration sphere at the corresponding moment in the robot coordinate system can be obtained
S43, setting a hand-eye calibration matrix from a 3D camera coordinate system to a robot coordinate system as H, and transferring the coordinates of the spherical center point in the camera coordinate system to the robot coordinate system asIs provided withThe robot pose at the same time and the coordinates of the fixed sphere center point under the camera coordinate system can be established as the following equation:The offset delta t and the hand-eye calibration matrix H are unknown quantity X to be solved;
the S44 processor establishes an equation through step S43 according to the n groups of corresponding robot poses and the coordinates of the center point of the calibration sphere, and the equation is drawn into a form matrix of A.X.D;
the S45 processor solves the linear equation by singular value decomposition to obtain the unknown quantity X, which is the offset delta t and the hand-eye calibration matrix H.
8. The mobile robot 3D hand-eye calibration device of claim 7, wherein the second algorithm step comprises:
the S51 processor converts the hand-eye calibration matrix H into Euler angle formWherein the rotation matrixAnd euler angle r ═ α, β, γ]The conversion relationship between the two is as follows:
the S52 processor sets the difference between the hand-eye calibration matrix H and the offset Δ t and the true value to dH and d Δ t, respectively, and the representation in the euler angle form is:dΔt=[dΔx,dΔy,dΔz](ii) a Wherein, the numerical values of dH and d delta t both tend to 0; when d α, d β, d γ ≈ 0, it can be estimated that sind α ≈ 0, cosd α ≈ 1, then:
The S53 processor brings H, dH, Δ t and d Δ t into the formula of the first algorithm to obtain an equation of A · (Δ t + d Δ t) ═ H + dH · b, and since H and Δ t are known, the equation can be solved by using a linear least squares method to obtain compensation quantities dH and d Δ t, so that a more accurate hand-eye calibration matrix is obtained
9. The device for 3D hand-eye calibration of a mobile robot as claimed in claim 6, wherein the master controller records point cloud data cP of a camera coordinate system where a calibration sphere held by the mobile robot is locatediAnd its robot pose rP under current robot coordinate systemiComprises the following steps:
s21 the main controller moves the robot to the camera view range and then stands still, controls the 3D camera to trigger shooting the point cloud data cP of the first group of calibration balls1(ii) a Simultaneously recording the pose rP of the robot under the robot base coordinate system at the moment1Pose rP1Including the coordinate values of the robot end effector in the robot coordinate systemAnd its postureThe attitude information of the robot is expressed by adopting an Euler angle mode,respectively, the rotation angles of the two parts around the X, Y, Z axis;
s22 main controller controlling robot to view in 3D cameraChanging the position and the posture in the enclosure, and recording new pose information rP of the robot2(ii) a Simultaneously, the 3D camera shoots point cloud data cP of the calibration sphere at the moment2;
And S23, repeating the steps by the main controller for n times to cover the whole view range of the 3D camera and the space freedom degree of the robot to obtain n groups of calibrated point cloud data.
10. The mobile robot 3D hand-eye calibration device of claim 6, wherein the processor extracting calibration sphere spherical point clouds comprises: for any group of scanning point cloud data cPiAnd positioning the point cloud on the calibration sphere from the point cloud data by using a random sampling consistency algorithm.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110689530.1A CN113362396B (en) | 2021-06-21 | 2021-06-21 | Mobile robot 3D hand-eye calibration method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110689530.1A CN113362396B (en) | 2021-06-21 | 2021-06-21 | Mobile robot 3D hand-eye calibration method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113362396A true CN113362396A (en) | 2021-09-07 |
CN113362396B CN113362396B (en) | 2024-03-26 |
Family
ID=77535498
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110689530.1A Active CN113362396B (en) | 2021-06-21 | 2021-06-21 | Mobile robot 3D hand-eye calibration method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113362396B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113814987A (en) * | 2021-11-24 | 2021-12-21 | 季华实验室 | Multi-camera robot hand-eye calibration method and device, electronic equipment and storage medium |
CN114147728A (en) * | 2022-02-07 | 2022-03-08 | 杭州灵西机器人智能科技有限公司 | Universal robot eye on-hand calibration method and system |
CN114347027A (en) * | 2022-01-08 | 2022-04-15 | 天晟智享(常州)机器人科技有限公司 | Pose calibration method of 3D camera relative to mechanical arm |
CN116038721A (en) * | 2023-04-03 | 2023-05-02 | 广东工业大学 | Hand-eye calibration method and system without kinematic participation |
CN117409080A (en) * | 2023-11-10 | 2024-01-16 | 广州市斯睿特智能科技有限公司 | Hand-eye calibration method for line scanning 3D camera |
WO2024045274A1 (en) * | 2022-08-29 | 2024-03-07 | 上海智能制造功能平台有限公司 | Hand-eye calibration method and system for robot vision system, and terminal and medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108582076A (en) * | 2018-05-10 | 2018-09-28 | 武汉库柏特科技有限公司 | A kind of Robotic Hand-Eye Calibration method and device based on standard ball |
CN109746920A (en) * | 2019-03-06 | 2019-05-14 | 南京航空航天大学 | A kind of industrial robot geometric parameter error calibrating method based on two-step method |
CN110116411A (en) * | 2019-06-06 | 2019-08-13 | 浙江汉振智能技术有限公司 | A kind of robot 3D vision hand and eye calibrating method based on ball target |
WO2020024178A1 (en) * | 2018-08-01 | 2020-02-06 | 深圳配天智能技术研究院有限公司 | Hand-eye calibration method and system, and computer storage medium |
CN111002312A (en) * | 2019-12-18 | 2020-04-14 | 江苏集萃微纳自动化系统与装备技术研究所有限公司 | Industrial robot hand-eye calibration method based on calibration ball |
CN111546328A (en) * | 2020-04-02 | 2020-08-18 | 天津大学 | Hand-eye calibration method based on three-dimensional vision measurement |
CN112091971A (en) * | 2020-08-21 | 2020-12-18 | 季华实验室 | Robot eye calibration method and device, electronic equipment and system |
-
2021
- 2021-06-21 CN CN202110689530.1A patent/CN113362396B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108582076A (en) * | 2018-05-10 | 2018-09-28 | 武汉库柏特科技有限公司 | A kind of Robotic Hand-Eye Calibration method and device based on standard ball |
WO2020024178A1 (en) * | 2018-08-01 | 2020-02-06 | 深圳配天智能技术研究院有限公司 | Hand-eye calibration method and system, and computer storage medium |
CN109746920A (en) * | 2019-03-06 | 2019-05-14 | 南京航空航天大学 | A kind of industrial robot geometric parameter error calibrating method based on two-step method |
CN110116411A (en) * | 2019-06-06 | 2019-08-13 | 浙江汉振智能技术有限公司 | A kind of robot 3D vision hand and eye calibrating method based on ball target |
CN111002312A (en) * | 2019-12-18 | 2020-04-14 | 江苏集萃微纳自动化系统与装备技术研究所有限公司 | Industrial robot hand-eye calibration method based on calibration ball |
CN111546328A (en) * | 2020-04-02 | 2020-08-18 | 天津大学 | Hand-eye calibration method based on three-dimensional vision measurement |
CN112091971A (en) * | 2020-08-21 | 2020-12-18 | 季华实验室 | Robot eye calibration method and device, electronic equipment and system |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113814987A (en) * | 2021-11-24 | 2021-12-21 | 季华实验室 | Multi-camera robot hand-eye calibration method and device, electronic equipment and storage medium |
CN113814987B (en) * | 2021-11-24 | 2022-06-03 | 季华实验室 | Multi-camera robot hand-eye calibration method and device, electronic equipment and storage medium |
CN114347027A (en) * | 2022-01-08 | 2022-04-15 | 天晟智享(常州)机器人科技有限公司 | Pose calibration method of 3D camera relative to mechanical arm |
CN114147728A (en) * | 2022-02-07 | 2022-03-08 | 杭州灵西机器人智能科技有限公司 | Universal robot eye on-hand calibration method and system |
WO2024045274A1 (en) * | 2022-08-29 | 2024-03-07 | 上海智能制造功能平台有限公司 | Hand-eye calibration method and system for robot vision system, and terminal and medium |
CN116038721A (en) * | 2023-04-03 | 2023-05-02 | 广东工业大学 | Hand-eye calibration method and system without kinematic participation |
CN116038721B (en) * | 2023-04-03 | 2023-07-18 | 广东工业大学 | Hand-eye calibration method and system without kinematic participation |
CN117409080A (en) * | 2023-11-10 | 2024-01-16 | 广州市斯睿特智能科技有限公司 | Hand-eye calibration method for line scanning 3D camera |
Also Published As
Publication number | Publication date |
---|---|
CN113362396B (en) | 2024-03-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113362396A (en) | Mobile robot 3D hand-eye calibration method and device | |
CN107139178B (en) | Unmanned aerial vehicle and vision-based grabbing method thereof | |
KR101988083B1 (en) | Systems and methods for tracking location of movable target object | |
CN109333534B (en) | Preplanned real-time gait control algorithm | |
Tamadazte et al. | A direct visual servoing scheme for automatic nanopositioning | |
CN109333506B (en) | Humanoid intelligent robot system | |
CN110919658B (en) | Robot calibration method based on vision and multi-coordinate system closed-loop conversion | |
CN110370316B (en) | Robot TCP calibration method based on vertical reflection | |
CN110450163A (en) | The general hand and eye calibrating method based on 3D vision without scaling board | |
WO2018043525A1 (en) | Robot system, robot system control device, and robot system control method | |
CN111426270B (en) | Industrial robot pose measurement target device and joint position sensitive error calibration method | |
CN100417952C (en) | Vision servo system and method for automatic leakage detection platform for sealed radioactive source | |
KR102094004B1 (en) | Method for controlling a table tennis robot and a system therefor | |
CN110928311B (en) | Indoor mobile robot navigation method based on linear features under panoramic camera | |
Hvilshøj et al. | Calibration techniques for industrial mobile manipulators: Theoretical configurations and best practices | |
CN114770461A (en) | Monocular vision-based mobile robot and automatic grabbing method thereof | |
CN113043332B (en) | Arm shape measuring system and method of rope-driven flexible robot | |
CN109737871A (en) | A kind of scaling method of the relative position of three-dimension sensor and mechanical arm | |
CN110370272B (en) | Robot TCP calibration system based on vertical reflection | |
CN114034205B (en) | Box filling system and filling method | |
CN114750160A (en) | Robot control method, robot control device, computer equipment and storage medium | |
CN112123329A (en) | Robot 3D vision hand-eye calibration method | |
Tiyu et al. | Positioning and pressing elevator button by binocular vision and robot manipulator | |
Heyer et al. | Camera Calibration for Reliable Object Manipulation in Care-Providing Robot FRIEND | |
Pang et al. | Object manipulation of a humanoid robot based on visual servoing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |