CN108818535B - Robot 3D vision hand-eye calibration method - Google Patents

Robot 3D vision hand-eye calibration method Download PDF

Info

Publication number
CN108818535B
CN108818535B CN201810730919.4A CN201810730919A CN108818535B CN 108818535 B CN108818535 B CN 108818535B CN 201810730919 A CN201810730919 A CN 201810730919A CN 108818535 B CN108818535 B CN 108818535B
Authority
CN
China
Prior art keywords
coordinate system
robot
sensor
cam
calibration plate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810730919.4A
Other languages
Chinese (zh)
Other versions
CN108818535A (en
Inventor
付雨
蒋鑫巍
陈贵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Hanzhen Shenmu Intelligent Technology Co., Ltd
Original Assignee
Suzhou Hanzhen Shenmu Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Hanzhen Shenmu Intelligent Technology Co Ltd filed Critical Suzhou Hanzhen Shenmu Intelligent Technology Co Ltd
Priority to CN201810730919.4A priority Critical patent/CN108818535B/en
Publication of CN108818535A publication Critical patent/CN108818535A/en
Application granted granted Critical
Publication of CN108818535B publication Critical patent/CN108818535B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a robot 3D vision hand-eye calibration method, which comprises the following steps: s1, acquiring the posture of a robot flange plate relative to a robot base coordinate and the posture of a calibration plate relative to a 3D sensor coordinate system; s2, calculating a rotation matrix of the 3D sensor coordinate system relative to a robot base coordinate system; s3, acquiring multiple sets of coordinate data of the workpiece grabbing point in a 3D sensor coordinate system and corresponding multiple sets of coordinate data in a robot base coordinate system; and S4, calculating the conversion relation of the XYZ coordinate axes of the 3D sensor coordinate system and the robot base coordinate system. Compared with the traditional hand-eye calibration method that only one conversion matrix is used for describing the conversion from the 3D sensor to the robot base coordinate, the robot 3D vision hand-eye calibration method provided by the invention respectively obtains the conversion relation from the 3D sensor to the robot base coordinate position and posture, is more flexible compared with the prior art, can better meet the precision requirement in practical engineering application, and improves the hand-eye calibration precision.

Description

Robot 3D vision hand-eye calibration method
Technical Field
The invention belongs to the technical field of robot 3D vision calibration, and particularly relates to a robot 3D vision hand-eye calibration method.
Background
The rapid advance of intelligent manufacturing enables the multi-joint robot to obtain a great deal of development, and the industrial robot participates in various fields of industrial manufacturing and production and becomes an indispensable role in factory automation and intelligent processes. The robot vision gives the robot eyes, integrates advanced image processing and three-dimensional data analysis algorithms, and applies an artificial intelligence technology, so that the robot action is not limited to point-to-point motion or a set track obtained through teaching any more, but is more flexible and intelligent under the guidance of the vision, and the robot vision is not popular in the aspects of high-precision detection, workpiece grabbing and positioning and the like. Compared with the traditional 2D vision, the depth and curved surface information cannot be provided, the 3D vision of the robot better conforms to the definition of human eyes, and the 3D sensor can provide position and posture information of products for the robot, so that the application in the industry is more flexible, the method has a very wide application prospect in the fields of logistics sorting, loading and unloading, automobile part grabbing and the like, and admittedly, compared with the traditional 2D hand-eye calibration, the hand-eye calibration method and algorithm of the 3D vision of the robot are more complex.
The existing 3D vision hand-eye calibration is divided into two cases: Eye-in-Hand and Eye-to-Hand. In the first case, the 3D sensor is mounted at the end of the robot and moves along with the movement of the manipulator, which is also called eye-on-hand eye calibration; in the second case, the 3D sensor is mounted on a fixed support, and the position of the sensor and the robot base is relatively fixed, also called eye-hand-eye calibration outside the hand.
The core of the 3D vision hand-eye calibration of the robot eye outside the hand is to calculate the conversion relation of the 3D sensor coordinate system relative to the robot base coordinate system, so that the position and the posture information of the workpiece obtained under the 3D sensor are converted into the position and the posture under the robot base coordinate system.
The prior art mostly uses a 4 × 4 homogeneous transformation matrix to describe the transformation relationship from the 3D sensor coordinate system to the robot base coordinate system, and the homogeneous transformation matrix includes a 3 × 3 rotation matrix and a 3 × 1 translation vector. However, in practical industrial application, because the 3D sensor coordinate system calibration has errors, the robot has systematic errors, the calculation error of the transformation matrix and the influence of some other factors, only one matrix is used to express the relationship between the whole 3D sensor coordinate system and the robot base coordinate system, and the obtained precision is often difficult to meet the requirements. For example, the hand-eye transformation matrix is solved in two steps in a 3D robot hand-eye calibration method proposed in the document [ Tsai R Y, Lenz R K, Angle detection for full Automation and efficacy 3D robotics hand/eye calibration [ J ]. IEEE Transactions on robot cs and Automation,1989,5(3): 345) 358], the rotation matrix is solved first, and then the translation vector is solved by using the rotation matrix obtained by calculation, so that the error calculated by the rotation matrix is also brought into the translation vector, and the precision is not enough. For example, in a module for calibrating a robot 3D vision hand-eye in Halcon machine vision software, the root mean square error of a translation transformation part obtained through experimental tests is about 2mm, and the precision requirement of practical industrial application cannot be met.
In addition, some existing hand-eye calibration methods have high requirements on physical precision of manipulators, calibration plates and other auxiliary facilities in the calibration process, and calibration cost is increased. For example, chinese patent application No. 201710856392.5 proposes a full-automatic hand-eye calibration method and device, which uses a high-precision ceramic camera calibration plate with an asymmetric arrangement structure of solid circles and hollow circles, and has a high precision requirement for the calibration plate. Therefore, in order to meet the requirement of hand-eye calibration precision and reduce calibration cost in practical industrial application, a new robot 3D vision hand-eye calibration method is urgently needed to be provided.
Disclosure of Invention
In view of the above, the invention provides a robot 3D vision hand-eye calibration method, which can improve the precision of robot 3D hand-eye calibration.
A robot 3D vision hand-eye calibration method comprises the following steps:
(1) installing a calibration plate on a flange of a robot, and moving the tail end of an arm of the robot to change the position and the posture of the flange, so as to obtain n groups of posture data of the flange in a base coordinate system of a robot base and n groups of posture data of the calibration plate in a coordinate system of a 3D sensor, wherein n is a natural number more than 3;
the 3D sensor is used for acquiring three-dimensional image information of a workpiece to be grabbed by the robot, the surface background of the calibration plate is black, and three white points p are arranged on the surface of the calibration plate1~p3:p1The point being at the centre of the calibration plate, p2Point in the X-axis direction of the calibration plate along the center, p3The point is in the Y-axis direction of the calibration plate along the center;
(2) converting n groups of attitude data of the flange plate in a robot base coordinate system into corresponding rotation matrix A1~AnSimultaneously, n groups of attitude data of the calibration plate in the 3D sensor coordinate system are also converted into corresponding rotation matrix B1~Bn
(3) Calculating a rotation transformation matrix baseRcam of the 3D sensor coordinate system relative to the robot base coordinate system according to the rotation matrix obtained in the step (2);
(4) grabbing a workpiece below the 3D sensor by using a robot paw to acquire a three-dimensional image, acquiring multiple groups of coordinate data of workpiece grabbing points in a 3D sensor coordinate system, and reading multiple groups of coordinate data of the workpiece grabbing points in a robot base coordinate system from a robot demonstrator;
(5) and (4) fitting and calculating linear transformation parameters between the 3D sensor coordinate system and the robot base coordinate system by using the coordinate data obtained in the step (4).
Further, in the step (2), for any set of posture data of the flange plate in the robot base coordinate system, the posture data is converted into a rotation matrix a of the corresponding flange plate coordinate system relative to the robot base coordinate system through the following relational expressioni
Figure BDA0001720868140000031
Wherein: (R)Xi,RYi,RZi) For the i-th group of attitude data, R, of the flange in the base coordinate system of the robot baseXi、RYi、RZiRespectively, the euler angles of the flange relative to the X, Y, Z axis in the robot base coordinate system, c1 ═ cos (R)Xi),s1=sin(RXi),c2=cos(RYi),s2=sin(RYi),c3=cos(RZi),s3=sin(RZi) I is a natural number and is more than or equal to 1 and less than or equal to n.
Further, in the step (2), for any set of posture data of the calibration board in the 3D sensor coordinate system, it is converted into a rotation matrix B corresponding to the calibration board coordinate system relative to the 3D sensor coordinate system through the following relationi
Figure BDA0001720868140000032
Wherein: is established with p1Calibration plate coordinate system with point as origin X ' -Y ' -Z ', [ X1, Y1, Z1]TFor the unit vectors in the X' -axis direction of the calibration plate under the 3D sensor coordinate system at the corresponding time, [ X2, y2, z2]TFor the unit vectors in the direction of the Y' axis of the calibration plate under the 3D sensor coordinate system at the corresponding time, [ x3, Y3, z3]TThe unit vector in the Z' axis direction of the board is determined under the 3D sensor coordinate system at the corresponding moment.
Further, in the step (3), a rotation transformation matrix baseRcam of the 3D sensor coordinate system relative to the robot base coordinate system is calculated by a plurality of sets of the following relations:
Ai -1×baseRcam×Bi=Aj -1×baseRcam×Bj
wherein: the rotating transformation matrix baseRcam is a matrix with the size of 3 multiplied by 3, i and j are natural numbers, i is more than or equal to 1 and less than or equal to n, j is more than or equal to 1 and less than or equal to n, and i is not equal to j.
Further, in the step (5), linear transformation parameters between the 3D sensor coordinate system and the robot base coordinate system are calculated by fitting a plurality of sets of the following relations:
Xrob=a1×Xcam+b1×Ycam+c1×Zcam+d1
Yrob=a2×Xcam+b2×Ycam+c2×Zcam+d2
Zrob=a3×Xcam+b3×Ycam+c3×Zcam+d3
wherein: (X)cam,Ycam,Zcam) For the coordinates of the workpiece grabbing point in the 3D sensor coordinate system at a certain moment in the grabbing process, (X)rob,Yrob,Zrob) Coordinates of the workpiece grabbing point in a robot base coordinate system at the moment, a1~a3、b1~b3、c1~c3、d1~d3Are linear transformation parameters between the 3D sensor coordinate system and the robot base coordinate system.
Compared with the traditional hand-eye calibration method which only uses one conversion matrix to describe the conversion from the 3D sensor to the robot base coordinate system, the robot 3D vision hand-eye calibration method of the invention respectively uses two sets of parameter models to express the position and angle conversion relation from the 3D sensor to the robot base coordinate system, and compared with the prior art, the method is more flexible, can meet the precision requirement in practical engineering application, and improves the hand-eye calibration precision. In addition, the invention does not need additional hardware equipment for assistance except the calibration plate, and semi-automatic hand-eye calibration is realized by reproducing the moving calibration plate of the teaching robot and the acquired gesture of the workpiece.
Drawings
FIG. 1 is a schematic diagram of a 3D hand-eye calibration system according to the present invention.
Fig. 2 is a schematic view of a calibration plate.
Fig. 3 is a schematic diagram of the pose of the calibration plate in the 3D sensor coordinate system.
Detailed Description
In order to more specifically describe the present invention, the following detailed description is provided for the technical solution of the present invention with reference to the accompanying drawings and the specific embodiments.
As shown in fig. 1, in the present embodiment, a 3D structured light sensor and a six-degree-of-freedom industrial robot are used to illustrate a specific implementation of a robot 3D vision hand-eye calibration method and a method for converting a position and a posture of a workpiece under the 3D sensor into a position and a posture under a robot base coordinate system based on the hand-eye calibration method.
The robot 3D vision hand-eye calibration method of the embodiment comprises the following steps:
step S1: the calibration plate shown in fig. 2 is installed on a robot flange, the tail end of a robot actuator is moved, the position and the posture of the flange are changed, and the postures of the plurality of groups of flanges relative to the base coordinates of the robot and the posture data of the calibration plate relative to the 3D sensor are obtained.
Tool coordinate R of known flangeX,RY,RZAnd the Euler angle sequence XYZ of the robot, and converting the Euler angle sequence XYZ into a rotation matrix of the flange relative to a robot base coordinate system, wherein the specific formula is as follows:
Figure BDA0001720868140000051
wherein: c1 ═ cos (R)X),s1=sin(RX),c2=cos(RY),s2=sin(RY),c3=cos(RZ),s3=sin(RZ)。
The attitude calculation steps of the calibration plate under the 3D sensor are as follows:
the fixed coordinate system of the 3D sensor is X-Y-Z and is established by p1A calibration plate coordinate system X '-Y' -Z 'with the point as the origin, wherein the unit vector in the X' axis direction of the calibration plate at the moment is [ X ] in the 3D sensor coordinate system1,y1,z1]TThe unit vector in the Y' axis direction of the calibration plate is [ x ]2,y2,z2]TThe unit vector in the Z' axis direction of the calibration plate is [ x ]3,y3,z3]TThen, the posture of the calibration board under the 3D sensor is:
Figure BDA0001720868140000052
the postures of the group of flanges relative to the robot base coordinate system in the embodiment are as follows:
Figure BDA0001720868140000053
the corresponding calibration plate has the following postures under the 3D sensor:
Figure BDA0001720868140000061
step S2: calculating a rotation matrix baseRcam of the 3D sensor relative to a robot base coordinate system, wherein the specific calculation process is as follows:
the relative position of the calibration plate relative to the flange plate is fixed, and the calculation formula of the rotation matrix of the calibration plate relative to the flange plate is as follows:
toolRcal=Ak -1×baseRcam×Bk
wherein: a. thekIs a rotation matrix of the k group of flange plates relative to a robot base coordinate system, BkA rotation matrix for the kth set of calibration plates relative to the 3D sensor coordinate system; since the equation is fixed on the left, taking different values of k on the right of the equation can be listed as follows:
Aj×Ai -1×baseRcam=baseRcam×Bj×Bi -1
wherein: i is more than or equal to 1 and less than or equal to n, j is more than or equal to 1 and less than or equal to n, i is not equal to j, and baseRcam can be obtained according to a plurality of groups of equations.
As is known, the formula for calculating the rotation matrix baseRcam of the 3D sensor relative to the robot base coordinate system, and converting the attitude camRobj of the workpiece under the 3D sensor into the attitude baseRobj of the robot base coordinate system, is as follows:
baseRobj=baseRcam×camRobj
the calculated rotation matrix of the 3D sensor relative to the robot base coordinate system for this example is:
Figure BDA0001720868140000062
in this example, the attitude of the workpiece under the 3D sensor at a certain time is:
Figure BDA0001720868140000063
the attitude of the workpiece under the robot base coordinate system at the moment is as follows:
Figure BDA0001720868140000064
step S3: and grabbing a workpiece to the lower part of the 3D sensor by using the robot paw to acquire a three-dimensional image, acquiring multiple groups of coordinate data of workpiece grabbing points in a 3D sensor coordinate system, and reading the multiple groups of coordinate data of the corresponding workpiece grabbing points in a robot base coordinate system in a robot demonstrator.
Suppose XYZ coordinates of a workpiece grasping point in a 3D sensor are X respectivelycam,Ycam,ZcamX is the XYZ coordinate under the robot-based coordinate systemrob,Yrob,ZrobLinear data fitting is performed on the coordinate data, and the linear transformation obtained in this example is as follows:
Xrob=0.081236×Xcam+1.0075×Ycam+0.010249×Zcam+814.47
Yrob=0.99192×Xcam-0.075336×Ycam+0.007673×Zcam+89.179
Zrob=-0.000871816×Xcam+0.012401×Ycam-0.97371×Zcam+1171.1
the root mean square error of the translational transformation portion obtained in this example was 0.25 mm. The XYZ coordinates of a group of workpiece grabbing points obtained in the example under the 3D sensor are [9.09970856, -75.4557953,1095.07800], the XYZ coordinates are converted into XYZ coordinates [750.411,112.292,103.868] under a robot base coordinate system, and the actual grabbing requirements are met.
The embodiments described above are presented to enable a person having ordinary skill in the art to make and use the invention. It will be readily apparent to those skilled in the art that various modifications to the above-described embodiments may be made, and the generic principles defined herein may be applied to other embodiments without the use of inventive faculty. Therefore, the present invention is not limited to the above embodiments, and those skilled in the art should make improvements and modifications to the present invention based on the disclosure of the present invention within the protection scope of the present invention.

Claims (1)

1. A robot 3D vision hand-eye calibration method comprises the following steps:
(1) installing a calibration plate on a flange of a robot, and moving the tail end of an arm of the robot to change the position and the posture of the flange, so as to obtain n groups of posture data of the flange in a base coordinate system of a robot base and n groups of posture data of the calibration plate in a coordinate system of a 3D sensor, wherein n is a natural number more than 3;
the 3D sensor is used for acquiring three-dimensional image information of a workpiece to be grabbed by the robot, the surface background of the calibration plate is black, and three white points p are arranged on the surface of the calibration plate1~p3:p1The point being at the centre of the calibration plate, p2Point in the X-axis direction of the calibration plate along the center, p3The point is in the Y-axis direction of the calibration plate along the center;
(2) converting n groups of attitude data of the flange plate in a robot base coordinate system into corresponding rotation matrix A1~An(ii) a To pairConverting any group of attitude data of the flange plate in the base coordinate system of the robot base into a rotation matrix A of the corresponding flange plate coordinate system relative to the base coordinate system of the robot base through the following relational expressioni
Figure FDA0002361427270000011
Wherein: (R)Xi,RYi,RZi) For the i-th group of attitude data, R, of the flange in the base coordinate system of the robot baseXi、RYi、RZiRespectively, the euler angles of the flange relative to the X, Y, Z axis in the robot base coordinate system, c1 ═ cos (R)Xi),s1=sin(RXi),c2=cos(RYi),s2=sin(RYi),c3=cos(RZi),s3=sin(RZi) I is a natural number and is more than or equal to 1 and less than or equal to n;
simultaneously, n groups of attitude data of the calibration plate in the 3D sensor coordinate system are also converted into corresponding rotation matrix B1~Bn(ii) a For any group of attitude data of the calibration plate in the 3D sensor coordinate system, the attitude data is converted into a rotation matrix B corresponding to the calibration plate coordinate system relative to the 3D sensor coordinate system through the following relational expressioni
Figure FDA0002361427270000012
Wherein: is established with p1Calibration plate coordinate system with point as origin X ' -Y ' -Z ', [ X1, Y1, Z1]TFor the unit vectors in the X' -axis direction of the calibration plate under the 3D sensor coordinate system at the corresponding time, [ X2, y2, z2]TFor the unit vectors in the direction of the Y' axis of the calibration plate under the 3D sensor coordinate system at the corresponding time, [ x3, Y3, z3]TA unit vector in the Z' axis direction of a calibration plate under a 3D sensor coordinate system at the corresponding moment;
(3) calculating a rotation transformation matrix baseRcam of the 3D sensor coordinate system relative to the robot base coordinate system through a plurality of groups of relational expressions according to the rotation matrix obtained in the step (2);
Ai -1×baseRcam×Bi=Aj -1×baseRcam×Bj
wherein: the rotating transformation matrix baseRcam is a matrix with the size of 3 multiplied by 3, i and j are natural numbers, i is more than or equal to 1 and less than or equal to n, j is more than or equal to 1 and less than or equal to n, and i is not equal to j;
(4) grabbing a workpiece below the 3D sensor by using a robot paw to acquire a three-dimensional image, acquiring multiple groups of coordinate data of workpiece grabbing points in a 3D sensor coordinate system, and reading multiple groups of coordinate data of the workpiece grabbing points in a robot base coordinate system from a robot demonstrator;
(5) calculating linear transformation parameters between a 3D sensor coordinate system and a robot base coordinate system by utilizing the coordinate data obtained in the step (4) through fitting of a plurality of groups of relational expressions;
Xrob=a1×Xcam+b1×Ycam+c1×Zcam+d1
Yrob=a2×Xcam+b2×Ycam+c2×Zcam+d2
Zrob=a3×Xcam+b3×Ycam+c3×Zcam+d3
wherein: (X)cam,Ycam,Zcam) For the coordinates of the workpiece grabbing point in the 3D sensor coordinate system at a certain moment in the grabbing process, (X)rob,Yrob,Zrob) Coordinates of the workpiece grabbing point in a robot base coordinate system at the moment, a1~a3、b1~b3、c1~c3、d1~d3Are linear transformation parameters between the 3D sensor coordinate system and the robot base coordinate system.
CN201810730919.4A 2018-07-05 2018-07-05 Robot 3D vision hand-eye calibration method Active CN108818535B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810730919.4A CN108818535B (en) 2018-07-05 2018-07-05 Robot 3D vision hand-eye calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810730919.4A CN108818535B (en) 2018-07-05 2018-07-05 Robot 3D vision hand-eye calibration method

Publications (2)

Publication Number Publication Date
CN108818535A CN108818535A (en) 2018-11-16
CN108818535B true CN108818535B (en) 2020-04-10

Family

ID=64135572

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810730919.4A Active CN108818535B (en) 2018-07-05 2018-07-05 Robot 3D vision hand-eye calibration method

Country Status (1)

Country Link
CN (1) CN108818535B (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109737871B (en) * 2018-12-29 2020-11-17 南方科技大学 Calibration method for relative position of three-dimensional sensor and mechanical arm
CN110281231B (en) * 2019-03-01 2020-09-29 浙江大学 Three-dimensional vision grabbing method for mobile robot for unmanned FDM additive manufacturing
CN110728657A (en) * 2019-09-10 2020-01-24 江苏理工学院 Annular bearing outer surface defect detection method based on deep learning
CN110640747B (en) * 2019-11-07 2023-03-24 上海电气集团股份有限公司 Hand-eye calibration method and system for robot, electronic equipment and storage medium
CN110980276B (en) * 2019-12-30 2021-08-17 南京埃克里得视觉技术有限公司 Method for implementing automatic casting blanking by three-dimensional vision in cooperation with robot
CN112123329A (en) * 2020-02-21 2020-12-25 深圳市三宝创新智能有限公司 Robot 3D vision hand-eye calibration method
CN111452048B (en) * 2020-04-09 2023-06-02 亚新科国际铸造(山西)有限公司 Calibration method and device for relative spatial position relation of multiple robots
CN111872922B (en) * 2020-07-29 2021-09-03 贵州电网有限责任公司 Three-degree-of-freedom parallel robot hand-eye calibration method based on 3D vision sensor
CN111958604A (en) * 2020-08-20 2020-11-20 扬州蓝邦数控制刷设备有限公司 Efficient special-shaped brush monocular vision teaching grabbing method based on CAD model
CN112589787B (en) * 2020-12-02 2022-09-16 上海纽钛测控技术有限公司 Visual positioning and hand-eye calibration method for loading and unloading samples of mechanical arm of feeding turntable
CN113103238A (en) * 2021-04-26 2021-07-13 福建(泉州)哈工大工程技术研究院 Hand-eye calibration method based on data optimization
CN113211444B (en) * 2021-05-20 2022-04-29 菲烁易维(重庆)科技有限公司 System and method for robot calibration
CN113997059A (en) * 2021-11-02 2022-02-01 珠海格力智能装备有限公司 Compressor workpiece assembling method, device and system and storage medium
CN114474058B (en) * 2022-02-11 2023-12-05 中国科学院自动化研究所 Visual guidance industrial robot system calibration method
CN115139283B (en) * 2022-07-18 2023-10-24 中船重工鹏力(南京)智能装备系统有限公司 Robot hand-eye calibration method based on random mark dot matrix
CN115609586B (en) * 2022-10-21 2024-06-04 华中科技大学 Robot high-precision assembly method based on grabbing pose constraint

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60252913A (en) * 1984-05-30 1985-12-13 Mitsubishi Electric Corp Robot controller
CN102294695A (en) * 2010-06-25 2011-12-28 鸿富锦精密工业(深圳)有限公司 Robot calibration method and calibration system
CN106767393B (en) * 2015-11-20 2020-01-03 沈阳新松机器人自动化股份有限公司 Hand-eye calibration device and method for robot
CN107443377B (en) * 2017-08-10 2020-07-17 埃夫特智能装备股份有限公司 Sensor-robot coordinate system conversion method and robot eye calibration method
CN107498558A (en) * 2017-09-19 2017-12-22 北京阿丘科技有限公司 Full-automatic hand and eye calibrating method and device

Also Published As

Publication number Publication date
CN108818535A (en) 2018-11-16

Similar Documents

Publication Publication Date Title
CN108818535B (en) Robot 3D vision hand-eye calibration method
CN108399639B (en) Rapid automatic grabbing and placing method based on deep learning
CN107351084B (en) Space manipulator system error correction method for maintenance task
CN113379849B (en) Robot autonomous recognition intelligent grabbing method and system based on depth camera
CN111347411B (en) Two-arm cooperative robot three-dimensional visual recognition grabbing method based on deep learning
CN113146620B (en) Binocular vision-based double-arm cooperative robot system and control method
CN113715016B (en) Robot grabbing method, system, device and medium based on 3D vision
CN109048897B (en) Method for teleoperation of master-slave robot
CN107253191B (en) Double-mechanical-arm system and coordination control method thereof
Su et al. A new insertion strategy for a peg in an unfixed hole of the piston rod assembly
CN111360821A (en) Picking control method, device and equipment and computer scale storage medium
CN113160334A (en) Double-robot system calibration method based on hand-eye camera
Al-Shanoon et al. Robotic manipulation based on 3-D visual servoing and deep neural networks
CN115139283B (en) Robot hand-eye calibration method based on random mark dot matrix
CN112454366A (en) Hand-eye calibration method
CN111216136A (en) Multi-degree-of-freedom mechanical arm control system, method, storage medium and computer
Hvilshøj et al. Calibration techniques for industrial mobile manipulators: Theoretical configurations and best practices
Kansal et al. Vision-based kinematic analysis of the Delta robot for object catching
Navarro-Alarcon et al. A dynamic and uncalibrated method to visually servo-control elastic deformations by fully-constrained robotic grippers
Ranjan et al. Identification and control of NAO humanoid robot to grasp an object using monocular vision
Kawasaki et al. Virtual robot teaching for humanoid hand robot using muti-fingered haptic interface
CN112123329A (en) Robot 3D vision hand-eye calibration method
Wu et al. Parallel PnP Robots
Shauri et al. Sensor integration and fusion for autonomous screwing task by dual-manipulator hand robot
Wu et al. Parallel PnP Robots: Parametric Modeling, Performance Evaluation and Design Optimization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200312

Address after: Room 1601, No.77, South Tiancheng Road, high speed rail new town, Xiangcheng District, Suzhou, Jiangsu Province

Applicant after: Suzhou Hanzhen Shenmu Intelligent Technology Co., Ltd

Address before: Hangzhou City, Zhejiang province Xihu District 310030 three Dun Shang Kun ecological Creative Park C323

Applicant before: HANCHINE. TECHNOLOGY CO., LTD.

GR01 Patent grant
GR01 Patent grant