CN109794963B - Robot rapid positioning method facing curved surface component - Google Patents
Robot rapid positioning method facing curved surface component Download PDFInfo
- Publication number
- CN109794963B CN109794963B CN201910044449.0A CN201910044449A CN109794963B CN 109794963 B CN109794963 B CN 109794963B CN 201910044449 A CN201910044449 A CN 201910044449A CN 109794963 B CN109794963 B CN 109794963B
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- robot
- curved surface
- coordinates
- mapping relation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 22
- 238000005259 measurement Methods 0.000 claims abstract description 58
- 239000012636 effector Substances 0.000 claims abstract description 36
- 238000006243 chemical reaction Methods 0.000 claims abstract description 7
- 238000013507 mapping Methods 0.000 claims description 51
- 230000000007 visual effect Effects 0.000 claims description 13
- 238000012937 correction Methods 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000010420 art technique Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
Images
Landscapes
- Length Measuring Devices By Optical Means (AREA)
Abstract
A robot rapid positioning method facing to a curved surface component is characterized by comprising the following steps: 1) establishing a conversion model among a robot flange, an end effector and a vision measurement system coordinate system through a laser tracker measuring datum point to obtain calibration parameters; 2) pasting a triangular auxiliary reflection target at the corner point of the curved surface member, acquiring an image of the curved surface member by a vision measurement system, extracting two-dimensional pixel point coordinates of the auxiliary reflection target and calculating three-dimensional coordinates of the reflection target in a coordinate system of the vision measurement system; 3) and calculating the current pose parameters of the robot according to the vision measurement coordinates and the calibration parameters of the curved surface member, comparing the current pose parameters with the target pose, and calculating the motion amount of the robot. The invention improves the reliability of the hand-eye calibration parameters; the positioning information of the curved surface member is fast and accurate; the device can be widely applied to quickly positioning curved surface parts of aviation, automobiles and the like.
Description
Technical Field
The invention relates to a robot technology, in particular to a robot positioning technology, and specifically relates to a robot rapid positioning method facing to a curved surface component.
Background
Industrial robots are widely applied to the automobile and electronic industries, but the poses of the robots are mostly controlled in an off-line programming mode, targets cannot be positioned in an active identification and positioning mode, and the application range is extremely limited; in the military field of aerospace and the like, the robot has the characteristics of special working environment, complex object characteristics and the like, so that the robot has a narrow visual application range. At present, high-end and intelligentization are becoming research hotspots in the field of industrial robots, and the industrial robot positioning based on machine vision assistance has very wide application prospects in the aerospace field.
In China, certain research achievements exist in the field of robot vision positioning, and two modes are mainly adopted: the robot monocular vision positioning method is characterized in that a target position is identified and calculated through a camera arranged at the tail end of the robot, and the method is low in positioning accuracy and only suitable for targets with regular shapes such as cylinders, rings, cuboids and the like; and secondly, the robot structured light scanning positioning method reconstructs the appearance of the target and calculates the pose through a structured light scanning system, the accuracy of the method is obviously improved compared with that of the first method, but the method is easily influenced by environmental illumination and the surface attribute of the component during positioning in consideration of the complexity and the particularity of the structure of the aerospace component, and more importantly, the time consumption for point cloud processing of the structured light scanning result is long, so that the method also has high universality.
Disclosure of Invention
The invention aims to provide a binocular vision positioning method of a robot facing a curved surface component, which is high in positioning accuracy and not influenced by environment, aiming at the problems that the positioning accuracy is not high when monocular vision is adopted in the conventional robot positioning and the influence of the environment is large when structured light scanning positioning is adopted.
The technical scheme of the invention is as follows:
a robot rapid positioning method facing to a curved surface component is characterized in that firstly, parameter calibration is carried out on a robot binocular vision system, wherein the parameter calibration comprises camera internal and external parameters and hand-eye calibration parameters; then, acquiring a target image by using a double camera, extracting two-dimensional positioning characteristic points in the image, and carrying out distortion correction; converting the two-dimensional coordinates of the positioning feature points into three-dimensional measurement coordinates in a camera coordinate system by using a camera calibration result; and finally, solving the motion parameters of the robot according to the three-dimensional measurement coordinates, the theoretical coordinates and the hand-eye calibration parameters of the positioning feature points. The method comprises the following specific steps:
(1) measuring robot flange, end effector and calibration tool datum point by using laser tracker to obtain measurement coordinates(Flange_Measured_Tracker),(End-Effector_Measured_Tracker), (Calibration-Platform_Measured_Tracker);
(2) Measuring and calibrating the datum point of the tool by using a vision measuring system to obtain a measuring coordinate(Calibration- Platform_Measured_Camera);
(1) laser tracker and vision measurement system measurement coordinate according to calibration tool datum pointAndcalculating mapping relation between vision measuring system and measuring coordinate system of laser tracker
(2) According to theoretical coordinates of robot flange datum pointsAnd laser tracker measurement coordinatesCalculating the mapping relation between the robot flange coordinate system and the laser tracker measurement coordinate system
(3) According to theoretical coordinates of end effector punctuationAnd laser tracker measurement coordinatesCalculating the mapping relation between the measuring coordinate system of the laser tracker and the coordinate system of the end effector
(1) according to the mapping relation between the vision measuring system and the measuring coordinate system of the laser trackerRobot flange coordinate system and laser tracker measurement coordinate system mapping relationCalculating the mapping relation between the robot flange and the coordinate system of the vision measuring system
(2) According to the mapping relation between the robot flange coordinate system and the laser tracker measurement coordinate systemMapping relation between measurement coordinate system of laser tracker and coordinate system of end effectorCalculating the mapping relation of the robot flange and the end effector coordinate system
(1) pasting a triangular auxiliary reflection target at the corner point of the curved surface member to teach the robot above the curved surface member, ensuring that the curved surface member is in the visual field range of the vision measurement system, and collecting an image of the curved surface member;
(2) extracting pixel coordinates of an auxiliary reflection target corner point of a triangle on the surface of the curved surface member by using a Harris algorithm;
(3) deleting redundant matching points based on epipolar constraint and calculating three-dimensional coordinates of residual characteristic points in a visual measurement system coordinate system
And 6, calculating the motion parameters of the robot according to the mapping relation in the step 4 and the vision measurement coordinates of the curved surface member in the step 5, specifically:
(1) according to the reconstructed three-dimensional coordinatesMapping relation with robot flange and vision measurement system coordinate systemCalculating coordinates of curved surface component in robot flange coordinate system
(2) According to the coordinates of the curved surface component in the robot flange coordinate systemRobot flange and end effector coordinate system mapping relationCalculating coordinates of a curved member in an end effector coordinate system
(3) According to the coordinates of the curved surface member in the coordinate system of the end effectorTheoretical modelMatching to obtain the mapping relation M between the vision measuring system and the theoretical model coordinate system of the curved surface memberMeaRewriting the mapping relationship to MMea(xxix) form (X Y Z a B C); wherein X, Y, Z denotes: three translational movement amounts of the robot, A, B, C representing three rotational movement amounts of the robot;
(4) pose parameter M obtained by measuring curved surface memberMea=(X Y Z A B C)MeaTheoretical pose parameter M of curved surface memberTheo=(X Y Z A B C)TheoAnd calculating the motion amount of the robot and driving the robot to a specified position.
The invention has the beneficial effects that:
the invention can be applied to robot visual positioning. Compared with the prior art, the robot visual positioning system is simple and easy to realize, can be effectively applied to parts with various shapes, identifies and positions the target in a short time, and improves the working efficiency and reliability of the robot.
1) The laser tracker is adopted to accurately calibrate a conversion matrix between the robot and the vision measurement system, so that the reliability of the hand-eye calibration parameters is improved.
2) And a high-precision binocular vision measurement system and an auxiliary reflection target are adopted to quickly and accurately reconstruct the positioning information of the curved surface member.
3) The device is widely applicable to objects and can be used for quickly positioning curved surface parts of aviation, automobiles and the like.
Drawings
FIG. 1 is a schematic diagram of a high precision calibration platform.
Fig. 2 is a schematic diagram of a robot vision system calibration.
FIG. 3 is a schematic diagram of feature point extraction for curved surface member positioning.
Fig. 4 is a schematic diagram of a coordinate system conversion process of the robot visual positioning system.
Fig. 5 is a schematic diagram of robot pose adjustment.
Fig. 6 robot vision positioning flow chart.
Fig. 7 is a schematic diagram of the structure of the measuring system of the present invention.
Detailed Description
The present invention is described in further detail below with reference to the attached drawing figures.
As shown in the figures 1-7 of the drawings,
the invention relates to a robot vision positioning system, which comprises a robot, an end effector 3 and a vision measuring system 2, wherein the end effector 2 is connected with the robot through a robot flange 1, and the vision measuring system 2 is arranged on the end effector 3; before positioning, the robot and the curved surface component are randomly positioned, and the curved surface component is required to be ensured to be within the reach range of the robot; the present invention provides for the positioning to be completed with the end effector tool at or some distance above the center of mass of the curved surface member, as shown in fig. 7.
The robot positioning process of the present invention is shown in fig. 6, and includes the following steps.
(1) measuring robot flange, end effector and calibration tool datum point by using laser tracker to obtain measurement coordinates(Flange_Measured_Tracker),(End-Effector_Measured_Tracker), (Calibration-Platform_Measured_Tracker);
(2) Measuring and calibrating the datum point of the tool by using a vision measuring system to obtain a measuring coordinate(Calibration- Platform_Measured_Camera);
In the step 2, a special target ball for the laser tracker is used during measurement of the laser tracker, and an optical reflection target ball is used during measurement of the vision measurement system;
(1) laser tracker and vision measurement system measurement coordinate according to calibration tool datum pointAnddetermining the mapping relation between the vision measuring system and the measuring coordinate system of the laser tracker
(2) According to theoretical coordinates of robot flange datum pointsAnd laser tracker measurement coordinatesDetermining robot flange coordinate system and laser tracker measurement coordinatesSystem mapping relation
(3) According to theoretical coordinates of end effector punctuationAnd laser tracker measurement coordinatesDetermining the mapping relation between the measuring coordinate system of the laser tracker and the coordinate system of the end effector
(1) according to the mapping relation between the vision measuring system and the measuring coordinate system of the laser trackerRobot flange coordinate system and laser tracker measurement coordinate system mapping relationCalculating the mapping relation between the robot flange and the coordinate system of the vision measuring system
(2) According to the mapping relation between the robot flange coordinate system and the laser tracker measurement coordinate systemMapping relation between measurement coordinate system of laser tracker and coordinate system of end effectorCalculating the mapping relation of the robot flange and the end effector coordinate system
The steps 1-4 are a calibration process of the robot vision positioning system, and when the positioning system is not changed or the positioning precision is reduced due to the influence of external force, repeated calibration is not needed after the calibration is finished;
(1) pasting a triangular auxiliary reflection target at the corner point of the curved surface member to teach the robot above the curved surface member, ensuring that the curved surface member is in the visual field range of the vision measurement system, and collecting an image of the curved surface member;
(2) extracting at least 4 positioning feature points of the part by using an SIFT algorithm, wherein the extracted positioning feature points are not on the same straight line as shown in FIG. 3;
(3) deleting redundant matching points based on epipolar constraint and calculating three-dimensional coordinates of residual characteristic points in a visual measurement system coordinate system
And 6, calculating the motion parameters of the robot according to the mapping relation in the step 4 and the vision measurement coordinates of the curved surface member in the step 5, wherein the coordinate system conversion process is as shown in FIG. 4, and specifically comprises the following steps:
(1) according to the reconstructed three-dimensional coordinatesMapping relation with robot flange and vision measurement system coordinate systemCalculating coordinates of curved surface component in robot flange coordinate system
(2) According to the coordinates of the curved surface component in the robot flange coordinate systemRobot flange and end effector coordinate system mapping relationCalculating coordinates of a curved member in an end effector coordinate system
(3) According to the coordinates of the curved surface member in the coordinate system of the end effectorTheoretical modelMatching to obtain the mapping relation M between the vision measuring system and the theoretical model coordinate system of the curved surface memberMeaRewriting the mapping relationship to MMea(xxix) form (X Y Z a B C); wherein X, Y, Z denotes: three translational movement amounts of the robot, A, B, C representing three rotational movement amounts of the robot;
(4) pose parameter M obtained by measuring curved surface memberMea=(X Y Z A B C)MeaTheoretical pose parameter M of curved surface memberTheo=(X Y Z A B C)TheoThe amount of robot motion is calculated and the robot is driven to a specified position as shown in fig. 5.
When the driving quantity is smaller than the threshold tol, stopping adjusting the posture; otherwise, repeating the steps 5 and 6.
The present invention is not concerned with parts which are the same as or can be implemented using prior art techniques.
Claims (3)
1. A robot rapid positioning method facing to a curved surface component is characterized in that firstly, parameter calibration is carried out on a robot binocular vision system, wherein the parameter calibration comprises camera internal and external parameters and hand-eye calibration parameters; then, acquiring a target image by using a double camera, extracting two-dimensional positioning characteristic points in the image, and carrying out distortion correction; converting the two-dimensional coordinates of the positioning feature points into three-dimensional measurement coordinates in a camera coordinate system by using a camera calibration result; finally, solving the robot motion parameters according to the three-dimensional measurement coordinates, the theoretical coordinates and the hand-eye calibration parameters of the positioning feature points; the method comprises the following specific steps:
step 1, arranging measuring points on a robot flange, an end effector and a calibration tool as coordinate system conversion reference points, wherein the reference points of the robot flange, the end effector and the calibration tool are simultaneously in a measuring range of a laser tracker;
step 2, acquiring relevant coordinates by using a laser tracker and a camera respectively, and specifically:
(1) measuring robot flange, end effector and calibration tool datum point by using laser tracker to obtain measurement coordinates
(2) Measuring and calibrating the datum point of the tool by using a vision measuring system to obtain a measuring coordinate
Step 3, calculating a mapping relation related to a measurement coordinate system of the laser tracker according to the coordinates obtained by measurement in the step 2, specifically:
(1) laser tracker and vision measurement system measurement coordinate according to calibration tool datum pointAnddetermining the mapping relation between the vision measuring system and the measuring coordinate system of the laser tracker
(2) According to theoretical coordinates of robot flange datum pointsAnd laser tracker measurement coordinatesDetermining the mapping relation between the robot flange coordinate system and the laser tracker measurement coordinate system
(3) According to theoretical coordinates of end effector punctuationAnd laser tracker measurement coordinatesDetermining the mapping relation between the measuring coordinate system of the laser tracker and the coordinate system of the end effector
Step 4, further calculating a mapping relation related to the robot pose parameters according to the mapping relation obtained in the step 3, specifically:
(1) according to the mapping relation between the vision measuring system and the measuring coordinate system of the laser trackerRobot flange coordinate system and laser tracker measurement coordinate system mapping relationCalculating the mapping relation between the robot flange and the coordinate system of the vision measuring system
(2) According to the mapping relation between the robot flange coordinate system and the laser tracker measurement coordinate systemMapping relation between measurement coordinate system of laser tracker and coordinate system of end effectorCalculating the mapping relation of the robot flange and the end effector coordinate system
Step 5, calculating the three-dimensional coordinates of the curved surface component in the coordinate system of the visual system, which specifically comprises the following steps:
(1) pasting a triangular auxiliary reflection target at the corner point of the curved surface member, teaching a robot above the curved surface member, ensuring that the curved surface member is in the visual field range of a vision measurement system, and collecting an image of the curved surface member;
(2) extracting pixel coordinates of triangular auxiliary reflection target corner points on surface of curved surface component by using Harris algorithm
(3) Deleting redundant matching points based on epipolar constraint and calculating three-dimensional coordinates of residual characteristic points in a visual measurement system coordinate system
And 6, calculating the motion parameters of the robot according to the mapping relation in the step 4 and the vision measurement coordinates of the curved surface member in the step 5, specifically:
(1) according to the reconstructed three-dimensional coordinatesMapping relation with robot flange and vision measurement system coordinate systemCalculating coordinates of curved surface component in robot flange coordinate system
(2) According to the coordinates of the curved surface component in the robot flange coordinate systemRobot flange and end effector coordinate system mapping relationCalculating coordinates of a curved member in an end effector coordinate system
(3) According to the coordinates of the curved surface member in the coordinate system of the end effectorTheoretical modelMatching to obtain the mapping relation M between the vision measuring system and the theoretical model coordinate system of the curved surface memberMeaRewriting the mapping relationship to MMea(xxix) form (X Y Z a B C); wherein X, Y, Z denotes: three translational movement amounts of the robot, A, B, C representing three rotational movement amounts of the robot;
(4) pose parameter M obtained by measuring curved surface memberMea=(X Y Z A B C)MeaTheoretical pose parameter M of curved surface memberTheo=(X Y Z A B C)TheoAnd calculating the motion amount of the robot and driving the robot to a specified position.
2. The method as claimed in claim 1, wherein the robot flange, the end effector and the calibration fixture are all provided with at least three measuring points as reference points.
3. The method as claimed in claim 1, wherein the adjusting of the attitude is stopped when the driving amount for driving the robot to the specified position is less than the threshold tol, otherwise, the steps 5 and 6 are repeated.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910011810 | 2019-01-07 | ||
CN201910011810X | 2019-01-07 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109794963A CN109794963A (en) | 2019-05-24 |
CN109794963B true CN109794963B (en) | 2021-06-01 |
Family
ID=66559526
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910044449.0A Active CN109794963B (en) | 2019-01-07 | 2019-01-17 | Robot rapid positioning method facing curved surface component |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109794963B (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110136208B (en) * | 2019-05-20 | 2020-03-17 | 北京无远弗届科技有限公司 | Joint automatic calibration method and device for robot vision servo system |
CN110370286B (en) * | 2019-08-13 | 2022-04-12 | 西北工业大学 | Method for identifying rigid body space position of dead axle motion based on industrial robot and monocular camera |
CN112109073B (en) * | 2019-08-30 | 2022-10-04 | 上汽通用五菱汽车股份有限公司 | Robot off-line program correcting device and method |
CN112082482B (en) * | 2020-09-09 | 2021-12-17 | 易思维(杭州)科技有限公司 | Visual positioning method for workpiece with edge feature only, application and precision evaluation method |
CN113580137B (en) * | 2021-08-12 | 2023-09-22 | 天津大学 | Mobile robot base-workpiece relative pose determining method based on vision measurement |
CN113643384B (en) * | 2021-10-12 | 2022-02-08 | 深圳荣耀智能机器有限公司 | Coordinate system calibration method, automatic assembly method and device |
CN113945152B (en) * | 2021-10-18 | 2023-09-08 | 易思维(杭州)科技有限公司 | Method for recovering measurement function of single-line structured light three-dimensional sensor by utilizing three-dimensional block |
CN114098968B (en) * | 2021-12-28 | 2022-05-27 | 珠海维尔康生物科技有限公司 | Quick positioning and tracking device of auxiliary robot |
CN115371564B (en) * | 2022-10-24 | 2023-03-07 | 南京航空航天大学 | Method and system for calibrating relative pose of linear laser sensor and robot flange plate |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013526423A (en) * | 2010-05-14 | 2013-06-24 | コグネックス・テクノロジー・アンド・インベストメント・コーポレーション | Apparatus and method for robust calibration between machine vision system and robot |
CN104864807A (en) * | 2015-04-10 | 2015-08-26 | 深圳大学 | Manipulator hand-eye calibration method based on active binocular vision |
CN105716525A (en) * | 2016-03-30 | 2016-06-29 | 西北工业大学 | Robot end effector coordinate system calibration method based on laser tracker |
CN106952262A (en) * | 2017-04-25 | 2017-07-14 | 大连理工大学 | A kind of deck of boat analysis of Machining method based on stereoscopic vision |
CN108098762A (en) * | 2016-11-24 | 2018-06-01 | 广州映博智能科技有限公司 | A kind of robotic positioning device and method based on novel visual guiding |
-
2019
- 2019-01-17 CN CN201910044449.0A patent/CN109794963B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013526423A (en) * | 2010-05-14 | 2013-06-24 | コグネックス・テクノロジー・アンド・インベストメント・コーポレーション | Apparatus and method for robust calibration between machine vision system and robot |
CN104864807A (en) * | 2015-04-10 | 2015-08-26 | 深圳大学 | Manipulator hand-eye calibration method based on active binocular vision |
CN105716525A (en) * | 2016-03-30 | 2016-06-29 | 西北工业大学 | Robot end effector coordinate system calibration method based on laser tracker |
CN108098762A (en) * | 2016-11-24 | 2018-06-01 | 广州映博智能科技有限公司 | A kind of robotic positioning device and method based on novel visual guiding |
CN106952262A (en) * | 2017-04-25 | 2017-07-14 | 大连理工大学 | A kind of deck of boat analysis of Machining method based on stereoscopic vision |
Non-Patent Citations (1)
Title |
---|
工业机器人视觉定位系统的实现;冯志刚 等;《航空科学技术》;20180615;第29卷(第6期);第48-53页 * |
Also Published As
Publication number | Publication date |
---|---|
CN109794963A (en) | 2019-05-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109794963B (en) | Robot rapid positioning method facing curved surface component | |
CN110370286B (en) | Method for identifying rigid body space position of dead axle motion based on industrial robot and monocular camera | |
CN107139178B (en) | Unmanned aerial vehicle and vision-based grabbing method thereof | |
CN111156925A (en) | Three-dimensional measurement method for large component based on line structured light and industrial robot | |
CN109291048B (en) | Real-time online programming system and method for grinding and polishing industrial robot | |
CN110906863B (en) | Hand-eye calibration system and calibration method for line-structured light sensor | |
CN111426270B (en) | Industrial robot pose measurement target device and joint position sensitive error calibration method | |
CN111415391A (en) | Multi-view camera external orientation parameter calibration method adopting inter-shooting method | |
CN105806309A (en) | Robot zero calibration system and method based on laser triangulation ranging | |
CN111043963A (en) | Three-dimensional scanning system measuring method of carriage container based on two-dimensional laser radar | |
CN112017248B (en) | 2D laser radar camera multi-frame single-step calibration method based on dotted line characteristics | |
CN112958960B (en) | Robot hand-eye calibration device based on optical target | |
CN109059755B (en) | High-precision hand-eye calibration method for robot | |
CN112109072B (en) | Accurate 6D pose measurement and grabbing method for large sparse feature tray | |
CN115284292A (en) | Mechanical arm hand-eye calibration method and device based on laser camera | |
CN113681559B (en) | Line laser scanning robot hand-eye calibration method based on standard cylinder | |
CN115139283A (en) | Robot hand-eye calibration method based on random mark dot matrix | |
CN113781558B (en) | Robot vision locating method with decoupling gesture and position | |
CN110962127A (en) | Auxiliary calibration device for tail end pose of mechanical arm and calibration method thereof | |
CN111121628A (en) | Calibration method of three-dimensional scanning system of carriage container based on two-dimensional laser radar | |
CN112700505B (en) | Binocular three-dimensional tracking-based hand and eye calibration method and device and storage medium | |
CN111906767A (en) | Vision rectification mechanical arm based on binocular structured light and rectification method | |
CN114046889B (en) | Automatic calibration method for infrared camera | |
CN111028298B (en) | Convergent binocular system for rigid coordinate system space transformation calibration | |
CN112123329A (en) | Robot 3D vision hand-eye calibration method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |