CN109794963B - Robot rapid positioning method facing curved surface component - Google Patents

Robot rapid positioning method facing curved surface component Download PDF

Info

Publication number
CN109794963B
CN109794963B CN201910044449.0A CN201910044449A CN109794963B CN 109794963 B CN109794963 B CN 109794963B CN 201910044449 A CN201910044449 A CN 201910044449A CN 109794963 B CN109794963 B CN 109794963B
Authority
CN
China
Prior art keywords
coordinate system
robot
curved surface
coordinates
mapping relation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910044449.0A
Other languages
Chinese (zh)
Other versions
CN109794963A (en
Inventor
黄翔
李泷杲
李�根
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Publication of CN109794963A publication Critical patent/CN109794963A/en
Application granted granted Critical
Publication of CN109794963B publication Critical patent/CN109794963B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

A robot rapid positioning method facing to a curved surface component is characterized by comprising the following steps: 1) establishing a conversion model among a robot flange, an end effector and a vision measurement system coordinate system through a laser tracker measuring datum point to obtain calibration parameters; 2) pasting a triangular auxiliary reflection target at the corner point of the curved surface member, acquiring an image of the curved surface member by a vision measurement system, extracting two-dimensional pixel point coordinates of the auxiliary reflection target and calculating three-dimensional coordinates of the reflection target in a coordinate system of the vision measurement system; 3) and calculating the current pose parameters of the robot according to the vision measurement coordinates and the calibration parameters of the curved surface member, comparing the current pose parameters with the target pose, and calculating the motion amount of the robot. The invention improves the reliability of the hand-eye calibration parameters; the positioning information of the curved surface member is fast and accurate; the device can be widely applied to quickly positioning curved surface parts of aviation, automobiles and the like.

Description

Robot rapid positioning method facing curved surface component
Technical Field
The invention relates to a robot technology, in particular to a robot positioning technology, and specifically relates to a robot rapid positioning method facing to a curved surface component.
Background
Industrial robots are widely applied to the automobile and electronic industries, but the poses of the robots are mostly controlled in an off-line programming mode, targets cannot be positioned in an active identification and positioning mode, and the application range is extremely limited; in the military field of aerospace and the like, the robot has the characteristics of special working environment, complex object characteristics and the like, so that the robot has a narrow visual application range. At present, high-end and intelligentization are becoming research hotspots in the field of industrial robots, and the industrial robot positioning based on machine vision assistance has very wide application prospects in the aerospace field.
In China, certain research achievements exist in the field of robot vision positioning, and two modes are mainly adopted: the robot monocular vision positioning method is characterized in that a target position is identified and calculated through a camera arranged at the tail end of the robot, and the method is low in positioning accuracy and only suitable for targets with regular shapes such as cylinders, rings, cuboids and the like; and secondly, the robot structured light scanning positioning method reconstructs the appearance of the target and calculates the pose through a structured light scanning system, the accuracy of the method is obviously improved compared with that of the first method, but the method is easily influenced by environmental illumination and the surface attribute of the component during positioning in consideration of the complexity and the particularity of the structure of the aerospace component, and more importantly, the time consumption for point cloud processing of the structured light scanning result is long, so that the method also has high universality.
Disclosure of Invention
The invention aims to provide a binocular vision positioning method of a robot facing a curved surface component, which is high in positioning accuracy and not influenced by environment, aiming at the problems that the positioning accuracy is not high when monocular vision is adopted in the conventional robot positioning and the influence of the environment is large when structured light scanning positioning is adopted.
The technical scheme of the invention is as follows:
a robot rapid positioning method facing to a curved surface component is characterized in that firstly, parameter calibration is carried out on a robot binocular vision system, wherein the parameter calibration comprises camera internal and external parameters and hand-eye calibration parameters; then, acquiring a target image by using a double camera, extracting two-dimensional positioning characteristic points in the image, and carrying out distortion correction; converting the two-dimensional coordinates of the positioning feature points into three-dimensional measurement coordinates in a camera coordinate system by using a camera calibration result; and finally, solving the motion parameters of the robot according to the three-dimensional measurement coordinates, the theoretical coordinates and the hand-eye calibration parameters of the positioning feature points. The method comprises the following specific steps:
step 1, arranging measuring points on a robot flange, an end effector and a calibration tool as coordinate system conversion reference points, wherein the reference points of the robot flange, the end effector and the calibration tool are simultaneously positioned in a measuring range of a laser tracker, and the number of the reference points on each mechanism is not less than 3;
step 2, acquiring relevant coordinates by using a laser tracker and a camera respectively, and specifically:
(1) measuring robot flange, end effector and calibration tool datum point by using laser tracker to obtain measurement coordinates
Figure BDA0001948688590000021
(Flange_Measured_Tracker),
Figure BDA0001948688590000022
(End-Effector_Measured_Tracker),
Figure BDA0001948688590000023
(Calibration-Platform_Measured_Tracker);
(2) Measuring and calibrating the datum point of the tool by using a vision measuring system to obtain a measuring coordinate
Figure BDA0001948688590000024
(Calibration- Platform_Measured_Camera);
Step 3, calculating a mapping relation related to a measurement coordinate system of the laser tracker according to the coordinates obtained by measurement in the step 2, specifically:
(1) laser tracker and vision measurement system measurement coordinate according to calibration tool datum point
Figure BDA0001948688590000025
And
Figure BDA0001948688590000026
calculating mapping relation between vision measuring system and measuring coordinate system of laser tracker
Figure BDA0001948688590000027
(2) According to theoretical coordinates of robot flange datum points
Figure BDA0001948688590000028
And laser tracker measurement coordinates
Figure BDA0001948688590000029
Calculating the mapping relation between the robot flange coordinate system and the laser tracker measurement coordinate system
Figure BDA00019486885900000210
(3) According to theoretical coordinates of end effector punctuation
Figure BDA00019486885900000211
And laser tracker measurement coordinates
Figure BDA00019486885900000212
Calculating the mapping relation between the measuring coordinate system of the laser tracker and the coordinate system of the end effector
Figure BDA00019486885900000213
Step 4, further calculating a mapping relation related to the robot pose parameters according to the mapping relation obtained in the step 3, specifically:
(1) according to the mapping relation between the vision measuring system and the measuring coordinate system of the laser tracker
Figure BDA00019486885900000214
Robot flange coordinate system and laser tracker measurement coordinate system mapping relation
Figure BDA00019486885900000215
Calculating the mapping relation between the robot flange and the coordinate system of the vision measuring system
Figure BDA00019486885900000216
(2) According to the mapping relation between the robot flange coordinate system and the laser tracker measurement coordinate system
Figure BDA00019486885900000217
Mapping relation between measurement coordinate system of laser tracker and coordinate system of end effector
Figure BDA00019486885900000218
Calculating the mapping relation of the robot flange and the end effector coordinate system
Figure BDA00019486885900000219
Step 5, calculating the three-dimensional coordinates of the curved surface component in the coordinate system of the visual system, which specifically comprises the following steps:
(1) pasting a triangular auxiliary reflection target at the corner point of the curved surface member to teach the robot above the curved surface member, ensuring that the curved surface member is in the visual field range of the vision measurement system, and collecting an image of the curved surface member;
(2) extracting pixel coordinates of an auxiliary reflection target corner point of a triangle on the surface of the curved surface member by using a Harris algorithm;
(3) deleting redundant matching points based on epipolar constraint and calculating three-dimensional coordinates of residual characteristic points in a visual measurement system coordinate system
Figure BDA0001948688590000031
And 6, calculating the motion parameters of the robot according to the mapping relation in the step 4 and the vision measurement coordinates of the curved surface member in the step 5, specifically:
(1) according to the reconstructed three-dimensional coordinates
Figure BDA0001948688590000032
Mapping relation with robot flange and vision measurement system coordinate system
Figure BDA0001948688590000033
Calculating coordinates of curved surface component in robot flange coordinate system
Figure BDA0001948688590000034
(2) According to the coordinates of the curved surface component in the robot flange coordinate system
Figure BDA0001948688590000035
Robot flange and end effector coordinate system mapping relation
Figure BDA0001948688590000036
Calculating coordinates of a curved member in an end effector coordinate system
Figure BDA0001948688590000037
(3) According to the coordinates of the curved surface member in the coordinate system of the end effector
Figure DEST_PATH_IMAGE002
Theoretical model
Figure BDA0001948688590000039
Matching to obtain the mapping relation M between the vision measuring system and the theoretical model coordinate system of the curved surface memberMeaRewriting the mapping relationship to MMea(xxix) form (X Y Z a B C); wherein X, Y, Z denotes: three translational movement amounts of the robot, A, B, C representing three rotational movement amounts of the robot;
(4) pose parameter M obtained by measuring curved surface memberMea=(X Y Z A B C)MeaTheoretical pose parameter M of curved surface memberTheo=(X Y Z A B C)TheoAnd calculating the motion amount of the robot and driving the robot to a specified position.
The invention has the beneficial effects that:
the invention can be applied to robot visual positioning. Compared with the prior art, the robot visual positioning system is simple and easy to realize, can be effectively applied to parts with various shapes, identifies and positions the target in a short time, and improves the working efficiency and reliability of the robot.
1) The laser tracker is adopted to accurately calibrate a conversion matrix between the robot and the vision measurement system, so that the reliability of the hand-eye calibration parameters is improved.
2) And a high-precision binocular vision measurement system and an auxiliary reflection target are adopted to quickly and accurately reconstruct the positioning information of the curved surface member.
3) The device is widely applicable to objects and can be used for quickly positioning curved surface parts of aviation, automobiles and the like.
Drawings
FIG. 1 is a schematic diagram of a high precision calibration platform.
Fig. 2 is a schematic diagram of a robot vision system calibration.
FIG. 3 is a schematic diagram of feature point extraction for curved surface member positioning.
Fig. 4 is a schematic diagram of a coordinate system conversion process of the robot visual positioning system.
Fig. 5 is a schematic diagram of robot pose adjustment.
Fig. 6 robot vision positioning flow chart.
Fig. 7 is a schematic diagram of the structure of the measuring system of the present invention.
Detailed Description
The present invention is described in further detail below with reference to the attached drawing figures.
As shown in the figures 1-7 of the drawings,
the invention relates to a robot vision positioning system, which comprises a robot, an end effector 3 and a vision measuring system 2, wherein the end effector 2 is connected with the robot through a robot flange 1, and the vision measuring system 2 is arranged on the end effector 3; before positioning, the robot and the curved surface component are randomly positioned, and the curved surface component is required to be ensured to be within the reach range of the robot; the present invention provides for the positioning to be completed with the end effector tool at or some distance above the center of mass of the curved surface member, as shown in fig. 7.
The robot positioning process of the present invention is shown in fig. 6, and includes the following steps.
Step 1, arranging measuring points on a robot flange 1, an end effector 3 and a calibration tool as coordinate system conversion reference points, wherein the reference points of the robot flange 1, the end effector 3 and the calibration tool can be measured by a laser tracker at the same time, the reference points of the calibration tool can be measured by a vision measuring system 2, the number of the reference points on each mechanism is not less than 3, and the calibration tool is shown in figure 1;
step 2, acquiring relevant coordinates by using a laser tracker and a camera respectively, as shown in fig. 2, specifically:
(1) measuring robot flange, end effector and calibration tool datum point by using laser tracker to obtain measurement coordinates
Figure BDA0001948688590000041
(Flange_Measured_Tracker),
Figure BDA0001948688590000042
(End-Effector_Measured_Tracker),
Figure BDA0001948688590000043
(Calibration-Platform_Measured_Tracker);
(2) Measuring and calibrating the datum point of the tool by using a vision measuring system to obtain a measuring coordinate
Figure BDA0001948688590000044
(Calibration- Platform_Measured_Camera);
In the step 2, a special target ball for the laser tracker is used during measurement of the laser tracker, and an optical reflection target ball is used during measurement of the vision measurement system;
step 3, calculating a mapping relation related to a measurement coordinate system of the laser tracker according to the coordinates obtained by measurement in the step 2, specifically:
(1) laser tracker and vision measurement system measurement coordinate according to calibration tool datum point
Figure BDA0001948688590000045
And
Figure BDA0001948688590000046
determining the mapping relation between the vision measuring system and the measuring coordinate system of the laser tracker
Figure BDA0001948688590000047
(2) According to theoretical coordinates of robot flange datum points
Figure BDA0001948688590000048
And laser tracker measurement coordinates
Figure BDA0001948688590000049
Determining robot flange coordinate system and laser tracker measurement coordinatesSystem mapping relation
Figure BDA00019486885900000410
(3) According to theoretical coordinates of end effector punctuation
Figure BDA00019486885900000411
And laser tracker measurement coordinates
Figure BDA00019486885900000412
Determining the mapping relation between the measuring coordinate system of the laser tracker and the coordinate system of the end effector
Figure BDA00019486885900000413
Step 4, further calculating a mapping relation related to the robot pose parameters according to the mapping relation obtained in the step 3, specifically:
(1) according to the mapping relation between the vision measuring system and the measuring coordinate system of the laser tracker
Figure BDA0001948688590000051
Robot flange coordinate system and laser tracker measurement coordinate system mapping relation
Figure BDA0001948688590000052
Calculating the mapping relation between the robot flange and the coordinate system of the vision measuring system
Figure BDA0001948688590000053
(2) According to the mapping relation between the robot flange coordinate system and the laser tracker measurement coordinate system
Figure BDA0001948688590000054
Mapping relation between measurement coordinate system of laser tracker and coordinate system of end effector
Figure BDA0001948688590000055
Calculating the mapping relation of the robot flange and the end effector coordinate system
Figure BDA0001948688590000056
The steps 1-4 are a calibration process of the robot vision positioning system, and when the positioning system is not changed or the positioning precision is reduced due to the influence of external force, repeated calibration is not needed after the calibration is finished;
step 5, calculating the three-dimensional coordinates of the curved surface component in the coordinate system of the visual system, which specifically comprises the following steps:
(1) pasting a triangular auxiliary reflection target at the corner point of the curved surface member to teach the robot above the curved surface member, ensuring that the curved surface member is in the visual field range of the vision measurement system, and collecting an image of the curved surface member;
(2) extracting at least 4 positioning feature points of the part by using an SIFT algorithm, wherein the extracted positioning feature points are not on the same straight line as shown in FIG. 3;
(3) deleting redundant matching points based on epipolar constraint and calculating three-dimensional coordinates of residual characteristic points in a visual measurement system coordinate system
Figure BDA0001948688590000057
And 6, calculating the motion parameters of the robot according to the mapping relation in the step 4 and the vision measurement coordinates of the curved surface member in the step 5, wherein the coordinate system conversion process is as shown in FIG. 4, and specifically comprises the following steps:
(1) according to the reconstructed three-dimensional coordinates
Figure BDA0001948688590000058
Mapping relation with robot flange and vision measurement system coordinate system
Figure BDA0001948688590000059
Calculating coordinates of curved surface component in robot flange coordinate system
Figure BDA00019486885900000510
(2) According to the coordinates of the curved surface component in the robot flange coordinate system
Figure BDA00019486885900000511
Robot flange and end effector coordinate system mapping relation
Figure BDA00019486885900000512
Calculating coordinates of a curved member in an end effector coordinate system
Figure BDA00019486885900000513
(3) According to the coordinates of the curved surface member in the coordinate system of the end effector
Figure BDA00019486885900000514
Theoretical model
Figure BDA00019486885900000515
Matching to obtain the mapping relation M between the vision measuring system and the theoretical model coordinate system of the curved surface memberMeaRewriting the mapping relationship to MMea(xxix) form (X Y Z a B C); wherein X, Y, Z denotes: three translational movement amounts of the robot, A, B, C representing three rotational movement amounts of the robot;
(4) pose parameter M obtained by measuring curved surface memberMea=(X Y Z A B C)MeaTheoretical pose parameter M of curved surface memberTheo=(X Y Z A B C)TheoThe amount of robot motion is calculated and the robot is driven to a specified position as shown in fig. 5.
When the driving quantity is smaller than the threshold tol, stopping adjusting the posture; otherwise, repeating the steps 5 and 6.
The present invention is not concerned with parts which are the same as or can be implemented using prior art techniques.

Claims (3)

1. A robot rapid positioning method facing to a curved surface component is characterized in that firstly, parameter calibration is carried out on a robot binocular vision system, wherein the parameter calibration comprises camera internal and external parameters and hand-eye calibration parameters; then, acquiring a target image by using a double camera, extracting two-dimensional positioning characteristic points in the image, and carrying out distortion correction; converting the two-dimensional coordinates of the positioning feature points into three-dimensional measurement coordinates in a camera coordinate system by using a camera calibration result; finally, solving the robot motion parameters according to the three-dimensional measurement coordinates, the theoretical coordinates and the hand-eye calibration parameters of the positioning feature points; the method comprises the following specific steps:
step 1, arranging measuring points on a robot flange, an end effector and a calibration tool as coordinate system conversion reference points, wherein the reference points of the robot flange, the end effector and the calibration tool are simultaneously in a measuring range of a laser tracker;
step 2, acquiring relevant coordinates by using a laser tracker and a camera respectively, and specifically:
(1) measuring robot flange, end effector and calibration tool datum point by using laser tracker to obtain measurement coordinates
Figure FDA0002977751440000011
Figure FDA0002977751440000012
(2) Measuring and calibrating the datum point of the tool by using a vision measuring system to obtain a measuring coordinate
Figure FDA0002977751440000013
Figure FDA0002977751440000014
Step 3, calculating a mapping relation related to a measurement coordinate system of the laser tracker according to the coordinates obtained by measurement in the step 2, specifically:
(1) laser tracker and vision measurement system measurement coordinate according to calibration tool datum point
Figure FDA0002977751440000015
And
Figure FDA0002977751440000016
determining the mapping relation between the vision measuring system and the measuring coordinate system of the laser tracker
Figure FDA0002977751440000017
(2) According to theoretical coordinates of robot flange datum points
Figure FDA0002977751440000018
And laser tracker measurement coordinates
Figure FDA0002977751440000019
Determining the mapping relation between the robot flange coordinate system and the laser tracker measurement coordinate system
Figure FDA00029777514400000110
(3) According to theoretical coordinates of end effector punctuation
Figure FDA00029777514400000111
And laser tracker measurement coordinates
Figure FDA00029777514400000112
Determining the mapping relation between the measuring coordinate system of the laser tracker and the coordinate system of the end effector
Figure FDA00029777514400000113
Step 4, further calculating a mapping relation related to the robot pose parameters according to the mapping relation obtained in the step 3, specifically:
(1) according to the mapping relation between the vision measuring system and the measuring coordinate system of the laser tracker
Figure FDA00029777514400000114
Robot flange coordinate system and laser tracker measurement coordinate system mapping relation
Figure FDA00029777514400000115
Calculating the mapping relation between the robot flange and the coordinate system of the vision measuring system
Figure FDA00029777514400000116
(2) According to the mapping relation between the robot flange coordinate system and the laser tracker measurement coordinate system
Figure FDA0002977751440000021
Mapping relation between measurement coordinate system of laser tracker and coordinate system of end effector
Figure FDA0002977751440000022
Calculating the mapping relation of the robot flange and the end effector coordinate system
Figure FDA0002977751440000023
Step 5, calculating the three-dimensional coordinates of the curved surface component in the coordinate system of the visual system, which specifically comprises the following steps:
(1) pasting a triangular auxiliary reflection target at the corner point of the curved surface member, teaching a robot above the curved surface member, ensuring that the curved surface member is in the visual field range of a vision measurement system, and collecting an image of the curved surface member;
(2) extracting pixel coordinates of triangular auxiliary reflection target corner points on surface of curved surface component by using Harris algorithm
Figure FDA0002977751440000024
(3) Deleting redundant matching points based on epipolar constraint and calculating three-dimensional coordinates of residual characteristic points in a visual measurement system coordinate system
Figure FDA0002977751440000025
And 6, calculating the motion parameters of the robot according to the mapping relation in the step 4 and the vision measurement coordinates of the curved surface member in the step 5, specifically:
(1) according to the reconstructed three-dimensional coordinates
Figure FDA0002977751440000026
Mapping relation with robot flange and vision measurement system coordinate system
Figure FDA0002977751440000027
Calculating coordinates of curved surface component in robot flange coordinate system
Figure FDA0002977751440000028
(2) According to the coordinates of the curved surface component in the robot flange coordinate system
Figure FDA0002977751440000029
Robot flange and end effector coordinate system mapping relation
Figure FDA00029777514400000210
Calculating coordinates of a curved member in an end effector coordinate system
Figure FDA00029777514400000211
(3) According to the coordinates of the curved surface member in the coordinate system of the end effector
Figure 2
Theoretical model
Figure FDA00029777514400000213
Matching to obtain the mapping relation M between the vision measuring system and the theoretical model coordinate system of the curved surface memberMeaRewriting the mapping relationship to MMea(xxix) form (X Y Z a B C); wherein X, Y, Z denotes: three translational movement amounts of the robot, A, B, C representing three rotational movement amounts of the robot;
(4) pose parameter M obtained by measuring curved surface memberMea=(X Y Z A B C)MeaTheoretical pose parameter M of curved surface memberTheo=(X Y Z A B C)TheoAnd calculating the motion amount of the robot and driving the robot to a specified position.
2. The method as claimed in claim 1, wherein the robot flange, the end effector and the calibration fixture are all provided with at least three measuring points as reference points.
3. The method as claimed in claim 1, wherein the adjusting of the attitude is stopped when the driving amount for driving the robot to the specified position is less than the threshold tol, otherwise, the steps 5 and 6 are repeated.
CN201910044449.0A 2019-01-07 2019-01-17 Robot rapid positioning method facing curved surface component Active CN109794963B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910011810 2019-01-07
CN201910011810X 2019-01-07

Publications (2)

Publication Number Publication Date
CN109794963A CN109794963A (en) 2019-05-24
CN109794963B true CN109794963B (en) 2021-06-01

Family

ID=66559526

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910044449.0A Active CN109794963B (en) 2019-01-07 2019-01-17 Robot rapid positioning method facing curved surface component

Country Status (1)

Country Link
CN (1) CN109794963B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110136208B (en) * 2019-05-20 2020-03-17 北京无远弗届科技有限公司 Joint automatic calibration method and device for robot vision servo system
CN110370286B (en) * 2019-08-13 2022-04-12 西北工业大学 Method for identifying rigid body space position of dead axle motion based on industrial robot and monocular camera
CN112109073B (en) * 2019-08-30 2022-10-04 上汽通用五菱汽车股份有限公司 Robot off-line program correcting device and method
CN112082482B (en) * 2020-09-09 2021-12-17 易思维(杭州)科技有限公司 Visual positioning method for workpiece with edge feature only, application and precision evaluation method
CN113580137B (en) * 2021-08-12 2023-09-22 天津大学 Mobile robot base-workpiece relative pose determining method based on vision measurement
CN113643384B (en) * 2021-10-12 2022-02-08 深圳荣耀智能机器有限公司 Coordinate system calibration method, automatic assembly method and device
CN113945152B (en) * 2021-10-18 2023-09-08 易思维(杭州)科技有限公司 Method for recovering measurement function of single-line structured light three-dimensional sensor by utilizing three-dimensional block
CN114098968B (en) * 2021-12-28 2022-05-27 珠海维尔康生物科技有限公司 Quick positioning and tracking device of auxiliary robot
CN115371564B (en) * 2022-10-24 2023-03-07 南京航空航天大学 Method and system for calibrating relative pose of linear laser sensor and robot flange plate

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013526423A (en) * 2010-05-14 2013-06-24 コグネックス・テクノロジー・アンド・インベストメント・コーポレーション Apparatus and method for robust calibration between machine vision system and robot
CN104864807A (en) * 2015-04-10 2015-08-26 深圳大学 Manipulator hand-eye calibration method based on active binocular vision
CN105716525A (en) * 2016-03-30 2016-06-29 西北工业大学 Robot end effector coordinate system calibration method based on laser tracker
CN106952262A (en) * 2017-04-25 2017-07-14 大连理工大学 A kind of deck of boat analysis of Machining method based on stereoscopic vision
CN108098762A (en) * 2016-11-24 2018-06-01 广州映博智能科技有限公司 A kind of robotic positioning device and method based on novel visual guiding

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013526423A (en) * 2010-05-14 2013-06-24 コグネックス・テクノロジー・アンド・インベストメント・コーポレーション Apparatus and method for robust calibration between machine vision system and robot
CN104864807A (en) * 2015-04-10 2015-08-26 深圳大学 Manipulator hand-eye calibration method based on active binocular vision
CN105716525A (en) * 2016-03-30 2016-06-29 西北工业大学 Robot end effector coordinate system calibration method based on laser tracker
CN108098762A (en) * 2016-11-24 2018-06-01 广州映博智能科技有限公司 A kind of robotic positioning device and method based on novel visual guiding
CN106952262A (en) * 2017-04-25 2017-07-14 大连理工大学 A kind of deck of boat analysis of Machining method based on stereoscopic vision

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
工业机器人视觉定位系统的实现;冯志刚 等;《航空科学技术》;20180615;第29卷(第6期);第48-53页 *

Also Published As

Publication number Publication date
CN109794963A (en) 2019-05-24

Similar Documents

Publication Publication Date Title
CN109794963B (en) Robot rapid positioning method facing curved surface component
CN110370286B (en) Method for identifying rigid body space position of dead axle motion based on industrial robot and monocular camera
CN107139178B (en) Unmanned aerial vehicle and vision-based grabbing method thereof
CN111156925A (en) Three-dimensional measurement method for large component based on line structured light and industrial robot
CN109291048B (en) Real-time online programming system and method for grinding and polishing industrial robot
CN110906863B (en) Hand-eye calibration system and calibration method for line-structured light sensor
CN111426270B (en) Industrial robot pose measurement target device and joint position sensitive error calibration method
CN111415391A (en) Multi-view camera external orientation parameter calibration method adopting inter-shooting method
CN105806309A (en) Robot zero calibration system and method based on laser triangulation ranging
CN111043963A (en) Three-dimensional scanning system measuring method of carriage container based on two-dimensional laser radar
CN112017248B (en) 2D laser radar camera multi-frame single-step calibration method based on dotted line characteristics
CN112958960B (en) Robot hand-eye calibration device based on optical target
CN109059755B (en) High-precision hand-eye calibration method for robot
CN112109072B (en) Accurate 6D pose measurement and grabbing method for large sparse feature tray
CN115284292A (en) Mechanical arm hand-eye calibration method and device based on laser camera
CN113681559B (en) Line laser scanning robot hand-eye calibration method based on standard cylinder
CN115139283A (en) Robot hand-eye calibration method based on random mark dot matrix
CN113781558B (en) Robot vision locating method with decoupling gesture and position
CN110962127A (en) Auxiliary calibration device for tail end pose of mechanical arm and calibration method thereof
CN111121628A (en) Calibration method of three-dimensional scanning system of carriage container based on two-dimensional laser radar
CN112700505B (en) Binocular three-dimensional tracking-based hand and eye calibration method and device and storage medium
CN111906767A (en) Vision rectification mechanical arm based on binocular structured light and rectification method
CN114046889B (en) Automatic calibration method for infrared camera
CN111028298B (en) Convergent binocular system for rigid coordinate system space transformation calibration
CN112123329A (en) Robot 3D vision hand-eye calibration method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant