CN104260112A - Robot hand and eye locating method - Google Patents

Robot hand and eye locating method Download PDF

Info

Publication number
CN104260112A
CN104260112A CN201410477611.5A CN201410477611A CN104260112A CN 104260112 A CN104260112 A CN 104260112A CN 201410477611 A CN201410477611 A CN 201410477611A CN 104260112 A CN104260112 A CN 104260112A
Authority
CN
China
Prior art keywords
coordinate
robot
point
axis
theta
Prior art date
Application number
CN201410477611.5A
Other languages
Chinese (zh)
Other versions
CN104260112B (en
Inventor
杨小亭
李汉舟
罗华
张栋栋
Original Assignee
西安航天精密机电研究所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 西安航天精密机电研究所 filed Critical 西安航天精密机电研究所
Priority to CN201410477611.5A priority Critical patent/CN104260112B/en
Publication of CN104260112A publication Critical patent/CN104260112A/en
Application granted granted Critical
Publication of CN104260112B publication Critical patent/CN104260112B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention belongs to the technical field of mechanical automation, in particular to a robot hand and eye locating method. The robot hand and eye locating method includes that pasting a calibration plate on a reference workpiece, building a visual coordinate system, calculating coordinates of all the points on the reference workpiece under the visual coordinate system, moving the reference workpiece and the calibration plate to the downside of the robot station, and obtaining the coordinates of all the points on the reference workpiece through converting between the visual coordinate system and a robot coordinate system. The robot hand and eye locating method is suitable for the robot assembly line work featured with large locating range and high locating precision.

Description

A kind of Robot Hand-eye localization method
Technical field
The invention belongs to technical field of mechanical automation, relate to a kind of localization method, be specifically related to a kind of Robot Hand-eye localization method.
Background technology
Along with the continuous progress of science and technology, industrial automaticity is more and more higher, and robot is enhancing productivity, and guarantees that the effect on the quality of production, attenuating production cost is most important.Robot application is in process of production as generally, and the position how robot operationally obtains workpiece is accurately very important.
When traditional way is mounting robot on streamline, the precision relying on robot to install and the put precision of workpiece on streamline remove control in the course of the work to the location of workpiece; The drawback of this way is apparent, and the installation accuracy of robot and the precision of putting of workpiece are all Artificial Control, and therefore, the problem inaccurate to Workpiece fixing operationally often appears in robot, cannot finish the work efficiently.
In order to address this problem, at present, develop a kind of robot and the artificial technology positioned workpiece of collocation vision system replacement, this technology has now become a kind of classical mode on streamline.
Robot and the vision system mode of employing are in the market: on working robot, install vision system, during work, directly by vision system, workpiece is positioned, then vision system direct control unit device people carries out work, but the little streamline of orientation range that is only applicable to of this mode uses.
Therefore, present urgent problem is exactly: how to realize the streamline large to scope and accurately locate.
Summary of the invention
In order to solve the problem in background technology, the present invention proposes a kind of Robot Hand-eye localization method being applicable to the robot pipelining that orientation range is large, positioning precision is high.
Concrete technical scheme of the present invention is:
1, a Robot Hand-eye localization method, is characterized in that: comprise the following steps:
1) visual coordinate system is set up:
1.1) choosing benchmark workpiece any two points is shooting point, and all the other points on benchmark workpiece are calculation level; If any two points is designated as B 1point and B 2;
1.2) adjust the position of benchmark workpiece on streamline, two shooting point lines are projected in image coordinate system parallel with the axis of ordinates of image coordinate system;
1.3) on benchmark workpiece, scaling board is sticked; Described scaling board is provided with dot matrix; In dot matrix, the line of every a line point is all parallel with the axis of abscissas of image coordinate system, and each row point line is parallel with axis of ordinates; Any point in dot matrix is set to initial point, in initial point horizontal direction line be a little axis of abscissas, on initial point vertical direction line be a little axis of ordinates, set up visual coordinate system (x v, y v);
2) coordinate of each point under visual coordinate system is calculated:
2.1) according to the ratio calculation actual physical size coefficient of the difference of the pixel value of the actual range of any two points in dot matrix on scaling board and any two points of shooting; The concrete calculating formula of actual physical size coefficient is:
K x=L/M x,K y=L/M y
Wherein, on scaling board, in dot matrix, the actual range of any two points is definite value, is set to L;
M xfor the difference of the pixel value on any two points abscissa direction of shooting, M yfor the difference of the pixel value on any two points ordinate direction of shooting;
K xfor the actual physical size coefficient on abscissa direction, K yfor the actual physical size coefficient on ordinate direction is;
2.2) according to the B of shooting 1the pixel of point and the pixel value of visual coordinate system initial point, calculate B 1the coordinate of point under visual coordinate system, concrete calculating formula is:
x B 1 x = k x · ( x B 1 p - x 0 p ) y B 1 v = - k y · ( y B 1 p - y 0 p ) - - - ( 1 )
Wherein, B 1the pixel value of point is b 1the coordinate of point under visual coordinate system is the pixel value of visual coordinate system initial point is
2.3) according to the practical structures size of benchmark workpiece, namely on benchmark workpiece all the other points relative on the position of B1 point and benchmark workpiece, all the other are put relative to B 1and B 2point line angle, to calculate on benchmark workpiece the coordinate a little under visual coordinate system: setting tool body calculating formula is:
X v = dx · cos θ - dy · sin θ + x B 1 v Y v = dx · sin θ + dy · cos θ + y B 1 v - - - ( 2 )
(dx, dy) is for all the other points on benchmark workpiece are relative to the position of B1 point;
θ is that the line of all the other points and B1 point on benchmark workpiece is relative to B 1and B 2point line angle;
3) under benchmark workpiece being moved to robot station together with scaling board;
4) visual coordinate system is converted to robot coordinate system:
Judge that the type of robot is robot for punching or assembly robot; If robot for punching, then perform step 4.1; If assembly robot, then perform step 4.2;
4.1) to obtain on benchmark workpiece the coordinate a little under robot for punching coordinate system:
4.1.1) by benchmark workpiece movable under the station of robot for punching, the coordinate value of initial point under robot coordinate system in acquisition point matrix using coordinate value as the translational movement under visual coordinate ties up to robot coordinate system;
4.1.2) deflection angle of computation vision coordinate system axis of ordinates and robot coordinate system's axis of ordinates and the deflection angle of visual coordinate system axis of abscissas and robot coordinate system's axis of abscissas;
4.1.2.1) at least two row point lines on dot matrix are got, get the angle of each row point line and robot coordinate system's axis of ordinates respectively, according to the algorithm of averaging of suing for peace, obtain the deflection angle of visual coordinate system axis of ordinates and robot coordinate system's axis of ordinates;
4.1.2.2) at least two row point lines on dot matrix are got, get the angle of every a line point line and robot coordinate system's axis of abscissas respectively, according to the algorithm of averaging of suing for peace, obtain the deflection angle of visual coordinate system axis of abscissas and robot coordinate system's axis of abscissas;
4.1.3) according to the physical length of scaling board dot matrix mid point line and the projection relation of robot coordinate system, the length ratio of line under acquisition point line physical length and robot coordinate system, is put;
4.1.4) visual coordinate is tied to the conversion formula of robot coordinate and is:
x r = ( X v · cos θ y - Y v · cos θ y ) / cos ( θ y ) · L x ′ / L x + x 0 r y r = - ( Y v · cos θ x + X v · cos θ x ) / cos ( θ x ) · L y ′ / L y + y 0 r - - - ( 3 )
Wherein: for the coordinate value of initial point in dot matrix under robot coordinate system;
L x, L yfor the physical length of dot matrix mid point line;
L' x, L' yfor the length of dot matrix mid point line under robot coordinate system;
θ xfor the deflection angle of visual coordinate system axis of ordinates and robot coordinate system's axis of ordinates;
θ yfor the deflection angle of visual coordinate system axis of ordinates and robot coordinate system's axis of ordinates;
4.2) to obtain on benchmark workpiece the coordinate a little under assembly robot coordinate system:
4.2.1) by benchmark workpiece movable under the station of assembly robot, adjustment robot end installing plate is parallel with benchmark workpiece, records robot end's anglec of rotation now;
4.2.2) in acquisition point matrix, the coordinate value of initial point under robot coordinate system ties up to the translational movement under robot coordinate system as visual coordinate;
4.2.3) angle between some line and the axis of ordinates of robot coordinate system arranged arbitrarily in the angle between some line and the axis of abscissas of robot coordinate system and dot matrix of going arbitrarily in dot matrix is got respectively; The algorithm computation vision coordinate system of averaging according to suing for peace and the deflection angle of robot coordinate system;
4.2.4) obtaining the conversion formula that visual coordinate is tied to robot coordinate system is:
x 1 r = x r + L · cos ( θ 1 + θ L ) - L · cos ( θ 1 + θ L + θ 0 ) y 1 r = y r + L · sin ( θ 1 + θ L ) - L · cos ( θ 1 + θ L + θ 0 ) - - - ( 4 )
Wherein: L is the distance of alignment point in robot rotation axis points to robot end's installing plate;
θ lfor the line of alignment point in robot end's rotation axis points and robot end's installing plate and the angle of visual coordinate system abscissa;
θ 1for the deflection angle of visual coordinate system and robot coordinate system;
θ 0for robot end's anglec of rotation;
5) robot picks up workpiece according to the coordinate of the workpiece obtained under robot coordinate system.
Above-mentioned image coordinate system is the upper left corner of the image taken is initial point, and the transverse direction of shooting image is axis of abscissas, and the longitudinal direction of shooting image is the coordinate system that axis of ordinates is set up.
The invention has the advantages that:
1, the present invention takes the method for visual coordinate system converting machine people coordinate system, makes robot when carrying out pipelining on a large scale, accurate positioning.
2, the present invention analyzes the Robot Vision Calibration method of two kinds of techniques, is applicable to vision robot's application of the various uses such as arc-welding, carrying, piling, vanning, assembling.
Accompanying drawing explanation
Fig. 1 is the front view of benchmark workpiece;
Fig. 2 is the schematic diagram of dot matrix on scaling board;
Fig. 3 is the graph of a relation of robot for punching coordinate system and visual coordinate system;
Fig. 4 is the graph of a relation of assembly robot coordinate system and visual coordinate system.
Detailed description of the invention
The present invention proposes the Robot Hand-eye localization method that a kind of orientation range is large, positioning precision is high, below in conjunction with accompanying drawing, technical scheme of the present invention be described:
Due to camera camera site and robot operating position different, operated by robot after needing to transmit certain distance on streamline after vision system shooting location completes, before robot work, need the conversion carrying out visual coordinate system and robot coordinate system.
Step 1) set up visual coordinate system;
Step 1.1) as shown in Figure 1: choosing benchmark workpiece any two points is shooting point, and all the other points on benchmark workpiece are calculation level; If any two points is designated as B 1point and B 2point;
Step 1.2) position of benchmark workpiece on adjustment streamline, make two shooting point (B 1point and B 2point) line projects parallel with the axis of ordinates of image coordinate system in image coordinate system;
Wherein, image coordinate system is exactly the upper left corner of the image taken is initial point O, and the transverse direction of shooting image is the axis of abscissas X of image coordinate system, and the longitudinal direction of shooting image is the coordinate system (X, Y) of the axis of ordinates Y foundation of image coordinate system;
Step 1.3) on benchmark workpiece, stick scaling board (figure of scaling board is see Fig. 2); Scaling board is provided with dot matrix (OABC); Guarantee that the line of every a line point on scaling board is all parallel with the X-axis of image coordinate system, each row point line is parallel with Y-axis; Any point in dot matrix is set to initial point, in initial point horizontal direction line be a little axis of abscissas, on initial point vertical direction line be a little axis of ordinates, set up visual coordinate system (x v, y v);
As shown in Figure 2, using O point as visual coordinate system initial point, the line of O point and A point is axis of ordinates, and the line of O point and B point is axis of abscissas,
Step 2) calculate the coordinate of each point under visual coordinate system:
Step 2.1) according to the ratio calculation actual physical size coefficient of the difference of the pixel value of the actual range of any two points in dot matrix on scaling board and any two points of shooting; The concrete calculating formula of actual physical size coefficient is:
K x=L/M x,K y=L/M y
Wherein, on scaling board, in dot matrix, the actual range of any two points is definite value, is set to L;
M xfor the difference of the pixel value on any two points abscissa direction of shooting, M yfor the difference of the pixel value on any two points ordinate direction of shooting;
K xfor the actual physical size coefficient on abscissa direction, K yfor the actual physical size coefficient on ordinate direction;
Step 2.2) according to the B taken 1the pixel of point and the pixel value of visual coordinate system initial point, calculate B 1the coordinate of point under visual coordinate system, concrete calculating formula is:
x B 1 x = k x · ( x B 1 p - x 0 p ) y B 1 v = - k y · ( y B 1 p - y 0 p ) - - - ( 1 )
Wherein, B 1the pixel value of point is b 1the coordinate of point under visual coordinate system is the pixel value of visual coordinate system initial point is
Step 2.3) according to the practical structures size of benchmark workpiece, namely on benchmark workpiece all the other points relative on the position of B1 point and benchmark workpiece, all the other are put relative to B 1and B 2point line angle, to calculate on benchmark workpiece the coordinate a little under visual coordinate system: setting tool body calculating formula is:
X v = dx · cos θ - dy · sin θ + x B 1 v Y v = dx · sin θ + dy · cos θ + y B 1 v - - - ( 2 )
(dx, dy) is for all the other points on benchmark workpiece are relative to the position of B1 point;
θ is that the line of all the other points and B1 point on benchmark workpiece is relative to B 1and B 2point line angle;
Step 3) at completing steps 2) after, under benchmark workpiece is moved to robot station together with scaling board;
Step 4) visual coordinate system is converted to robot coordinate system:
There are two types in general robot: complete the robot for punching of the hole operation that screws and complete the assembly robot of installment work; Wherein, robot for punching only need provide coordinate, and assembly robot, except providing coordinate, also provides the anglec of rotation of robot end's installing plate.
Therefore, completing steps 4) when needing, need the type first judging robot, the type of robot is divided into two kinds of situations to carry out the conversion that visual coordinate is tied to robot coordinate system:
Judge that the type of robot is robot for punching or assembly robot; If robot for punching, then perform step 4.1); If assembly robot, then perform step 4.2);
Step 4.1) to follow these steps to obtain on benchmark workpiece the coordinate a little under robot for punching coordinate system:
Because punch machines robot end has installed cylinder and screw fixing machine.Cause the uneven situation of the plane of robot coordinate plane and benchmark workpiece like this;
Step 4.1.1) by benchmark workpiece movable under the station of robot for punching, obtaining step 1.3) in the coordinate value of dot matrix Central Plains point (as shown in Figure 2, i.e. O point) under robot coordinate system using coordinate value as the translational movement under visual coordinate ties up to robot coordinate system;
Step 4.1.2) computation vision coordinate system axis of ordinates and the deflection angle of robot coordinate system's axis of ordinates and the deflection angle of visual coordinate system axis of abscissas and robot coordinate system's axis of abscissas;
Step 4.1.2.1) get at least two row point lines on dot matrix, get the angle of each row point line and robot coordinate system's axis of ordinates respectively, according to the algorithm of averaging of suing for peace, obtain the deflection angle of visual coordinate system axis of ordinates and robot coordinate system's axis of ordinates;
Step 4.1.2.2) get at least two row point lines on dot matrix, get the angle of every a line point line and robot coordinate system's axis of abscissas respectively, according to the algorithm of averaging of suing for peace, obtain the deflection angle of visual coordinate system axis of abscissas and robot coordinate system's axis of abscissas;
As shown in Figure 3 and Figure 4: first, robot moves and obtains vision calibration plate A, B, C point coordinates,
A) calculate OA and BC two straight lines under robot coordinate system with the deflection angle mean value θ of y-axis yyjust be to be partial on the left of y-axis, right side is negative);
B) OB and AC two straight lines under robot coordinate system with the deflection angle mean value θ of x-axis xxjust be to be partial on the downside of x-axis, upside is negative);
Step 4.1.3) according to the physical length of scaling board dot matrix mid point line and the projection relation of robot coordinate system, under acquisition point line physical length and robot coordinate system, put the length ratio of line;
Specific practice is: owing to there is projection relation, and therefore projected length also exists a proportionality coefficient, according to the line segment length L' under robot coordinate system's axis of abscissas that OABC obtains x, line segment length L' under axis of ordinates ywith the physical length lateral length L of OABC x, longitudinal length L x; And calculate abscissa proportionality coefficient and ordinate proportionality coefficient respectively;
Step 4.1.4) completing steps 4.1.1) to step 4.1.3) after, the conversion formula that acquisition visual coordinate is tied to robot coordinate is:
x r = ( X v · cos θ y - Y v · cos θ y ) / cos ( θ y ) · L x ′ / L x + x 0 r y r = - ( Y v · cos θ x + X v · cos θ x ) / cos ( θ x ) · L y ′ / L y + y 0 r - - - ( 3 )
Put together machines artificial four articulated robots, not only needs to provide coordinate value, also need to provide the end anglec of rotation during work;
Step 4.2) to follow these steps to obtain on benchmark workpiece the coordinate a little under assembly robot coordinate system:
Step 4.2.1) by benchmark workpiece movable under the station of assembly robot, adjustment robot end installing plate is parallel with benchmark workpiece, now accurately can be placed on benchmark workpiece by module to be assembled, records robot end anglec of rotation θ now 0;
Step 4.2.2) obtaining step 1.3) in the coordinate value of dot matrix Central Plains point (as shown in Figure 2, i.e. O point) under robot coordinate system using coordinate value as the translational movement under visual coordinate ties up to robot coordinate system;
Step 4.2.3) to get in dot matrix the angle between some line and the axis of ordinates of robot coordinate system that angle between the some line of row arbitrarily and the axis of abscissas of robot coordinate system and scaling board arrange arbitrarily in dot matrix respectively; The algorithm computation vision coordinate system of averaging according to suing for peace and the deflection angle of robot coordinate system;
Specific practice is: shown in composition graphs 2 and Fig. 4; The installing hole put together machines on the installing plate of robot end is respectively to initial point O, A, B point in dot matrix, obtain the coordinate of three points in robot coordinate system, calculate initial point O and B under robot coordinate system and put the angle of line and X-axis, datum point O and A puts and Y-axis angle, calculating mean value θ 1, be the anglec of rotation of robot coordinate system relative to visual coordinate system.
Step 4.2.4) completing steps 4.2.1) to step 4.2.3) after, the conversion formula that acquisition visual coordinate is tied to robot coordinate system is:
x 1 r = x r + L · cos ( θ 1 + θ L ) - L · cos ( θ 1 + θ L + θ 0 ) y 1 r = y r + L · sin ( θ 1 + θ L ) - L · cos ( θ 1 + θ L + θ 0 ) - - - ( 4 )
Wherein, L is the line of alignment point on robot end's rotation axis points and installing plate on installing plate;
θ lfor the line of alignment point on robot end's rotation axis points on installing plate and installing plate and the angle of visual coordinate system abscissa;
θ 1for the deflection angle of visual coordinate system and robot coordinate system;
θ 0for robot end's anglec of rotation.
Step 5) robot picks up workpiece according to the coordinate of the workpiece obtained under robot coordinate system.

Claims (2)

1. a Robot Hand-eye localization method, is characterized in that: comprise the following steps:
1) visual coordinate system is set up:
1.1) choosing benchmark workpiece any two points is shooting point, and all the other points on benchmark workpiece are calculation level; If any two points is designated as B 1point and B 2;
1.2) adjust the position of benchmark workpiece on streamline, two shooting point lines are projected in image coordinate system parallel with the axis of ordinates of image coordinate system;
1.3) on benchmark workpiece, scaling board is sticked; Described scaling board is provided with dot matrix; In dot matrix, the line of every a line point is all parallel with the axis of abscissas of image coordinate system, and each row point line is parallel with axis of ordinates; Any point in dot matrix is set to initial point, in initial point horizontal direction line be a little axis of abscissas, on initial point vertical direction line be a little axis of ordinates, set up visual coordinate system (x v, y v);
2) coordinate of each point under visual coordinate system is calculated:
2.1) according to the ratio calculation actual physical size coefficient of the difference of the pixel value of the actual range of any two points in dot matrix on scaling board and any two points of shooting; The concrete calculating formula of actual physical size coefficient is:
K x=L/M x,K y=L/M y
Wherein, on scaling board, in dot matrix, the actual range of any two points is definite value, is set to L;
M xfor the difference of the pixel value on any two points abscissa direction of shooting, M yfor the difference of the pixel value on any two points ordinate direction of shooting;
K xfor the actual physical size coefficient on abscissa direction, K yfor the actual physical size coefficient on ordinate direction is;
2.2) according to the B of shooting 1the pixel of point and the pixel value of visual coordinate system initial point, calculate B 1the coordinate of point under visual coordinate system, concrete calculating formula is:
x B 1 x = k x · ( x B 1 p - x 0 p ) y B 1 v = - k y · ( y B 1 p - y 0 p ) - - - ( 1 )
Wherein, B 1the pixel value of point is b 1the coordinate of point under visual coordinate system is the pixel value of visual coordinate system initial point is
2.3) according to the practical structures size of benchmark workpiece, namely on benchmark workpiece all the other points relative on the position of B1 point and benchmark workpiece, all the other are put relative to B 1and B 2point line angle, to calculate on benchmark workpiece the coordinate a little under visual coordinate system: setting tool body calculating formula is:
X v = dx · cos θ - dy · sin θ + x B 1 v Y v = dx · sin θ + dy · cos θ + y B 1 v - - - ( 2 )
(dx, dy) is for all the other points on benchmark workpiece are relative to the position of B1 point;
θ is that the line of all the other points and B1 point on benchmark workpiece is relative to B 1and B 2point line angle;
3) under benchmark workpiece being moved to robot station together with scaling board;
4) visual coordinate system is converted to robot coordinate system:
Judge that the type of robot is robot for punching or assembly robot; If robot for punching, then perform step 4.1; If assembly robot, then perform step 4.2;
4.1) to obtain on benchmark workpiece the coordinate a little under robot for punching coordinate system:
4.1.1) by benchmark workpiece movable under the station of robot for punching, the coordinate value of initial point under robot coordinate system in acquisition point matrix using coordinate value as the translational movement under visual coordinate ties up to robot coordinate system;
4.1.2) deflection angle of computation vision coordinate system axis of ordinates and robot coordinate system's axis of ordinates and the deflection angle of visual coordinate system axis of abscissas and robot coordinate system's axis of abscissas;
4.1.2.1) at least two row point lines on dot matrix are got, get the angle of each row point line and robot coordinate system's axis of ordinates respectively, according to the algorithm of averaging of suing for peace, obtain the deflection angle of visual coordinate system axis of ordinates and robot coordinate system's axis of ordinates;
4.1.2.2) at least two row point lines on dot matrix are got, get the angle of every a line point line and robot coordinate system's axis of abscissas respectively, according to the algorithm of averaging of suing for peace, obtain the deflection angle of visual coordinate system axis of abscissas and robot coordinate system's axis of abscissas;
4.1.3) according to the physical length of scaling board dot matrix mid point line and the projection relation of robot coordinate system, the length ratio of line under acquisition point line physical length and robot coordinate system, is put;
4.1.4) visual coordinate is tied to the conversion formula of robot coordinate and is:
x r = ( X v · cos θ y - Y v · cos θ y ) / cos ( θ y ) · L x ′ / L x + x 0 r y r = - ( Y v · cos θ x + X v · cos θ x ) / cos ( θ x ) · L y ′ / L y + y 0 r - - - ( 3 )
Wherein: for the coordinate value of initial point in dot matrix under robot coordinate system;
L x, L yfor the physical length of dot matrix mid point line;
L' x, L' yfor the length of dot matrix mid point line under robot coordinate system;
θ xfor the deflection angle of visual coordinate system axis of ordinates and robot coordinate system's axis of ordinates;
θ yfor the deflection angle of visual coordinate system axis of ordinates and robot coordinate system's axis of ordinates;
4.2) to obtain on benchmark workpiece the coordinate a little under assembly robot coordinate system:
4.2.1) by benchmark workpiece movable under the station of assembly robot, adjustment robot end installing plate is parallel with benchmark workpiece, records robot end's anglec of rotation now;
4.2.2) in acquisition point matrix, the coordinate value of initial point under robot coordinate system ties up to the translational movement under robot coordinate system as visual coordinate;
4.2.3) angle between some line and the axis of ordinates of robot coordinate system arranged arbitrarily in the angle between some line and the axis of abscissas of robot coordinate system and dot matrix of going arbitrarily in dot matrix is got respectively; The algorithm computation vision coordinate system of averaging according to suing for peace and the deflection angle of robot coordinate system;
4.2.4) obtaining the conversion formula that visual coordinate is tied to robot coordinate system is:
x 1 r = x r + L · cos ( θ 1 + θ L ) - L · cos ( θ 1 + θ L + θ 0 ) y 1 r = y r + L · sin ( θ 1 + θ L ) - L · cos ( θ 1 + θ L + θ 0 ) - - - ( 4 )
Wherein: L is the distance of alignment point in robot rotation axis points to robot end's installing plate;
θ lfor the line of alignment point in robot end's rotation axis points and robot end's installing plate and the angle of visual coordinate system abscissa;
θ 1for the deflection angle of visual coordinate system and robot coordinate system;
θ 0for robot end's anglec of rotation;
5) robot picks up workpiece according to the coordinate of the workpiece obtained under robot coordinate system.
2. Robot Hand-eye localization method according to claim 1, is characterized in that: described image coordinate system is the upper left corner of the image taken is initial point, and the transverse direction of shooting image is axis of abscissas, and the longitudinal direction of shooting image is the coordinate system that axis of ordinates is set up.
CN201410477611.5A 2014-09-18 2014-09-18 A kind of Robot Hand-eye localization method Active CN104260112B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410477611.5A CN104260112B (en) 2014-09-18 2014-09-18 A kind of Robot Hand-eye localization method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410477611.5A CN104260112B (en) 2014-09-18 2014-09-18 A kind of Robot Hand-eye localization method

Publications (2)

Publication Number Publication Date
CN104260112A true CN104260112A (en) 2015-01-07
CN104260112B CN104260112B (en) 2016-05-18

Family

ID=52151756

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410477611.5A Active CN104260112B (en) 2014-09-18 2014-09-18 A kind of Robot Hand-eye localization method

Country Status (1)

Country Link
CN (1) CN104260112B (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105619411A (en) * 2016-03-22 2016-06-01 中国船舶重工集团公司第七一六研究所 Stacking method for six-axis industrial robot
CN105965495A (en) * 2016-05-12 2016-09-28 英华达(上海)科技有限公司 Mechanical arm positioning method and system
CN106054874A (en) * 2016-05-19 2016-10-26 歌尔股份有限公司 Visual positioning calibrating method and device, and robot
CN106217372A (en) * 2015-06-02 2016-12-14 精工爱普生株式会社 Robot, robot controller and robot system
CN106767393A (en) * 2015-11-20 2017-05-31 沈阳新松机器人自动化股份有限公司 The hand and eye calibrating apparatus and method of robot
CN107322627A (en) * 2017-06-29 2017-11-07 安徽新兴翼凌机电发展有限公司 A kind of industrial robot sucked type instrument hand positioner
CN107431788A (en) * 2015-02-18 2017-12-01 西门子医疗保健诊断公司 The alignment of the pallet based on image and tube seat positioning in vision system
CN108025435A (en) * 2015-07-17 2018-05-11 艾沛克斯品牌公司 With the vision system calibrated automatically
CN108078628A (en) * 2016-12-02 2018-05-29 王健 The robot space-location method of view-based access control model error compensation
CN108326850A (en) * 2018-01-10 2018-07-27 温州大学 A kind of accurate mobile mechanical arm of robot reaches the method and system of designated position
CN108772824A (en) * 2018-06-06 2018-11-09 深圳市恒晨电器有限公司 A kind of screw machine hand teaching alignment method
CN109648554A (en) * 2018-12-14 2019-04-19 佛山市奇创智能科技有限公司 Robot calibration method, device and system
CN109693235A (en) * 2017-10-23 2019-04-30 中国科学院沈阳自动化研究所 A kind of Prosthetic Hand vision tracking device and its control method
CN110465944A (en) * 2019-08-09 2019-11-19 琦星智能科技股份有限公司 Calculation method based on the industrial robot coordinate under plane visual

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60252913A (en) * 1984-05-30 1985-12-13 Mitsubishi Electric Corp Robot controller
JPH0299802A (en) * 1988-10-07 1990-04-11 Fanuc Ltd Setting method of coordinate system in visual sensor using hand eye
CN1586833A (en) * 2004-07-15 2005-03-02 上海交通大学 Single eye visual sensor for welding robot and its hand-eye relation quick marking method
CN101186038A (en) * 2007-12-07 2008-05-28 北京航空航天大学 Method for demarcating robot stretching hand and eye
CN101630409A (en) * 2009-08-17 2010-01-20 北京航空航天大学 Hand-eye vision calibration method for robot hole boring system
US20110320039A1 (en) * 2010-06-25 2011-12-29 Hon Hai Precision Industry Co., Ltd. Robot calibration system and calibrating method thereof
CN103170973A (en) * 2013-03-28 2013-06-26 上海理工大学 Man-machine cooperation device and method based on Kinect video camera

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60252913A (en) * 1984-05-30 1985-12-13 Mitsubishi Electric Corp Robot controller
JPH0299802A (en) * 1988-10-07 1990-04-11 Fanuc Ltd Setting method of coordinate system in visual sensor using hand eye
CN1586833A (en) * 2004-07-15 2005-03-02 上海交通大学 Single eye visual sensor for welding robot and its hand-eye relation quick marking method
CN101186038A (en) * 2007-12-07 2008-05-28 北京航空航天大学 Method for demarcating robot stretching hand and eye
CN101630409A (en) * 2009-08-17 2010-01-20 北京航空航天大学 Hand-eye vision calibration method for robot hole boring system
US20110320039A1 (en) * 2010-06-25 2011-12-29 Hon Hai Precision Industry Co., Ltd. Robot calibration system and calibrating method thereof
CN103170973A (en) * 2013-03-28 2013-06-26 上海理工大学 Man-machine cooperation device and method based on Kinect video camera

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107431788A (en) * 2015-02-18 2017-12-01 西门子医疗保健诊断公司 The alignment of the pallet based on image and tube seat positioning in vision system
CN107431788B (en) * 2015-02-18 2020-12-08 西门子医疗保健诊断公司 Method and system for image-based tray alignment and tube slot positioning in a vision system
US10725060B2 (en) 2015-02-18 2020-07-28 Siemens Healthcare Diagnostics Inc. Image-based tray alignment and tube slot localization in a vision system
CN106217372A (en) * 2015-06-02 2016-12-14 精工爱普生株式会社 Robot, robot controller and robot system
CN108025435A (en) * 2015-07-17 2018-05-11 艾沛克斯品牌公司 With the vision system calibrated automatically
CN106767393A (en) * 2015-11-20 2017-05-31 沈阳新松机器人自动化股份有限公司 The hand and eye calibrating apparatus and method of robot
CN106767393B (en) * 2015-11-20 2020-01-03 沈阳新松机器人自动化股份有限公司 Hand-eye calibration device and method for robot
CN105619411A (en) * 2016-03-22 2016-06-01 中国船舶重工集团公司第七一六研究所 Stacking method for six-axis industrial robot
CN105965495A (en) * 2016-05-12 2016-09-28 英华达(上海)科技有限公司 Mechanical arm positioning method and system
CN106054874A (en) * 2016-05-19 2016-10-26 歌尔股份有限公司 Visual positioning calibrating method and device, and robot
CN106054874B (en) * 2016-05-19 2019-04-26 歌尔股份有限公司 Vision positioning scaling method, device and robot
CN108078628A (en) * 2016-12-02 2018-05-29 王健 The robot space-location method of view-based access control model error compensation
CN107322627A (en) * 2017-06-29 2017-11-07 安徽新兴翼凌机电发展有限公司 A kind of industrial robot sucked type instrument hand positioner
CN109693235A (en) * 2017-10-23 2019-04-30 中国科学院沈阳自动化研究所 A kind of Prosthetic Hand vision tracking device and its control method
CN108326850A (en) * 2018-01-10 2018-07-27 温州大学 A kind of accurate mobile mechanical arm of robot reaches the method and system of designated position
CN108772824A (en) * 2018-06-06 2018-11-09 深圳市恒晨电器有限公司 A kind of screw machine hand teaching alignment method
CN109648554B (en) * 2018-12-14 2019-08-30 佛山市奇创智能科技有限公司 Robot calibration method, device and system
CN109648554A (en) * 2018-12-14 2019-04-19 佛山市奇创智能科技有限公司 Robot calibration method, device and system
CN110465944A (en) * 2019-08-09 2019-11-19 琦星智能科技股份有限公司 Calculation method based on the industrial robot coordinate under plane visual
CN110465944B (en) * 2019-08-09 2021-03-16 琦星智能科技股份有限公司 Method for calculating coordinates of industrial robot based on plane vision

Also Published As

Publication number Publication date
CN104260112B (en) 2016-05-18

Similar Documents

Publication Publication Date Title
US9782899B2 (en) Calibration method for coordinate system of robot manipulator
CN103353758B (en) A kind of Indoor Robot navigation method
CN103162622B (en) The Portable ball target of single camera vision system and use thereof and measuring method thereof
US7899577B2 (en) Measuring system and calibration method
CN100476345C (en) Method for measuring geometric parameters of spatial circle based on technique of binocular stereoscopic vision
EP2981397B1 (en) A robot system and method for calibration
JP3946711B2 (en) Robot system
CN102485441B (en) Robot positioning method and calibration method
CN103425100B (en) The direct teaching control method of robot based on equalising torque
CN107214703B (en) Robot self-calibration method based on vision-assisted positioning
CN105458483B (en) Postwelding weld joint tracking robot automatic deviation correction and ultrasonic impact system
CN104200086A (en) Wide-baseline visible light camera pose estimation method
CN102107374B (en) On-line detection method for diameter size of shaft disc type part
CN104972362A (en) Intelligent force control robot grinding system and method
CN104786226A (en) Posture and moving track positioning system and method of robot grabbing online workpiece
CN106338245B (en) A kind of non-contact traverse measurement method of workpiece
CN104759945A (en) Mobile hole-making robot standard alignment method based on high precision industrial camera
US20170054954A1 (en) System and method for visually displaying information on real objects
CN101532827B (en) Deviation correction method for measurement of rail wear based on laser vision
US9043024B2 (en) Vision correction method for tool center point of a robot manipulator
KR20160010868A (en) Automated machining head with vision and procedure
CN105300375A (en) Robot indoor positioning and navigation method based on single vision
DE102011015987A1 (en) System and method for visual presentation of information on real objects
CN103101060A (en) Sensing calibration method for robot tool center point
EP3088843B1 (en) System and method for aligning a coordinated movement machine reference frame with a measurement system reference frame

Legal Events

Date Code Title Description
PB01 Publication
C06 Publication
SE01 Entry into force of request for substantive examination
C10 Entry into substantive examination
GR01 Patent grant
C14 Grant of patent or utility model