CN104260112A  Robot hand and eye locating method  Google Patents
Robot hand and eye locating method Download PDFInfo
 Publication number
 CN104260112A CN104260112A CN201410477611.5A CN201410477611A CN104260112A CN 104260112 A CN104260112 A CN 104260112A CN 201410477611 A CN201410477611 A CN 201410477611A CN 104260112 A CN104260112 A CN 104260112A
 Authority
 CN
 China
 Prior art keywords
 coordinate
 robot
 point
 axis
 theta
 Prior art date
Links
 230000000007 visual effect Effects 0.000 claims abstract description 68
 239000011159 matrix materials Substances 0.000 claims description 43
 238000004080 punching Methods 0.000 claims description 15
 238000006243 chemical reactions Methods 0.000 claims description 8
 230000004807 localization Effects 0.000 claims description 8
 238000004364 calculation methods Methods 0.000 claims description 6
 238000005516 engineering processes Methods 0.000 description 5
 238000004519 manufacturing process Methods 0.000 description 3
 238000000034 methods Methods 0.000 description 2
 238000010586 diagrams Methods 0.000 description 1
 230000000694 effects Effects 0.000 description 1
 230000002708 enhancing Effects 0.000 description 1
 238000009434 installation Methods 0.000 description 1
 239000000203 mixtures Substances 0.000 description 1
 238000003466 welding Methods 0.000 description 1
Abstract
The invention belongs to the technical field of mechanical automation, in particular to a robot hand and eye locating method. The robot hand and eye locating method includes that pasting a calibration plate on a reference workpiece, building a visual coordinate system, calculating coordinates of all the points on the reference workpiece under the visual coordinate system, moving the reference workpiece and the calibration plate to the downside of the robot station, and obtaining the coordinates of all the points on the reference workpiece through converting between the visual coordinate system and a robot coordinate system. The robot hand and eye locating method is suitable for the robot assembly line work featured with large locating range and high locating precision.
Description
Technical field
The invention belongs to technical field of mechanical automation, relate to a kind of localization method, be specifically related to a kind of Robot Handeye localization method.
Background technology
Along with the continuous progress of science and technology, industrial automaticity is more and more higher, and robot is enhancing productivity, and guarantees that the effect on the quality of production, attenuating production cost is most important.Robot application is in process of production as generally, and the position how robot operationally obtains workpiece is accurately very important.
When traditional way is mounting robot on streamline, the precision relying on robot to install and the put precision of workpiece on streamline remove control in the course of the work to the location of workpiece; The drawback of this way is apparent, and the installation accuracy of robot and the precision of putting of workpiece are all Artificial Control, and therefore, the problem inaccurate to Workpiece fixing operationally often appears in robot, cannot finish the work efficiently.
In order to address this problem, at present, develop a kind of robot and the artificial technology positioned workpiece of collocation vision system replacement, this technology has now become a kind of classical mode on streamline.
Robot and the vision system mode of employing are in the market: on working robot, install vision system, during work, directly by vision system, workpiece is positioned, then vision system direct control unit device people carries out work, but the little streamline of orientation range that is only applicable to of this mode uses.
Therefore, present urgent problem is exactly: how to realize the streamline large to scope and accurately locate.
Summary of the invention
In order to solve the problem in background technology, the present invention proposes a kind of Robot Handeye localization method being applicable to the robot pipelining that orientation range is large, positioning precision is high.
Concrete technical scheme of the present invention is:
1, a Robot Handeye localization method, is characterized in that: comprise the following steps:
1) visual coordinate system is set up:
1.1) choosing benchmark workpiece any two points is shooting point, and all the other points on benchmark workpiece are calculation level; If any two points is designated as B
_{1}point and B
_{2};
1.2) adjust the position of benchmark workpiece on streamline, two shooting point lines are projected in image coordinate system parallel with the axis of ordinates of image coordinate system;
1.3) on benchmark workpiece, scaling board is sticked; Described scaling board is provided with dot matrix; In dot matrix, the line of every a line point is all parallel with the axis of abscissas of image coordinate system, and each row point line is parallel with axis of ordinates; Any point in dot matrix is set to initial point, in initial point horizontal direction line be a little axis of abscissas, on initial point vertical direction line be a little axis of ordinates, set up visual coordinate system (x
^{v}, y
^{v});
2) coordinate of each point under visual coordinate system is calculated:
2.1) according to the ratio calculation actual physical size coefficient of the difference of the pixel value of the actual range of any two points in dot matrix on scaling board and any two points of shooting; The concrete calculating formula of actual physical size coefficient is:
K
_{x}＝L/M
_{x}，K
_{y}＝L/M
_{y}
Wherein, on scaling board, in dot matrix, the actual range of any two points is definite value, is set to L;
M
_{x}for the difference of the pixel value on any two points abscissa direction of shooting, M
_{y}for the difference of the pixel value on any two points ordinate direction of shooting;
K
_{x}for the actual physical size coefficient on abscissa direction, K
_{y}for the actual physical size coefficient on ordinate direction is;
2.2) according to the B of shooting
_{1}the pixel of point and the pixel value of visual coordinate system initial point, calculate B
_{1}the coordinate of point under visual coordinate system, concrete calculating formula is:
Wherein, B
_{1}the pixel value of point is
b
_{1}the coordinate of point under visual coordinate system is
the pixel value of visual coordinate system initial point is
2.3) according to the practical structures size of benchmark workpiece, namely on benchmark workpiece all the other points relative on the position of B1 point and benchmark workpiece, all the other are put relative to B
_{1}and B
_{2}point line angle, to calculate on benchmark workpiece the coordinate a little under visual coordinate system: setting tool body calculating formula is:
(dx, dy) is for all the other points on benchmark workpiece are relative to the position of B1 point;
θ is that the line of all the other points and B1 point on benchmark workpiece is relative to B
_{1}and B
_{2}point line angle;
3) under benchmark workpiece being moved to robot station together with scaling board;
4) visual coordinate system is converted to robot coordinate system:
Judge that the type of robot is robot for punching or assembly robot; If robot for punching, then perform step 4.1; If assembly robot, then perform step 4.2;
4.1) to obtain on benchmark workpiece the coordinate a little under robot for punching coordinate system:
4.1.1) by benchmark workpiece movable under the station of robot for punching, the coordinate value of initial point under robot coordinate system in acquisition point matrix using coordinate value as the translational movement under visual coordinate ties up to robot coordinate system;
4.1.2) deflection angle of computation vision coordinate system axis of ordinates and robot coordinate system's axis of ordinates and the deflection angle of visual coordinate system axis of abscissas and robot coordinate system's axis of abscissas;
4.1.2.1) at least two row point lines on dot matrix are got, get the angle of each row point line and robot coordinate system's axis of ordinates respectively, according to the algorithm of averaging of suing for peace, obtain the deflection angle of visual coordinate system axis of ordinates and robot coordinate system's axis of ordinates;
4.1.2.2) at least two row point lines on dot matrix are got, get the angle of every a line point line and robot coordinate system's axis of abscissas respectively, according to the algorithm of averaging of suing for peace, obtain the deflection angle of visual coordinate system axis of abscissas and robot coordinate system's axis of abscissas;
4.1.3) according to the physical length of scaling board dot matrix mid point line and the projection relation of robot coordinate system, the length ratio of line under acquisition point line physical length and robot coordinate system, is put;
4.1.4) visual coordinate is tied to the conversion formula of robot coordinate and is:
Wherein:
for the coordinate value of initial point in dot matrix under robot coordinate system;
L
_{x}, L
_{y}for the physical length of dot matrix mid point line;
L'
_{x}, L'
_{y}for the length of dot matrix mid point line under robot coordinate system;
θ
_{x}for the deflection angle of visual coordinate system axis of ordinates and robot coordinate system's axis of ordinates;
θ
_{y}for the deflection angle of visual coordinate system axis of ordinates and robot coordinate system's axis of ordinates;
4.2) to obtain on benchmark workpiece the coordinate a little under assembly robot coordinate system:
4.2.1) by benchmark workpiece movable under the station of assembly robot, adjustment robot end installing plate is parallel with benchmark workpiece, records robot end's anglec of rotation now;
4.2.2) in acquisition point matrix, the coordinate value of initial point under robot coordinate system ties up to the translational movement under robot coordinate system as visual coordinate;
4.2.3) angle between some line and the axis of ordinates of robot coordinate system arranged arbitrarily in the angle between some line and the axis of abscissas of robot coordinate system and dot matrix of going arbitrarily in dot matrix is got respectively; The algorithm computation vision coordinate system of averaging according to suing for peace and the deflection angle of robot coordinate system;
4.2.4) obtaining the conversion formula that visual coordinate is tied to robot coordinate system is:
Wherein: L is the distance of alignment point in robot rotation axis points to robot end's installing plate;
θ
_{l}for the line of alignment point in robot end's rotation axis points and robot end's installing plate and the angle of visual coordinate system abscissa;
θ
_{1}for the deflection angle of visual coordinate system and robot coordinate system;
θ
_{0}for robot end's anglec of rotation;
5) robot picks up workpiece according to the coordinate of the workpiece obtained under robot coordinate system.
Abovementioned image coordinate system is the upper left corner of the image taken is initial point, and the transverse direction of shooting image is axis of abscissas, and the longitudinal direction of shooting image is the coordinate system that axis of ordinates is set up.
The invention has the advantages that:
1, the present invention takes the method for visual coordinate system converting machine people coordinate system, makes robot when carrying out pipelining on a large scale, accurate positioning.
2, the present invention analyzes the Robot Vision Calibration method of two kinds of techniques, is applicable to vision robot's application of the various uses such as arcwelding, carrying, piling, vanning, assembling.
Accompanying drawing explanation
Fig. 1 is the front view of benchmark workpiece;
Fig. 2 is the schematic diagram of dot matrix on scaling board;
Fig. 3 is the graph of a relation of robot for punching coordinate system and visual coordinate system;
Fig. 4 is the graph of a relation of assembly robot coordinate system and visual coordinate system.
Detailed description of the invention
The present invention proposes the Robot Handeye localization method that a kind of orientation range is large, positioning precision is high, below in conjunction with accompanying drawing, technical scheme of the present invention be described:
Due to camera camera site and robot operating position different, operated by robot after needing to transmit certain distance on streamline after vision system shooting location completes, before robot work, need the conversion carrying out visual coordinate system and robot coordinate system.
Step 1) set up visual coordinate system;
Step 1.1) as shown in Figure 1: choosing benchmark workpiece any two points is shooting point, and all the other points on benchmark workpiece are calculation level; If any two points is designated as B
_{1}point and B
_{2}point;
Step 1.2) position of benchmark workpiece on adjustment streamline, make two shooting point (B
_{1}point and B
_{2}point) line projects parallel with the axis of ordinates of image coordinate system in image coordinate system;
Wherein, image coordinate system is exactly the upper left corner of the image taken is initial point O, and the transverse direction of shooting image is the axis of abscissas X of image coordinate system, and the longitudinal direction of shooting image is the coordinate system (X, Y) of the axis of ordinates Y foundation of image coordinate system;
Step 1.3) on benchmark workpiece, stick scaling board (figure of scaling board is see Fig. 2); Scaling board is provided with dot matrix (OABC); Guarantee that the line of every a line point on scaling board is all parallel with the Xaxis of image coordinate system, each row point line is parallel with Yaxis; Any point in dot matrix is set to initial point, in initial point horizontal direction line be a little axis of abscissas, on initial point vertical direction line be a little axis of ordinates, set up visual coordinate system (x
^{v}, y
^{v});
As shown in Figure 2, using O point as visual coordinate system initial point, the line of O point and A point is axis of ordinates, and the line of O point and B point is axis of abscissas,
Step 2) calculate the coordinate of each point under visual coordinate system:
Step 2.1) according to the ratio calculation actual physical size coefficient of the difference of the pixel value of the actual range of any two points in dot matrix on scaling board and any two points of shooting; The concrete calculating formula of actual physical size coefficient is:
K
_{x}＝L/M
_{x}，K
_{y}＝L/M
_{y}
Wherein, on scaling board, in dot matrix, the actual range of any two points is definite value, is set to L;
M
_{x}for the difference of the pixel value on any two points abscissa direction of shooting, M
_{y}for the difference of the pixel value on any two points ordinate direction of shooting;
K
_{x}for the actual physical size coefficient on abscissa direction, K
_{y}for the actual physical size coefficient on ordinate direction;
Step 2.2) according to the B taken
_{1}the pixel of point and the pixel value of visual coordinate system initial point, calculate B
_{1}the coordinate of point under visual coordinate system, concrete calculating formula is:
Wherein, B
_{1}the pixel value of point is
b
_{1}the coordinate of point under visual coordinate system is
the pixel value of visual coordinate system initial point is
Step 2.3) according to the practical structures size of benchmark workpiece, namely on benchmark workpiece all the other points relative on the position of B1 point and benchmark workpiece, all the other are put relative to B
_{1}and B
_{2}point line angle, to calculate on benchmark workpiece the coordinate a little under visual coordinate system: setting tool body calculating formula is:
(dx, dy) is for all the other points on benchmark workpiece are relative to the position of B1 point;
θ is that the line of all the other points and B1 point on benchmark workpiece is relative to B
_{1}and B
_{2}point line angle;
Step 3) at completing steps 2) after, under benchmark workpiece is moved to robot station together with scaling board;
Step 4) visual coordinate system is converted to robot coordinate system:
There are two types in general robot: complete the robot for punching of the hole operation that screws and complete the assembly robot of installment work; Wherein, robot for punching only need provide coordinate, and assembly robot, except providing coordinate, also provides the anglec of rotation of robot end's installing plate.
Therefore, completing steps 4) when needing, need the type first judging robot, the type of robot is divided into two kinds of situations to carry out the conversion that visual coordinate is tied to robot coordinate system:
Judge that the type of robot is robot for punching or assembly robot; If robot for punching, then perform step 4.1); If assembly robot, then perform step 4.2);
Step 4.1) to follow these steps to obtain on benchmark workpiece the coordinate a little under robot for punching coordinate system:
Because punch machines robot end has installed cylinder and screw fixing machine.Cause the uneven situation of the plane of robot coordinate plane and benchmark workpiece like this;
Step 4.1.1) by benchmark workpiece movable under the station of robot for punching, obtaining step 1.3) in the coordinate value of dot matrix Central Plains point (as shown in Figure 2, i.e. O point) under robot coordinate system using coordinate value as the translational movement under visual coordinate ties up to robot coordinate system;
Step 4.1.2) computation vision coordinate system axis of ordinates and the deflection angle of robot coordinate system's axis of ordinates and the deflection angle of visual coordinate system axis of abscissas and robot coordinate system's axis of abscissas;
Step 4.1.2.1) get at least two row point lines on dot matrix, get the angle of each row point line and robot coordinate system's axis of ordinates respectively, according to the algorithm of averaging of suing for peace, obtain the deflection angle of visual coordinate system axis of ordinates and robot coordinate system's axis of ordinates;
Step 4.1.2.2) get at least two row point lines on dot matrix, get the angle of every a line point line and robot coordinate system's axis of abscissas respectively, according to the algorithm of averaging of suing for peace, obtain the deflection angle of visual coordinate system axis of abscissas and robot coordinate system's axis of abscissas;
As shown in Figure 3 and Figure 4: first, robot moves and obtains vision calibration plate A, B, C point coordinates,
A) calculate OA and BC two straight lines under robot coordinate system with the deflection angle mean value θ of yaxis
_{y}(θ
_{y}just be to be partial on the left of yaxis, right side is negative);
B) OB and AC two straight lines under robot coordinate system with the deflection angle mean value θ of xaxis
_{x}(θ
_{x}just be to be partial on the downside of xaxis, upside is negative);
Step 4.1.3) according to the physical length of scaling board dot matrix mid point line and the projection relation of robot coordinate system, under acquisition point line physical length and robot coordinate system, put the length ratio of line;
Specific practice is: owing to there is projection relation, and therefore projected length also exists a proportionality coefficient, according to the line segment length L' under robot coordinate system's axis of abscissas that OABC obtains
_{x}, line segment length L' under axis of ordinates
_{y}with the physical length lateral length L of OABC
_{x}, longitudinal length L
_{x}; And calculate abscissa proportionality coefficient and ordinate proportionality coefficient respectively;
Step 4.1.4) completing steps 4.1.1) to step 4.1.3) after, the conversion formula that acquisition visual coordinate is tied to robot coordinate is:
Put together machines artificial four articulated robots, not only needs to provide coordinate value, also need to provide the end anglec of rotation during work;
Step 4.2) to follow these steps to obtain on benchmark workpiece the coordinate a little under assembly robot coordinate system:
Step 4.2.1) by benchmark workpiece movable under the station of assembly robot, adjustment robot end installing plate is parallel with benchmark workpiece, now accurately can be placed on benchmark workpiece by module to be assembled, records robot end anglec of rotation θ now
_{0};
Step 4.2.2) obtaining step 1.3) in the coordinate value of dot matrix Central Plains point (as shown in Figure 2, i.e. O point) under robot coordinate system using coordinate value as the translational movement under visual coordinate ties up to robot coordinate system;
Step 4.2.3) to get in dot matrix the angle between some line and the axis of ordinates of robot coordinate system that angle between the some line of row arbitrarily and the axis of abscissas of robot coordinate system and scaling board arrange arbitrarily in dot matrix respectively; The algorithm computation vision coordinate system of averaging according to suing for peace and the deflection angle of robot coordinate system;
Specific practice is: shown in composition graphs 2 and Fig. 4; The installing hole put together machines on the installing plate of robot end is respectively to initial point O, A, B point in dot matrix, obtain the coordinate of three points in robot coordinate system, calculate initial point O and B under robot coordinate system and put the angle of line and Xaxis, datum point O and A puts and Yaxis angle, calculating mean value θ
_{1}, be the anglec of rotation of robot coordinate system relative to visual coordinate system.
Step 4.2.4) completing steps 4.2.1) to step 4.2.3) after, the conversion formula that acquisition visual coordinate is tied to robot coordinate system is:
Wherein, L is the line of alignment point on robot end's rotation axis points and installing plate on installing plate;
θ
_{l}for the line of alignment point on robot end's rotation axis points on installing plate and installing plate and the angle of visual coordinate system abscissa;
θ
_{1}for the deflection angle of visual coordinate system and robot coordinate system;
θ
_{0}for robot end's anglec of rotation.
Step 5) robot picks up workpiece according to the coordinate of the workpiece obtained under robot coordinate system.
Claims (2)
1. a Robot Handeye localization method, is characterized in that: comprise the following steps:
1) visual coordinate system is set up:
1.1) choosing benchmark workpiece any two points is shooting point, and all the other points on benchmark workpiece are calculation level; If any two points is designated as B
_{1}point and B
_{2};
1.2) adjust the position of benchmark workpiece on streamline, two shooting point lines are projected in image coordinate system parallel with the axis of ordinates of image coordinate system;
1.3) on benchmark workpiece, scaling board is sticked; Described scaling board is provided with dot matrix; In dot matrix, the line of every a line point is all parallel with the axis of abscissas of image coordinate system, and each row point line is parallel with axis of ordinates; Any point in dot matrix is set to initial point, in initial point horizontal direction line be a little axis of abscissas, on initial point vertical direction line be a little axis of ordinates, set up visual coordinate system (x
^{v}, y
^{v});
2) coordinate of each point under visual coordinate system is calculated:
2.1) according to the ratio calculation actual physical size coefficient of the difference of the pixel value of the actual range of any two points in dot matrix on scaling board and any two points of shooting; The concrete calculating formula of actual physical size coefficient is:
K
_{x}＝L/M
_{x}，K
_{y}＝L/M
_{y}
Wherein, on scaling board, in dot matrix, the actual range of any two points is definite value, is set to L;
M
_{x}for the difference of the pixel value on any two points abscissa direction of shooting, M
_{y}for the difference of the pixel value on any two points ordinate direction of shooting;
K
_{x}for the actual physical size coefficient on abscissa direction, K
_{y}for the actual physical size coefficient on ordinate direction is;
2.2) according to the B of shooting
_{1}the pixel of point and the pixel value of visual coordinate system initial point, calculate B
_{1}the coordinate of point under visual coordinate system, concrete calculating formula is:
Wherein, B
_{1}the pixel value of point is
b
_{1}the coordinate of point under visual coordinate system is
the pixel value of visual coordinate system initial point is
2.3) according to the practical structures size of benchmark workpiece, namely on benchmark workpiece all the other points relative on the position of B1 point and benchmark workpiece, all the other are put relative to B
_{1}and B
_{2}point line angle, to calculate on benchmark workpiece the coordinate a little under visual coordinate system: setting tool body calculating formula is:
(dx, dy) is for all the other points on benchmark workpiece are relative to the position of B1 point;
θ is that the line of all the other points and B1 point on benchmark workpiece is relative to B
_{1}and B
_{2}point line angle;
3) under benchmark workpiece being moved to robot station together with scaling board;
4) visual coordinate system is converted to robot coordinate system:
Judge that the type of robot is robot for punching or assembly robot; If robot for punching, then perform step 4.1; If assembly robot, then perform step 4.2;
4.1) to obtain on benchmark workpiece the coordinate a little under robot for punching coordinate system:
4.1.1) by benchmark workpiece movable under the station of robot for punching, the coordinate value of initial point under robot coordinate system in acquisition point matrix using coordinate value as the translational movement under visual coordinate ties up to robot coordinate system;
4.1.2) deflection angle of computation vision coordinate system axis of ordinates and robot coordinate system's axis of ordinates and the deflection angle of visual coordinate system axis of abscissas and robot coordinate system's axis of abscissas;
4.1.2.1) at least two row point lines on dot matrix are got, get the angle of each row point line and robot coordinate system's axis of ordinates respectively, according to the algorithm of averaging of suing for peace, obtain the deflection angle of visual coordinate system axis of ordinates and robot coordinate system's axis of ordinates;
4.1.2.2) at least two row point lines on dot matrix are got, get the angle of every a line point line and robot coordinate system's axis of abscissas respectively, according to the algorithm of averaging of suing for peace, obtain the deflection angle of visual coordinate system axis of abscissas and robot coordinate system's axis of abscissas;
4.1.3) according to the physical length of scaling board dot matrix mid point line and the projection relation of robot coordinate system, the length ratio of line under acquisition point line physical length and robot coordinate system, is put;
4.1.4) visual coordinate is tied to the conversion formula of robot coordinate and is:
Wherein:
for the coordinate value of initial point in dot matrix under robot coordinate system;
L
_{x}, L
_{y}for the physical length of dot matrix mid point line;
L'
_{x}, L'
_{y}for the length of dot matrix mid point line under robot coordinate system;
θ
_{x}for the deflection angle of visual coordinate system axis of ordinates and robot coordinate system's axis of ordinates;
θ
_{y}for the deflection angle of visual coordinate system axis of ordinates and robot coordinate system's axis of ordinates;
4.2) to obtain on benchmark workpiece the coordinate a little under assembly robot coordinate system:
4.2.1) by benchmark workpiece movable under the station of assembly robot, adjustment robot end installing plate is parallel with benchmark workpiece, records robot end's anglec of rotation now;
4.2.2) in acquisition point matrix, the coordinate value of initial point under robot coordinate system ties up to the translational movement under robot coordinate system as visual coordinate;
4.2.3) angle between some line and the axis of ordinates of robot coordinate system arranged arbitrarily in the angle between some line and the axis of abscissas of robot coordinate system and dot matrix of going arbitrarily in dot matrix is got respectively; The algorithm computation vision coordinate system of averaging according to suing for peace and the deflection angle of robot coordinate system;
4.2.4) obtaining the conversion formula that visual coordinate is tied to robot coordinate system is:
Wherein: L is the distance of alignment point in robot rotation axis points to robot end's installing plate;
θ
_{l}for the line of alignment point in robot end's rotation axis points and robot end's installing plate and the angle of visual coordinate system abscissa;
θ
_{1}for the deflection angle of visual coordinate system and robot coordinate system;
θ
_{0}for robot end's anglec of rotation;
5) robot picks up workpiece according to the coordinate of the workpiece obtained under robot coordinate system.
2. Robot Handeye localization method according to claim 1, is characterized in that: described image coordinate system is the upper left corner of the image taken is initial point, and the transverse direction of shooting image is axis of abscissas, and the longitudinal direction of shooting image is the coordinate system that axis of ordinates is set up.
Priority Applications (1)
Application Number  Priority Date  Filing Date  Title 

CN201410477611.5A CN104260112B (en)  20140918  20140918  A kind of Robot Handeye localization method 
Applications Claiming Priority (1)
Application Number  Priority Date  Filing Date  Title 

CN201410477611.5A CN104260112B (en)  20140918  20140918  A kind of Robot Handeye localization method 
Publications (2)
Publication Number  Publication Date 

CN104260112A true CN104260112A (en)  20150107 
CN104260112B CN104260112B (en)  20160518 
Family
ID=52151756
Family Applications (1)
Application Number  Title  Priority Date  Filing Date 

CN201410477611.5A Active CN104260112B (en)  20140918  20140918  A kind of Robot Handeye localization method 
Country Status (1)
Country  Link 

CN (1)  CN104260112B (en) 
Cited By (14)
Publication number  Priority date  Publication date  Assignee  Title 

CN105619411A (en) *  20160322  20160601  中国船舶重工集团公司第七一六研究所  Stacking method for sixaxis industrial robot 
CN105965495A (en) *  20160512  20160928  英华达（上海）科技有限公司  Mechanical arm positioning method and system 
CN106054874A (en) *  20160519  20161026  歌尔股份有限公司  Visual positioning calibrating method and device, and robot 
CN106217372A (en) *  20150602  20161214  精工爱普生株式会社  Robot, robot controller and robot system 
CN106767393A (en) *  20151120  20170531  沈阳新松机器人自动化股份有限公司  The hand and eye calibrating apparatus and method of robot 
CN107322627A (en) *  20170629  20171107  安徽新兴翼凌机电发展有限公司  A kind of industrial robot sucked type instrument hand positioner 
CN107431788A (en) *  20150218  20171201  西门子医疗保健诊断公司  The alignment of the pallet based on image and tube seat positioning in vision system 
CN108025435A (en) *  20150717  20180511  艾沛克斯品牌公司  With the vision system calibrated automatically 
CN108078628A (en) *  20161202  20180529  王健  The robot spacelocation method of viewbased access control model error compensation 
CN108326850A (en) *  20180110  20180727  温州大学  A kind of accurate mobile mechanical arm of robot reaches the method and system of designated position 
CN108772824A (en) *  20180606  20181109  深圳市恒晨电器有限公司  A kind of screw machine hand teaching alignment method 
CN109648554A (en) *  20181214  20190419  佛山市奇创智能科技有限公司  Robot calibration method, device and system 
CN109693235A (en) *  20171023  20190430  中国科学院沈阳自动化研究所  A kind of Prosthetic Hand vision tracking device and its control method 
CN110465944A (en) *  20190809  20191119  琦星智能科技股份有限公司  Calculation method based on the industrial robot coordinate under plane visual 
Citations (7)
Publication number  Priority date  Publication date  Assignee  Title 

JPS60252913A (en) *  19840530  19851213  Mitsubishi Electric Corp  Robot controller 
JPH0299802A (en) *  19881007  19900411  Fanuc Ltd  Setting method of coordinate system in visual sensor using hand eye 
CN1586833A (en) *  20040715  20050302  上海交通大学  Single eye visual sensor for welding robot and its handeye relation quick marking method 
CN101186038A (en) *  20071207  20080528  北京航空航天大学  Method for demarcating robot stretching hand and eye 
CN101630409A (en) *  20090817  20100120  北京航空航天大学  Handeye vision calibration method for robot hole boring system 
US20110320039A1 (en) *  20100625  20111229  Hon Hai Precision Industry Co., Ltd.  Robot calibration system and calibrating method thereof 
CN103170973A (en) *  20130328  20130626  上海理工大学  Manmachine cooperation device and method based on Kinect video camera 

2014
 20140918 CN CN201410477611.5A patent/CN104260112B/en active Active
Patent Citations (7)
Publication number  Priority date  Publication date  Assignee  Title 

JPS60252913A (en) *  19840530  19851213  Mitsubishi Electric Corp  Robot controller 
JPH0299802A (en) *  19881007  19900411  Fanuc Ltd  Setting method of coordinate system in visual sensor using hand eye 
CN1586833A (en) *  20040715  20050302  上海交通大学  Single eye visual sensor for welding robot and its handeye relation quick marking method 
CN101186038A (en) *  20071207  20080528  北京航空航天大学  Method for demarcating robot stretching hand and eye 
CN101630409A (en) *  20090817  20100120  北京航空航天大学  Handeye vision calibration method for robot hole boring system 
US20110320039A1 (en) *  20100625  20111229  Hon Hai Precision Industry Co., Ltd.  Robot calibration system and calibrating method thereof 
CN103170973A (en) *  20130328  20130626  上海理工大学  Manmachine cooperation device and method based on Kinect video camera 
Cited By (20)
Publication number  Priority date  Publication date  Assignee  Title 

CN107431788A (en) *  20150218  20171201  西门子医疗保健诊断公司  The alignment of the pallet based on image and tube seat positioning in vision system 
CN107431788B (en) *  20150218  20201208  西门子医疗保健诊断公司  Method and system for imagebased tray alignment and tube slot positioning in a vision system 
US10725060B2 (en)  20150218  20200728  Siemens Healthcare Diagnostics Inc.  Imagebased tray alignment and tube slot localization in a vision system 
CN106217372A (en) *  20150602  20161214  精工爱普生株式会社  Robot, robot controller and robot system 
CN108025435A (en) *  20150717  20180511  艾沛克斯品牌公司  With the vision system calibrated automatically 
CN106767393A (en) *  20151120  20170531  沈阳新松机器人自动化股份有限公司  The hand and eye calibrating apparatus and method of robot 
CN106767393B (en) *  20151120  20200103  沈阳新松机器人自动化股份有限公司  Handeye calibration device and method for robot 
CN105619411A (en) *  20160322  20160601  中国船舶重工集团公司第七一六研究所  Stacking method for sixaxis industrial robot 
CN105965495A (en) *  20160512  20160928  英华达（上海）科技有限公司  Mechanical arm positioning method and system 
CN106054874A (en) *  20160519  20161026  歌尔股份有限公司  Visual positioning calibrating method and device, and robot 
CN106054874B (en) *  20160519  20190426  歌尔股份有限公司  Vision positioning scaling method, device and robot 
CN108078628A (en) *  20161202  20180529  王健  The robot spacelocation method of viewbased access control model error compensation 
CN107322627A (en) *  20170629  20171107  安徽新兴翼凌机电发展有限公司  A kind of industrial robot sucked type instrument hand positioner 
CN109693235A (en) *  20171023  20190430  中国科学院沈阳自动化研究所  A kind of Prosthetic Hand vision tracking device and its control method 
CN108326850A (en) *  20180110  20180727  温州大学  A kind of accurate mobile mechanical arm of robot reaches the method and system of designated position 
CN108772824A (en) *  20180606  20181109  深圳市恒晨电器有限公司  A kind of screw machine hand teaching alignment method 
CN109648554B (en) *  20181214  20190830  佛山市奇创智能科技有限公司  Robot calibration method, device and system 
CN109648554A (en) *  20181214  20190419  佛山市奇创智能科技有限公司  Robot calibration method, device and system 
CN110465944A (en) *  20190809  20191119  琦星智能科技股份有限公司  Calculation method based on the industrial robot coordinate under plane visual 
CN110465944B (en) *  20190809  20210316  琦星智能科技股份有限公司  Method for calculating coordinates of industrial robot based on plane vision 
Also Published As
Publication number  Publication date 

CN104260112B (en)  20160518 
Similar Documents
Publication  Publication Date  Title 

US9782899B2 (en)  Calibration method for coordinate system of robot manipulator  
CN103353758B (en)  A kind of Indoor Robot navigation method  
CN103162622B (en)  The Portable ball target of single camera vision system and use thereof and measuring method thereof  
US7899577B2 (en)  Measuring system and calibration method  
CN100476345C (en)  Method for measuring geometric parameters of spatial circle based on technique of binocular stereoscopic vision  
EP2981397B1 (en)  A robot system and method for calibration  
JP3946711B2 (en)  Robot system  
CN102485441B (en)  Robot positioning method and calibration method  
CN103425100B (en)  The direct teaching control method of robot based on equalising torque  
CN107214703B (en)  Robot selfcalibration method based on visionassisted positioning  
CN105458483B (en)  Postwelding weld joint tracking robot automatic deviation correction and ultrasonic impact system  
CN104200086A (en)  Widebaseline visible light camera pose estimation method  
CN102107374B (en)  Online detection method for diameter size of shaft disc type part  
CN104972362A (en)  Intelligent force control robot grinding system and method  
CN104786226A (en)  Posture and moving track positioning system and method of robot grabbing online workpiece  
CN106338245B (en)  A kind of noncontact traverse measurement method of workpiece  
CN104759945A (en)  Mobile holemaking robot standard alignment method based on high precision industrial camera  
US20170054954A1 (en)  System and method for visually displaying information on real objects  
CN101532827B (en)  Deviation correction method for measurement of rail wear based on laser vision  
US9043024B2 (en)  Vision correction method for tool center point of a robot manipulator  
KR20160010868A (en)  Automated machining head with vision and procedure  
CN105300375A (en)  Robot indoor positioning and navigation method based on single vision  
DE102011015987A1 (en)  System and method for visual presentation of information on real objects  
CN103101060A (en)  Sensing calibration method for robot tool center point  
EP3088843B1 (en)  System and method for aligning a coordinated movement machine reference frame with a measurement system reference frame 
Legal Events
Date  Code  Title  Description 

PB01  Publication  
C06  Publication  
SE01  Entry into force of request for substantive examination  
C10  Entry into substantive examination  
GR01  Patent grant  
C14  Grant of patent or utility model 