CN104236456A - Robot hand-eye calibration method based on two-degree-of-freedom three-dimensional visual sensor - Google Patents

Robot hand-eye calibration method based on two-degree-of-freedom three-dimensional visual sensor Download PDF

Info

Publication number
CN104236456A
CN104236456A CN201410446352.XA CN201410446352A CN104236456A CN 104236456 A CN104236456 A CN 104236456A CN 201410446352 A CN201410446352 A CN 201410446352A CN 104236456 A CN104236456 A CN 104236456A
Authority
CN
China
Prior art keywords
vision sensor
coordinate system
matrix
robot
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410446352.XA
Other languages
Chinese (zh)
Other versions
CN104236456B (en
Inventor
谭治英
骆敏舟
李涛
方世辉
罗艳
赵娜娜
郑俊君
孙晓瑜
黄海卫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Institutes of Physical Science of CAS
Institute of Advanced Manufacturing Technology
Original Assignee
Hefei Institutes of Physical Science of CAS
Institute of Advanced Manufacturing Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Institutes of Physical Science of CAS, Institute of Advanced Manufacturing Technology filed Critical Hefei Institutes of Physical Science of CAS
Priority to CN201410446352.XA priority Critical patent/CN104236456B/en
Publication of CN104236456A publication Critical patent/CN104236456A/en
Application granted granted Critical
Publication of CN104236456B publication Critical patent/CN104236456B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Manipulator (AREA)

Abstract

The present invention provides a kind of Robotic Hand-Eye Calibration methods based on two-freedom 3D visual sensor, comprising the following steps: the first step, the calculating of coordinate conversion matrix T0 between initial position visual coordinate system and robot basis coordinates system; Second step, visual sensor from initial position both vertically as well as horizontally respectively rotation angle, θ and Afterwards, the transforming relationship matrix T θ between coordinate system and Calculating; Trick transformational relation matrix can be obtained by the multiplication of each transition matrix in third step; The present invention has many advantages, such as that method is simple, measurement accuracy is high, easy to spread, can effectively meet the needs of Robotic Hand-Eye Calibration.

Description

A kind of Robotic Hand-Eye Calibration method based on two-freedom 3D vision sensor
Technical field
The present invention relates to industrial robot vision's calibration technique field, specifically, it relates to a kind of Robotic Hand-Eye Calibration method based on two-freedom 3D vision sensor.
Background technology
Along with the application & development of 3D vision sensor vision technique, increasing robot is using the important tool of 3D vision sensor as its independent navigation.But, the problem of homing capability by the restriction of 3D vision sensor field range is there is in prior art, the degree of freedom of level, vertical direction must be increased for expanding field range to sensor, driving 3D vision sensor to carry out the scene information collection of more Large visual angle field angle according to this.
Summary of the invention
For deficiency of the prior art, the technical problem to be solved in the present invention is, a kind of Robotic Hand-Eye Calibration method based on two-freedom 3D vision sensor is provided, the method has the features such as computing is simple, measuring accuracy is higher, easy popularization, and effectively can meet the needs of Robotic Hand-Eye Calibration.
In order to solve the problems of the technologies described above, the technical solution adopted in the present invention is: a kind of Robotic Hand-Eye Calibration method based on two-freedom 3D vision sensor, comprises the following steps:
The first step, four points are marked in advance in 3D vision sensor field range, the mutual relationship of these four points be any 3 not conllinear and 4 not coplanar, gather this some cloud information of 4 by the 3D vision sensor be fixed on robot model machine initial position, obtain the D coordinates value of four gauge points relative to 3D vision sensor coordinate system; The measurement of Three-Dimensional Dynamic measuring equipment is adopted to obtain, relative to the D coordinates value of four gauge points in space under basis coordinates system of robot, obtaining transition matrix, obtaining the transformational relation matrix between the coordinate system of 3D vision sensor initial position and basis coordinates system of robot;
Second step, rotate to an angle in the horizontal direction by controlling driven by motor 3D vision sensor, and gather 4 cloud datas under current 3D vision sensor in space, calculate the three-dimensional coordinate of four points, utilize the two group three-dimensional coordinates of 4, space under 3D vision sensor to calculate the turning axle of 3D vision sensor coordinate systems, set up the anglec of rotation and 3D vision sensor and rotates transformational relation matrix afterwards between coordinate system and initial coordinate system; The computing method of the relational matrix that 3D vision sensor is corresponding when vertically rotating are identical;
3rd step, is multiplied by each transition matrix and can obtains trick transformational relation matrix.
Compared with prior art, owing to adopting technique scheme, a kind of Robot Vision Calibration method based on two-freedom 3D vision sensor of the present invention, level must be increased to sensor for expanding field range, the degree of freedom of vertical direction, 3D vision sensor has been driven to carry out the scene information collection of more Large visual angle field angle according to this, solve the problem of homing capability by the restriction of 3D vision sensor field range, and efficiently solve the calculating of sensor coordinate system transformational relation after horizontal vertical direction any rotation and between basis coordinates system of robot, scaling method depends on strict mathematical theory and derives, computational accuracy is high.
The invention has the beneficial effects as follows: have that method is simple, measuring accuracy is high, be easy to the advantages such as popularization, effectively can meet the needs of Robotic Hand-Eye Calibration.
Accompanying drawing explanation
Below by with reference to accompanying drawing describe the present invention particularly in conjunction with example, advantage of the present invention and implementation will be more obvious, wherein content shown in accompanying drawing is only for explanation of the present invention, and does not form restriction of going up in all senses of the present invention, in the accompanying drawings:
Fig. 1 is the calibration process schematic diagram of two-freedom 3D vision sensor of the present invention and basis coordinates system of robot; In figure: (1) is NDI Three-Dimensional Dynamic measuring instrument, (2) be four not coplanar gauge points in space, (3) be basis coordinates system of robot, (4) be 3D vision sensor initial position co-ordinates system, (5) be coordinate system after 3D vision sensor vertically rotates θ, (7) corresponding turning axle is represented, (9) be two points on turning axle with (10), (6) are that 3D vision sensor vertically rotates from position (5) after coordinate system, (8) represent corresponding turning axle, and (11) and (12) are two points on turning axle;
Fig. 2 is the concrete operations process flow diagram of the embodiment of the present invention;
Fig. 3 is the transformational relation schematic diagram between 3D vision sensor initial coordinate system of the present invention and basis coordinates system of robot; In figure: o-xyz is world coordinate system, o r-x ry rz rfor basis coordinates system of robot, o v-x vy vz vfor the coordinate system of 3D vision sensor, o nDI-x nDIy nDIz nDIfor NDI Three-Dimensional Dynamic surveying instrument coordinate system;
Fig. 4 is the transforming relationship schematic diagram of 3D vision sensor coordinate system of the present invention with coordinate system during vertical axis revolving; In figure: 3D vision sensor edge is by P c1, P c2coordinate system before and after 2 straight line anglec of rotation θ formed uses o respectively v1-x v1y v1z v1with o v2-x v2y v2z v2represent.
Embodiment
The present invention is described further below in conjunction with embodiment and accompanying drawing thereof:
As shown in Figures 1 to 4, a kind of Robotic Hand-Eye Calibration method based on two-freedom 3D vision sensor of the present invention, if robot head vertical direction, horizontal direction are respectively θ relative to the anglec of rotation of initial position, , calibration process is mainly divided into following steps:
The first step, coordinate conversion matrix T between initial position visual coordinate system and basis coordinates system of robot 0calculating;
Choose four some P not coplanar in space 1, P 2, P 3, P 4, obtain the three-dimensional coordinate of 4 points under visual coordinate system by vision sensor, be designated as respectively:
P V1(x v1,y v1,z v1),P V2(x v2,y v2,z v2),P V3(x v3,y v3,z v3),P V4(x v4,y v4,z v4)
By controlling the motion in each joint of arm, and forward kinematics solution obtains the three-dimensional coordinate of these 4 points under basis coordinates system of robot, is designated as respectively:
P R1(x r1,y r1,z r1),P R2(x r2,y r2,z r2),P R3(x r3,y r3,z r3),P R4(x r4,y r4,z r4)
Calculate transition matrix T 0
T 0 = x r 1 x r 2 x r 3 x r 4 y r 1 y r 2 y r 3 y r 4 z r 1 z r 2 z r 3 z r 4 1 1 1 1 · x v 1 x v 2 x v 3 x v 4 y v 1 y v 2 y v 3 y v 4 z v 1 z v 2 z v 3 z v 4 1 1 1 1 - 1
Second step, vision sensor from initial position vertically and horizontal direction respectively anglec of rotation θ with after, the transforming relationship matrix T between coordinate system θwith calculating;
1) four not coplanar points of space are gathered at initial position coordinate is designated as respectively
P V 1 1 ( x v 1 1 , y v 1 1 , z v 1 1 ) , P V 2 1 ( x v 2 1 , y v 2 1 , z v 2 1 ) , P V 3 1 ( x v 3 1 , y v 3 1 , z v 3 1 ) , P V 4 1 ( x v 4 1 , y v 4 1 , z v 4 1 )
Vision module, vertically after any rotation, obtains 4, space at the coordinate of current coordinate system by sensor, is designated as respectively
P V 1 2 ( x v 1 2 , y v 1 2 , z v 1 2 ) , P V 2 2 ( x v 2 2 , y v 2 2 , z v 2 2 ) , P V 3 2 ( x v 3 2 , y v 3 2 , z v 3 2 ) , P V 4 2 ( x v 4 2 , y v 4 2 , z v 4 2 )
Utilize the principle shown in Fig. 3, calculate corresponding transformed matrix T t
T t = x v 1 1 x v 2 1 x v 3 1 x v 4 1 y v 1 1 y v 2 1 y v 3 1 y v 4 1 z v 1 1 z v 2 1 z v 3 1 z v 4 1 1 1 1 1 · x v 1 2 x v 2 2 x v 3 2 x v 4 2 y v 1 2 y v 2 2 y v 3 2 y v 4 2 z v 1 2 z v 2 2 z v 3 2 z v 4 2 1 1 1 1 - 1
2) transition matrix T is utilized tcalculate any two points P on turning axle c1, P c2, P c2=(x c2, y c2, z c2) ', P c2=(x c2, y c2, z c2) '
x c 1 x c 2 y c 1 y c 2 z c 1 z c 2 1 1 = T t x c 1 x c 2 y c 1 y c 2 z c 1 z c 2 1 1
3) by transition matrix T θ, be divided into rotation and translation two parts to calculate respectively, wherein translational movement d represents, rotation amount R represents, then
T θ = R d 0 1
Vision module initial coordinate system o v1-x v1y v1z v1be o with the coordinate after stationary shaft anglec of rotation θ v2-x v2y v2z v2, an O is the center of X-axis rotate to utilize geometry symmetric relation to know.Utilize o v1o and rotational axis vertical, and O point is on the rotary shaft, these two conditions can set up following equation:
x 0(x c2-x c1)+y 0(y c2-y c1)+z 0(z c2-z c1)=0
x 0 - x c 1 x c 2 - x c 1 = y 0 - y c 1 y c 2 - y c 1 = z 0 - z c 1 z c 2 - z c 1
Can obtain by solving three equations above:
x 0 y 0 z 0 = x c 2 - x c 1 y c 2 - y c 1 z c 2 - z c 1 y c 2 - y c 1 x c 1 - x c 2 0 z c 2 - z c 1 0 x c 1 - x c 2 - 1 0 x c 1 ( y c 2 - y c 1 ) - y c 1 ( x c 2 - x c 1 ) x c 1 ( z c 2 - z c 1 ) - z c 1 ( x c 2 - x c 1 )
Utilize geometric relationship with the rotary flat motion immovability of coordinate system, can obtain
- x 0 ( d 1 - x 0 ) - y 0 ( d 2 - y 0 ) - z 0 ( d 3 - z 0 ) x 0 2 + y 0 2 + z 0 2 + ( d 1 - x 0 ) 2 + ( d 2 - y 0 ) 2 + ( d 3 - z 0 ) 2 = cos ( θ )
d 1(x c2-x c1)+d 2(y c2-y c1)+d 3(z c2-z c1)=0
( d 1 - x 0 ) 2 + ( d 2 - y 0 ) 2 + ( d 3 - z 0 ) 2 = x 0 2 + y 0 2 + z 0 2
Utilize Mapple software to solve above-mentioned ternary quadratic equation group, its analytic solution can be obtained.
4) the translation vector d calculated is utilized, and the character that on turning axle, the coordinate figure of any point does not change with rotation, following equation can be obtained:
x c 1 x c 2 y c 1 y c 2 z c 1 z c 2 1 1 = R d 0 1 x c 1 x c 2 y c 1 y c 2 z c 1 z c 2 1 1
By solving above-mentioned system of linear equations, corresponding rotation matrix R can be obtained, comprehensive 1) to 4) transition matrix T can be obtained θ;
Vision sensor is from the initial position anglec of rotation in the horizontal direction after, the transforming relationship matrix between coordinate system above-mentioned identical step can be adopted to calculate;
3rd step, each transition matrix is multiplied and can obtains trick transformational relation matrix; Press three steps above and calculate transition matrix T 0, T θ, after, the arbitrfary point (x, y, z) in the field range that sensor obtains ', calculate its coordinate under robot coordinate system by following conversion formula.
Although invention has been described by reference to the accompanying drawings above; but the present invention is not limited to above-mentioned embodiment; above-mentioned embodiment is only schematic; instead of it is restrictive; those of ordinary skill in the art is under enlightenment of the present invention; when not departing from present inventive concept, can also make a lot of distortion, these all belong within protection of the present invention.

Claims (2)

1., based on a Robotic Hand-Eye Calibration method for two-freedom 3D vision sensor, it is characterized in that, comprise the following steps:
The first step, coordinate conversion matrix T between initial position visual coordinate system and basis coordinates system of robot 0calculating; Four points are marked in advance in 3D vision sensor field range, the mutual relationship of these four points be any 3 not conllinear and 4 not coplanar, gather this some cloud information of 4 by the 3D vision sensor be fixed on robot model machine initial position, obtain the D coordinates value of four gauge points relative to 3D vision sensor coordinate system; The measurement of Three-Dimensional Dynamic measuring equipment is adopted to obtain, relative to the D coordinates value of four gauge points in space under basis coordinates system of robot, obtaining transition matrix, obtaining the transformational relation matrix T between the coordinate system of 3D vision sensor initial position and basis coordinates system of robot 0;
Second step, vision sensor from initial position vertically and horizontal direction respectively anglec of rotation θ with after, the transforming relationship matrix T between coordinate system θwith calculating; By controlling driven by motor 3D vision sensor vertically any rotation θ, and gather 4 cloud datas under current 3D vision sensor in space, calculate the three-dimensional coordinate of four points, utilize the two group three-dimensional coordinates of 4, space under 3D vision sensor to calculate the turning axle of 3D vision sensor coordinate systems, set up the anglec of rotation and 3D vision sensor and rotates transformational relation matrix T afterwards between coordinate system and initial coordinate system θ; 3D vision sensor any rotation in the horizontal direction time, corresponding relational matrix computing method with calculate T θmethod identical;
3rd step, is multiplied by each transition matrix and can obtains trick transformational relation matrix; When vision module does vertical and horizontal direction rotary motion relative to basis coordinates system of robot, its relation transition matrix corresponding to robot basis coordinates is
2. the Robotic Hand-Eye Calibration method based on having two-freedom 3D vision sensor according to claim 1, it is characterized in that, 3D vision sensor does horizontal or vertical direction and rotates timing signal, two points on turning axle, try to achieve by utilizing the characteristic that on turning axle, the D coordinates value of arbitrfary point under 3D sensor coordinate system does not change with rotation.
CN201410446352.XA 2014-09-04 2014-09-04 A kind of Robotic Hand-Eye Calibration method based on two-freedom 3D vision sensor Active CN104236456B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410446352.XA CN104236456B (en) 2014-09-04 2014-09-04 A kind of Robotic Hand-Eye Calibration method based on two-freedom 3D vision sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410446352.XA CN104236456B (en) 2014-09-04 2014-09-04 A kind of Robotic Hand-Eye Calibration method based on two-freedom 3D vision sensor

Publications (2)

Publication Number Publication Date
CN104236456A true CN104236456A (en) 2014-12-24
CN104236456B CN104236456B (en) 2016-10-26

Family

ID=52225009

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410446352.XA Active CN104236456B (en) 2014-09-04 2014-09-04 A kind of Robotic Hand-Eye Calibration method based on two-freedom 3D vision sensor

Country Status (1)

Country Link
CN (1) CN104236456B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105902312A (en) * 2016-05-20 2016-08-31 深圳市智图医疗技术有限责任公司 Calibration method of surgical navigation tool
CN106553195A (en) * 2016-11-25 2017-04-05 中国科学技术大学 Object 6DOF localization method and system during industrial robot crawl
CN107450885A (en) * 2017-07-21 2017-12-08 上海交通大学 A kind of coordinate transform method for solving of industrial robot and three-dimension sensor
CN109360243A (en) * 2018-09-28 2019-02-19 上海爱观视觉科技有限公司 A kind of scaling method of the movable vision system of multiple degrees of freedom
CN109591007A (en) * 2017-09-30 2019-04-09 北京柏惠维康科技有限公司 A kind of robot space register method and device
CN109596126A (en) * 2017-09-30 2019-04-09 北京柏惠维康科技有限公司 A kind of determination method and apparatus of robot space coordinates transformational relation
CN109597318A (en) * 2017-09-30 2019-04-09 北京柏惠维康科技有限公司 A kind of method and apparatus of robot space registration
CN111207747A (en) * 2018-11-21 2020-05-29 中国科学院沈阳自动化研究所 Spatial positioning method based on HoloLens glasses
WO2020114046A1 (en) * 2018-12-04 2020-06-11 中冶赛迪重庆信息技术有限公司 Coordinate system calibration method and system, computer readable storage medium, and device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62214403A (en) * 1986-03-17 1987-09-21 Yaskawa Electric Mfg Co Ltd Calibration method of robot system with visual sensor
JPH04365586A (en) * 1991-06-14 1992-12-17 Toyota Autom Loom Works Ltd Optical axis aligning method and orthogonal axis aligning method for hand eye
CN101349542B (en) * 2008-06-27 2010-04-14 东南大学 Vision measuring apparatus of large size part
CN103175485A (en) * 2013-02-20 2013-06-26 天津工业大学 Method for visually calibrating aircraft turbine engine blade repair robot
CN103512482B (en) * 2013-10-14 2016-01-06 中国科学院电工研究所 A kind of super-conductive magnetic suspension rotor attitude measurement signal calibration device
CN103854291B (en) * 2014-03-28 2017-08-29 中国科学院自动化研究所 Camera marking method in four-degree-of-freedom binocular vision system
CN103925879A (en) * 2014-04-24 2014-07-16 中国科学院合肥物质科学研究院 Indoor robot vision hand-eye relation calibration method based on 3D image sensor

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105902312A (en) * 2016-05-20 2016-08-31 深圳市智图医疗技术有限责任公司 Calibration method of surgical navigation tool
CN105902312B (en) * 2016-05-20 2019-06-25 深圳市智图医疗技术有限责任公司 A kind of scaling method of guiding tool of operation
CN106553195A (en) * 2016-11-25 2017-04-05 中国科学技术大学 Object 6DOF localization method and system during industrial robot crawl
CN106553195B (en) * 2016-11-25 2018-11-27 中国科学技术大学 Object 6DOF localization method and system during industrial robot crawl
CN107450885A (en) * 2017-07-21 2017-12-08 上海交通大学 A kind of coordinate transform method for solving of industrial robot and three-dimension sensor
CN107450885B (en) * 2017-07-21 2020-09-08 上海交通大学 Coordinate transformation solving method for industrial robot and three-dimensional sensor
CN109596126A (en) * 2017-09-30 2019-04-09 北京柏惠维康科技有限公司 A kind of determination method and apparatus of robot space coordinates transformational relation
CN109597318A (en) * 2017-09-30 2019-04-09 北京柏惠维康科技有限公司 A kind of method and apparatus of robot space registration
CN109591007A (en) * 2017-09-30 2019-04-09 北京柏惠维康科技有限公司 A kind of robot space register method and device
CN109591007B (en) * 2017-09-30 2022-02-22 北京柏惠维康科技有限公司 Robot space registration method and device
CN109360243A (en) * 2018-09-28 2019-02-19 上海爱观视觉科技有限公司 A kind of scaling method of the movable vision system of multiple degrees of freedom
US11847797B2 (en) 2018-09-28 2023-12-19 Anhui Eyevolution Technology Co., Ltd. Calibration method for multi-degree-of-freedom movable vision system
CN111207747A (en) * 2018-11-21 2020-05-29 中国科学院沈阳自动化研究所 Spatial positioning method based on HoloLens glasses
CN111207747B (en) * 2018-11-21 2021-09-28 中国科学院沈阳自动化研究所 Spatial positioning method based on HoloLens glasses
WO2020114046A1 (en) * 2018-12-04 2020-06-11 中冶赛迪重庆信息技术有限公司 Coordinate system calibration method and system, computer readable storage medium, and device

Also Published As

Publication number Publication date
CN104236456B (en) 2016-10-26

Similar Documents

Publication Publication Date Title
CN104236456A (en) Robot hand-eye calibration method based on two-degree-of-freedom three-dimensional visual sensor
CN105014677B (en) Vision Mechanical arm control method based on Camshift visual tracking and D-H modeling algorithm
CN103308925B (en) Integral three-dimensional color laser radar data point cloud generating method
CN102794767B (en) B spline track planning method of robot joint space guided by vision
CN104385281B (en) A kind of Zero calibration method of two-freedom high speed parallel robot
CN103412565B (en) A kind of robot localization method with the quick estimated capacity of global position
CN107168186B (en) 4 automatic horizontal control systems and its working method based on six axis combination sensors
CN102261908B (en) Geometric constraint-based method for measuring three-dimensional attitude of object
CN104385283B (en) A kind of quick judgment method of sixdegree-of-freedom simulation Singularity
CN101660903B (en) Extrinsic parameter computing method for measurement robot
CN108759822B (en) Mobile robot 3D positioning system
CN109623822B (en) Robot hand-eye calibration method
CN109974742A (en) A kind of laser Method for Calculate Mileage and map constructing method
CN105522577A (en) Method and device used for planning Descartes trajectory of five-axis bending robot
CN104298244A (en) Industrial robot three-dimensional real-time and high-precision positioning device and method
CN102506711A (en) Line laser vision three-dimensional rotate scanning method
CN105987697A (en) Right-angle bend Mecanum wheel type AGV (Automatic Guided Vehicle) navigation and positioning method and system
CN104070523B (en) The real-time circular interpolation implementation method of industrial robot based on space coordinate conversion
CN103324140B (en) Generating method of general cutter scanning body in five-axis machining
CN110802600A (en) Singularity processing method of six-degree-of-freedom articulated robot
CN106335061A (en) Hand-eye relation calibration method based on four-freedom-degree robot
CN102707664B (en) Method for smoothing machining route of five-axle machining cutter
CN104021249A (en) Method for accurately calculating motion track of any point on surface of upper rotary forging die
CN105459116A (en) Robot remote operation device and method based on magnetometer
CN104950893A (en) Homography matrix based visual servo control method for shortest path

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant