CN103115615B - Fully-automatic calibration method for hand-eye robot based on exponential product model - Google Patents

Fully-automatic calibration method for hand-eye robot based on exponential product model Download PDF

Info

Publication number
CN103115615B
CN103115615B CN201310030955.7A CN201310030955A CN103115615B CN 103115615 B CN103115615 B CN 103115615B CN 201310030955 A CN201310030955 A CN 201310030955A CN 103115615 B CN103115615 B CN 103115615B
Authority
CN
China
Prior art keywords
robot
joint
binocular camera
camera
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310030955.7A
Other languages
Chinese (zh)
Other versions
CN103115615A (en
Inventor
王海霞
卢晓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Zhuo Xintong Intelligent Technology Co ltd
Original Assignee
Shandong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University of Science and Technology filed Critical Shandong University of Science and Technology
Priority to CN201310030955.7A priority Critical patent/CN103115615B/en
Publication of CN103115615A publication Critical patent/CN103115615A/en
Application granted granted Critical
Publication of CN103115615B publication Critical patent/CN103115615B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses a fully-automatic calibration method for a hand-eye robot based on an exponential product model and particularly relates to a fully-automatic calibration method for achieving synchronization of three parts including a robot body, hands and eyes as well as a camera. The method can be achieved by utilizing a mirror and a plurality of beacon scaffolds instead of additional auxiliary measuring instruments. The fully-automatic calibration method is mainly characterized in that firstly, hand information outside the view field of a terminal camera is converted to a coordinate system of the camera by utilizing mirror reflection principle to finish hand and eye calibration; and secondly, the terminal camera carried by the robot is used as a measuring instrument and a checkerboard on a spacial mirror surface is used as a measuring target, so that the times for shooting the target by using the camera in a rotating process of each joint of the robot are increased, and meanwhile, the camera calibration and the joint parameter calibration are achieved under a special motion strategy. The fully-automatic calibration method for the robot is simple and effective.

Description

A kind of hand-eye machine people full automatic calibration method based on exponent product model
Technical field
The present invention relates to hand-eye machine people and demarcate field, more particularly relate to a kind of hand-eye machine people full automatic calibration method based on exponent product model, to improve the absolute fix precision of hand-eye machine people.
Background technology
Hand-eye machine people refers to and installs camera robot end, plays the effect of eyes, because it is flexible, enjoys people to pay close attention to.The height of its integral calibrating precision determines that this robot can be applied to actual a kind of important indicator, the integral calibrating of hand-eye machine people comprises camera calibration, hand and eye calibrating, robot body demarcation, and wherein camera calibration refers to and demarcates the hard-wired camera intrinsic parameter of robot end.Hand and eye calibrating demarcates the relation between robot end's instrument and camera mounted thereto.It is demarcate each joint parameter of robot that robot body is demarcated, and different kinematics models has different parameter definition.It is all the very important problem in robot vision field that every part is demarcated, all more complicated, therefore a lot of inconvenience is brought to the application of trick robot, people simplify calibration process under the prerequisite ensureing precision as far as possible for this reason, there are camera calibration and hand and eye calibrating to carry out simultaneously, have hand and eye calibrating and robot body to demarcate to carry out simultaneously, also have camera calibration and robot body to demarcate to carry out simultaneously, but also there is no a kind of scaling method three simultaneously carried out, and calibration process complexity needs could realize alternately, therefore a kind of simple in the urgent need to developing, effective automated process realizes the integral calibrating of hand-eye machine people.
Summary of the invention
Task of the present invention is to solve the deficiency that in prior art, hand-eye machine people scaling method exists, and provides a kind of hand-eye machine people full automatic calibration method based on exponent product model.
Its technical solution is:
Based on a hand-eye machine people full automatic calibration method for exponent product model, comprise the following steps:
A provides industrial robot, end binocular camera, one piece of level crossing and some surveyor's beacons; End binocular camera takes detachable connected mode to be installed on robot end, on the minute surface that surveyor's beacon is pasted on level crossing and robot end's tool control point place; Then step b is entered,
B determines the initial position of robot, and select suitable position to be positioned over by the level crossing being pasted with surveyor's beacon in end binocular camera field range, utilize end binocular camera to obtain the mirror image one of level crossing simultaneously, comprise minute surface surveyor's beacon information and the surveyor's beacon information reflexing to the robot end's tool control point place inside mirror in this mirror image, be used for demarcating trick relation; Then step c is entered,
C control rotates one by one from end joint, utilizes end binocular camera certain angular interval in the rotary course of each joint to take minute surface surveyor's beacon image once, until it exceeds end binocular camera field range; Suppose to obtain m in each joint motions process of robot iwidth image, i=1 ..., n, n are joint of robot number, utilize width image calibration camera and joint of robot parameter.
In above-mentioned steps a, the exponent product motion model of setting robot is:
g wt ( θ ) = e θ 1 ξ ^ 1 e θ 2 ξ ^ 2 . . . e θ n ξ ^ n
Wherein g wt(θ) be that robot is in θ=(θ 1, θ 2, θ 3..., θ n) time end-of-arm tooling coordinate system T-phase to the transformation relation of world coordinate system W, it is motion spinor exponential matrix;
In above-mentioned steps b, setting trick variation relation is A ch, wherein A ocand A ch'for given data;
In above-mentioned steps c, suppose that the point that end binocular camera is measured in i-th joint rotary course is designated as x ij∈ R 4 × numfor space three-dimensional homogeneous coordinates, m ibe the number of times of camera measurement data in the i-th joint rotary course, num is the number of measured point, space, and joint parameter solution formula is:
x i , a = e θ ij ξ ^ ij x i , b , i = 1 , . . . , n , j = 1,2,3 , . . . , m i , a , b ∈ { 1 , . . . , m i } ,
Wherein x i,aand x i,bx iin the homogeneous coordinates value of any two groups of data, can directly obtain spinor parameter ξ according to above formula ij, then pass through X iin m igroup measurement data, every two groups are obtained one group of solution, are obtained by combination of two group is separated, will group solution is averaged as final joint parameter, that is:
ξ i = 1 C m i 2 Σ j = 1 C m i 2 ξ ij
Wherein ξ ifor motion spinor coordinate.
The present invention has following Advantageous Effects:
1, realize simply.Do not need extra measuring equipment, only utilize self-contained end camera to obtain in robot each joint rotary course trick, camera, body can all be demarcated by width image.
2, error is introduced little.Error source only has measuring accuracy index of end camera.
3, precision is high.Error source is few, and utilizes the image calibration joint of robot parameter of calibration for cameras, and its degree of accuracy is higher.
Accompanying drawing explanation
Below in conjunction with accompanying drawing and embodiment, the present invention is further described:
Fig. 1 is the robot kinematics's illustraton of model in the present invention.
Fig. 2 is the trick relation calibration principle figure in the present invention.
Fig. 3 is the calibration principle figure in the present invention.
Embodiment
Shown in Fig. 1 is robot kinematics's illustraton of model.During initial position, world coordinate system W and tool coordinates system T is based upon on robot end's camera, world coordinates ties up to initial position and maintains static, and tool coordinates is moved along with the motion of robot, assuming that there be n rotary joint of connecting in robot, then the exponent product motion model of robot is:
g wt ( θ ) = e θ 1 ξ ^ 1 e θ 2 ξ ^ 2 . . . e θ n ξ ^ n - - - ( 1 )
Wherein g wt(θ) be that robot is in θ=(θ 1, θ 2, θ 3..., θ n) time end-of-arm tooling coordinate system T-phase to the transformation relation of world coordinate system W. it is motion spinor exponential matrix, be specifically expressed as follows:
e θ ξ ^ = e θ ω ^ ( I - e θ ω ^ ) r 0 1 = R 3 × 3 t 3 × 1 0 1 , ω ≠ 0 - - - ( 2 )
ω ^ = 0 - ω 3 ω 2 ω 3 0 - ω 1 - ω 2 ω 1 0
for the antisymmetric matrix of ω.ξ ifor motion spinor coordinate, can be expressed as:
ξ i = ω i ω i × r i ∈ R 6 - - - ( 3 )
Wherein ω iand r i∈ R 3, i=1 ..., n is the position vector of turning axle unit direction, joint under world coordinate system and turning axle, is collectively referred to as spinor parameter, also referred to as joint parameter.And the coordinate of spatial point under world coordinate system can obtain according to following formula:
X w=g wt(θ)X t (4)
Wherein X t, X w∈ R 4 × 1be respectively the homogeneous coordinates of spatial point under tool coordinates system and world coordinate system.
Shown in Fig. 2 is hand and eye calibrating schematic diagram.First select a suitable attitude as robot initial position, and obtain the mirror image of the level crossing in field range, contain minute surface surveyor's beacon information in this mirror image and reflex to the surveyor's beacon information that the end hand inside mirror grabs.Namely utilize this two parts information acquisition trick relation, concrete grammar is as follows:
Consider robot end at camera not within sweep of the eye, mirror can be utilized to reflect it to it within sweep of the eye, as shown in Figure 2, C, H, O are respectively camera coordinates system, hand coordinate system and minute surface coordinate system, C', H' are corresponding virtual image coordinate system, X be space a bit, its virtual image is X'.Before introducing hand and eye calibrating process, first provide two propositions:
Proposition 1: by specularly reflected jiong coordinate system, if the left-handed system of being originally, be right-handed system after reflection, vice versa;
Proposition 2: due to point and the duality of camera, at the camera observation station X' at a C place, be equivalent to the camera observation station X at a C' place.
The homogeneous coordinates of postulated point X' and X under camera C and O coordinate system are expressed as X ' c, X' o, X cand X o, coordinate system O and C, C and C', C and H', coordinate transform between C' and H', C and H are respectively A oc, A cc', A ch', A c'h', A ch, wherein A ocand A ch'for given data, A chfor needing the trick transformation relation solved.Then X ' cwith X' o, X cwith X obetween there is following relation:
X' o=A ocX′ c (5)
X o=A ocX c (6)
According to proposition 1:
X o = 1 0 0 0 0 1 0 0 0 0 - 1 0 0 0 0 1 X o ′ = BX o ′ , B = 1 0 0 0 0 1 0 0 0 0 - 1 0 0 0 0 1 - - - ( 7 )
Bring formula (5) and (6) into (7) to arrange:
X c = A oc - 1 BA oc X c ′ - - - ( 8 )
If note X c'for the homogeneous coordinates of X under coordinate system C', according to the known X ' of proposition 2 c=X c', then formula (8) can be changed into:
X c = A oc - 1 BA oc X c ′ - - - ( 9 )
Transformation relation A then between known camera coordinates system C and C ' cc'for:
A cc ′ = A oc - 1 BA oc - - - ( 10 )
Again according to the transformation relation in Fig. 2 between coordinate:
A c ′ h ′ = A cc ′ - 1 A ch ′ - - - ( 11 )
In addition, according to proposition 2:
A ch=A c'h' (12)
Formula (10) and (11) are brought in formula (12), can obtain:
A ch = A oc - 1 BA oc A ch ′ - - - ( 13 )
Then trick relativeness A chuniquely can be determined by formula (13).
As Fig. 3, the present invention does not need extra surveying instrument, only need utilize mirror and some surveyor's beacons.Its principal feature is: one is, under extraneous for end camera fields of view hand information is transformed into camera coordinates system by the principle of reflection that make use of mirror, complete hand and eye calibrating; Two is as surveying instrument using end camera self-contained for robot, with the gridiron pattern (surveyor's beacon) on the minute surface of space for measurement target, using the point of fixity within the scope of spatial field of view as impact point, then in robot single joint rotary course, impact point is taken, first utilize these image calibration camera intrinsic parameters, to reentry the 3 d space coordinate value of impact point, wherein end binocular camera intrinsic parameter is that the planar approach utilizing Zhang Zhengyou to propose carries out demarcating, and the scaling method of joint of robot parameter is as follows:
Suppose that the point that end camera is measured in i-th joint rotary course is designated as x ij∈ R 4 × numfor space three-dimensional homogeneous coordinates, m ibe the number of times of camera measurement data in the i-th joint rotary course, num is the number of measured point, space, then joint parameter solution formula is:
x i , a = e θ ij ξ ^ ij x i , b , i = 1 , . . . , n , j = 1,2,3 , . . . , m i , a , b ∈ { 1 , . . . , m i } - - - ( 14 )
X i,aand x i,bx iin the homogeneous coordinates value of any two groups of data, can directly obtain spinor parameter ξ according to above formula ij.Due to X iin have m igroup measurement data, every two groups can be obtained one group of solution, can be obtained by combination of two group is separated, by this group solution is averaged as final joint parameter, that is:
ξ i = 1 C m i 2 Σ j = 1 C m i 2 ξ ij - - - ( 15 )
The relevant technologies content do not addressed in aforesaid way is taked or uses for reference prior art to realize.
It should be noted that, under the instruction of this instructions, those skilled in the art can also make such or such easy variation pattern, such as equivalent way, or obvious mode of texturing.Above-mentioned variation pattern all should within protection scope of the present invention.

Claims (2)

1., based on a hand-eye machine people full automatic calibration method for exponent product model, it is characterized in that comprising the following steps:
A provides industrial robot, end binocular camera, one piece of level crossing and some surveyor's beacons; End binocular camera takes detachable connected mode to be installed on robot end, on the minute surface that surveyor's beacon is pasted on level crossing and robot end's tool control point place; Then step b is entered,
B determines the initial position of robot, and select suitable position to be positioned over by the level crossing being pasted with surveyor's beacon in end binocular camera field range, utilize end binocular camera to obtain the mirror image one of level crossing simultaneously, comprise minute surface surveyor's beacon information and the surveyor's beacon information reflexing to the robot end's tool control point place inside mirror in this mirror image, be used for demarcating trick relation; Then step c is entered,
C control rotates one by one from end joint, utilizes end binocular camera certain angular interval in the rotary course of each joint to take minute surface surveyor's beacon image once, until it exceeds end binocular camera field range; Suppose to obtain m in each joint motions process of robot iwidth image, i=1 ..., n, n are joint of robot number, utilize width image calibration end binocular camera and joint of robot parameter.
2. a kind of hand-eye machine people full automatic calibration method based on exponent product model according to claim 1, is characterized in that:
In above-mentioned steps a, the exponent product motion model of setting robot is:
g wt ( θ ) = e θ 1 ξ ^ 1 e θ 2 ξ ^ 2 . . . e θ n ξ ^ n
Wherein g wt(θ) be that robot is in θ=(θ 1, θ 2, θ 3..., θ n) time end-of-arm tooling coordinate system T-phase to the transformation relation of world coordinate system W, it is motion spinor exponential matrix;
In above-mentioned steps b, setting trick variation relation is A ch, wherein A ocand A ch'for given data; A ocrepresent the coordinate conversion relation between minute surface and end binocular camera, a ocinverse matrix; A ch'for the coordinate conversion relation between end binocular camera and robot end's virtual image, A ocand A ch'directly can obtain by measuring, B = 1 0 0 0 0 1 0 0 0 0 - 1 0 0 0 0 1 ;
In above-mentioned steps c, suppose that the point that end binocular camera is measured in i-th joint rotary course is designated as i=1 ..., n, x ij∈ R 4 × numfor space three-dimensional homogeneous coordinates, m ibe the number of times of end binocular camera measurement data in the i-th joint rotary course, num is the number of measured point, space, and joint parameter solution formula is:
x i , a = e θ ij ξ ^ ij x i , b , i = 1 , . . . , n , j=1,2,3,...,m i,a,b∈{1,...,m i},
Wherein x i,aand x i,bx iin the homogeneous coordinates value of any two groups of data, can directly obtain spinor parameter ξ according to above formula ij, then pass through X iin m igroup measurement data, every two groups are obtained one group of solution, are obtained by combination of two group is separated, will group solution is averaged as final joint parameter, that is:
ξ i = 1 C m i 2 Σ j = 1 C m i 2 ξ ij
Wherein ξ ifor motion spinor coordinate.
CN201310030955.7A 2013-01-28 2013-01-28 Fully-automatic calibration method for hand-eye robot based on exponential product model Active CN103115615B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310030955.7A CN103115615B (en) 2013-01-28 2013-01-28 Fully-automatic calibration method for hand-eye robot based on exponential product model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310030955.7A CN103115615B (en) 2013-01-28 2013-01-28 Fully-automatic calibration method for hand-eye robot based on exponential product model

Publications (2)

Publication Number Publication Date
CN103115615A CN103115615A (en) 2013-05-22
CN103115615B true CN103115615B (en) 2015-01-21

Family

ID=48414035

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310030955.7A Active CN103115615B (en) 2013-01-28 2013-01-28 Fully-automatic calibration method for hand-eye robot based on exponential product model

Country Status (1)

Country Link
CN (1) CN103115615B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9211643B1 (en) 2014-06-25 2015-12-15 Microsoft Technology Licensing, Llc Automatic in-situ registration and calibration of robotic arm/sensor/workspace system
CN105598957A (en) * 2016-01-27 2016-05-25 国机集团科学技术研究院有限公司 Industrial robot kinematic modelling method and system
CN106113035B (en) * 2016-06-16 2018-04-24 华中科技大学 A kind of Six-DOF industrial robot end-of-arm tooling coordinate system caliberating device and method
CN107369184B (en) * 2017-06-23 2020-02-28 中国科学院自动化研究所 Synchronous calibration method for hybrid binocular industrial robot system and other devices
CN110246193B (en) * 2019-06-20 2021-05-14 南京博蓝奇智能科技有限公司 Industrial robot end camera online calibration method
CN110370316B (en) * 2019-06-20 2021-12-10 重庆大学 Robot TCP calibration method based on vertical reflection
CN111829492B (en) * 2020-07-24 2021-11-30 中交第二航务工程局有限公司 Laser plummet application-based contact measurement method
CN112809668B (en) * 2020-12-30 2022-08-30 上海媒智科技有限公司 Method, system and terminal for automatic hand-eye calibration of mechanical arm
CN112907673B (en) * 2021-03-19 2021-10-22 深圳创维-Rgb电子有限公司 Positioning method, positioning device, terminal equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101096101A (en) * 2006-06-26 2008-01-02 北京航空航天大学 Robot foot-eye calibration method and device
CN101630409A (en) * 2009-08-17 2010-01-20 北京航空航天大学 Hand-eye vision calibration method for robot hole boring system
CN101733749A (en) * 2009-12-22 2010-06-16 哈尔滨工业大学 Multidomain uniform modeling and emulation system of space robot
CN102022989A (en) * 2010-09-29 2011-04-20 山东科技大学 Robot calibration method based on exponent product model
EP2402124A2 (en) * 2010-06-30 2012-01-04 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and device for determining structural parameters of a robot

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090118864A1 (en) * 2007-11-01 2009-05-07 Bryce Eldridge Method and system for finding a tool center point for a robot using an external camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101096101A (en) * 2006-06-26 2008-01-02 北京航空航天大学 Robot foot-eye calibration method and device
CN101630409A (en) * 2009-08-17 2010-01-20 北京航空航天大学 Hand-eye vision calibration method for robot hole boring system
CN101733749A (en) * 2009-12-22 2010-06-16 哈尔滨工业大学 Multidomain uniform modeling and emulation system of space robot
EP2402124A2 (en) * 2010-06-30 2012-01-04 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method and device for determining structural parameters of a robot
CN102022989A (en) * 2010-09-29 2011-04-20 山东科技大学 Robot calibration method based on exponent product model

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种实用的手眼视觉系统定位方法;王海霞等;《计算机工程与应用》;20071231;第43卷(第24期);第235-238页 *
双机器人系统的快速手眼标定方法;魏振忠等;《光学精密工程》;20110831;第19卷(第8期);第1895-1902页 *

Also Published As

Publication number Publication date
CN103115615A (en) 2013-05-22

Similar Documents

Publication Publication Date Title
CN103115615B (en) Fully-automatic calibration method for hand-eye robot based on exponential product model
CN106338245B (en) A kind of non-contact traverse measurement method of workpiece
CN101435704B (en) Star tracking method of star sensor under high dynamic state
CN103110429B (en) The optical calibration method of ultrasound probe
CN103759669B (en) A kind of monocular vision measuring method of heavy parts
CN103278139B (en) A kind of varifocal single binocular vision sensing device
CN102692214B (en) Narrow space binocular vision measuring and positioning device and method
CN102679961B (en) Portable four-camera three-dimensional photographic measurement system and method
CN103175470B (en) Reference sphere positioning and measuring method based on line-structured light vision sensor
CN105509733A (en) Measuring method for relative pose of non-cooperative spatial circular object
CN102519434B (en) Test verification method for measuring precision of stereoscopic vision three-dimensional recovery data
CN102506711B (en) Line laser vision three-dimensional rotate scanning method
CN108844543A (en) Indoor AGV navigation control method based on UWB positioning and dead reckoning
CN104155765A (en) Method and equipment for correcting three-dimensional image in tiled integral imaging display
CN105115560A (en) Non-contact measurement method for cabin capacity
CN107796370A (en) For obtaining the method, apparatus and mobile mapping system of conversion parameter
CN105043250A (en) Dual-view-angle data alignment method based on at least two common mark points
CN106157322B (en) A kind of camera installation site scaling method based on plane mirror
CN105318838A (en) Method and system for single-plane calibration of relation between laser range finder and tail end of mechanical arm
CN103591966A (en) Star simulator test platform and test calibration method
CN104739514A (en) Automatic tracking and positioning method for surgical instrument in large visual field
CN103985121A (en) Method for calibrating underwater projector
CN102096918B (en) Calibration method of parameters of camera for rendezvous and docking
CN103260008B (en) A kind of image position is to the projection conversion method of physical location
CN105434046A (en) Surgical navigator positioning method based on elimination of infrared filter refraction effect

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C53 Correction of patent of invention or patent application
CB03 Change of inventor or designer information

Inventor after: Wang Haixia

Inventor after: Lu Xiao

Inventor after: Li Yuxia

Inventor after: Fan Binghui

Inventor before: Wang Haixia

Inventor before: Lu Xiao

COR Change of bibliographic data

Free format text: CORRECT: INVENTOR; FROM: WANG HAIXIA LU XIAO TO: WANG HAIXIA LU XIAO LI YUXIA FAN BINGHUI

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220608

Address after: 266590 room 402, industrialization building, 579 qianwangang Road, Huangdao District, Qingdao City, Shandong Province

Patentee after: Qingdao Zhuo Xintong Intelligent Technology Co.,Ltd.

Address before: 266590 No. 579, Qian Wan Gang Road, Qingdao economic and Technological Development Zone, Shandong

Patentee before: SHANDONG University OF SCIENCE AND TECHNOLOGY