CN1686682A - Adaptive motion selection method used for robot on line hand eye calibration - Google Patents

Adaptive motion selection method used for robot on line hand eye calibration Download PDF

Info

Publication number
CN1686682A
CN1686682A CN 200510025780 CN200510025780A CN1686682A CN 1686682 A CN1686682 A CN 1686682A CN 200510025780 CN200510025780 CN 200510025780 CN 200510025780 A CN200510025780 A CN 200510025780A CN 1686682 A CN1686682 A CN 1686682A
Authority
CN
China
Prior art keywords
motion
trick
new
utilize
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN 200510025780
Other languages
Chinese (zh)
Inventor
张婧
石繁槐
王建华
刘允才
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN 200510025780 priority Critical patent/CN1686682A/en
Publication of CN1686682A publication Critical patent/CN1686682A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

An adaptive movement selection method for in-line calibration of robot's hand and eyes includes using the previous 5 movement parameters of the hands of robot to calculate the inertial values of 3 thresholds for movement selection, in-line calibration of hands and eyes on the basis of movement selection, calculating rotational error and traslation error, adaptively changing said 3 thresholds, performing movement selection and calibration of hands and eyes again, and cyclically repeating above-said steps.

Description

The adaptive motion selection method that is used for robot on line hand eye calibration
Technical field
The present invention relates to a kind of adaptive motion selection method that is used for robot on line hand eye calibration, can be widely used in aspects such as robot three-dimensional vision measurement, visual servo and tactilely-perceptible.Belonging to advanced makes and automatic field.
Background technology
It is basic problem in the machine vision that Robot Hand-eye is demarcated.The accuracy of hand and eye calibrating has partly determined the precision of machine vision applications when robot.It is exactly camera and relative position between the robot hand and the direction relations of measuring on the end effector (robot hand) that is fixed on robot arm that Robot Hand-eye is demarcated.Most of method for solving of this type of problem all was to separate homogeneous transformation equation AX=XB (Y.C.Shiu and S.Ahmad by iteration optimization algorithms in the past, " Calibration of wrist-mounted robotic sensors bysolving homogeneous transform equations of the form AX=XB; " IEEE Trans.Robot.Automat., vol.5, pp.16-29, Feb.1989.), here A represents the motion of robot hand, B represents corresponding camera motion, video camera and spatial alternation robot hand between the relation of X for demarcating.Because iteration optimization computational process is non real-time, so hand and eye calibrating can only off line carry out in this case.(J.Angeles such as Angeles afterwards, G.Soucy and F.P.Ferrie, " The online solution of thehand-eye problem ", IEEE Trans.Robot. Automat., vol.16, pp.720-731, Dec.2000.) and (N.Andreff such as Andreff, R.Horaud, and B.Espiau, " Robot hand-eye calibrationusing structure-from-motion ", Int.J.Robot.Res., 20 (3): 228-248,2001.) almost propose the canbe used on line technology of hand and eye calibrating simultaneously, overcome the shortcoming that conventional method can not be carried out real-time calibration.
No matter adopt which kind of method to carry out hand and eye calibrating, the calculating of trick relation all requires robot self-movement twice at least, and the rotating shaft of twice motion must not be parallel.Therefore, when there is degenerate case in the trick exercise data that collects,, just can't obtain the exact solution of trick transformation relation as pure rotation or pure flat move etc.But when we when carrying out online hand-eye calibration, owing to the motion of robot is that what to be determined by concrete application is not for hand and eye calibrating designs, so just probably there is the degeneration situation in the exercise data that is used for hand and eye calibrating that collects this moment.In addition, the anglec of rotation is very little or translation is bigger in the sampling motion, when perhaps the angle between the rotating shaft of twice motion is very little, all can make last calibration result produce bigger error.For fear of this situation, Chinese invention patent " based on the robot on line hand eye calibration method of motion selection " (application number: 200510025252.0) proposed to improve the raising robot on line hand eye calibration by the approach that motion is selected.This kind method is rule of thumb set three threshold values that motion is selected earlier, i.e. the max-thresholds d of the mould of the translational component of the minimum threshold β of the anglec of rotation of the minimum threshold α of angle, the each motion of robot hand and the each motion of robot hand between the rotating shaft of robot hand twice motion in front and back.Since the trick sampling motion first time, select two satisfactory trick motions successively according to threshold value to calculating the trick transformation relation.Use this method, improved the accuracy of online hand-eye calibration greatly.But the weak point of this method is: rule of thumb set α, the value of β and d is not considered the characteristics of motion sequence self, do not have the robustness that adapts to various situations, and just no longer change of threshold value setting in a single day, if setting is improper, may make that then the demarcation number of times is very few.
Summary of the invention
The objective of the invention is at the deficiencies in the prior art, a kind of adaptive motion selection method that is used for robot on line hand eye calibration is provided, can be according to different applicable cases, automatically calculate the threshold value that is used to move and selects, increase the number of times of on-line proving, improved the Robot Hand-eye stated accuracy.
Technical scheme of the present invention: the movement characteristic that the trick exercise data sequence that samples when making full use of on-line proving is reflected, adaptive definite threshold value.The calculation of parameter of at first utilizing the first five time motion of robot hand to obtain be used to the move initial value of three threshold values selecting then, selects to carry out online hand-eye calibration based on motion.Rotation error and translation error are calculated in each back of demarcating, according to three threshold values of the adaptive change of error.And then utilize new threshold value to move again and select and hand and eye calibrating.Move in circles, can continuously carry out the online hand-eye calibration operation of robot.
Self adaptation online hand-eye calibration method of the present invention specifically comprises following step:
1. utilize in the first five time of robot hand motion the mean value of the mould of the translational component of the anglec of rotation of the sine value of angle, each motion and each motion between the rotating shaft of per twice motion to set three threshold values that motion is selected: α sine value, β and d;
2. search selects a trick motion of satisfying β and d threshold condition right according to the motion selection algorithm.If motion is selected to have carried out continuously 5 times, still can not satisfy threshold value, then reduce β, increase d, utilize new threshold value to continue to seek then.So move in circles, until find the motion of qualified trick to as first trick motion to (A ', B ');
3. search selects a trick motion of satisfying α sine value, β and d threshold condition right according to the motion selection algorithm.If motion is selected to have carried out continuously 5 times, still can not satisfy threshold value, then reduce α sine value and β, increase d, utilize new threshold value to continue to seek then.So move in circles, until find qualified second trick motion to (A ", B ");
4. utilize the motion of two tricks to (A ', B ') and (A ", B ") with Andreff hand and eye calibrating linear equation calculating trick transformation relation matrix X, obtains the hand and eye calibrating result one time;
5. calculate the rotation of trick transformation relation matrix X and the root-mean-square error of translational component, number of times demarcated in record.If demarcate number of times less than 5 times, then change threshold value, increase α sine value and β, reduce d, second trick motion that will be used to demarcate last time to (A "; B ") as new first trick motion to (A ', B '), utilize the threshold value after changing, adopt the method for step 3 from the subsequent sampling data, continues search select second new trick move to (A "; B "), adopt the method for step 4 to carry out new hand and eye calibrating, up to demarcating number of times more than or equal to 5 times;
6. utilize this rotation of demarcating five groups of trick transformation relation matrix X that obtain before and root-mean-square error and five groups of threshold values of translation, utilize cubic polynomial to return and predict new threshold value.Second trick motion that will be used to demarcate last time to (A "; B ") as new first trick motion to (A ', B '), utilize new threshold value again, adopt the method for step 3 from the subsequent sampling data, continues search select second new trick motion to (A ", B "), the method for employing step 4 is carried out new hand and eye calibrating, repeat this step, carry out on-line proving.
During practical application, can be according to three threshold values of adaptive setting, move automatically by software and to select and hand and eye calibrating calculating.The motion selection method that the present invention proposes, three threshold values can coming adaptive setting campaign to select according to the kinetic property of motion sequence self have increased and have utilized motion to select to carry out the demarcation number of times of online hand-eye calibration, have improved stated accuracy simultaneously.
The adaptive motion selection method that the present invention proposes can be widely used in aspects such as robot three-dimensional vision measurement, visual servo and tactilely-perceptible, has suitable practical value.
Description of drawings
Fig. 1 is a Robot Hand-eye peg model schematic diagram of the present invention.
Fig. 2 is the motion selection algorithm schematic diagram that is used for online hand-eye calibration of the present invention.
The specific embodiment
In order to explain technical scheme of the present invention better, be described in further detail below in conjunction with drawings and Examples.
1. utilize in the first five time of robot hand motion the mean value of the mould of the translational component of the anglec of rotation of the sine value of angle, each motion and each motion between the rotating shaft of per twice motion to set three threshold value: sin (α) that motion is selected, β, d.If the homogeneous matrix of i (i is a natural number) inferior video camera that samples and paw attitude is respectively P i, Q i, X is the required homogeneous matrix of trick transformation relation of finding the solution.
( sin ( α ) , β , d ) = 1 5 ( Σ n = 1 5 sin ( α ) n , Σ n = 1 5 β n , Σ n = 1 5 d n )
2. select a trick motion of satisfying β and d threshold condition to (A ', B ') according to motion selection algorithm search.At first make start=6, can obtain the homogeneous matrix A of paw motion for the first time from the trick attitude of the 6th time and the 7th time 1Homogeneous matrix B with camera motion 1, as shown in Figure 1, P 1, P 2Be respectively video camera the 1st and the 2nd constantly with respect to the homogeneous matrix of the attitude of calibrated reference, Q 1, Q 2Be respectively robot hand the 1st be engraved in the homogeneous matrix of attitude under the robot basis coordinates system at the 2nd o'clock, then have:
A 1 = Q start - 1 Q 7 , B 1 = P start - 1 P 7
If (A ', B ')=(A 1, B 1).If the anglec of rotation of A ' more than or equal to the mould of β and A ' translational component smaller or equal to d, then current motion to (A ', B ') satisfies given threshold condition, otherwise again by the trick Attitude Calculation trick motion of the 6th time and the 8th time sampling to (A ', B ') and judge the anglec of rotation of A ' and whether the mould of the translation of A ' satisfies above condition.If motion is selected to have carried out continuously 5 times, promptly A ′ = Q start - 1 · Q i , I-start=5 still can not satisfy threshold value, then reduces β, increases d, makes β multiply by 0.8, makes d multiply by 1.2, makes start=i then.Utilize new threshold value to continue to seek, so move in circles, until find qualified trick motion to as first trick motion to (A ', B ').If this moment (A ', B ') by for the first time and the trick attitude of sampling for the i+1 time (promptly having passed through i trick moves) calculate.Fig. 2 shows the right complete procedure of first trick that satisfies condition motion of search.
3. since the trick attitude (promptly the i time motion back) of the i+1 time sampling, the trick that satisfies α sine value, β and d threshold condition according to one of motion selection algorithm search selection move to (A ", B "), as shown in Figure 2.At first make start=i, can obtain the paw homogeneous matrix A of moving from the trick attitude of the i+1 time and the i+2 time I+1With the homogeneous matrix B of camera motion I+1, promptly have:
A i + 1 = Q i + 1 - 1 Q i + 2 , B i + 1 = P i + 1 - 1 P i + 2
If (A ", B ")=(A I+1, B I+1).If the mould of the translational component of A " the anglec of rotation more than or equal to β, A " is smaller or equal to d and A ' and A " rotating shaft between the sine value of angle more than or equal to sin (α); then current motion to (A ", B ") satisfies given threshold condition; otherwise again by the trick Attitude Calculation trick motion of the i+1 time and the i+3 time sampling to (whether the sine value of the angle between the rotating shaft of the anglec of rotation of A ", B ") also judges A ", A mould and the A ' and the A of translational component " " satisfies above condition.If motion is selected to have carried out continuously 5 times, promptly A ′ ′ = Q start - 1 Q i , I-start=5 still can not satisfy threshold value, then reduces α sine value and β, increases d, makes sin that (α) multiply by 1.2, and β multiply by 0.8, makes d multiply by 1.2.start=i。Utilize new threshold value to continue to seek then, so move in circles, until find qualified second trick motion to (A ", B ").
4. utilize the motion of two tricks to (A ', B ') and (A ", B ") with Andreff hand and eye calibrating linear equation calculating trick transformation relation matrix X, obtains the hand and eye calibrating result one time.
5. calculate the rotation of trick transformation relation matrix X and the root-mean-square error of translational component, number of times demarcated in record.If demarcate number of times less than 5 times, then change threshold value, increase α sine value and β, reduce d, make sin (α) and β multiply by 1.2, make d multiply by 0.8.Second trick motion that will be used to demarcate last time to (A "; B ") as new first trick motion to (A ', B '), utilize the threshold value after changing, adopt step 3 from the subsequent sampling data, continues search select second new trick move to (A "; B "), adopt step 4 to carry out new hand and eye calibrating, up to demarcating number of times more than or equal to 5 times.
6. utilize this rotation of demarcating five groups of trick transformation relation matrix X that obtain before and root-mean-square error and five groups of threshold values of translation, utilize cubic polynomial to return and predict new threshold value, second trick motion that will be used to demarcate last time to (A "; B ") as new first trick motion to (A ', B '), utilize new threshold value, utilize step 3 from the subsequent sampling data, continue search select second new trick motion to (A "; B "), utilize step 4 to carry out new hand and eye calibrating, repeat this step, carry out on-line proving.
It is as follows wherein to utilize cubic polynomial to return the step of predicting new threshold value:
(1) uses preceding four groups of rotation error data and corresponding sin (α) value, utilize the cubic polynomial regression algorithm, obtain the curved line relation between rotation error and the sin (α).The model that cubic polynomial returns is:
y=b0+b1×x+b2×x 2+b3×x 3
Wherein, y is threshold value (sin (α)), and x is a rotation error, b0, and b1, b2, b3 are the coefficients that will calculate.By calculating b0, b1, b2 behind the b3, makes x equal the 5th group of rotation error, calculates y, just new sin (α) value.
(2) use and calculate sin (α) similar methods, with preceding four groups of rotation errors and corresponding β value, calculate one group of b0, b1, b2, b3 earlier.Utilize the 5th group of rotation error to predict new threshold value beta again 1With preceding four groups of translation errors and corresponding β value, calculate another group b0 ' then, b1 ', b2 ', b3 ' utilizes the 5th group of translation error to predict new threshold value beta again 2Make β=3* β 1+ β 2
(3) use and calculate sin (α) similar methods,, calculate one group of b0 with preceding four groups of translation error data and corresponding d value, b1, b2, b3 obtains the curved line relation between translation error and the d.Utilize the 5th group of rotation error to predict new d value again.

Claims (1)

1, a kind of adaptive motion selection method that is used for robot on line hand eye calibration is characterized in that comprising following concrete steps:
1) utilize in the first five time of robot hand motion the mean value of the mould of the translational component of the anglec of rotation of the sine value of angle, each motion and each motion between the rotating shaft of per twice motion to set three threshold values that motion is selected: α sine value, β and d;
2) search selects a trick motion of satisfying β and d threshold condition right according to the motion selection algorithm, if motion is selected to have carried out continuously 5 times, still can not satisfy threshold value, then reduce β, increase d, utilize new threshold value to continue to seek then, so move in circles, until find the motion of qualified trick to as first trick motion to (A ', B ');
3) search selects a trick motion of satisfying α sine value, β and d threshold condition right according to the motion selection algorithm, if motion is selected to have carried out continuously 5 times, still can not satisfy threshold value, then reduce α sine value and β, increase d, utilize new threshold value to continue to seek then, so move in circles, until find qualified second trick motion to (A ", B ");
4) utilize the motion of two tricks to (A ', B ') and (A ", B ") with Andreff hand and eye calibrating linear equation calculating trick transformation relation matrix X, obtains the hand and eye calibrating result one time;
5) calculate the rotation of trick transformation relation matrix X and the root-mean-square error of translational component, number of times demarcated in record, if demarcate number of times less than 5 times, then change threshold value, increase α sine value and β, reduce d, second trick motion that will be used to demarcate last time to (A "; B ") as new first trick motion to (A ', B '), utilize the threshold value after changing, adopt the method for step 3) from the subsequent sampling data, continues search select second new trick move to (A "; B "), adopt the method for step 4) to carry out new hand and eye calibrating, up to demarcating number of times more than or equal to 5 times;
6) utilize this rotation of demarcating five groups of trick transformation relation matrix X that obtain before and root-mean-square error and five groups of threshold values of translation, utilize cubic polynomial to return and predict new threshold value, second trick motion that will be used to demarcate last time to (A "; B ") as new first trick motion to (A ', B '), utilize new threshold value, adopt the method for step 3) from the subsequent sampling data, continues search select second new trick move to (A "; B "), adopt the method for step 4) to carry out new hand and eye calibrating, repeat this step, carry out on-line proving.
CN 200510025780 2005-05-12 2005-05-12 Adaptive motion selection method used for robot on line hand eye calibration Pending CN1686682A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 200510025780 CN1686682A (en) 2005-05-12 2005-05-12 Adaptive motion selection method used for robot on line hand eye calibration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 200510025780 CN1686682A (en) 2005-05-12 2005-05-12 Adaptive motion selection method used for robot on line hand eye calibration

Publications (1)

Publication Number Publication Date
CN1686682A true CN1686682A (en) 2005-10-26

Family

ID=35304724

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 200510025780 Pending CN1686682A (en) 2005-05-12 2005-05-12 Adaptive motion selection method used for robot on line hand eye calibration

Country Status (1)

Country Link
CN (1) CN1686682A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100408279C (en) * 2006-06-26 2008-08-06 北京航空航天大学 Robot foot-eye calibration method and device
CN104842371A (en) * 2015-05-29 2015-08-19 山东大学 Robot hand-eye calibration method based on non-minimized and non-optimized algorithm
CN108942934A (en) * 2018-07-23 2018-12-07 珠海格力电器股份有限公司 Determine the method and device of hand and eye calibrating
CN109159114A (en) * 2018-08-16 2019-01-08 郑州大学 The accuracy method of SCARA manipulator fixed camera vision system hand and eye calibrating
CN117103286A (en) * 2023-10-25 2023-11-24 杭州汇萃智能科技有限公司 Manipulator eye calibration method and system and readable storage medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100408279C (en) * 2006-06-26 2008-08-06 北京航空航天大学 Robot foot-eye calibration method and device
CN104842371A (en) * 2015-05-29 2015-08-19 山东大学 Robot hand-eye calibration method based on non-minimized and non-optimized algorithm
CN108942934A (en) * 2018-07-23 2018-12-07 珠海格力电器股份有限公司 Determine the method and device of hand and eye calibrating
CN109159114A (en) * 2018-08-16 2019-01-08 郑州大学 The accuracy method of SCARA manipulator fixed camera vision system hand and eye calibrating
CN117103286A (en) * 2023-10-25 2023-11-24 杭州汇萃智能科技有限公司 Manipulator eye calibration method and system and readable storage medium
CN117103286B (en) * 2023-10-25 2024-03-19 杭州汇萃智能科技有限公司 Manipulator eye calibration method and system and readable storage medium

Similar Documents

Publication Publication Date Title
US11045949B2 (en) Deep machine learning methods and apparatus for robotic grasping
Dasari et al. Robonet: Large-scale multi-robot learning
Johns et al. Deep learning a grasp function for grasping under gripper pose uncertainty
Cifuentes et al. Probabilistic articulated real-time tracking for robot manipulation
Kloss et al. Combining learned and analytical models for predicting action effects
CN110799309A (en) Vibration control of a system with configuration dependent dynamics
CN1686682A (en) Adaptive motion selection method used for robot on line hand eye calibration
CN110561421B (en) Mechanical arm indirect dragging demonstration method and device
CN110053044B (en) Model-free self-adaptive smooth sliding mode impedance control method for clamping serial fruits by parallel robot
EP3978204A1 (en) Techniques for force and torque-guided robotic assembly
CN114663496A (en) Monocular vision odometer method based on Kalman pose estimation network
CN112847355B (en) Robot bolt screwing method and system based on DMP
Ruchanurucks et al. Humanoid robot motion generation with sequential physical constraints
CN1672881A (en) On-line robot hand and eye calibrating method based on motion selection
Dietrich et al. Probabilistic multi-sensor fusion based on signed distance functions
Matak et al. Planning visual-tactile precision grasps via complementary use of vision and touch
CN1920885A (en) Network topology model construction based three-dimensional human face cartoon making method
Su et al. Enhanced kinematic model for dexterous manipulation with an underactuated hand
CN111546344A (en) Mechanical arm control method for alignment
Boscariol et al. Jerk-continuous trajectories for cyclic tasks
Su et al. Nonlinear visual mapping model for 3-D visual tracking with uncalibrated eye-in-hand robotic system
CN111002292B (en) Robot arm humanoid motion teaching method based on similarity measurement
Samadani et al. Multi-constrained inverse kinematics for the human hand
CN113012291B (en) Method and device for reconstructing three-dimensional model of object based on manipulator parameters
CN110363793B (en) Object tracking method and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication