CN105157725A - Hand-eye calibration method employing two-dimension laser vision sensor and robot - Google Patents

Hand-eye calibration method employing two-dimension laser vision sensor and robot Download PDF

Info

Publication number
CN105157725A
CN105157725A CN201510460526.2A CN201510460526A CN105157725A CN 105157725 A CN105157725 A CN 105157725A CN 201510460526 A CN201510460526 A CN 201510460526A CN 105157725 A CN105157725 A CN 105157725A
Authority
CN
China
Prior art keywords
coordinate
delta
robot
formula
hand
Prior art date
Application number
CN201510460526.2A
Other languages
Chinese (zh)
Other versions
CN105157725B (en
Inventor
张铁
李波
邹焱飚
Original Assignee
华南理工大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华南理工大学 filed Critical 华南理工大学
Priority to CN201510460526.2A priority Critical patent/CN105157725B/en
Publication of CN105157725A publication Critical patent/CN105157725A/en
Application granted granted Critical
Publication of CN105157725B publication Critical patent/CN105157725B/en

Links

Abstract

A disclosed hand-eye calibration method employing a two-dimension laser vision sensor and a robot comprises the following steps: step A, establishing a mathematic module of a calibration algorithm; step B, making enforcement operation step for hand-eye calibration. The step of establishing the mathematic module of the calibration algorithm realizes derivation of the mathematic module of the calibration algorithm; and the step of making the enforcement operation step for hand-eye calibration solves the concrete enforcement operation flow in the h and-eye calibration process by employing the two-dimension laser vision sensor and the robot. The method possesses the advantages of being simple, practical, agile, good in precision and the like.

Description

The hand and eye calibrating method of a kind of two-dimensional laser vision sensor and robot

Technical field

The present invention relates to the hand and eye calibrating technology of a kind of laser sensor and robot, in particular to the hand and eye calibrating method of a kind of two-dimensional laser vision sensor and robot, this hand and eye calibrating method becomes appearance Motor execution device with the artificial machinery of industrial machine, with industrial robot self and two-dimensional laser sensor for measurement mechanism, robot becomes appearance and drives laser sensor to measure some spaces point of fixity, the attitude of difference recorder people and the coordinate figure of this spatial point in basis coordinates system of robot and laser sensor surving coordinate system, by the transformation relation between certain Algorithm for Solving sensor measurement coordinate system and robot end's flange coordinate system, namely the hand and eye calibrating of two-dimensional laser sensor is carried out.

Background technology

Because vision system has good detection perform and positioning performance, so robotic vision system develops focus and emphasis that oneself becomes robot research field.The informative of vision-sensing method because obtaining, and high sensitivity, high precision, with the advantage such as workpiece is contactless, and to be more and more subject to people's attention.

At present, the image of visual sensing collection has based on the image of natural light, artificial common light and take laser as the structure light image of active light source.In the industrial environment that some is special, such as there is the bad disturbing factors such as strong arc light, dust, smog at welding scene, the performance of traditional C CD camera receives comparatively serious interference, and CCD camera traditional in such a case just can not be finished the work well, poor practicability.By contrast, described two-dimensional laser sensor is based on principle of triangulation, object section profile measurement is carried out by linear beam laser, adopt and filter all parasitic lights comprising arc light with laser with the optical filter of equiwavelength, the integrated optical receiver assembly of sensor internal, CMOS area detector receive only and form the image of laser stripe.The advantage of this two-dimensional laser sensor does not adopt any portable parts, sturdy and durable, not by interference such as arclight, flue dust, splashings.

The advantages such as laser has high-energy, high brightness as active light source, monochromaticity is good, antijamming capability is strong, therefore two-dimensional laser vision sensor has very large development prospect.CCD camera is face battle array vision sensor, and described two-dimensional laser sensor is linear array vision sensor.Machine vision, undoubtedly can the flexibility efficiency of significantly hoisting machine people operation as one of the core technology of detection field and artificial intelligence field.Wherein, mapping relations between visual coordinate system and robot end's joint coordinate system must be known by hand and eye calibrating, the precision of demarcating determines the homework precision of robot to a great extent, for this demarcates vision system, the vision system demarcated can obtain higher precision becomes the technical issues that need to address.

Summary of the invention

The object of the invention is to overcome the shortcoming of prior art and deficiency, the hand and eye calibrating method of a kind of two-dimensional laser vision sensor and robot is provided, this hand and eye calibrating method comprises the mathematical model setting up calibration algorithm, the implementation and operation step formulating hand and eye calibrating, has the features such as easy, practical, flexible, precision is good.

Object of the present invention is achieved through the following technical solutions: the hand and eye calibrating method of a kind of two-dimensional laser vision sensor and robot, this hand and eye calibrating method adopts an inside to be integrated with optical receiver assembly, the two-dimensional laser sensor of CMOS area detector and robot (containing robot controller, teach box), wherein two-dimensional laser sensor is fixedly mounted on robot end's flange by support, forms Eye-in-hand hand-eye system; Note robot basis coordinates be Base}, robot the 6th joint end flange coordinate be that { End}, two-dimensional laser sensor measurement coordinate are that { M}, the object of hand and eye calibrating is and solves coordinate system { M} is relative to the coordinate system { transformation matrix of End}

The linear beam that the two-dimensional laser sensor that the method adopts sends project measured object on the surface time, laser beam can form the image consistent with measured object surface profile, this laser beam has a series of continuous, uniform P laser sampling point, and then sensor returns this P sampled point relative to the sensor measurement coordinate system { Z of M} maxle and X maxial coordinate value;

The method also adopts computing machine to obtain the algorithm computing of the measurement data of two-dimensional laser sensor, the typing completing nominal data, execution demarcation;

The method also needs to adopt some miscellaneous parts, such as: scaling board.

Described two-dimensional laser vision sensor and the hand and eye calibrating method of robot, comprise the following steps:

Steps A, set up the mathematical model of calibration algorithm;

The implementation and operation step of step B, formulation hand and eye calibrating;

Described steps A comprises the following steps:

A1) 1, space P is obtained 1in basis coordinates system of robot, { coordinate in Base} is bp 1, bp 1=[ 1x 1, 1y 1, 1z 1, 1] t, acquisition point P 1at two-dimensional laser sensor measurement coordinate system, { coordinate in M} is mp 1, mp 1=[ mx 1, my 1, mz 1, 1] t.Note trick matrix, namely { relative to robot end's flange coordinate system, { transformation matrix of End} is M} coordinate system T M E n d = r 11 r 12 r 13 Δ x r 21 r 22 r 23 Δ y r 31 r 32 r 33 Δ z 0 0 0 1 , { relative to coordinate system, { transformation relation of Base} is End} note coordinate system

A2) according to the transformation relation between coordinate system, known by equation two ends premultiplication matrix inverse, obtain:

( T E n d B a s e ) - 1 P B 1 = T M E n d P M 1 , - - - ( 1 )

In formula, for coordinate system End} relative to coordinate system the transformation matrix of Base} inverse, bp is 1, space P 1basis coordinates system of robot the coordinate in Base}, for coordinate system M} relative to robot end's flange coordinate system the transformation matrix of End}, mp 1for a P 1at the two-dimensional laser sensor measurement coordinate system { coordinate in M}.

Note T = ( T E n d B a s e ) - 1 P B 1 = [ c 1 , d 1 , e 1 , 1 ] T , Obtain:

T P B 1 = T M E n d P M 1 , - - - ( 2 )

In formula, T be column matrix, bp 1for 1, space P 1basis coordinates system of robot the coordinate in Base}, for coordinate system M} relative to robot end's flange coordinate system the transformation matrix of End}, mp 1for a P 1at the two-dimensional laser sensor measurement coordinate system { coordinate in M}.

Formula (2) is launched:

c 1 d 1 e 1 1 = r 11 r 12 r 13 Δ x r 21 r 22 r 23 Δ y r 31 r 31 r 33 Δ z 0 0 0 1 x M 1 y M 1 z M 1 1 , - - - ( 3 )

In formula, c 1 d 1 e 1 1 For column matrix, r 11 r 12 r 13 Δ x r 21 r 22 r 23 Δ y r 31 r 31 r 33 Δ z 0 0 0 1 For coordinate system M} relative to robot end's flange coordinate system the concrete form of the transformation matrix of End}, x M 1 y M 1 z M 1 1 For column matrix, represent some P 1at two-dimensional laser sensor measurement coordinate system { coordinate in M}.

Formula (3) is launched further obtain:

r 11 x M 1 + r 13 z M 1 + Δ x = c 1 r 21 x M 1 + r 23 z M 1 + Δ y = d 1 r 31 x M 1 + r 33 z M 1 + Δ z = e 1 , - - - ( 4 )

In formula, the implication of variable is see formula (3);

A3) space point P is similar to 1, in like manner, for space point P 2, P 3, have:

r 11 x M 2 + r 13 z M 2 + Δ x = c 2 r 21 x M 2 + r 23 z M 2 + Δ y = d 2 r 31 x M 2 + r 33 z M 2 + Δ z = e 2 , - - - ( 5 )

r 11 x M 3 + r 13 z M 3 + Δ x = c 3 r 21 x M 3 + r 23 z M 3 + Δ y = d 3 r 31 x M 3 + r 33 z M 3 + Δ z = e 3 , - - - ( 6 )

Formula (5) and symbol implication in formula (6) are see formula (4);

Can be obtained by formula (4), formula (5), formula (6):

r 11 x M 1 + r 13 z M 1 + Δ x = c 1 r 11 x M 2 + r 13 z M 2 + Δ x = c 2 r 11 x M 3 + r 13 z M 3 + Δ x = c 3 , - - - ( 7 )

In formula, symbol implication is see formula (3)---formula (6);

Formula (7) can be write as following form:

x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 r 11 r 13 Δ x = c 1 c 2 c 3 , - - - ( 8 )

To convert further:

r 11 r 13 Δ x = x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 c 1 c 2 c 3 , - - - ( 9 )

In formula (8), formula (9), symbol implication is see formula (7);

So far, r has been solved 11, r 13, Δ x;

In like manner, can be obtained by formula (4), formula (5), formula (6):

r 21 r 23 Δ y = x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 d 1 d 2 d 3 , - - - ( 10 )

r 31 r 33 Δ z = x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 e 1 e 2 e 3 , - - - ( 11 )

In formula, x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 For x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 Inverse.

So far, r has been solved 11, r 13, Δ x and r 21, r 23, Δ y and r 31, r 33, Δ z;

A4) to determine trick relational matrix completely also need to solve r 12, r 22, r 32, the character according to attitude matrix:

r 11 2 + r 21 2 + r 31 2 = 1 r 12 2 + r 22 2 + r 32 2 = 1 r 13 2 + r 23 2 + r 33 2 = 1 , - - - ( 12 )

And it is vectorial be vector of unit length and pairwise orthogonal.

In formula, symbol is coordinate system, and { M} is relative to robot end's flange coordinate system { transformation matrix expression amount of End}.

So have:

R can have been solved by formula (13) 12, r 22, r 32, by the vector solved carry out unitization process, obtain normalized vector, so far solves trick relational matrix

A5) obtained by coordinate pose transformation relation:

P B ′ = T E n d B a s e T M E n d P M , - - - ( 14 )

In formula, 1p' is according to calibration result derive the space point P that draws coordinate system the theoretical coordinate value in Base}, mp is that { coordinate in M}, will at coordinate system for this point bp' with bp carries out contrasting the precision just checking hand and eye calibrating, only utilizes spatial point P 1, P 2, P 3the possibility of result carrying out hand and eye calibrating can also exist larger error, in order to improve accuracy and the serious forgiveness of calibration algorithm, choosing N (N>3) individual spatial point, therefrom choosing 3 points, total plant combination, choose a minimum combination of its medial error as calibration result.

Described steps A 2) in, according to the transformation relation between coordinate system, obtain by equation two ends premultiplication matrix inverse, obtain:

( T E n d B a s e ) - 1 P B 1 = T M E n d P M 1 , - - - ( 1 )

Note T = ( T E n d B a s e ) - 1 P B 1 = [ c 1 , d 1 , e 1 , 1 ] T , Obtain:

T P B 1 = T M E n d P M 1 , - - - ( 2 )

Formula (2) is launched:

c 1 d 1 e 1 1 = r 11 r 12 r 13 Δ x r 21 r 22 r 23 Δ y r 31 r 31 r 33 Δ z 0 0 0 1 x M 1 y M 1 z M 1 1 , - - - ( 3 )

Formula (3) is launched further obtain:

r 11 x M 1 + r 13 z M 1 + Δ x = c 1 r 21 x M 1 + r 23 z M 1 + Δ y = d 1 r 31 x M 1 + r 33 z M 1 + Δ z = e 1 , - - - ( 4 )

In formula, c 1 d 1 e 1 1 For column matrix, r 11 r 12 r 13 Δ x r 21 r 22 r 23 Δ y r 31 r 31 r 33 Δ z 0 0 0 1 For coordinate system M} relative to robot end's flange coordinate system the concrete form of the transformation matrix of End}, x M 1 y M 1 z M 1 1 For column matrix, represent some P 1at two-dimensional laser sensor measurement coordinate system { coordinate in M}.

Described steps A 3) in, be similar to space point P 1, get other space point P 2, P 3, in like manner have:

r 11 x M 2 + r 13 z M 2 + Δ x = c 2 r 21 x M 2 + r 23 z M 2 + Δ y = d 2 r 31 x M 2 + r 33 z M 2 + Δ z = e 2 , - - - ( 5 )

r 11 x M 3 + r 13 z M 3 + Δ x = c 3 r 21 x M 3 + r 23 z M 3 + Δ y = d 3 r 31 x M 3 + r 33 z M 3 + Δ z = e 3 , - - - ( 6 )

Can be obtained by formula (4), formula (5) and formula (6):

r 11 x M 1 + r 13 z M 1 + Δ x = c 1 r 11 x M 2 + r 13 z M 2 + Δ x = c 2 r 11 x M 3 + r 13 z M 3 + Δ x = c 3 , - - - ( 7 )

Write formula (7) as following form:

x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 r 11 r 13 Δ x = c 1 c 2 c 3 , - - - ( 8 )

To convert further:

r 11 r 13 Δ x = x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 c 1 c 2 c 3 , - - - ( 9 )

In formula (8), formula (9), symbol implication is see formula (7);

So far, r has been solved 11, r 13, Δ x;

In like manner, can be obtained by formula (4), formula (5), formula (6):

r 21 r 23 Δ y = x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 d 1 d 2 d 3 , - - - ( 10 )

r 31 r 33 Δ z = x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 e 1 e 2 e 3 , - - - ( 11 )

In formula, x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 For x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 Inverse;

So far, r has been solved 11, r 13, Δ x, r 21, r 23, Δ y, r 31, r 33with Δ z.

Described step B comprises the following steps:

B1) scaling board is put in space a certain suitable position, manipulation robot makes to be arranged on laser rays that the two-dimensional laser sensor emission on its end flange goes out and to be incident upon on scaling board on certain point, be designated as a P, according to scaling board and geometric relationship between laser light photosensor read now P point at the laser sensor surving coordinate system { coordinate figure in M} mp; Keep the motionless reading of robot and write down now robot end's flanged tool coordinate system { End} is relative to basis coordinates system of the robot { pose of Base}

B2) TCP of manipulation robot tool coordinates system that another one has been established arrives described P point, reads P point at basis coordinates system of the robot { coordinate in Base} bp.By what obtain mp, bp is as one group of nominal data;

B3) repeat step B1), B2), obtain the nominal data that N group is different;

B4) by step B3) in obtain N group nominal data input calibrating procedure, draw hand and eye calibrating result by computer calculate

The present invention has following advantage and effect relative to prior art:

1, of the present invention practical, use flexible, easy, hand and eye calibrating precision is high.

2, the hand-eye system being applicable to two-dimensional laser vision sensor and robot and forming of the present invention, utilizes laser vision sensor to replace traditional C CD vision well, meets the application demand of robot under special occasions.

Accompanying drawing explanation

Fig. 1 a obtains space calibration point at sensor measurement coordinate system internal coordinate and robot current pose schematic diagram in the present invention.

Fig. 1 b is the partial enlarged drawing of laser sensor.

Fig. 2 obtains space calibration point at basis coordinates system of robot internal coordinate schematic diagram in the present invention.

Embodiment

Below in conjunction with embodiment and accompanying drawing, the present invention is described in further detail, but embodiments of the present invention are not limited thereto.

Embodiment

Install last two-dimensional laser sensor by mounting bracket at robot (containing robot controller, teach box) end flange, this sensor communicates with computing machine, the measurement data that computing machine receiving sensor returns.

The laser beam projects that two-dimensional laser sensor sends to measured object on the surface time, laser beam can form the image consistent with measured object surface profile, this laser beam has a series of continuous, uniform P laser sampling point, and then sensor returns P sampled point in this Shu Jiguang relative to the Z axis in sensor measurement coordinate system and X-axis coordinate figure.

In conjunction with scaling board, utilize laser sensor and robot to obtain hand and eye calibrating desired data.The method also adopts computing machine to obtain the algorithm computing of the measurement data of two-dimensional laser sensor, the typing completing nominal data, execution demarcation, utilizes these data to calculate and solves trick relation.

This annex of scaling board is additionally used in the present embodiment.

As illustrated in figs. ia and ib, described steps A (setting up the mathematical model of calibration algorithm) comprises the following steps:

A1) 1, space P is obtained 1in basis coordinates system of robot, { coordinate in Base}1 is bp 1, bp 1=[ 1x 1, 1y 1, 1z 1, 1] t, acquisition point P 1in the surving coordinate system of two-dimensional laser sensor, { coordinate in M}2 is mp 1, mp 1=[ mx 1, my 1, mz 1, 1] t.Note trick matrix, namely { relative to robot end's flange coordinate system, { transformation matrix of End}3 is M} coordinate system { relative to coordinate system, { transformation relation of Base} is End} note coordinate system

In formula, for coordinate system End} relative to coordinate system the transformation matrix of Base} inverse, bp is 1, space P 1basis coordinates system of robot the coordinate in Base}, for coordinate system M} relative to robot end's flange coordinate system the transformation matrix of End}, mp 1for a P 1at the two-dimensional laser sensor measurement coordinate system { coordinate in M}.

Wherein:

T M E n d = r 11 r 12 r 13 Δ x r 21 r 22 r 23 Δ y r 31 r 32 r 33 Δ z 0 0 0 1 In 12 variablees be all unknown; also be a 4x4 matrix.

Its value is known, directly can read from robot teach box 4 and know.

A2) according to the transformation relation between coordinate system, known by equation two ends premultiplication matrix inverse, obtain:

( T E n d B a s e ) - 1 P B 1 = T M E n d P M 1 , - - - ( 1 )

In formula, for coordinate system End} relative to coordinate system the transformation matrix of Base} inverse, bp is 1, space P 1basis coordinates system of robot the coordinate in Base}, for coordinate system M} relative to robot end's flange coordinate system the transformation matrix of End}, mp 1for a P 1at the two-dimensional laser sensor measurement coordinate system { coordinate in M}.

Note T = ( T E n d B a s e ) - 1 P B 1 = [ c 1 , d 1 , e 1 , 1 ] T , Obtain:

T P B 1 = T M E n d P M 1 , - - - ( 2 )

In formula, T be column matrix, bp 1for 1, space P 1basis coordinates system of robot the coordinate in Base}, for coordinate system M} relative to robot end's flange coordinate system the transformation matrix of End}, mp 1for a P 1at the two-dimensional laser sensor measurement coordinate system { coordinate in M}.

Formula (2) is launched:

c 1 d 1 e 1 1 = r 11 r 12 r 13 Δ x r 21 r 22 r 23 Δ y r 31 r 31 r 33 Δ z 0 0 0 1 x M 1 y M 1 z M 1 1 , - - - ( 3 )

In formula, c 1 d 1 e 1 1 For column matrix, r 11 r 12 r 13 Δ x r 21 r 22 r 23 Δ y r 31 r 31 r 33 Δ z 0 0 0 1 For coordinate system M} relative to robot end's flange coordinate system the concrete form of the transformation matrix of End}, x M 1 y M 1 z M 1 1 For column matrix, represent some P 1at two-dimensional laser sensor measurement coordinate system { coordinate in M}.

Formula (3) is launched further obtain:

r 11 x M 1 + r 13 z M 1 + Δ x = c 1 r 21 x M 1 + r 23 z M 1 + Δ y = d 1 r 31 x M 1 + r 33 z M 1 + Δ z = e 1 , - - - ( 4 )

In formula, the implication of variable is see formula (3);

A3) space point P is similar to 1, in like manner, for space point P 2, P 3, have:

r 11 x M 2 + r 13 z M 2 + Δ x = c 2 r 21 x M 2 + r 23 z M 2 + Δ y = d 2 r 31 x M 2 + r 33 z M 2 + Δ z = e 2 , - - - ( 5 )

r 11 x M 3 + r 13 z M 3 + Δ x = c 3 r 21 x M 3 + r 23 z M 3 + Δ y = d 3 r 31 x M 3 + r 33 z M 3 + Δ z = e 3 , - - - ( 6 )

In formula (5), (6), symbol implication is see formula (4);

Can be obtained by formula (4), formula (5), formula (6):

r 11 x M 1 + r 13 z M 1 + Δ x = c 1 r 11 x M 2 + r 13 z M 2 + Δ x = c 2 r 11 x M 3 + r 13 z M 3 + Δ x = c 3 , - - - ( 7 )

In formula, symbol implication is see formula (3)---formula (6);

Formula (7) can be write as following form:

x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 r 11 r 13 Δ x = c 1 c 2 c 3 , - - - ( 8 )

Inverse of a matrix is multiplied by further by same for equation two ends:

r 11 r 13 Δ x = x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 c 1 c 2 c 3 , - - - ( 9 )

In formula (8), formula (9), symbol implication is see formula (7);

So far, r has been solved 11, r 13, Δ x;

In like manner, can be obtained by formula (4), formula (5), formula (6):

r 21 r 23 Δ y = x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 d 1 d 2 d 3 , - - - ( 10 )

r 31 r 33 Δ z = x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 e 1 e 2 e 3 , - - - ( 11 )

In formula, x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 For x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 Inverse.

So far, r has been solved 11, r 13, Δ x and r 21, r 23, Δ y and r 31, r 33, Δ z;

A4) to determine trick relational matrix completely also need to solve r 12, r 22, r 32, the character according to attitude matrix:

r 11 2 + r 21 2 + r 31 2 = 1 r 12 2 + r 22 2 + r 32 2 = 1 r 13 2 + r 23 2 + r 33 2 = 1 , - - - ( 12 )

Vector be vector of unit length and pairwise orthogonal.

In formula, symbol is coordinate system, and { M} is relative to robot end's flange coordinate system { transformation matrix expression amount of End}.

So have:

R can have been solved by formula (13) 12, r 22, r 32, by the vector solved according to with carry out unitization process, obtain unitization vector, so far solves trick relational matrix

A5) obtained by coordinate pose transformation relation:

P B ′ = T E n d B a s e T M E n d P M , - - - ( 14 )

In formula, bp' be according to calibration result derive the space point P that draws coordinate system the theoretical coordinate value in Base}, mp is that { coordinate in M}, will at coordinate system for this point bp' with bp carries out contrasting the precision just checking hand and eye calibrating, only utilizes spatial point P 1, P 2, P 3the possibility of result carrying out hand and eye calibrating can also exist larger error, in order to improve accuracy and the serious forgiveness of calibration algorithm, choosing 5 spatial point, therefrom choosing 3 points, total plant combination, choose a minimum combination of its medial error as calibration result.

Described step B (formulating the implementation and operation step of hand and eye calibrating) comprises the following steps:

B1) as shown in Figure 1, scaling board 5 is put a certain suitable position in space, make to be arranged on by robot teach box 4 manipulation robot laser rays that the two-dimensional laser sensor 2 on its end flange 3 launches and be incident upon a certain P point place on scaling board 5, scaling board 5 is in the range ability of sensor, as can be seen from the geometric relationship in Fig. 1, find out Z in the present sample data that sensor 2 returns maxle minimum value Z min{ the Z axis coordinate in M}2, with Z at sensor measurement coordinate system to be P point mincorresponding X maxial coordinate value is just for P point is at the coordinate system { X in M} maxial coordinate, so just have read P point at the laser sensor surving coordinate system { coordinate figure in M}2 mp; Keep robot motionless, from robot teach box 4 read and write down represent now robot end's flanged tool coordinate system { End}3 is relative to basis coordinates system of robot { Base}1 pose eulerian angle α, β, γ and offset Δ f x, Δ f y, Δ f z. can be obtained by following formula:

T M E n d = c α c β c α s β s γ - s α c γ c α s β c γ + s α s γ Δf x s α c β s α s β s γ + c α c γ s α s β c γ - c α s γ Δf y - s β c β s γ c β c γ Δf z 0 0 0 1 ,

In formula, for tool coordinates system, { End} is relative to basis coordinates system of robot { Base} position auto―control;

Wherein: s α, c α represent sin α, cos α respectively, all the other symbols by that analogy;

B2) as shown in Figure 2, { TCP of Tool}6 arrives described P point, reads P point at basis coordinates system of the robot { coordinate in Base}1 in the tool coordinates system that manipulation robot makes another one establish bp.By what obtain mp, bp is as one group of nominal data;

B3) repeat step B1), B2), obtain the nominal data that N group is different;

B4) by step B3) in obtain N group nominal data input calibrating procedure, draw hand and eye calibrating result by computing machine 7 according to the computing of calibration algorithm mathematical model

Above-described embodiment is the present invention's preferably embodiment; but embodiments of the present invention are not restricted to the described embodiments; change, the modification done under other any does not deviate from Spirit Essence of the present invention and principle, substitute, combine, simplify; all should be the substitute mode of equivalence, be included within protection scope of the present invention.

Claims (6)

1. a hand and eye calibrating method for two-dimensional laser vision sensor and robot, is characterized in that, comprises the following steps:
Steps A, set up the mathematical model of calibration algorithm;
The implementation and operation step of step B, formulation hand and eye calibrating;
Described step B comprises the following steps:
B1) scaling board is put in space a certain suitable position, manipulation robot makes to be arranged on the laser rays that the two-dimensional laser sensor emission on its end flange goes out and is incident upon P point on scaling board, according to scaling board and geometric relationship between laser light photosensor read now P point at the laser sensor surving coordinate system { coordinate figure in M} mp; Keep the motionless reading of robot and write down now robot end's flanged tool coordinate system { End} is relative to basis coordinates system of the robot { pose of Base}
B2) TCP of manipulation robot tool coordinates system that another one has been established arrives described P point, reads P point at basis coordinates system of the robot { coordinate in Base} bp, by what obtain mp, bp is as one group of nominal data;
B3) repeat step B1), B2), until obtain the different nominal data of N group set;
B4) by step B3) in obtain N group nominal data input calibrating procedure, draw hand and eye calibrating result by computer calculate
2. the hand and eye calibrating method of two-dimensional laser vision sensor as claimed in claim 1 and robot, it is characterized in that, described steps A comprises the following steps:
A1) 1, space P is obtained 1at basis coordinates system of the robot { coordinate in Base} bp 1, 1p 1=[ 1x 1, 1y 1, 1z 1, 1] t, acquisition point P 1at two-dimensional laser sensor measurement coordinate system, { coordinate in M} is mp 1, mp 1=[ mx 1, my 1, mz 1, 1] t, note trick matrix, namely { relative to robot end's flange coordinate system, { transformation matrix of End} is M} coordinate system T M E n d = r 11 r 12 r 13 Δ x r 21 r 22 r 23 Δ y r 31 r 32 r 33 Δ z 0 0 0 1 , { relative to coordinate system, { transformation relation of Base} is End} note coordinate system
A2) according to the transformation relation between coordinate system, obtain:
by equation two ends premultiplication matrix inverse, obtain:
( T E n d B a s e ) - 1 P B 1 = T M E n d P M 1 , - - - ( 1 )
In formula, for coordinate system End} relative to coordinate system the transformation matrix of Base} inverse, bp is 1, space P 1basis coordinates system of robot the coordinate in Base}, for coordinate system M} relative to robot end's flange coordinate system the transformation matrix of End}, mp 1for a P 1at the two-dimensional laser sensor measurement coordinate system { coordinate in M};
Note T = ( T E n d B a s e ) - 1 P B 1 = [ c 1 , d 1 , e 1 , 1 ] T , Obtain:
T P B 1 = T M E n d P M 1 , - - - ( 2 )
In formula, T be column matrix, bp 1for 1, space P 1basis coordinates system of robot the coordinate in Base}, for coordinate system M} relative to robot end's flange coordinate system the transformation matrix of End}, mp 1for a P 1at the two-dimensional laser sensor measurement coordinate system { coordinate in M};
Formula (2) is launched to obtain:
c 1 d 1 e 1 1 = r 11 r 12 r 13 Δ x r 21 r 22 r 23 Δ y r 31 r 31 r 33 Δ z 0 0 0 1 x M 1 y M 1 z M 1 1 , - - - ( 3 )
In formula, c 1 d 1 e 1 1 For column matrix, r 11 r 12 r 13 Δ x r 21 r 22 r 23 Δ y r 31 r 32 r 33 Δ z 0 0 0 1 For coordinate system M} relative to robot end's flange coordinate system the concrete form of the transformation matrix of End}, x M 1 y M 1 z M 1 1 For column matrix, represent some P 1at two-dimensional laser sensor measurement coordinate system { coordinate in M};
Formula (3) is launched further obtain:
r 11 x M 1 + r 13 z M 1 + Δ x = c 1 r 21 x M 1 + r 23 z M 1 + Δ y = d 1 r 31 x M 1 + r 33 z M 1 + Δ z = e 1 , - - - ( 4 )
A3) for space point P 2and P 3, have:
r 11 x M 2 + r 13 z M 2 + Δ x = c 2 r 21 x M 2 + r 23 z M 2 + Δ y = d 2 r 31 x M 2 + r 33 z M 2 + Δ z = e 2 , - - - ( 5 )
r 11 x M 3 + r 13 z M 3 + Δ x = c 3 r 21 x M 3 + r 23 z M 3 + Δ y = d 3 r 31 x M 3 + r 33 z M 3 + Δ z = e 3 , - - - ( 6 )
Obtained by formula (4), formula (5) and formula (6):
r 11 x M 1 + r 13 z M 1 + Δ x = c 1 r 11 x M 2 + r 13 z M 2 + Δ x = c 2 r 11 x M 3 + r 13 z M 3 + Δ x = c 3 , - - - ( 7 )
Write formula (7) as following form:
x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 r 11 r 13 Δ x = c 1 c 2 c 3 , - - - ( 8 )
Formula (7) is converted further:
r 11 r 13 Δ x = x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 c 1 c 2 c 3 , - - - ( 9 )
So far, r has been solved 11, r 13with Δ x;
Obtained by formula (4), formula (5) and formula (6):
r 21 r 23 Δ y = x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 d 1 d 2 d 3 , - - - ( 10 )
r 31 r 33 Δ z = x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 e 1 e 2 e 3 , - - - ( 11 )
So far, r has been solved 11, r 13, Δ x, r 21, r 23, Δ y, r 31, r 33with Δ z;
In formula, x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 - 1 For x M 1 z M 1 1 x M 2 z M 2 1 x M 3 z M 3 1 Inverse;
A4) in order to determine trick relational matrix character according to attitude matrix solves r 12, r 22and r 32:
r 11 2 + r 21 2 + r 31 2 = 1
r 12 2 + r 22 2 + r 32 2 = 1 , - - - ( 12 )
r 13 2 + r 23 2 + r 33 2 = 1
Vector with be vector of unit length and pairwise orthogonal;
In formula, with be coordinate system M} relative to robot end's flange coordinate system the transformation matrix expression amount of End}, therefore obtains:
R is solved by formula (13) 12, r 22, r 32, by the vector solved with carry out unitization process, obtain normalized with vector, so far, solves trick relational matrix
A5) obtained by coordinate pose transformation relation:
P B ′ = T E n d B a s e T M E n d P M , - - - ( 14 )
Wherein, bp' be according to calibration result derive the above-mentioned space point P that draws coordinate system the theoretical coordinate value in Base}, mp is that { coordinate in M}, will at coordinate system for this point bp' with bp carries out contrasting the precision just checking hand and eye calibrating, only utilizes spatial point P 1, P 2, P 3the possibility of result carrying out hand and eye calibrating can also exist larger error, in order to improve accuracy and the serious forgiveness of calibration algorithm, choosing N (N>3) individual spatial point, therefrom choosing 3 points, total plant combination, choose plant the minimum one combination of combination medial error as calibration result.
3. the hand and eye calibrating method of two-dimensional laser vision sensor as claimed in claim 2 and robot, is characterized in that, described steps A 1) in, obtain 1, space P 1in basis coordinates system of robot, { coordinate in Base} is bp 1, bp 1=[ 1x 1, 1y 1, 1z 1, 1] t, acquisition point P 1at two-dimensional laser sensor measurement coordinate system, { coordinate in M} is mp 1, mp 1=[ mx 1, my 1, mz 1, 1] t.
4. the hand and eye calibrating method of two-dimensional laser vision sensor as claimed in claim 2 and robot, is characterized in that, described steps A 4) in, utilize vector with be vector of unit length and the characteristic of pairwise orthogonal, so have solve r 12, r 22and r 32, in order to reduce calibrated error, by the vector solved with carry out unitization process, obtain normalized with vector, so far, has solved trick relational matrix
5. the hand and eye calibrating method of two-dimensional laser vision sensor as claimed in claim 2 and robot, is characterized in that, described steps A 5) in, obtained by coordinate pose transformation relation:
P B ′ = T E n d B a s e T M E n d P M ,
Wherein, bp' be according to hand and eye calibrating result derive the space point P that draws basis coordinates system of robot the theoretical coordinate value in Base}, mp is that { coordinate in M}, will at coordinate system for this point bp' with bp carries out contrasting the precision just checking hand and eye calibrating, in order to improve accuracy and the serious forgiveness of calibration algorithm, chooses N (N>3) individual spatial point, therefrom chooses at 3 o'clock as one group and substitutes into calibration algorithm mathematical model, total plant combination, choose a minimum combination of its medial error as calibration result.
6. the hand and eye calibrating method of two-dimensional laser vision sensor as claimed in claim 1 and robot, it is characterized in that, described step B1) in, scaling board is put in space a certain suitable position, manipulation robot makes to be arranged on the laser rays that the two-dimensional laser sensor emission on its end flange goes out and is incident upon P point on scaling board, according to scaling board and geometric relationship between laser light photosensor read now P point at the laser sensor surving coordinate system { coordinate figure in M} mp.
CN201510460526.2A 2015-07-29 2015-07-29 A kind of hand and eye calibrating method of two-dimensional laser visual sensor and robot CN105157725B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510460526.2A CN105157725B (en) 2015-07-29 2015-07-29 A kind of hand and eye calibrating method of two-dimensional laser visual sensor and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510460526.2A CN105157725B (en) 2015-07-29 2015-07-29 A kind of hand and eye calibrating method of two-dimensional laser visual sensor and robot

Publications (2)

Publication Number Publication Date
CN105157725A true CN105157725A (en) 2015-12-16
CN105157725B CN105157725B (en) 2018-06-29

Family

ID=54798664

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510460526.2A CN105157725B (en) 2015-07-29 2015-07-29 A kind of hand and eye calibrating method of two-dimensional laser visual sensor and robot

Country Status (1)

Country Link
CN (1) CN105157725B (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106335061A (en) * 2016-11-11 2017-01-18 福州大学 Hand-eye relation calibration method based on four-freedom-degree robot
CN106426172A (en) * 2016-10-27 2017-02-22 深圳元启智能技术有限公司 Calibration method and system for industrial robot tool coordinate system
CN106643805A (en) * 2016-12-30 2017-05-10 上海交通大学 Position calibration method of laser positioning sensor in AGV (automated guided vehicle)
CN106839979A (en) * 2016-12-30 2017-06-13 上海交通大学 The hand and eye calibrating method of line structured laser sensor
CN107152911A (en) * 2017-06-01 2017-09-12 无锡中车时代智能装备有限公司 Based on the PSD dot laser sensors fed back and the scaling method of robot relative position
CN107253190A (en) * 2017-01-23 2017-10-17 梅卡曼德(北京)机器人科技有限公司 The device and its application method of a kind of high precision machines people trick automatic camera calibration
CN107256567A (en) * 2017-01-22 2017-10-17 梅卡曼德(北京)机器人科技有限公司 A kind of automatic calibration device and scaling method for industrial robot trick camera
CN107560563A (en) * 2017-07-28 2018-01-09 华南理工大学 A kind of line laser three-dimensional measuring apparatus demarcation and error compensating method
CN107871328A (en) * 2016-09-28 2018-04-03 康耐视公司 The calibration method that NI Vision Builder for Automated Inspection and NI Vision Builder for Automated Inspection are realized
CN108972544A (en) * 2018-06-21 2018-12-11 华南理工大学 A kind of vision laser sensor is fixed on the hand and eye calibrating method of robot
CN108972559A (en) * 2018-08-20 2018-12-11 上海嘉奥信息科技发展有限公司 Hand and eye calibrating method based on infrared stereoscopic vision positioning system and mechanical arm
CN109048893A (en) * 2018-07-27 2018-12-21 浙江工业大学 A kind of mechanical arm localization method based on monocular RGB camera
CN109470138A (en) * 2018-10-22 2019-03-15 江苏集萃微纳自动化系统与装备技术研究所有限公司 The On-line Measuring Method of part
CN109528274A (en) * 2017-09-22 2019-03-29 清华大学深圳研究生院 A kind of method for registering and device
CN109623206A (en) * 2018-12-19 2019-04-16 清华大学 Method for optimizing the welding gun pose of segregation reasons in the welding of robot pipeline
CN110136208A (en) * 2019-05-20 2019-08-16 北京无远弗届科技有限公司 A kind of the joint automatic calibration method and device of Visual Servoing System
CN110355464A (en) * 2019-07-05 2019-10-22 上海交通大学 Visual Matching Method, system and the medium of laser processing
CN110480642A (en) * 2019-10-16 2019-11-22 遨博(江苏)机器人有限公司 Industrial robot and its method for utilizing vision calibration user coordinate system
CN110974421A (en) * 2019-12-13 2020-04-10 杭州三坛医疗科技有限公司 Calibration method and system for TCP of surgical robot and storage medium
CN111070199A (en) * 2018-10-18 2020-04-28 杭州海康威视数字技术股份有限公司 Hand-eye calibration assessment method and robot
TWI712473B (en) 2017-05-22 2020-12-11 達明機器人股份有限公司 Method for calibrating coordinate of robot arm
WO2021012124A1 (en) * 2019-07-19 2021-01-28 西门子(中国)有限公司 Robot hand-eye calibration method and apparatus, computing device, medium and product

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080088165A (en) * 2007-03-29 2008-10-02 삼성중공업 주식회사 Robot calibration method
CN101630409A (en) * 2009-08-17 2010-01-20 北京航空航天大学 Hand-eye vision calibration method for robot hole boring system
CN102794763A (en) * 2012-08-31 2012-11-28 江南大学 Systematic calibration method of welding robot guided by line structured light vision sensor
CN102927908A (en) * 2012-11-06 2013-02-13 中国科学院自动化研究所 Robot eye-on-hand system structured light plane parameter calibration device and method
KR20130029883A (en) * 2011-09-16 2013-03-26 대우조선해양 주식회사 Laser vision system calibration method using working joint
CN103558850A (en) * 2013-07-26 2014-02-05 无锡信捷电气股份有限公司 Laser vision guided welding robot full-automatic movement self-calibration method
US20140100694A1 (en) * 2012-10-05 2014-04-10 Beckman Coulter, Inc. System and method for camera-based auto-alignment
CN103991006A (en) * 2014-04-01 2014-08-20 浙江大学 Calibration method and device for robot hole forming platform vision measurement system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080088165A (en) * 2007-03-29 2008-10-02 삼성중공업 주식회사 Robot calibration method
CN101630409A (en) * 2009-08-17 2010-01-20 北京航空航天大学 Hand-eye vision calibration method for robot hole boring system
KR20130029883A (en) * 2011-09-16 2013-03-26 대우조선해양 주식회사 Laser vision system calibration method using working joint
CN102794763A (en) * 2012-08-31 2012-11-28 江南大学 Systematic calibration method of welding robot guided by line structured light vision sensor
US20140100694A1 (en) * 2012-10-05 2014-04-10 Beckman Coulter, Inc. System and method for camera-based auto-alignment
CN102927908A (en) * 2012-11-06 2013-02-13 中国科学院自动化研究所 Robot eye-on-hand system structured light plane parameter calibration device and method
CN103558850A (en) * 2013-07-26 2014-02-05 无锡信捷电气股份有限公司 Laser vision guided welding robot full-automatic movement self-calibration method
CN103991006A (en) * 2014-04-01 2014-08-20 浙江大学 Calibration method and device for robot hole forming platform vision measurement system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘苏宜等: "激光视觉机器人焊接中摄像机和手眼的同时标定", 《华南理工大学学报(自然科学版)》 *
王胜华 等: "机器人定点变位姿手-眼标定方法", 《清华大学学报(自然科学版)》 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107871328A (en) * 2016-09-28 2018-04-03 康耐视公司 The calibration method that NI Vision Builder for Automated Inspection and NI Vision Builder for Automated Inspection are realized
CN106426172A (en) * 2016-10-27 2017-02-22 深圳元启智能技术有限公司 Calibration method and system for industrial robot tool coordinate system
CN106426172B (en) * 2016-10-27 2019-04-16 深圳元启智能技术有限公司 A kind of scaling method and system of industrial robot tool coordinates system
CN106335061A (en) * 2016-11-11 2017-01-18 福州大学 Hand-eye relation calibration method based on four-freedom-degree robot
CN106643805B (en) * 2016-12-30 2020-07-14 上海交通大学 Method for calibrating position of laser positioning sensor in AGV
CN106839979A (en) * 2016-12-30 2017-06-13 上海交通大学 The hand and eye calibrating method of line structured laser sensor
CN106839979B (en) * 2016-12-30 2019-08-23 上海交通大学 The hand and eye calibrating method of line structured laser sensor
CN106643805A (en) * 2016-12-30 2017-05-10 上海交通大学 Position calibration method of laser positioning sensor in AGV (automated guided vehicle)
CN107256567A (en) * 2017-01-22 2017-10-17 梅卡曼德(北京)机器人科技有限公司 A kind of automatic calibration device and scaling method for industrial robot trick camera
CN107256567B (en) * 2017-01-22 2020-08-07 梅卡曼德(北京)机器人科技有限公司 Automatic calibration device and calibration method for hand-eye camera of industrial robot
CN107253190B (en) * 2017-01-23 2020-09-01 梅卡曼德(北京)机器人科技有限公司 High-precision robot hand-eye camera automatic calibration device and use method thereof
CN107253190A (en) * 2017-01-23 2017-10-17 梅卡曼德(北京)机器人科技有限公司 The device and its application method of a kind of high precision machines people trick automatic camera calibration
TWI712473B (en) 2017-05-22 2020-12-11 達明機器人股份有限公司 Method for calibrating coordinate of robot arm
CN107152911A (en) * 2017-06-01 2017-09-12 无锡中车时代智能装备有限公司 Based on the PSD dot laser sensors fed back and the scaling method of robot relative position
CN107560563B (en) * 2017-07-28 2019-10-18 华南理工大学 A kind of calibration of line laser three-dimensional measuring apparatus and error compensating method
CN107560563A (en) * 2017-07-28 2018-01-09 华南理工大学 A kind of line laser three-dimensional measuring apparatus demarcation and error compensating method
CN109528274A (en) * 2017-09-22 2019-03-29 清华大学深圳研究生院 A kind of method for registering and device
CN108972544A (en) * 2018-06-21 2018-12-11 华南理工大学 A kind of vision laser sensor is fixed on the hand and eye calibrating method of robot
CN109048893A (en) * 2018-07-27 2018-12-21 浙江工业大学 A kind of mechanical arm localization method based on monocular RGB camera
CN108972559A (en) * 2018-08-20 2018-12-11 上海嘉奥信息科技发展有限公司 Hand and eye calibrating method based on infrared stereoscopic vision positioning system and mechanical arm
CN111070199A (en) * 2018-10-18 2020-04-28 杭州海康威视数字技术股份有限公司 Hand-eye calibration assessment method and robot
CN109470138A (en) * 2018-10-22 2019-03-15 江苏集萃微纳自动化系统与装备技术研究所有限公司 The On-line Measuring Method of part
CN109623206A (en) * 2018-12-19 2019-04-16 清华大学 Method for optimizing the welding gun pose of segregation reasons in the welding of robot pipeline
CN110136208A (en) * 2019-05-20 2019-08-16 北京无远弗届科技有限公司 A kind of the joint automatic calibration method and device of Visual Servoing System
CN110355464A (en) * 2019-07-05 2019-10-22 上海交通大学 Visual Matching Method, system and the medium of laser processing
WO2021012124A1 (en) * 2019-07-19 2021-01-28 西门子(中国)有限公司 Robot hand-eye calibration method and apparatus, computing device, medium and product
CN110480642A (en) * 2019-10-16 2019-11-22 遨博(江苏)机器人有限公司 Industrial robot and its method for utilizing vision calibration user coordinate system
CN110974421A (en) * 2019-12-13 2020-04-10 杭州三坛医疗科技有限公司 Calibration method and system for TCP of surgical robot and storage medium

Also Published As

Publication number Publication date
CN105157725B (en) 2018-06-29

Similar Documents

Publication Publication Date Title
Pérez et al. Robot guidance using machine vision techniques in industrial environments: A comparative review
CN105798909B (en) Robot Zero positioning System and method for based on laser and vision
CN102419178B (en) Mobile robot positioning system and method based on infrared road sign
US9852512B2 (en) Reduced homography based on structural redundancy of conditioned motion
CN101995231B (en) Three-dimensional detection system for surface of large thin-shell object and detection method thereof
AU2008296518B2 (en) System and method for three-dimensional measurement of the shape of material objects
CN102289306B (en) Attitude sensing equipment and positioning method thereof as well as method and device for controlling mouse pointer
TWI512548B (en) Moving trajectory generation method
CN103337066B (en) 3D obtains the calibration steps of system
CN103499302B (en) The camshaft diameter dimension On-line Measuring Method of structure based light Vision imaging system
KR101954855B1 (en) Use of intensity variations of light patterns for depth mapping of objects in a volume
CN105992900B (en) System and method for the orientation of computing device
CN100580697C (en) Method and apparatus for determining absolute position of a tip of an elongate object on a plane surface with invariant features
CN104279960B (en) Method for measuring size of object through mobile device
JP2014102246A (en) Position attitude detection system
CN103759670B (en) A kind of object dimensional information getting method based on numeral up short
US20140015963A1 (en) Portable three-dimensional metrology with data displayed on the measured surface
CN103234512B (en) Triaxial air bearing table high-precision attitude angle and angular velocity measuring device
CN103438826B (en) The three-dimension measuring system of the steel plate that laser combines with vision and method
CN103093223B (en) A kind of method for rapidly positioning of light spot image center
CN102207371B (en) Three-dimensional point coordinate measuring method and measuring apparatus thereof
JP5842248B2 (en) Marker
CN102175221B (en) Vehicle-mounted mobile photographic surveying system based on fisheye lens
JP2014511480A (en) System for measuring the position and movement of objects
CN102692214B (en) Narrow space binocular vision measuring and positioning device and method

Legal Events

Date Code Title Description
PB01 Publication
C06 Publication
SE01 Entry into force of request for substantive examination
C10 Entry into substantive examination
GR01 Patent grant
GR01 Patent grant