CN103162622B - The Portable ball target of single camera vision system and use thereof and measuring method thereof - Google Patents

The Portable ball target of single camera vision system and use thereof and measuring method thereof Download PDF

Info

Publication number
CN103162622B
CN103162622B CN201310064021.5A CN201310064021A CN103162622B CN 103162622 B CN103162622 B CN 103162622B CN 201310064021 A CN201310064021 A CN 201310064021A CN 103162622 B CN103162622 B CN 103162622B
Authority
CN
China
Prior art keywords
sphere
camera
ball
coordinate system
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310064021.5A
Other languages
Chinese (zh)
Other versions
CN103162622A (en
Inventor
赵宏
谷飞飞
马跃洋
卜鹏辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN201310064021.5A priority Critical patent/CN103162622B/en
Publication of CN103162622A publication Critical patent/CN103162622A/en
Application granted granted Critical
Publication of CN103162622B publication Critical patent/CN103162622B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present invention provides Portable ball target and the measuring method thereof of a kind of single camera vision system and use thereof, ball target is made up of the probe of spheroid, contiguous block and a series of difformity and length, probe is made up of extension bar and gauge head, contiguous block is used for supporting each spheroid and linking probe, ball-joint piece of surface is equipped with uniform 14 screwed holes, is connected with adopting screw rod between extension bar and ball.Design different connecting rods to adapt to different measurement occasions.Spherical part is made up of more than three non-co-planar balls, single camera shooting piece image gets final product Fast Calibration intrinsic parameters of the camera, determines sphere center position, thus obtaining the three-dimensional coordinate of tested point, provide a kind of feasible way for the Depth Information Acquistion under monocular system.After camera moves, outer ginseng can be moved by Fast Calibration camera according to the spherical part single image of ball target, it is achieved spatial data splices, effectively expand wide-measuring range and measuring speed.

Description

Monocular vision system, portable ball target used by monocular vision system and measuring method of monocular vision system
[ technical field ] A method for producing a semiconductor device
The invention belongs to the technical field of vision measurement, and relates to a portable ball target which can be used for quick calibration, measurement and data splicing of a monocular vision measurement system.
[ background of the invention ]
With the increase of the requirements for detecting the appearance of the workpiece, the visual measurement technology is applied more and more. Compared with a stereoscopic vision system which needs a plurality of cameras, the single camera does not need a camera synchronization device and a mechanical device for keeping the relative positions of the cameras, so the system is convenient to use, low in price and convenient to realize. However, when a monocular vision system is used for measurement, the depth information of a three-dimensional object cannot be recovered, and the three-dimensional depth information is often determined by methods of forming triangular constraint by means of extra conditions such as line laser and the like, constraining the measured object to be imaged at a specific position, or constraining the surface to be measured by using a mark point or a calibration surface, and the like, so that the measurement process is very complicated and the application is limited. The process of phasing internal and external parameters at the same time is time-consuming and labor-consuming, which is not beneficial to the rapid measurement of workpieces. In addition, the monocular vision system has a limited measurement range, only local information can be measured when a large-sized workpiece is measured, and the splicing of three-dimensional data by adopting the splicing target becomes an effective way for expanding the measurement range. At present, a commonly used splicing target is mainly a plane target, and the defects that the distortion is too large and even the observation cannot be carried out when the camera moving visual angle is too large exist.
[ summary of the invention ]
Aiming at the defects or shortcomings of the existing monocular measurement system, the invention aims to provide a monocular vision system which is more flexible, simple and convenient and has a visual angle which is not limited by a single surface of a target, a portable ball target used by the monocular vision system and a measurement method thereof, so that the monocular measurement system can be calibrated, measured and spliced by data quickly. Because the ball has the advantages of good outer contour continuity and the like, the visual field of 360-degree space is free of visual angle limitation and convenient to calibrate, and the defect that the planar target is too large in distortion and even cannot be observed when the moving visual angle of the camera is too large is overcome. The invention adopts the sphere as the splicing target and the calibration target, and the calibration process and the splicing process are synchronously completed by using the sphere target.
In order to achieve the purpose, the invention adopts the following technical scheme:
a portable ball target for rapid calibration and measurement of a monocular vision system mainly comprises a ball body, a connecting block and a series of probes with different shapes and lengths; the connecting block is used for supporting each sphere and connecting a probe, the probe consists of a long connecting rod and a measuring head, and different connecting rods are designed to adapt to different measuring occasions; the sphere part consists of more than three non-coplanar spheres, and the sphere is connected with the connecting block through a screw.
As a preferred embodiment of the present invention, the connection block is in the shape of a sphere.
A monocular vision system based on the portable ball target comprises the portable ball target and a camera, and the camera can see the complete ball target body part.
The measuring method based on the monocular vision system comprises (1) shooting a ball target image by the monocular vision system, and geometrically calibrating the internal parameters K of the camera based on absolute quadratic curve imaging according to a perspective imaging model and a ball projection model of the camera, wherein the internal parameters K of the camera comprise the optical center (u) of the camera0,v0) Focal length f and tilt factor s; (2) determining three-dimensional coordinates of each sphere center under a camera coordinate system by using the calibrated internal parameters of the camera and the spherical perspective imaging geometric model; (3) calculating the distance from the sphere center of each sphere in the sphere target to the measuring head; (4) and determining the position of the measuring head, namely the three-dimensional coordinate of the point to be measured according to the position relation between the sphere of the sphere target and the measuring head.
As a preferred embodiment of the present invention, the specific method of the above step (1) comprises the steps of:
(5.1) extracting the elliptical contour formed by each round ball in the camera by adopting an image processing method and an edge detection operator, and fitting a curve equation C corresponding to the elliptical contour by adopting a least square methodiWherein i is the number of the balls;
(5.2) projection imaging model based on sphere perspective, camera coordinate system OcXcYcZcAnd the spherical coordinate system O of the ith spheresiXsiYsiZsiThe homography matrix between is: hi=KRidiag{1,1,ΛiWherein, Λi=Zi0/ri,Zi0Is the distance between the center of the ith sphere and the optical center of the camera, riThe radius of the ith sphere is the ellipse of the sphere imaged in the cameraEquation C of the quadratic curve of a circleiIn dual space the representation is of the form:
C i * = H i C ui * H i T = K R i diag ( 1,1 - Λ i 2 ) R i T K T = KK T - o i o i T , formula (5.1)
Wherein,is the sphere center imaging coordinate, r, of the ith sphere3iFor corresponding rotation matrix RiColumn 3, RiAnd K is a matrix formed by the internal parameters of the camera.
As a preferred embodiment of the present invention, in the step (5.2), when the camera internal parameter K is solved, an absolute quadratic curve is introduced, and an image of the absolute quadratic curve under the homography matrix is w = (KK)T)-1Expressed in dual space as: w is a*=KKT(ii) a The homography matrix between the imaging elliptic curve equations of every two spheres in the sphere target isA, B =1,2,3 and A ≠ B, wherein,imaging elliptic curve equation C for sphere AARepresentation in dual space, CBAn imaging ellipse wireless equation of the sphere B; according to the view geometry theory, the homography matrix of each pair of imaging ellipses comprises a feature vector, called epipolar line of the homography matrix, denoted l, passing through the corresponding ellipse pairABWhere a, B =1,2,3 and a ≠ B, the eigenvalue corresponding to each eigenvector is called a pole, and is denoted as vABA, B =1,2,3 and a ≠ B.lAB、vABAnd w satisfies [ lAB]×wvAB=0, obtaining w from singular value decomposition, and further from the formula w*=KKTAnd obtaining the internal parameter K of the camera by orthogonal decomposition.
As a preferred embodiment of the present invention, the specific method of step (2) is: transforming the formula (5.1) in the step (5.2) to obtain: β i K - 1 C i * K - T = R i diag ( 1,1 , - Λ i 2 ) R i T , wherein, βiThe imaging scale factor of the ith round ball; the expression on the left side of the equation is obtained by singular value decomposition to be consistent with the expression on the right side of the equationThereby obtaining a rotation matrix RiAnd ΛiAnd obtaining the three-dimensional coordinates of each sphere center.
As a preferred embodiment of the present invention, the position coordinates (x, y, z) of the point to be measured are calculated according to the following:
(xi-x)2+(yi-y)2+(zi-z)2=(di)2equation (8.1)
Wherein i is the number of the balls, (x)i,yi,zi) Is the coordinate of the sphere center of the ith sphere, diAnd (3) obtaining i formulas (8.1) by the i round balls as the distance between the round balls and the measuring head, and obtaining the position coordinates (x, y, z) of the point to be measured by combining the i round balls.
As a preferred embodiment of the present invention, before measurement, a ball target is self-calibrated to obtain a distance between a sphere and a measuring head, and the self-calibration method includes: the measuring head is fixed at one point and is fixed, the ball target rotates around the measuring head for a plurality of angles to shoot m images, an error minimization function is established by combining all the ball centers with the constraint that the radius of the concentric circles is unchanged, the space position of the point to be measured is obtained by minimizing the total distance error, and finally the distance between the ball center of each ball and the point to be measured is obtained.
A method for performing mobile calibration and data splicing based on the monocular vision system, (10.1) firstly, placing a ball target at a proper position to ensure that a camera can see the ball part of the ball target; (10.2) establishing a world coordinate System OwXwYwZwSelecting three balls of the ball target in sequence, wherein the origin O of the world coordinate systemwIs the center of the first sphere, X of the world coordinate systemwThe axis being the vector from the center of the second sphere to the center of the first sphereZ of world coordinate systemwThe axis is as follows: O w Z w ‾ = O 2 O 1 ‾ × O 3 O 1 ‾ , y of world coordinate systemwThe shaft is O w Y w ‾ = O w Z w ‾ × O 2 O 1 ‾ ; (10.3) obtaining the relation between the camera coordinate system and the world coordinate system [ R ] from the step (10.2)w-1,Tw-1]When the camera moves to other non-first positions, the relation between the coordinate system and the world system of the camera at the position is [ R ]w-i,Tw-i]And setting a coordinate system of the camera at the first position as a reference coordinate system, and unifying the data measured at all other positions to the reference coordinate system to realize data splicing.
Compared with the prior art, the invention has the following beneficial effects: when the method is used for measurement, different balls in the ball target are shot, so that the position coordinates of the point to be measured are obtained. Because the ball has the advantages of good outer contour continuity and the like, the visual field of 360-degree space is free of visual angle limitation and convenient to calibrate, and the defect that the planar target is too large in distortion and even cannot be observed when the moving visual angle of the camera is too large is overcome.
[ description of the drawings ]
FIG. 1 is a schematic view of a ball target structure and a perspective imaging model of a camera according to the present invention;
FIG. 2 is a schematic view of a partial perspective imaging model of a ball target sphere according to the present invention;
FIG. 3 is a schematic diagram of the principle of the ball target self-calibration and the gauge head position calculation of the present invention;
fig. 4 is a schematic diagram of camera movement position calibration of the ball target of the present invention.
[ detailed description ] embodiments
The present invention is described in further detail below with reference to the attached drawings.
Fig. 1 is a schematic view of a ball target structure and a perspective imaging model of a camera according to the present invention. Wherein the reference numerals denote: 1. the measuring head comprises a measuring head 2, an extension bar 3, a ball 4, a screw rod 5 and a spherical connecting block. O iscXcYcZcAs camera coordinate system, OcIs the optical center of the camera, ZcThe axis coincides with the direction of the optical axis of the camera. O iss1、Os2、Os3The centers of the three balls, O, respectively, for establishing a target coordinate systemsXsYsZsIs a target coordinate system. uv denotes the image pixel coordinate system (i.e. the image coordinate system in which the object is imaged in the camera), (u)0,v0) In the representation imageAnd (4) a heart.
The present invention is specifically described by taking a ball target consisting of 4 round balls as an example. The probe of ball target comprises gauge head 1 and extension bar 2, and gauge head 1 is connected to spherical connecting block 5 through extension bar 2 on, 14 screw holes of equipartition have been beaten on 5 surfaces of spherical connecting block, and adopt the screw rod to be connected between extension bar 2 and the ball 3, and the spheroid quantity that can confirm needs to install as required has been guaranteed to the quantity of screw hole. Each two cameras in a displaced position require overlapping fields of view. The method comprises the steps of calibrating internal parameters of a camera according to the relation between space sphere imaging and an absolute quadratic curve, and then obtaining the sphere center position of each sphere according to a sphere perspective imaging model and a matrix decomposition method, thereby calculating the space three-dimensional coordinates of the measuring head. And finally, calibrating the relation of the moving positions of the cameras by using the ball targets as splicing targets to realize data splicing at different positions.
Fig. 2 is a schematic diagram of a partial perspective imaging model of a sphere of a spherical target. O iscXcYcZcAs camera coordinate system, OsiXsiYsiZsiThe spherical coordinate system of the ith sphere is established by taking the optical center of the camera as the origin and the connecting line of the optical center and the spherical center as the Z axis. Center of sphere OiTo the optical center O of the cameracA distance of Zi0Radius of the sphere is riAccording to ray tracing, a circle is formed by points where the outgoing ray is tangent to the sphere, i.e. a contour generator(the contour generator is the actual contour of each sphere),the imaging ellipse in the camera image plane is denoted CiAnd uv denotes an image pixel coordinate system.
Fig. 3 is a schematic diagram of the principle of ball target self-calibration and gauge head position calculation. Wherein the reference numerals denote: p0Representing the three-dimensional coordinates of the point to be measured, i.e. the position of the target probe, OsXsYsZsRepresenting the target coordinate system, d1~d4The distances from the centers of the four balls to the probe are shown.
FIG. 4 is a schematic diagram of camera movement position calibration that can simultaneously cover a ball target. Wherein the reference numerals denote: o isciXciYciZciRepresenting the camera coordinate system constructed at the i-th position of the camera, OsXsYsZsRepresenting the target coordinate system, pos (i) representing the camera movement position, [ R ]ab,Tab]Representing the transformation from coordinate system a to coordinate system b, and the footmarks a, b represent the camera coordinate system and the target coordinate system for each position.
The steps will be specifically described below.
The first stage is as follows: hardware set-up
Referring to fig. 1, a monocular vision measuring system is set up, and a perspective imaging model of a camera is established. The sphere part of the sphere target comprises more than three spheres 3 connected by spherical connecting blocks 5, the spheres 3 are connected on the spherical connecting pieces 5 through screw rods 4, and the radius r of the spheresiThe number and size of the balls can be determined differently according to the measurement field of view; the extension bar 2 of the probe is connected to the spherical connecting block 5, and the length and the shape of the extension bar are determined according to actual measurement occasions; the measuring head 1 is arranged at the tail end of the extension bar, and the positions of the sphere and the measuring head 1 can be determined through self calibration of the sphere target. The ball target is placed at a proper position in the field of view of the camera, and the measuring head 1 is contacted with a measured point. An image is taken.
And a second stage: and calibrating a single image of the internal parameter K of the camera based on the ball target and the absolute quadratic curve.
Referring to fig. 2, the sphere is partially imaged in perspective, with each sphere imaged as an ellipse on the image plane. The image plane is the plane where the actual sphere forms an image in the camera. The internal parameter K of the camera is mainly a matrix formed by the optical center and the focal length of the camera, and the calibration process is as follows:
2.1, using binarization and dilationThe method for processing the images of swelling, threshold value and the like and the edge detection operator extract the elliptic contour formed by the sphere, and the least square method is adopted to fit the elliptic curve equation Ci(i = 1-4), wherein i is the number of the balls in the ball target. Projecting a quadratic curve C of each sphereiStoring according to a certain sequence for standby.
2.2, imaging model based on perspective projection of sphere, camera coordinate system OcXcYcZcAnd the spherical coordinate system O of the ith spheresiXsiYsiZsiThere is only one rotation between them, so the homography matrix H between the twoi=KRidiag{1,1,ΛiWherein, Λi=Zi0/ri,Zi0Is the distance between the center of the ith sphere and the optical center of the camera, riIs the radius of the ith sphere, RiFor the rotation matrix, the contour generator of the sphere has an imaging elliptic quadratic curve equation on the image plane of C i = H i - T C i ‾ H i - 1 = H i - T diag ( 1,1 , - 1 ) H i - 1 . In dual space the representation is of the form:
C i * = H i C ui * H i T = K R i diag ( 1,1 - Λ i 2 ) R i T K T = KK T - o i o i T - - - ( 1 )
whereinIs the coordinate, r, imaged by the center of the ith sphere3iFor corresponding rotation matrix RiColumn 3.
The absolute quadratic curve IAC being the infinite plane pi=(0,0,0,1)TUpper quadratic curve, image w = (KK) under homography matrix H = KRT)-1Which can be represented in dual space as
w*=KKT(2)
Imaging from three spheres in space yields three equations as shown in (3). Calculating homography matrix between every two imaging elliptic curve equationsIs CARepresentation in dual space. According to the view geometry theory, the homography matrix of each pair of imaging ellipses comprises a feature vector, called epipolar line of the homography matrix, denoted l, passing through the corresponding ellipse pairAB(a, B =1,2,3 and a ≠ B), the eigenvalue corresponding to each eigenvector is called pole, denoted vAB(a, B =1,2,3 and a ≠ B). lAB、vABAnd w satisfies equation (3):
[lAB]×wvAB=0(3)
and obtaining w according to singular value decomposition, and further obtaining the camera internal parameter K according to the orthogonal decomposition of the formula (2). And can be obtained.
And a third stage: obtaining the position O of each sphere center by utilizing matrix singular value decomposition based on the spherical target imaging modeliDetermining the position relation between the sphere center and the measuring head through the self-calibration of the sphere target; once calibrated, the positional parameters can be used as known quantities for subsequent workpiece surface measurements without de-skewing the ball targets (increasing or decreasing the balls or changing the extension bar, etc.).
3.1, the formula (1) in the step 2 is simply deformed to obtain
β i K - 1 C i * K - T = R i diag ( 1,1 , - Λ i 2 ) R i T - - - ( 4 )
Wherein, βiThe imaging scale factor of the ith round ball;
the left side of the equation is a symmetric matrix, and the form consistent with the right side of the equation can be obtained through singular value decomposition, so that a rotation matrix R is obtainediAnd ΛiAnd obtaining the three-dimensional coordinates of each sphere center.
3.2, referring to fig. 3, the three-dimensional coordinates of the sphere center can be obtained by the calculation in step 3.1, and in order to obtain an accurate positional relationship between the sphere center and the stylus, the sphere target is self-calibrated before the target is used for three-dimensional measurement. Fixing the measuring head at one point, rotating the ball target around the measuring head by multiple angles to shoot m images, wherein the number of balls used in the ball target is n =4, and the center of the ball in the jth image is set as Oi(i =1 to n) has coordinates ofThe coordinates can be determined using step 3.1. Distance from ith ball to measuring head at jth positionAnd the stylus position (x, y, z) is an unknown quantity,
( x i j - x ) 2 + ( y i j - y ) 2 + ( z i j - z ) 2 = ( d i j ) 2 , - - - ( 5 )
for each ball, m-1 constraint equations can be obtained by using the constraint that the radius of the concentric circle is unchanged, and an error minimization function is established by combining all the centers of the balls
min Σ i = 1 n Σ j = 1 m ( d i j - d i ) - - - ( 6 )
Obtaining an optimized spatial position of the point to be measured by minimizing the total distance error, and then introducing (5) the spatial position as a known quantity, the distance between the center of each ball and the point to be measured being usedAnd (6) obtaining. The resulting distance value is used as a known quantity in the subsequent measurement process.Because the self-calibration process is only needed once, the three-dimensional coordinates of the point to be measured can be obtained only by shooting a single image in the subsequent measurement process.
A fourth stage: monocular measurement system moving calibration and data splicing based on ball target
Referring to fig. 4, firstly, a world coordinate system O is established based on the vector orthonormal method with the space sphere as a referencewXwYwZwAnd as a splicing coordinate system, realizing coordinate transformation of the camera at different positions by utilizing the splicing coordinate system.
4.1, placing the ball target at a proper position in the visual field, taking an image by the camera at the position 1, and moving the camera to the position i, wherein the cameras at the two positions are required to see the ball part of the ball target. If more positions are needed to perform large workpiece surface measurements, each position is required to see the spherical portion of the target. The balls shot at each position are required to be arranged smoothly according to the same order, and the uniformity of a world coordinate system is ensured.
And 4.2, establishing a world coordinate system by taking the immobile space ball as a reference, and establishing a Cartesian rectangular coordinate system by the three balls according to a standard orthogonal method. Selecting three specific balls, such as balls with serial numbers 1-3. Firstly, the world coordinate system origin OwIs arranged at the center of the ball 1, and the vector from the center of the ball 2 and the ball 3 to the center of the ball 1 is determinedIs defined as XwAxial direction from which Z can be determinedwThe shaft is provided with a plurality of axial holes, O w Z w ‾ = O 2 O 1 ‾ × O 3 O 1 ‾ , finally determining YwShaft: O w Y w ‾ = O w Z w ‾ × O 2 O 1 ‾ . after the direction of the coordinate axis vector is determined, unitizing the vectors of all directions to complete the establishment of the world coordinate system.
4.3 when the camera is in position 1, the world coordinate system is established by step 2, and the relation between the camera coordinate system and the world coordinate system is determined [ R ]w-1,Tw-1]After the camera is moved to the ith position, the relation between the coordinate system of the camera at the ith position and the world coordinate system [ R ] can be determined similarlyw-i,Tw-i]. And selecting a coordinate system of the camera at the first position as a reference coordinate system, and unifying the measured data at all the positions to the reference coordinate system to realize data splicing. The relationship of the ith camera coordinate system to the 1 st position camera coordinate system can be derived:
R 1 - i = R w - i R w - 1 - 1
T 1 - i = T w - i - R w - i R w - 1 - 1 T w - 1 (7)
the invention provides a feasible way for realizing the depth information acquisition under the monocular measurement system, and has the advantages of quick and convenient implementation and low cost. The method has the advantages that the space point measurement is completed while the camera calibration is realized, and the measurement accuracy can be optimized by increasing the number of the spheres at the sphere part and keeping the measuring tip to be fixed and rotating the sphere target for multiple shots. The invention simultaneously provides a feasible way for the mobile calibration and data splicing of the multi-view system and the monocular measurement system, the calibration principle of the multi-view system is similar to that of the monocular mobile calibration, and the coordinate system I and the data splicing can be completed through the step of the fourth stage.

Claims (10)

1. The utility model provides a portable ball target that is used for quick demarcation of monocular vision system and measures, mainly comprises spheroid and probe, the probe comprises extension bar (2) and gauge head (1), its characterized in that: the portable ball target further comprises a connecting block (5), wherein the connecting block (5) is used for supporting each ball body and connecting the probe, each ball body is composed of more than three non-coplanar ball bodies, and each ball body is connected with the connecting block through a screw (4).
2. The portable ball target of claim 1, wherein: the connecting block (5) is in a spherical shape.
3. A monocular vision system based on the portable ball target of claim 1, characterized in that: the monocular vision system includes a portable ball target and a camera that can see the complete ball target sphere portion.
4. A measuring method based on the monocular vision system of claim 3, characterized in that: (1) shooting a ball target image by a monocular vision system, and calibrating camera internal parameters K based on absolute quadratic curve imaging geometry according to a camera perspective imaging model and a ball projection model, wherein the camera internal parameters K comprise a camera optical center (u)0,v0) Focal length f and tilt factor s; (2) determining three-dimensional coordinates of each sphere center under a camera coordinate system by using the calibrated internal parameters of the camera and the spherical perspective imaging geometric model; (3) calculating the distance from the sphere center of each sphere in the sphere target to the measuring head; (4) and determining the position of the measuring head, namely the three-dimensional coordinate of the point to be measured according to the position relation between the sphere of the sphere target and the measuring head.
5. The measurement method according to claim 4, characterized in that: the specific method of the step (1) comprises the following steps:
(5.1) extracting the elliptical contour formed by each round ball in the camera by adopting an image processing method and an edge detection operator, and fitting a curve equation C corresponding to the elliptical contour by adopting a least square methodiWherein i is the number of the balls;
(5.2) projection imaging model based on sphere perspective, camera coordinate system OcXcYcZcAnd the spherical coordinate system O of the ith spheresiXsiYsiZsiThe homography matrix between is: hi=KRidiag{1,1,ΛiWherein, Λi=Zi0/ri,Zi0Between the sphere center of the ith sphere and the optical center of the cameraA distance of riIs the radius of the ith sphere, RiQuadratic curve equation C of ellipse formed by imaging of sphere in camera as rotation matrixiIn dual space the representation is of the form:
C i * = H i C ‾ i * H i T = KR i d i a g ( 1 , 1 , - Λ i 2 ) R i T K T = KK T - o i o i T , formula (5.1)
Wherein,is the sphere center imaging coordinate, r, of the ith sphere3iFor corresponding rotation matrix RiColumn 3, RiAnd K is a matrix formed by the internal parameters of the camera.
6. The measurement method according to claim 5, characterized in that: in the step (5.2), when the internal parameter K of the camera is solved,introducing an absolute quadratic curve, wherein the image of the absolute quadratic curve under the homography matrix is w ═ (KK)T)-1Expressed in dual space as: w is a*=KKT(ii) a The homography matrix between the imaging elliptic curve equations of every two spheres in the sphere target isA, B ≠ 1,2,3 and a ≠ B, wherein,imaging elliptic curve equation C for sphere AARepresentation in dual space, CBAn imaging ellipse wireless equation of the sphere B; according to the view geometry theory, the homography matrix of each pair of imaging ellipses comprises a feature vector, called epipolar line of the homography matrix, denoted l, passing through the corresponding ellipse pairABWhere a, B is 1,2,3 and a ≠ B, the eigenvalue corresponding to each eigenvector is called a pole, and is denoted as vABA, B ≠ B.l ≠ 1,2,3AB、vABAnd w satisfies the following formula (6.1):
[lAB]×wvAB=0(6.1)
decomposing according to singular value to obtain w, and further according to formula w*=KKTAnd obtaining the internal parameter K of the camera by orthogonal decomposition.
7. The measurement method according to claim 5, characterized in that: the specific method of the step (2) is as follows: transforming the formula (5.1) in the step (5.2) to obtain:
wherein, βiThe imaging scale factor of the ith round ball;
the expression on the left side of the equation is decomposed by singular values to obtain a form consistent with the right side of the equation, thereby obtaining a rotation matrix RiAnd ΛiAnd obtaining the three-dimensional coordinates of each sphere center.
8. The measurement method according to claim 4, characterized in that: the position coordinates (x, y, z) of the point to be measured are calculated according to the following:
(xi-x)2+(yi-y)2+(zi-z)2=(di)2equation (8.1)
Wherein i is the number of the balls, (x)i,yi,zi) Is the coordinate of the sphere center of the ith sphere, diAnd (3) obtaining i formulas (8.1) by the i round balls as the distance between the round balls and the measuring head, and obtaining the position coordinates (x, y, z) of the point to be measured by combining the i round balls.
9. The measurement method according to claim 8, characterized in that: before measurement, firstly, self-calibration is carried out on the ball target to obtain the distance between the ball and the measuring head, and the self-calibration method comprises the following steps: the measuring head is fixed at one point and is fixed, the ball target rotates around the measuring head for a plurality of angles to shoot m images, an error minimization function is established by combining all the ball centers with the constraint that the radius of the concentric circles is unchanged, the space position of the point to be measured is obtained by minimizing the total distance error, and finally the distance between the ball center of each ball and the point to be measured is obtained.
10. A method for performing motion scaling and data stitching based on the monocular vision system of claim 3, characterized in that: (10.1) firstly, placing the ball target at a proper position to ensure that the camera can see the ball part of the ball target; (10.2) establishing a world coordinate System OwXwYwZwSelecting three balls of the ball target in sequence, wherein the origin O of the world coordinate systemwIs the center of the first sphere, X of the world coordinate systemwThe axis being the vector from the center of the second sphere to the center of the first sphereZ of world coordinate systemwThe axis is as follows:y of world coordinate systemwThe shaft isWherein,a vector from the center of the third sphere to the center of the first sphere; (10.3) obtaining the relation between the camera coordinate system and the world coordinate system [ R ] from the step (10.2)w-1,Tw-1]When the camera moves to other non-first positions, the relation between the coordinate system and the world system of the camera at the position is [ R ]w-i,Tw-i]And setting a coordinate system of the camera at the first position as a reference coordinate system, and unifying the data measured at all other positions to the reference coordinate system to realize data splicing.
CN201310064021.5A 2013-02-28 2013-02-28 The Portable ball target of single camera vision system and use thereof and measuring method thereof Active CN103162622B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310064021.5A CN103162622B (en) 2013-02-28 2013-02-28 The Portable ball target of single camera vision system and use thereof and measuring method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310064021.5A CN103162622B (en) 2013-02-28 2013-02-28 The Portable ball target of single camera vision system and use thereof and measuring method thereof

Publications (2)

Publication Number Publication Date
CN103162622A CN103162622A (en) 2013-06-19
CN103162622B true CN103162622B (en) 2016-06-29

Family

ID=48585915

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310064021.5A Active CN103162622B (en) 2013-02-28 2013-02-28 The Portable ball target of single camera vision system and use thereof and measuring method thereof

Country Status (1)

Country Link
CN (1) CN103162622B (en)

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103759669B (en) * 2014-01-03 2016-11-23 河南科技大学 A kind of monocular vision measuring method of heavy parts
CN104101299A (en) * 2014-08-05 2014-10-15 吉林大学 Camera three-dimensional truss calibrating target for automotive visual detection system
CN104101301B (en) * 2014-08-11 2016-08-03 中铁宝桥集团有限公司 A kind of close-range photogrammetry target
CN104807476A (en) * 2015-04-23 2015-07-29 上海大学 Pose estimation-based quick probe calibration device and method
CN105157604B (en) * 2015-08-07 2017-07-11 天津大学 The quick calibrating method of outfield multi beam Vision Measuring System With Structured Light Stripe
CN106426158A (en) * 2015-08-11 2017-02-22 冯黎 Automatic robot operating procedure correcting system applied in combination with three-dimensional measurement
DE202015009460U1 (en) * 2015-08-12 2017-10-12 Jenoptik Industrial Metrology Germany Gmbh Hole inspection apparatus
CN105606025B (en) * 2016-02-01 2017-06-27 西安交通大学 A kind of method that use laser and monocular camera measure sphere-like object geometric parameter
CN106197283B (en) * 2016-09-23 2020-03-10 广州汽车集团股份有限公司 Coordinate recognizer, use method thereof and measurement system
CN106846413A (en) * 2017-01-24 2017-06-13 太原理工大学 The device and method that three-dimensional tree-shaped rectangular coordinate system is built and image space is demarcated
CN106979748A (en) * 2017-03-16 2017-07-25 北京汽车股份有限公司 Three coordinate detection methods and detecting system based on graphics
CN107146254A (en) * 2017-04-05 2017-09-08 西安电子科技大学 The Camera extrinsic number scaling method of multicamera system
US10466027B2 (en) 2017-06-21 2019-11-05 Fujitsu Ten Corp. Of America System and method for marker placement
CN108007332B (en) * 2017-11-07 2021-01-05 中国核工业二三建设有限公司 Multi-pipe-diameter pipe orifice size measurement auxiliary tool
CN107756408B (en) * 2017-11-22 2020-10-23 浙江优迈德智能装备有限公司 Robot track teaching device and method based on active infrared binocular vision
CN108268427B (en) * 2018-01-10 2021-07-09 中国地质大学(武汉) Arbitrary ball goal probability analysis method and device and storage device
CN109079774B (en) * 2018-05-04 2019-07-09 南京航空航天大学 A kind of isotropism visual sensing three-dimensional spherical target and scaling method
CN108489398B (en) * 2018-05-21 2020-03-17 华南农业大学 Method for measuring three-dimensional coordinates by laser and monocular vision under wide-angle scene
CN108917646B (en) * 2018-07-24 2023-08-22 天津市友发德众钢管有限公司 Global calibration device and method for multi-vision sensor
CN109443206B (en) * 2018-11-09 2020-03-10 山东大学 System and method for measuring tail end pose of mechanical arm based on color spherical light source target
CH715610A1 (en) * 2018-12-04 2020-06-15 Watch Out S A System and methods for measuring the profile of a part.
CN110555902B (en) * 2019-09-10 2021-03-16 中国科学院长春光学精密机械与物理研究所 Monocular vision measurement cooperative target vision simulation system
CN110793458B (en) * 2019-10-30 2022-10-21 成都安科泰丰科技有限公司 Coplane adjusting method for two-dimensional laser displacement sensor
CN112815832B (en) * 2019-11-15 2022-06-07 中国科学院长春光学精密机械与物理研究所 Measuring camera coordinate system calculation method based on 3D target
CN111174725B (en) * 2019-12-31 2024-06-04 吉林大学 Coaxial-earth-connection monocular reconstruction system and method for polar line geometry of concentric secondary curve
CN111504202A (en) * 2020-02-29 2020-08-07 深圳市智信精密仪器有限公司 Method for high-precision calibration splicing of multiple line lasers
CN111256591B (en) * 2020-03-13 2021-11-02 易思维(杭州)科技有限公司 External parameter calibration device and method for structured light sensor
CN111256592B (en) * 2020-03-13 2021-12-03 易思维(杭州)科技有限公司 External parameter calibration device and method for structured light sensor
CN112665523B (en) * 2020-11-24 2022-04-19 北京星航机电装备有限公司 Combined measurement method for complex profile
CN112365407B (en) * 2021-01-13 2021-04-20 西南交通大学 Panoramic stitching method for camera with configurable visual angle
CN113119129A (en) * 2021-04-28 2021-07-16 吕若罡 Monocular distance measurement positioning method based on standard ball
CN113446934B (en) * 2021-06-03 2022-05-17 中国人民解放军海军工程大学 Close-range photogrammetry rotatable code calibration equipment and calibration method
CN113409403B (en) * 2021-06-29 2022-09-16 湖南泽塔科技有限公司 Camera calibration frame, single-image camera calibration method and system based on calibration frame
CN114396904A (en) * 2021-11-29 2022-04-26 北京银河方圆科技有限公司 Positioning device and positioning system
CN114353693B (en) * 2021-12-28 2023-11-28 中国航空工业集团公司北京长城航空测控技术研究所 Special handheld vector lever for large-scale three-dimensional space integral measurement positioning instrument
CN114935316B (en) * 2022-05-20 2024-03-12 长春理工大学 Standard depth image generation method based on optical tracking and monocular vision

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101216296A (en) * 2008-01-11 2008-07-09 天津大学 Binocular vision rotating axis calibration method
CN102062578A (en) * 2010-12-13 2011-05-18 西安交通大学 Handheld optical target for measuring visual coordinate and measuring method thereof
CN102288106A (en) * 2010-06-18 2011-12-21 合肥工业大学 Large-space visual tracking six-dimensional measurement system and method
CN102663767A (en) * 2012-05-08 2012-09-12 北京信息科技大学 Method for calibrating and optimizing camera parameters of vision measuring system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012141810A1 (en) * 2011-03-03 2012-10-18 Faro Technologies, Inc. Target apparatus and method
KR101824501B1 (en) * 2011-05-19 2018-02-01 삼성전자 주식회사 Device and method for controlling display of the image in the head mounted display

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101216296A (en) * 2008-01-11 2008-07-09 天津大学 Binocular vision rotating axis calibration method
CN102288106A (en) * 2010-06-18 2011-12-21 合肥工业大学 Large-space visual tracking six-dimensional measurement system and method
CN102062578A (en) * 2010-12-13 2011-05-18 西安交通大学 Handheld optical target for measuring visual coordinate and measuring method thereof
CN102663767A (en) * 2012-05-08 2012-09-12 北京信息科技大学 Method for calibrating and optimizing camera parameters of vision measuring system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A STRATIFIED APPROACH FOR CAMERA CALIBRATION USING SPHERES;Kwan-Yee Wong,etc.;《Image Processing, IEEE Transactions on》;20100803;第20卷(第2期);305-316 *
用于相机标定的球靶标投影误差分析与校正;谷飞飞;《光学学报》;20121231;第32卷(第12期);1-7 *

Also Published As

Publication number Publication date
CN103162622A (en) 2013-06-19

Similar Documents

Publication Publication Date Title
CN103162622B (en) The Portable ball target of single camera vision system and use thereof and measuring method thereof
CN108844459B (en) Calibration method and device of blade digital sample plate detection system
CN108921901B (en) Large-view-field camera calibration method based on precise two-axis turntable and laser tracker
CN103759670B (en) A kind of object dimensional information getting method based on numeral up short
CN102364299B (en) Calibration technology for multiple structured light projected three-dimensional profile measuring heads
CN103759669B (en) A kind of monocular vision measuring method of heavy parts
CN105783773B (en) A kind of numerical value scaling method of line structured light vision sensor
CN102207371B (en) Three-dimensional point coordinate measuring method and measuring apparatus thereof
CN104990515B (en) Large-sized object three-dimensional shape measure system and its measuring method
CN100562707C (en) Binocular vision rotating axis calibration method
CN104266608B (en) Field calibration device for visual sensor and calibration method
CN103258329B (en) A kind of camera marking method based on ball one-dimensional
CN102944191B (en) Method and device for three-dimensional vision measurement data registration based on planar circle target
CN112985293B (en) Binocular vision measurement system and measurement method for single-camera double-spherical mirror image
CN105957096A (en) Camera extrinsic parameter calibration method for three-dimensional digital image correlation
CN102155909A (en) Nano-scale three-dimensional shape measurement method based on scanning electron microscope
CN107726975A (en) A kind of error analysis method of view-based access control model stitching measure
CN101261738A (en) A camera marking method based on double 1-dimension drone
CN104268876A (en) Camera calibration method based on partitioning
CN107941153B (en) Visual system for optimizing calibration of laser ranging
CN111091599B (en) Multi-camera-projector system calibration method based on sphere calibration object
CN110378969A (en) A kind of convergence type binocular camera scaling method based on 3D geometrical constraint
CN104634248A (en) Revolving shaft calibration method under binocular vision
CN108195314B (en) Reflective striped three dimension profile measurement method based on more field stitchings
CN110044374A (en) A kind of method and odometer of the monocular vision measurement mileage based on characteristics of image

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant