CN103530880A - Camera calibration method based on projected Gaussian grid pattern - Google Patents

Camera calibration method based on projected Gaussian grid pattern Download PDF

Info

Publication number
CN103530880A
CN103530880A CN201310482789.4A CN201310482789A CN103530880A CN 103530880 A CN103530880 A CN 103530880A CN 201310482789 A CN201310482789 A CN 201310482789A CN 103530880 A CN103530880 A CN 103530880A
Authority
CN
China
Prior art keywords
point
camera
gaussian
striation
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310482789.4A
Other languages
Chinese (zh)
Other versions
CN103530880B (en
Inventor
贾振元
刘巍
李明星
刘阳
杨景豪
张驰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN201310482789.4A priority Critical patent/CN103530880B/en
Publication of CN103530880A publication Critical patent/CN103530880A/en
Application granted granted Critical
Publication of CN103530880B publication Critical patent/CN103530880B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

A camera calibration method based on a projected Gaussian grid pattern, which is disclosed by the invention, belongs to the field of image processing and computer visual inspection, in particular to an on-site calibration method for the intrinsic and exterior parameters of cameras in a large forging dimension measurement system. By utilizing the characteristic of the Gaussian distribution of the widthwise grey scales of transverse and vertical light bars in the Gaussian grid pattern and fitting a Gaussian curve, the camera calibration method can accurately obtain the image coordinates of the points on the central lines of the light bars, so that a central line equation of the transverse and vertical light bars is fit out, the intersection points of the central lines of the transverse and vertical light bars are feature points for calibration, and according to the image coordinates of the feature points for calibration provided in the shot image of the Gaussian grid pattern, the intrinsic and exterior parameters of the cameras are acquired step by step. The camera calibration method has high real-timeness, robustness and high calibration precision, high-precision camera parameters can be obtained by step-by-step calibration, the problem of coupling in the simultaneous solution of all the camera parameters is avoided, and the camera calibration method is applicable to the on-line calibration of cameras on the forging site.

Description

Camera marking method based on projection Gaussian network pattern
Technical field
The invention belongs to image and process and Computer Vision Detection field, the field calibration method of the inside and outside parameter of video camera in particularly large forgings Size Measuring System.
Background technology
One of basic task that computer vision is processed is according to two-dimensional image information, to recover the three-dimensional geometric information of object.Realize task of utilizing picture point to ask for corresponding space object surface point, need to determine video camera imaging geometric model, the parameter of this geometric model is called camera parameters.Camera parameters is divided into inside and outside parameter, and intrinsic parameter is the parameter relevant with optical characteristics with how much of video camera self, and outer parameter is that video camera is with respect to three-dimensional position and the direction of a certain world coordinate system.The process of determining the inside and outside parameter of video camera is called camera calibration, and the precision of scaling method directly has influence on the precision of computer vision measurement.Therefore, video camera being carried out to research quick, simple and direct, accurate demarcation is significant undoubtedly.
Traditional camera marking method can be divided into based on 3D stereo target scaling method according to the difference of calibrated reference, based on 2D plane target drone scaling method (the gridiron pattern target scaling method that the Zhang Zhengyou of take proposes is representative) and the scaling method based on 1D target.These traditional scaling methods all need calibrated reference, and for the demarcation of large-field shooting machine, can the unique point of calibrated reference evenly be covered with in whole visual field, directly affects the precision of demarcation.On the one hand, make high-precision large scale demarcate target involve great expense, difficult in maintenance.On the other hand, also inapplicable for the occasion that is not suitable for online and impossible use calibrated reference.Under the hot environment of forging process for fuel workshop, the method for calibrating block, scaling board and stickup target spot all can not be applied.Therefore, traditional cameras scaling method can not meet the requirement that the online dimensional parameters of large forgings is measured.In addition, although self-calibrating method does not utilize any demarcation thing, the constraint that only utilizes camera intrinsic parameter self to exist, just can estimate camera intrinsic parameter according to the corresponding relation of inter-pictures point.Comparatively flexible in this class methods operation, but precision is not too high, and robustness is not enough.
By adopting projector projects target to address the above problem, the characteristic pattern of projection in theory edge should be step variation, but in fact due to diffusional effect, pattern edge is gradual change trend, and edge can move to black background one lateral deviation.The projector projects circular feature spot array of take is example, the center of circular light spot is the unique point for demarcating, because each hot spot can produce diffusion in various degree towards periphery, therefore be difficult to the accurate center that adopts centroid method to obtain round spot according to the image after binaryzation, utilize related algorithm to carry out circular feature Boundary Extraction and carry out circle (or oval) matching to obtain also not a duck soup of high-precision Yuan Ban center.In like manner, utilize the striation combination pattern that projector projection is general, extract the center of striation as characteristic curve, precision is also difficult to guarantee.
Summary of the invention
Technical matters to be solved by this invention is to overcome the deficiencies in the prior art, for on-the-spot in forging, traditional scaling method exists precision low, non real-time, even can not apply and self-calibrating method exists precision not too high, the problems such as robustness deficiency, invent a kind of large-field shooting machine scaling method based on projection Gaussian network pattern, utilize in Gaussian network pattern horizontal, the gray scale of vertical striation on Width is the characteristic of Gaussian distribution, by fitted Gaussian curve, can obtain accurately the image coordinate of the point on light stripe centric line, and then simulate horizontal stroke, the center line equation of vertical striation, horizontal, the intersection point of vertical light stripe centric line is feature point for calibration, according to the image coordinate that the feature point for calibration providing in the image of the Gaussian network pattern obtaining is provided, substep obtain video camera in, outer parameter.
The technical scheme that the present invention takes is a kind of camera marking method based on projection Gaussian network pattern, it is characterized in that, utilize in Gaussian network pattern horizontal, the gray scale of vertical striation on Width is the characteristic of Gaussian distribution, by fitted Gaussian curve, can obtain accurately the image coordinate of the point on light stripe centric line, and then simulate horizontal stroke, the center line equation of vertical striation, horizontal, the intersection point of vertical light stripe centric line is feature point for calibration, according to the image coordinate that the feature point for calibration providing in the image of the Gaussian network pattern obtaining is provided, substep obtain video camera in, outer parameter, concrete steps are as follows:
Step 1: build camera calibration systems.The four-dimensional automatically controlled platform 2a in left side, the four-dimensional automatically controlled platform 2b in right side and projector 3 are arranged on the table top of platform 1, left side camera 4a are fixed on to the four-dimensional automatically controlled platform 2a in left side upper, right side video camera 4b is fixed on the four-dimensional automatically controlled platform 2b in right side.
Step 2: projection Gaussian network pattern, take and obtain intersecting point coordinate.By projector 3, to projection on the flat board of polishing in factory building or metope 5, by many parallel horizontal striation vertical striations parallel with many, form Gaussian network pattern 6, the gray scale of each striation on Width is all Gaussian distribution, wherein horizontal, vertical striation intersection point A i,jfor feature point for calibration, the numbering that i is horizontal striation, according to order from top to bottom, j is for the numbering of vertical striation, according to order from left to right.Due to the light intensity stack of horizontal, vertical striation intersection point place, left side camera 4a is carried out after binary conversion treatment to the speck at remaining grid intersection point place only in the image of acquisition, i.e. isolated UNICOM region one by one with the image that video camera 4b shooting in right side obtains.Utilize centroid method can obtain the center-of-mass coordinate (u0 in UNICOM region i,j, v0 i,j), as unique point A i,jrough position.To take this rough position as the center of circle, and the border circular areas that the Δ pixel of usining is radius is as hunting zone, then at [u0 i,j-Δ, u0 i,j+ Δ] in scope every Δ/n broad ways search horizontal striation once, according to Gaussian distribution characteristic, carry out matching, the point using Gaussian distribution peak point on horizontal light stripe centric line, therefore can obtain the some P on 2n+1 center line i, j, s, subscript s is 1,2,3 ..., 2n+1, and then simulate straight line l h, i, j.Similarly, at [v0 i,j-Δ, v0 i,j+ Δ] in scope every Δ/n broad ways search vertical striation once, according to Gaussian distribution characteristic, carry out matching, the point using Gaussian distribution peak point on vertical light stripe centric line, can obtain the some Q on 2n+1 center line i, j, t, subscript t is 1,2,3 ..., 2n+1, and then simulate straight line l v, i, j.Finally, the intersection point by asking for two intersecting straight lines in same hunting zone is as feature point for calibration A i,j, its coordinate is (u i,j, v i,j).
Step 3: the rough coordinates of obtaining principal point.Utilize left side camera 4a or right side video camera 4b under two kinds of different focal, to take the Gaussian network pattern 6 of same projection, unique point A i,jimage coordinate be respectively (u1 i,j, v1 i,j) and (u2 i,j, v2 i,j), principal point coordinate is (u 0, v 0), have:
u 2 i , j - u 0 u 1 i , j - u 0 = v 2 i , j - v 0 v 1 i , j - v 0 - - - ( 2 )
Can utilize above formula to obtain the coordinate (u of the rough position of principal point 0, v 0).
Step 4: the principal point coordinate of asking for distortion factor and optimization.According to distortion model, can release the intersection point A of actual photographed i,jcoordinate p i,j=(u i,j, v i,j, 1) twith desirable intersecting point coordinate q i,j=(u' i,j, v' i,j, 1) ttransformational relation as follows:
Figure BDA0000396435260000042
Wherein,
Figure BDA0000396435260000043
k 1with k 2for coefficient of radial distortion, p 1with p 2for tangential distortion coefficient.
In addition, horizontal, the vertical the same number of lattice of striation of take is example, and intersection point adds up to num, on so every row, has
Figure BDA0000396435260000044
individual intersection point, linear according to the guarantor of straight line, i.e. the character of the some conllinear on same striation, in conjunction with the necessary and sufficient condition of three point on a straight line, can list optimization aim function as follows:
min Σ i = 1 num ( Σ j = num ( i - 1 ) + 1 num ( i - 1 ) + num - 2 | q i , j + 1 T [ q i , j ] × q i , j + 2 | ) - - - ( 4 )
Wherein,
Figure BDA0000396435260000046
be 3 A i,j, A i, j+1, A i, j+2the necessary and sufficient condition of conllinear.[q i, j] *represent q i,jantisymmetric matrix, that is:
[ q i , j ] × = 0 - 1 v ′ i , j 1 0 - u ′ i , j - v ′ i , j u ′ i , j 0 - - - ( 5 )
By Levenberg-Marquardt nonlinear optimization algorithm, be optimized, make the value minimum of the objective function of formula (4), can obtain distortion factor k 1, k 2, p 1with p 2and the principal point coordinate (u' after optimizing 0, v' 0).Then utilize formula (3) that all intersecting point coordinates are modified to desirable coordinate.
Step 5: ask for remaining inner parameter of video camera.Utilize revised desirable intersection point as unique point, adopt active vision method, utilize the four-dimensional automatically controlled platform 2a in left side to drive left side camera 4a to do two groups of orthogonal motions, at three whole story of every group of orthogonal motion, on position, take respectively the image of the Gaussian network pattern 6 of a projection, final left side camera 4a takes and obtains 6 images.
Parallel lines and plane at infinity intersect at same infinity point, i.e. hidden disappearing a little.And one group of orthogonal motion contains twice translation, a translation motion the whole story position photographs two width images on the line of corresponding intersection point be one group of space parallel line, and twice translation be orthogonal, so we can obtain the hidden of one group of quadrature and disappear a little to e i1(u i1, v i1) and e i2(u i2, v i2), i=1 wherein, 2, represent the order of orthogonal motion, and have Oe i1oe i2=0, the principal point that wherein O is video camera.Utilize that two groups of quadratures are hidden to disappear a little rightly, can solve respectively by solving following system of equations the scale factor α in the Intrinsic Matrix K of left side camera 4a xwith α y:
( u 11 - u ′ 0 ) ( u 12 - u ′ 0 ) / α x 2 + ( v 11 - v ′ 0 ) ( v 12 - v ′ 0 ) / α y 2 + 1 = 0 ( u 21 - u ′ 0 ) ( u 22 - u ′ 0 ) / α x 2 + ( v 21 - v ′ 0 ) ( v 22 - v ′ 0 ) / α y 2 + 1 = 0 - - - ( 6 )
Similarly, can obtain the scale factor α in the Intrinsic Matrix K of right side video camera 4b xwith α y.
Step 6: the external parameter that obtains video camera.Left side camera 4a and right side video camera 4b take the Gaussian network pattern 6 of same projection, world coordinate system is based upon on the camera coordinate system of left side camera 4a, utilizes left and right video camera 4a, the revised match point of 4b photographic images to calculate fundamental matrix F.Utilize intrinsic parameter and the fundamental matrix asked for can in the situation that differing a scale factor s, calculate essential matrix E.After decomposing essential matrix E, can in the situation that differing a scale factor, determine external parameter (rotation matrix R' and translation vector t').
Utilize the parallel Gauss's striation of projector 3 projections to physical length L 0on the gauge block of accurately measuring, utilize the Gaussian characteristics of striation to simulate sub-pix light stripe centric line, utilize the point of gray scale cataclysm as the frontier point of gauge block, according to the length L of the above-mentioned inside and outside Reconstruction gauge block of asking for ' 0, can obtain scale factor and be: s=L 0/ L' 0.Therefore, can obtain the external parameter (rotation matrix R=R' and translation vector t=s*t') of video camera reality.So far, completed the calibration process of video camera.
The invention has the beneficial effects as follows that the intersection point of the Gaussian network pattern that adopts projector projection, as feature point for calibration, avoided the use of calibrating block, scaling board and binding mark point, be convenient to realize the real-time calibration that carries out video camera under the complex environments such as the scene of forging.According to gray scale on striation Width, be Gaussian distribution characteristic and can determine accurately the positional information of feature point for calibration, there is high robustness, substep is demarcated can obtain high-precision camera parameters, coupling problem when the method can be avoided all camera parameters to solve simultaneously simultaneously.
Accompanying drawing explanation
Fig. 1 is calibration system schematic diagram of the present invention.Wherein: 1-shock insulation platform, the four-dimensional automatically controlled platform in 2a-left side, the four-dimensional automatically controlled platform in 2b-right side, 3-projector, 4a-left side camera, 4b-right side video camera, the flat board of 5-polishing or metope, 6-Gaussian network pattern.
Fig. 2 is that the present invention is taken the image of the lattice obtaining by video camera.
Fig. 3 is that the present invention carries out binary conversion treatment to the image of lattice and obtains the rough position of intersection point.
Fig. 4 is that Gauss light strip array reconstruction gauge block size of the present invention is asked for scale factor.
Embodiment
Below in conjunction with accompanying drawing and technical scheme, further describe the specific embodiment of the present invention.
Camera calibration adopts classical pinhole imaging system model conventionally, and the expression formula of this model is as follows:
Figure BDA0000396435260000071
Wherein, (X w, Y w, Z w, 1) tfor the homogeneous coordinates of spatial point in world coordinate system, (u, v, 1) tfor corresponding image picture point pixel coordinate is o 0homogeneous coordinates in uv, α x=f/dx is o 0scale factor in uv coordinate system on u axle, α y=f/dy is o 0scale factor in uv coordinate system on v axle, f is camera lens focal length, dx and dy are respectively the horizontal stroke of pixel, vertical physical size, (u 0, v 0) be principal point coordinate, ρ cfor scale-up factor, K is intrinsic parameters of the camera matrix, the external parameter matrix that [R|t] is video camera, and wherein, R is rotation matrix, t is translation vector.
Intrinsic parameters of the camera comprises principal point coordinate (u 0, v 0), scale factor α x, α y, coefficient of radial distortion k 1, k 2with tangential distortion coefficient p 1, p 2.Video camera external parameter be camera coordinate system with respect to the orientation of world coordinate system, comprise rotation matrix R and translation vector t.
Step 1: build camera calibration systems.The four-dimensional automatically controlled platform 2a in left side, the four-dimensional automatically controlled platform 2b in right side and projector 3 are arranged on the table top of platform 1, left side camera 4a is fixed on the four-dimensional automatically controlled platform 2a in left side, right side video camera 4b is fixed on to the four-dimensional automatically controlled platform 2b in right side upper, as shown in Figure 1.
Step 2: projection Gaussian network pattern, take and obtain intersecting point coordinate.By projector 3 to projection Gaussian network pattern 6 on the flat board of polishing in factory building or metope 5, Gaussian network pattern 6 is comprised of many parallel horizontal striation vertical striations parallel with many, the gray scale of each striation is all Gaussian distribution on Width, wherein horizontal, vertical striation intersection point A i,jfor feature point for calibration, the numbering that i is horizontal striation, according to order from top to bottom, j is for the numbering of vertical striation, according to order from left to right.The image of the projection Gaussian network pattern being obtained by left side camera 4a or right side video camera 4b shooting as shown in Figure 2.Due to the light intensity stack of horizontal, vertical striation intersection point place, left side camera 4a is carried out after binary conversion treatment with the image that video camera 4b shooting in right side obtains, the speck at remaining grid intersection point place only in the image of acquisition, i.e. isolated UNICOM region one by one, as shown in Figure 3.Utilize centroid method can obtain the center-of-mass coordinate (u0 in UNICOM region i,j, v0 i,j), as unique point A i,jrough position.To take this rough position as the center of circle, and the border circular areas that the Δ pixel of usining is radius is as hunting zone, then at [u0 i,j-Δ, u0 i,j+ Δ] in scope every Δ/n broad ways search horizontal striation once, according to Gaussian distribution characteristic, carry out matching, the point using Gaussian distribution peak point on horizontal light stripe centric line, therefore can obtain the some P on 2n+1 center line i, j, s, subscript s is 1,2,3 ..., 2n+1, and then simulate straight line l h, i, j.Similarly, at [v0 i,j-Δ, v0 i,j+ Δ] in scope every Δ/n broad ways search vertical striation once, according to Gaussian distribution characteristic, carry out matching, the point using Gaussian distribution peak point on vertical light stripe centric line, can obtain the some Q on 2n+1 center line i, j, t, subscript t is 1,2,3 ..., 2n+1, and then simulate straight line l v, i, j.Finally, the intersection point by asking for two intersecting straight lines in same hunting zone is as feature point for calibration A i,j, its coordinate is (u i,j, v i,j).
Step 3: the rough coordinates of obtaining principal point.Utilize varifocal method to ask for principal point, left side camera 4a or right side video camera 4b take same projection Gaussian network pattern 6, unique point A under two kinds of different focal i,jimage coordinate be respectively (u1 i,j, v1 i,j) and (u2 i,j, v2 i,j), principal point coordinate is (u 0, v 0), have:
u 2 i , j - u 0 u 1 i , j - u 0 = v 2 i , j - v 0 v 1 i , j - v 0 - - - ( 2 )
For the time being lens zoom center is considered as to principal point, can utilizes above formula to obtain the coordinate (u of the rough position of principal point 0, v 0).
Step 4: the principal point coordinate of asking for distortion factor and optimization.According to distortion model, can release the intersection point A of actual photographed i,jcoordinate p i,j=(u i,j, v i,j, 1) twith desirable intersecting point coordinate q i,j=(u' i,j, v' i,j, 1) ttransformational relation as follows:
Figure BDA0000396435260000091
Wherein, k 1with k 2for coefficient of radial distortion, p 1with p 2for tangential distortion coefficient.
In addition, horizontal, the vertical the same number of lattice of striation of take is example, and intersection point adds up to num, on so every row, has
Figure BDA0000396435260000093
individual intersection point, linear according to the guarantor of straight line, i.e. the character of the some conllinear on same striation, in conjunction with the necessary and sufficient condition of three point on a straight line, can list optimization aim function as follows:
min Σ i = 1 num ( Σ j = num ( i - 1 ) + 1 num ( i - 1 ) + num - 2 | q i , j + 1 T [ q i , j ] × q i , j + 2 | ) - - - ( 4 )
Wherein,
Figure BDA0000396435260000095
be 3 A i,j, A i, j+1, A i, j+2the necessary and sufficient condition of conllinear.[q i, j] *represent q i,jantisymmetric matrix, that is:
[ q i , j ] × = 0 - 1 v ′ i , j 1 0 - u ′ i , j - v ′ i , j u ′ i , j 0 - - - ( 5 )
By Levenberg-Marquardt nonlinear optimization algorithm, be optimized, make the value minimum of the objective function of formula (4), can obtain distortion factor k 1, k 2, p 1with p 2and the principal point coordinate (u' after optimizing 0, v' 0).Then utilize formula (3) that all intersecting point coordinates are modified to desirable coordinate.
Step 5: ask for remaining inner parameter of video camera.Utilize revised desirable intersection point as unique point, adopt active vision method, utilize the four-dimensional automatically controlled platform 2a in left side to drive left side camera 4a to do two groups of orthogonal motions, at three whole story of every group of orthogonal motion, on position, take respectively the image of a projection Gaussian network pattern 6, final left side camera 4a takes and obtains 6 images.Idiographic flow is as follows: (1) regulates the four-dimensional automatically controlled platform 2a in left side to suitable position, starts first group of orthogonal motion, takes respectively the image of a projection Gaussian network pattern 6 at three whole story of orthogonal motion on position; (2) utilize the four-dimensional automatically controlled platform 2a in left side to make left side camera 4a overlook certain angle, start second group of orthogonal motion, at three whole story of orthogonal motion, on position, take respectively the image of a projection Gaussian network pattern 6.
On straight line, the image of infinity point is called the hidden of this straight line and disappears a little.Because parallel lines and plane at infinity intersect at same infinity point, i.e. hidden disappearing a little.One group of orthogonal motion contains twice translation, a translation motion the whole story position photographs two width images on the line of corresponding intersection point be one group of space parallel line, and twice translation be orthogonal, so we can obtain the hidden of one group of quadrature and disappear a little to e i1(u i1, v i1) and e i2(u i2, v i2), subscript i=1,2, represent the order of orthogonal motion, and have Oe i1oe i2=0, the principal point that wherein O is video camera.Utilize that two groups of quadratures are hidden to disappear a little rightly, can solve respectively by solving following system of equations the scale factor α in the Intrinsic Matrix K of left side camera 4a xwith α y:
( u 11 - u ′ 0 ) ( u 12 - u ′ 0 ) / α x 2 + ( v 11 - v ′ 0 ) ( v 12 - v ′ 0 ) / α y 2 + 1 = 0 ( u 21 - u ′ 0 ) ( u 22 - u ′ 0 ) / α x 2 + ( v 21 - v ′ 0 ) ( v 22 - v ′ 0 ) / α y 2 + 1 = 0 - - - ( 6 )
Similarly, can obtain the scale factor α in the Intrinsic Matrix K of right side video camera 4b xwith α y.
Step 6: obtain external parameter.Left side camera 4a and right side video camera 4b take same projection Gaussian network pattern 6, world coordinate system are based upon on the camera coordinate system of left side camera 4a, utilize left and right video camera 4a, the revised match point of 4b photographic images to calculate fundamental matrix F.Utilize intrinsic parameter and fundamental matrix can in the situation that differing a scale factor s, calculate essential matrix E.After decomposing essential matrix E, can in the situation that differing a scale factor, determine external parameter (rotation matrix R' and translation vector t').
As shown in Figure 4, utilize the parallel Gauss's striation of projector 3 projections to physical length L 0on the gauge block of accurately measuring, utilize the Gaussian characteristics of striation to simulate sub-pix light stripe centric line, utilize the point of gray scale cataclysm as the frontier point of gauge block, according to the length L of the above-mentioned inside and outside Reconstruction gauge block of asking for ' 0, can obtain scale factor and be: s=L 0/ L' 0.Therefore, can obtain the external parameter (rotation matrix R=R' and translation vector t=s*t') of video camera reality.So far, completed the calibration process of video camera.
The camera marking method that the present invention proposes has good real-time, robustness and higher stated accuracy, can be used under the complex environments such as the scene of forging large-field shooting machine is carried out to on-line proving.

Claims (1)

1. the camera marking method based on projection Gaussian network pattern, it is characterized in that, camera marking method utilizes in Gaussian network pattern horizontal, the gray scale of vertical striation on Width is the characteristic of Gaussian distribution, by fitted Gaussian curve, can obtain accurately the image coordinate of the point on light stripe centric line, and then simulate horizontal stroke, the center line equation of vertical striation, horizontal, the intersection point of vertical light stripe centric line is feature point for calibration, according to the image coordinate that the feature point for calibration providing in the image of the Gaussian network pattern obtaining is provided, substep obtain video camera in, outer parameter, concrete steps are as follows:
Step 1: build camera calibration systems; The four-dimensional automatically controlled platform (2a) in left side, the four-dimensional automatically controlled platform (2b) in right side and projector (3) are arranged on the table top of platform (1), left side camera (4a) is fixed on to the four-dimensional automatically controlled platform in left side (2a) upper, right side video camera (4b) is fixed on the four-dimensional automatically controlled platform in right side (2b);
Step 2: projection Gaussian network pattern, take and obtain intersecting point coordinate; By projector (3), to flat board or the upper projection of metope (5) of polishing in factory building, by many parallel horizontal striation vertical striations parallel with many, form Gaussian network pattern (6), the gray scale of each striation on Width is all Gaussian distribution, wherein horizontal, vertical striation intersection point A i,jfor feature point for calibration, the numbering that i is horizontal striation, according to order from top to bottom, j is for the numbering of vertical striation, according to order from left to right; Due to the light intensity stack of horizontal, vertical striation intersection point place, left side camera (4a) and right side video camera (4b) are taken to the image obtaining and carry out after binary conversion treatment, the speck at remaining grid intersection point place only in the image of acquisition, i.e. isolated UNICOM region one by one; Utilize centroid method can obtain the center-of-mass coordinate (u0 in UNICOM region i,j, v0 i,j), as unique point A i,jrough position; To take this rough position as the center of circle, and the border circular areas that the Δ pixel of usining is radius is as hunting zone, then at [u0 i,j-Δ, u0 i,j+ Δ] in scope every Δ/n broad ways search horizontal striation once, according to Gaussian distribution characteristic, carry out matching, the point using Gaussian distribution peak point on horizontal light stripe centric line, therefore can obtain the some P on 2n+1 center line i, j, s, subscript s is 1,2,3 ..., 2n+1, and then simulate straight line l h, i, j; Similarly, at [v0 i,j-Δ, v0 i,j+ Δ] in scope every Δ/n broad ways search vertical striation once, according to Gaussian distribution characteristic, carry out matching, the point using Gaussian distribution peak point on vertical light stripe centric line, can obtain the some Q on 2n+1 center line i, j, t, subscript t is 1,2,3 ..., 2n+1, and then simulate straight line l v, i, j; Finally, the intersection point by asking for two intersecting straight lines in same hunting zone is as feature point for calibration A i,j, its coordinate is (u i,j, v i,j);
Step 3: the rough coordinates of obtaining principal point; Utilize left side camera (4a) or right side video camera (4b) under two kinds of different focal, to take the Gaussian network pattern (6) of same projection, unique point A i,jimage coordinate be respectively (u1 i,j, v1 i,j) and (u2 i,j, v2 i,j), principal point coordinate is (u 0, v 0), have:
u 2 i , j - u 0 u 1 i , j - u 0 = v 2 i , j - v 0 v 1 i , j - v 0 - - - ( 2 )
Can utilize above formula to obtain the coordinate (u of the rough position of principal point 0, v 0);
Step 4: the principal point coordinate of asking for distortion factor and optimization; According to distortion model, can release the intersection point A of actual photographed i,jcoordinate p i,j=(u i,j, v i,j, 1) twith desirable intersecting point coordinate q i,j=(u' i,j, v' i,j, 1) ttransformational relation as follows:
Figure FDA0000396435250000022
Wherein,
Figure FDA0000396435250000023
k 1with k 2for coefficient of radial distortion, p 1with p 2for tangential distortion coefficient;
In addition, horizontal, the vertical the same number of lattice of striation of take is example, and intersection point adds up to num, on so every row, has individual intersection point, linear according to the guarantor of straight line, i.e. the character of the some conllinear on same striation, in conjunction with the necessary and sufficient condition of three point on a straight line, can list optimization aim function as follows:
min Σ i = 1 num ( Σ j = num ( i - 1 ) + 1 num ( i - 1 ) + num - 2 | q i , j + 1 T [ q i , j ] × q i , j + 2 | ) - - - ( 4 )
Wherein,
Figure FDA0000396435250000031
be 3 A i,j, A i, j+1, A i, j+2the necessary and sufficient condition of conllinear; [q i, j] *represent q i,jantisymmetric matrix, that is:
[ q i , j ] × = 0 - 1 v ′ i , j 1 0 - u ′ i , j - v ′ i , j u ′ i , j 0 - - - ( 5 )
By Levenberg-Marquardt nonlinear optimization algorithm, be optimized, make the value minimum of the objective function of formula (4), can obtain distortion factor k 1, k 2, p 1with p 2and the principal point coordinate (u' after optimizing 0, v' 0); Then utilize formula (3) that all intersecting point coordinates are modified to desirable coordinate;
Step 5: ask for remaining inner parameter of video camera; Utilize revised desirable intersection point as unique point, adopt active vision method, utilize the four-dimensional automatically controlled platform in left side (2a) to drive left side camera (4a) to do two groups of orthogonal motions, at three whole story of every group of orthogonal motion, on position, take respectively the image of the Gaussian network pattern 6 of a projection, final left side camera (4a) is taken and is obtained 6 images;
Parallel lines and plane at infinity intersect at same infinity point, i.e. hidden disappearing a little; And one group of orthogonal motion contains twice translation, a translation motion the whole story position photographs two width images on the line of corresponding intersection point be one group of space parallel line, and twice translation be orthogonal, so we can obtain the hidden of one group of quadrature and disappear a little to e i1(u i1, v i1) and e i2(u i2, v i2), subscript i=1,2, represent the order of orthogonal motion, and have Oe i1oe i2=0, O=(u' wherein 0, v' 0) be the principal point of video camera; Utilize that two groups of quadratures are hidden to disappear a little rightly, can solve respectively by solving following system of equations the scale factor α in the Intrinsic Matrix K of left side camera (4a) xwith α y:
( u 11 - u ′ 0 ) ( u 12 - u ′ 0 ) / α x 2 + ( v 11 - v ′ 0 ) ( v 12 - v ′ 0 ) / α y 2 + 1 = 0 ( u 21 - u ′ 0 ) ( u 22 - u ′ 0 ) / α x 2 + ( v 21 - v ′ 0 ) ( v 22 - v ′ 0 ) / α y 2 + 1 = 0 - - - ( 6 )
Equally, can obtain the scale factor α in the Intrinsic Matrix K of right side video camera (4b) xwith α y;
Step 6: the external parameter that obtains video camera; Left side camera (4a) is taken the Gaussian network pattern (6) of same projection with right side video camera (4b), world coordinate system is based upon on the camera coordinate system of left side camera (4a), utilizes left and right video camera (4a) and (4b) the revised match point of photographic images to calculate fundamental matrix F; Utilize intrinsic parameter and the fundamental matrix asked for can in the situation that differing a scale factor s, calculate essential matrix E; After decomposing essential matrix E, can in the situation that differing a scale factor, determine external parameter (rotation matrix R' and translation vector t');
Utilize the parallel Gauss's striation of projector (3) projection to physical length L 0on the gauge block of accurately measuring, utilize the Gaussian characteristics of striation to simulate sub-pix light stripe centric line, utilize the point of gray scale cataclysm as the frontier point of gauge block, according to the length L of the above-mentioned inside and outside Reconstruction gauge block of asking for ' 0, can obtain scale factor and be: s=L 0/ L' 0; Therefore, can obtain the external parameter (rotation matrix R=R' and translation vector t=s*t') of video camera reality; So far, completed the calibration process of video camera.
CN201310482789.4A 2013-10-16 2013-10-16 Based on the camera marking method of projection Gaussian network pattern Active CN103530880B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310482789.4A CN103530880B (en) 2013-10-16 2013-10-16 Based on the camera marking method of projection Gaussian network pattern

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310482789.4A CN103530880B (en) 2013-10-16 2013-10-16 Based on the camera marking method of projection Gaussian network pattern

Publications (2)

Publication Number Publication Date
CN103530880A true CN103530880A (en) 2014-01-22
CN103530880B CN103530880B (en) 2016-04-06

Family

ID=49932859

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310482789.4A Active CN103530880B (en) 2013-10-16 2013-10-16 Based on the camera marking method of projection Gaussian network pattern

Country Status (1)

Country Link
CN (1) CN103530880B (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104156974A (en) * 2014-09-05 2014-11-19 大连理工大学 Camera distortion calibration method on basis of multiple constraints
CN104167001A (en) * 2014-08-27 2014-11-26 大连理工大学 Large-visual-field camera calibration method based on orthogonal compensation
CN104777327A (en) * 2015-03-17 2015-07-15 河海大学 Time-space image speed measuring system and method based on auxiliary laser calibration
CN104820973A (en) * 2015-05-07 2015-08-05 河海大学 Image correction method for distortion curve radian detection template
CN104933717A (en) * 2015-06-17 2015-09-23 合肥工业大学 Camera intrinsic and extrinsic parameter automatic calibration method based on directional calibration target
CN105716539A (en) * 2016-01-26 2016-06-29 大连理工大学 Rapid high-precision 3D shape measuring method
CN105758337A (en) * 2014-12-19 2016-07-13 宁波舜宇光电信息有限公司 Method for obtaining included angel between a lens plane and an image sensor plane
CN107464218A (en) * 2016-06-02 2017-12-12 维森软件技术(上海)有限公司 Automobile calibration system and its scaling method
CN107464263A (en) * 2016-06-02 2017-12-12 维森软件技术(上海)有限公司 Automobile calibration system and its scaling method
CN107580203A (en) * 2017-07-18 2018-01-12 长春理工大学 Immersion active stereo projective perspective transformation matrix solving method
CN108198219A (en) * 2017-11-21 2018-06-22 合肥工业大学 Error compensation method for camera calibration parameters for photogrammetry
CN108805936A (en) * 2018-05-24 2018-11-13 北京地平线机器人技术研发有限公司 Join scaling method, device and electronic equipment outside video camera
CN109993799A (en) * 2019-03-08 2019-07-09 贵州电网有限责任公司 A kind of ultraviolet camera scaling method and caliberating device
CN110415299A (en) * 2019-08-02 2019-11-05 山东大学 Vehicle location estimation method based on standard guideboard under a kind of kinematic constraint
CN111579220A (en) * 2020-05-29 2020-08-25 江苏迪盛智能科技有限公司 Resolution board
CN111968183A (en) * 2020-08-17 2020-11-20 西安交通大学 Gauge block calibration method for calibrating monocular line laser three-dimensional measurement module
US20210035316A1 (en) * 2019-07-29 2021-02-04 Seiko Epson Corporation Control method for projector and projector
CN112427487A (en) * 2019-08-26 2021-03-02 北京机电研究所有限公司 Device for measuring size of thermal state free forging by utilizing optical image and display grid

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5159361A (en) * 1989-03-09 1992-10-27 Par Technology Corporation Method and apparatus for obtaining the topography of an object
US7232990B2 (en) * 2004-06-30 2007-06-19 Siemens Medical Solutions Usa, Inc. Peak detection calibration for gamma camera using non-uniform pinhole aperture grid mask
US20100079598A1 (en) * 2008-09-03 2010-04-01 University Of South Carolina Robust Stereo Calibration System and Method for Accurate Digital Image Correlation Measurements
CN101776437A (en) * 2009-09-30 2010-07-14 江南大学 Calibration technology for vision sub-pixel of embedded type machine with optical path adjustment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5159361A (en) * 1989-03-09 1992-10-27 Par Technology Corporation Method and apparatus for obtaining the topography of an object
US7232990B2 (en) * 2004-06-30 2007-06-19 Siemens Medical Solutions Usa, Inc. Peak detection calibration for gamma camera using non-uniform pinhole aperture grid mask
US20100079598A1 (en) * 2008-09-03 2010-04-01 University Of South Carolina Robust Stereo Calibration System and Method for Accurate Digital Image Correlation Measurements
CN101776437A (en) * 2009-09-30 2010-07-14 江南大学 Calibration technology for vision sub-pixel of embedded type machine with optical path adjustment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张捷 等: "基于立体靶标的摄像机标定方法", 《东南大学学报(自然科学版)》 *

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104167001B (en) * 2014-08-27 2017-02-15 大连理工大学 Large-visual-field camera calibration method based on orthogonal compensation
CN104167001A (en) * 2014-08-27 2014-11-26 大连理工大学 Large-visual-field camera calibration method based on orthogonal compensation
CN104156974A (en) * 2014-09-05 2014-11-19 大连理工大学 Camera distortion calibration method on basis of multiple constraints
CN105758337B (en) * 2014-12-19 2018-09-04 宁波舜宇光电信息有限公司 A method of obtaining angle between lens plane and image sensor plane
CN105758337A (en) * 2014-12-19 2016-07-13 宁波舜宇光电信息有限公司 Method for obtaining included angel between a lens plane and an image sensor plane
CN104777327A (en) * 2015-03-17 2015-07-15 河海大学 Time-space image speed measuring system and method based on auxiliary laser calibration
CN104777327B (en) * 2015-03-17 2018-03-20 河海大学 Time-space image velocity-measuring system and method based on laser assisted demarcation
CN104820973A (en) * 2015-05-07 2015-08-05 河海大学 Image correction method for distortion curve radian detection template
CN104820973B (en) * 2015-05-07 2017-10-03 河海大学 The method for correcting image of distortion curve radian detection template
CN104933717B (en) * 2015-06-17 2017-08-11 合肥工业大学 The camera interior and exterior parameter automatic calibration method of target is demarcated based on directionality
CN104933717A (en) * 2015-06-17 2015-09-23 合肥工业大学 Camera intrinsic and extrinsic parameter automatic calibration method based on directional calibration target
CN105716539A (en) * 2016-01-26 2016-06-29 大连理工大学 Rapid high-precision 3D shape measuring method
CN107464218A (en) * 2016-06-02 2017-12-12 维森软件技术(上海)有限公司 Automobile calibration system and its scaling method
CN107464263A (en) * 2016-06-02 2017-12-12 维森软件技术(上海)有限公司 Automobile calibration system and its scaling method
CN107580203A (en) * 2017-07-18 2018-01-12 长春理工大学 Immersion active stereo projective perspective transformation matrix solving method
CN107580203B (en) * 2017-07-18 2019-01-15 长春理工大学 Immersion active stereo projective perspective transformation matrix solving method
CN108198219A (en) * 2017-11-21 2018-06-22 合肥工业大学 Error compensation method for camera calibration parameters for photogrammetry
CN108805936A (en) * 2018-05-24 2018-11-13 北京地平线机器人技术研发有限公司 Join scaling method, device and electronic equipment outside video camera
CN109993799A (en) * 2019-03-08 2019-07-09 贵州电网有限责任公司 A kind of ultraviolet camera scaling method and caliberating device
US20210035316A1 (en) * 2019-07-29 2021-02-04 Seiko Epson Corporation Control method for projector and projector
US11514592B2 (en) * 2019-07-29 2022-11-29 Seiko Epson Corporation Control method for projector and projector
CN110415299A (en) * 2019-08-02 2019-11-05 山东大学 Vehicle location estimation method based on standard guideboard under a kind of kinematic constraint
CN110415299B (en) * 2019-08-02 2023-02-24 山东大学 Vehicle position estimation method based on set guideboard under motion constraint
CN112427487A (en) * 2019-08-26 2021-03-02 北京机电研究所有限公司 Device for measuring size of thermal state free forging by utilizing optical image and display grid
CN111579220A (en) * 2020-05-29 2020-08-25 江苏迪盛智能科技有限公司 Resolution board
CN111579220B (en) * 2020-05-29 2023-02-10 江苏迪盛智能科技有限公司 Resolution ratio board
CN111968183A (en) * 2020-08-17 2020-11-20 西安交通大学 Gauge block calibration method for calibrating monocular line laser three-dimensional measurement module

Also Published As

Publication number Publication date
CN103530880B (en) 2016-04-06

Similar Documents

Publication Publication Date Title
CN103530880B (en) Based on the camera marking method of projection Gaussian network pattern
CN103837869B (en) Based on single line laser radar and the CCD camera scaling method of vector relations
CN104331896B (en) A kind of system calibrating method based on depth information
CN104266608B (en) Field calibration device for visual sensor and calibration method
CN104567727B (en) Global unified calibration method for linear structured light profile sensor through three-dimensional target
CN109443209A (en) A kind of line-structured light system calibrating method based on homography matrix
CN103903260B (en) Target method for quickly calibrating intrinsic parameters of vidicon
CN106091983B (en) The complete scaling method of Vision Measuring System With Structured Light Stripe comprising scanning direction information
CN104376558A (en) Cuboid-based intrinsic parameter calibration method for Kinect depth camera
CN104182982A (en) Overall optimizing method of calibration parameter of binocular stereo vision camera
CN104034305B (en) A kind of monocular vision is the method for location in real time
CN102980528A (en) Calibration method of pose position-free constraint line laser monocular vision three-dimensional measurement sensor parameters
CN105469389B (en) A kind of grid ball target for vision sensor calibration and corresponding scaling method
CN204388802U (en) Line-structured light vision system calibration plate
CN104835144A (en) Solving camera intrinsic parameter by using image of center of sphere and orthogonality
CN104657982A (en) Calibration method for projector
CN104111039A (en) Calibrating method for randomly placing fringe projection three-dimensional measuring system
CN104807405B (en) Three-dimensional coordinate measurement method based on light ray angle calibration
CN108154536A (en) The camera calibration method of two dimensional surface iteration
CN102567991A (en) Binocular vision calibration method and system based on concentric circle composite image matching
Li et al. Laser scanning based three dimensional measurement of vegetation canopy structure
CN104167001A (en) Large-visual-field camera calibration method based on orthogonal compensation
CN101882309A (en) Calibration pattern of camera and calibration and detection method based on same
Yang et al. A dual-platform laser scanner for 3D reconstruction of dental pieces
CN104123726B (en) Heavy forging measuring system scaling method based on vanishing point

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant