CN105389808A - Camera self-calibration method based on two vanishing points - Google Patents
Camera self-calibration method based on two vanishing points Download PDFInfo
- Publication number
- CN105389808A CN105389808A CN201510708196.4A CN201510708196A CN105389808A CN 105389808 A CN105389808 A CN 105389808A CN 201510708196 A CN201510708196 A CN 201510708196A CN 105389808 A CN105389808 A CN 105389808A
- Authority
- CN
- China
- Prior art keywords
- image
- point
- camera
- calibration
- self
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
Abstract
The present invention discloses a camera self-calibration method based on two vanishing points. With two groups of orthogonal parallel lines as self-calibration patterns, four and more sequence images of different viewing points are obtained, a Harris-based sub-pixel angle detection method is used to extract four intersections of self-calibration patterns in each image, and the image coordinates of the two vanishing points are calculated. According to a projection geometry characteristic that a light center is perpendicular to the connection line of the two vanishing points, the constraint equations of camera intrinsic parameters are listed, and the intrinsic parameters such as a image plane main point coordinate (u0, v0) and normalized focal lengths fx and fy are obtained. According to the method, a calibration plate and a calibration block are not needed, the acquisition of camera motion information and the world coordinate information of the self-calibration patterns are not needed, the calibration process is convenient, a shooting requirement is not high, the algorithm is simple and efficient, and large visual field, zoom and remote vision detection applications can be satisfied. The calibration precision is better than the precision of a three-vanishing-point self-calibration method, and the calibration of high precision can be completed by using a low-resolution camera.
Description
Technical field
The present invention relates to a kind of Camera Self-Calibration method based on two end points.
Background technology
Camera calibration by determining the mapping relations between spatial point 3D coordinate and its 2D coordinate projecting on image, solve camera parameter process (see Li Yunxiang. camera calibration and three-dimensional reconstruction are studied [D]. University Of Qingdao, 2009).Camera calibration is that in computer vision, from two dimensional image, quantitative reconstruction goes out the process of three-dimensional data, has been three-dimensional reconstruction image indispensability and the key link, has been widely used in three-dimensional modeling, navigation, video monitoring etc.No matter be in image measurement or machine vision applications, the precision of camera parameter calibration result and the stability of algorithm directly affect the accuracy that camera work bears results.Therefore, camera calibration is the prerequisite carrying out follow-up work, improves the emphasis that stated accuracy is research work.Camera calibration method be mainly divided into scaling board scaling method and Camera Self-Calibration method (see Qiu Maolin, Ma Songde, Li Yi. camera Calibration summary in computer vision. robotization journal, 2000,26 (1): 43 ~ 55).Scaling board is demarcated to be needed to demarcate thing, and precision is high, but the complicated and calibration process of algorithm wastes time and energy.Compare with scaling board scaling method, self-calibrating method does not need precision machined calibrating block, only determines camera intrinsic parameter from the multiple image of scene.This thought proposes (see O.D.Faugeras by Faugeras, Luong, Maybank first, Q.T.Luong, S.J.Maybank, " CameraSelf-calibration:TheoryandExperiments; " LectureNotesinComputerScience.Papers588,321 ~ 334 (1992)).
The theoretical foundation of self-calibration technology all based on the projected nature of absolute conic (AC) or absolute quadric (AQ) (see B.Triggs " AutocalibrationandtheAbsoluteQuadric ", IEEECVPR97, pp.609-6141997).The main method of self-calibration has: Faugeras, Luong, the self-calibration of the direct solution Kruppa equation that Maybank etc. propose is (see O.D.Faugeras, Q.T.Luong, S.J.Maybank, " CameraSelf-calibration:TheoryandExperiments, " LectureNotesinComputerScience.Papers588, 321 ~ 334 (1992)), utilize the constraint about IAC that the Kruppa equation derived by the Epipolar geometry relation of the picture (IAC) of absolute conic provides, by determining that IAC carrys out calibrating camera intrinsic parameter, layering is standardization progressively, document (ZhaoshengTao, DaweiTu, SaisaiHe, JinjieYe.ACameraSelf-CalibrationforMachineVisionBasedonK ruppa ' sEquationAppliedMechanicsandMaterials, 2013,389 (4): 1003 ~ 1007) direct self-calibrating method video camera projection matrix being carried out to QR decomposition proposed in, first require to do projective reconstruction to image sequence, imposed restriction by absolute conic face again, make affine parameter and camera intrinsic parameter, based on the self-calibrating method of absolute conic, with Kruppa equation, the picture of polar curve conversion with absolute conic is connected, because image is the quafric curve in camera imaging plane, it can to determine that the inner parameter of camera is (see JingJin1, andXiaofengLi.Efficientcameraself-calibrationmethodbased ontheabsolutedualquadric.J.Opt.Soc.Am.A, 2013,30 (3): 287 ~ 292), based on the Camera Self-Calibration method of active vision, by controlling the motion of camera, obtain multiple image, then the corresponding point of image are utilized to demarcate, solve camera intrinsic parameter (see Basu, A, " Activecalibration:Alternativestrategyandanalysis, " In:Proc.IEEEConf.OnComputerVisionandPatternRecognition, 495-500 (1993), Du, F., Brady, M., " Self-calibrationoftheintrinsicparametersofcamerasforacti vevisionsystems, " In:Proc.IEEEConf.OnComputerVisionandPatternRecognition, 477-482 (1992)), also has the Camera Self-Calibration method based on space line geometric relationship, Hartley proposed a kind of tensor, principle is as fundamental matrix role in the middle of two viewpoint scene analyses, and the method utilizes the line correspondences relation on multiple image to carry out Camera Self-Calibration.The com-parison and analysis of various self-calibrating method is in table 1.
Table 1 Camera Self-Calibration Measures compare
self-calibrating method | algorithm complex | robustness | dirigibility |
utilize essential matrix and basis matrix | complicated | not high | to camera motion and scene without constraint, more flexibly |
utilize absolute conic | complicated | responsive to initial value, poor robustness | to camera motion and scene without constraint, more flexibly |
based on active vision | simply | higher | require higher to experimental facilities and camera motion. |
utilize the line correspondences relation of image | simply | higher | certain constraint is had to scene. |
As can be seen from Table 1, utilize the line correspondences relation of image and the algorithm based on active vision simple, robustness is higher, and experimental facilities is easy.In recent years, Chinese scholars has done large quantity research to these class methods.The people such as Yin Yan propose a kind of Camera Self-Calibration method based on linear feature (see YanYan, RongchunZhao, " Anewkindofcameraself-calibrationmethodbasedonlinearfeatu re; " ComputerApplicationResearch.Papers3,170 ~ 171 (2006)), the method needs to there are the parallel lines in six groups of mutually perpendicular directions in scene, has very large constraint to scene.The people such as LiuYing propose camera calibration method based on coplanar end point (see LiuY, WuYX, WuMP, " PlanarVanishingPointsBasedCameraCalibration; " IEEEComputerSociety, 460 ~ 463 (2004)), the method requires to make pinpoint dot matrix template, strict matching image point and template point, process more complicated.Recklessly encourage political affairs propose a kind of only utilize the end point in one group of mutually orthogonal direction to demarcate camera algorithm (see: recklessly encourage political affairs, Tan Zheng. a kind of Camera Self-Calibration new algorithm based on two end points. space electronic technology, 2005,23 (1): 42 ~ 46), the method can obtain one group of orthogonal end point line according to a template image, so at least require shooting 6 width image, and the method precision is not high.
Summary of the invention
For the deficiency that prior art exists, the technical problem to be solved in the present invention is to provide a kind of Camera Self-Calibration method based on two end points.This technology does not need scaling board, calibrating block, does not need to obtain camera motion information, and scene cathetus world coordinates information, and calibration process is convenient, and photographing request is not high, and algorithm is succinctly efficient, can meet the application of Large visual angle, zoom and distant view vision-based detection.Stated accuracy can be better than three end point self-calibrating methods, uses low-resolution cameras also can complete the self-calibration of degree of precision.
For solving the problems of the technologies described above, the present invention by the following technical solutions:
Based on a Camera Self-Calibration method for two end points, it is characterized in that comprising the following steps:
Step one, obtains the sequence image of two end point self-calibration patterns; Described two end point self-calibration patterns comprise the mutually orthogonal parallel lines of at least two groups, that shooting N width comprises this pattern, that visual angle is different sequence image, i.e. the sequence perspective projection image of two end point self-calibration patterns, wherein N >=4;
Step 2, carries out pre-service to the sequence image that step one is taken; The pre-service of sequence image comprises the process of coloured image gray processing, image filtering, histogram equalization, edge sharpening and image denoising process;
Step 3, Corner Detection; In two end point self-calibration patterns, two groups of mutually orthogonal parallel lines intersect at four points, adopt and detect sequence image based on Harris sub-pix angular-point detection method, extract the image coordinate of described four intersection points in every width image;
Step 4, calculates the image coordinate of two end points; For every width sequence image, the image coordinate of four intersection points obtained according to definition and the step 3 of end point lists equation, thus calculates the image coordinate of two end points obtained that to be projected by two groups of mutually orthogonal parallel lines in described two end point self-calibration patterns;
Step 5, calculates camera internal reference, the scale factor that camera internal reference comprises the principal point coordinate of the plane of delineation, focal length is fastened in image coordinate; Under being located at image coordinate system, the coordinate of described two end points is: M (u
m, v
m), N (u
n, v
n), then the coordinate under camera coordinates system is: M [(u
m-u
0) dx, (v
m-v
0) dy, f], N [(u
n-u
0) dx, (v
n-v
0) dy, f], wherein, f is camera focus;
According to the known photocentre of perspective geometry character of orthogonal parallel straight line and the line of two end points orthogonal, namely photocentre is positioned at on the line of the two end points circle that is diameter, can obtain thus about camera intrinsic parameter u
0, v
0, f
x, f
yequation of constraint:
Wherein (u
0, v
0) be the principal point coordinate of the plane of delineation, f
x, f
ybe respectively the scale factor on image coordinate system u axle and v axle; If two end point coordinates that the i-th width image is determined are M (u
im, v
im), N (u
in, v
in), subtract each other after the end point coordinate of the i-th width determined through step 3 and jth width image is substituted into formula (a):
(u
jm+u
jn-u
im-u
in)a+(v
jm+v
jn-v
im-v
in)b+(v
imv
in-v
jmv
jn)c=(u
jmu
jn-u
imu
in)(b)
Wherein a=u
0,
when N>=4, solve the system of equations that formula (b) sets up and can obtain a, b, c, then camera intrinsic parameter u
0=a, v
0=b/c; By u
0, v
0, c substitutes into formula (a) and tries to achieve f
x, f
y.
Described in step 1, two end point self-calibration patterns adopt gridiron pattern pattern, comprise at least two group orthogonal parallel straight lines in gridiron pattern pattern, and described two groups of orthogonal parallel straight lines form rectangular patterns.
The angular-point detection method of the described sub-pixel based on Harris is: described two end point self-calibration patterns adopt gridiron pattern pattern, point near X-comers is divided into R point on edge and the point two of the P not on edge class, P point place shade of gray is zero, and the shade of gray on R point
direction is vertical to the line of angle point with this point, so the shade of gray put near angle point is all perpendicular to the line of this point to angle point; If
for shade of gray vector,
for image origin points to the coordinate of angle point,
for image origin points to the coordinate of near focal point i-th, real image may be subject to the impact of noise, therefore generally
if there is error ε
i, to calculate in a certain neighborhood centered by angle point error a little and be Φ, then have:
Therefore ask the problem of angle point accurate location to be converted into ask the problem making error and the minimum point of Φ, the method Optimization Solution of this problem using iterative, draws:
By in formula (d)
substitute into
obtain the iterative of angle point:
By adopting formula (e) to carry out iteration optimization to initial Harris angle point, obtain the angular coordinate of sub-pixel.
Described two end point self-calibration patterns adopt rectangular patterns, its four edges is two groups of parallel lines, and it is mutually orthogonal, then camera photocentre is positioned at on the two end point lines circle that is diameter, the line of photocentre and two end points is orthogonal, lists according to the definition of end point the image coordinate that equation calculates two end points.
Image denoising process adopts median filtering algorithm to remove noise.
The Camera Self-Calibration method based on two end points that the present invention proposes, use the geometrical property of end point to set up the equation of constraint of camera intrinsic parameter, obtain the intrinsic parameter of camera, algorithm is simple; Only need to take from different points of view the image that four width contain two groups of orthogonal parallel straight lines in operation experiments, do not need its three-dimensional coordinate information, simple to operate, scene is easy to realize; The occasion of the method to Large visual angle zoom is also suitable for.The result that the calibration result of two end point methods and traditional scaling board standardization, machine vision Halcon software scaling board are demarcated contrasted, the validity of result verification the method and practicality.
Accompanying drawing explanation
Fig. 1 is pinhole camera model schematic of the present invention.
Fig. 2 is the end point schematic diagram that in scene of the present invention, rectangular patterns is formed at the plane of delineation.
Fig. 3 is Corner Detection analysis chart of the present invention.
Fig. 4 is of the present invention based on the sub-pix Corner Detection of Harris and the testing result comparison diagram of Harris Corner Detection Algorithm.
Fig. 5 is one group of plane gridiron pattern image sequence that the present invention adopts slr camera to take.
Fig. 6 is the Corner Detection result figure of cross-hatch pattern picture of the present invention, and wherein, the Corner Detection result figure that 6 (a) is cross-hatch pattern picture, 6 (b) is the Corner Detection result figure of the rectangular patterns in the middle of cross-hatch pattern picture.
Fig. 7 is camera intrinsic parameter u of the present invention
0, v
0the curve map changed with noise with the mean square deviation of its true value.
Fig. 8 is camera intrinsic parameter f of the present invention
x, f
ythe curve map changed with noise with the mean square deviation of its true value.
Embodiment
Below in conjunction with the drawings and specific embodiments, the present invention is described in further detail.
The invention provides a kind of Camera Self-Calibration method based on two end points, so-called two end points, to refer in scene that two groups of orthogonal parallel lines meet at two infinity points in projective space, these two intersection points are two end points (see Caprile at two subpoints of the plane of delineation, B., Torre, V., " UsingVanishingPointsforCameraCalibration; " InternationalJournalofComputerVision.Papers4 (2), 127-140 (1990)).According to the known photocentre of perspective geometry character of orthogonal parallel straight line and the line of two end points orthogonal, namely photocentre is positioned at on the line of the two end points circle that is diameter, sets up camera intrinsic parameter equation of constraint thus, solves camera internal reference.Method of the present invention only needs scene to comprise the straight line of two groups of orthogonal parallel, by obtaining four width images of scene different points of view and above image, by four the intersection point image coordinate extracting orthogonal parallel straight line in every width image based on Harris sub-pix angular-point detection method, obtain the image coordinate of two end points.According to the perspective geometry character of orthogonal parallel straight line, namely the line of photocentre and two end points is perpendicular, lists the restrictive equation about camera internal reference, obtains four intrinsic parameters of plane of delineation principal point coordinate, normalization focal length etc.
1, camera imaging model
Video camera adopts pin-hole imaging model, as Fig. 1.Certain some X ' in world coordinate system projects to the x ' point of the plane of delineation by the photocentre O of camera lens, the transformational relation of the point coordinate under image coordinate system and world coordinate system:
αx'=K[R|t]X'(1)
Wherein, α is scale factor, and K is camera Intrinsic Matrix, the outer parameter matrix that [R|t] is camera, a calibration for cameras internal reference matrix K in the present invention.K is four parameter model, that is:
Wherein, f
uand f
vbe respectively the scale factor on image coordinate system u axle and v axle, unit is pixel, (u
0, v
0) be the principal point coordinate of the plane of delineation.
2, based on the Camera Self-Calibration of two end points
2.1 determine end point
When adopting two end points to carry out Camera Self-Calibration, require to have in scene the parallel lines that two groups orthogonal, provide parametric solution process for rectangular patterns here, this rectangular patterns can be the rectangular patterns in plane black and white gridiron pattern.If the rectangular patterns in scene is ABCD, corresponding to plane of delineation EFGH, O through camera shooting is the photocentre of camera.The end point supposing on EF and GH direction is the end point on M, EH and FG direction is N.Utilize the characteristic of ABCD square templates and the character of end point, i.e. AB ⊥ BC, OM//AB, ON//BC, so OM ⊥ ON, can obtain as drawn a conclusion: have two groups of parallel lines, they are mutually orthogonal, then camera photocentre is positioned on the ball that two end points obtaining to be projected by these two groups of parallel lines are diameter.As shown in Figure 2.
Utilizing the equation of two parallel lines can calculate the coordinate of end point, can end point M be calculated as utilized the equation of parallel lines EF and HG:
In formula (3), (u
e, v
e), (u
f, v
f), (u
g, v
g), (u
h, v
h), (u
m, v
m) be respectively the image coordinate of E, F, G, H, M.Utilize formula (3) that the coordinate of end point M can be obtained.Similarly, straight line EH and FG is utilized can to calculate the coordinate of end point N.
Therefore, as long as detect the coordinate of the rectangular projection in scene to the summit of four on the plane of delineation, the coordinate of end point can just be tried to achieve.On the plane of delineation E, F, G, H angle point correct detection and accurately location be the prerequisite that can camera calibration correctly realize, be also the committed step determining stated accuracy simultaneously.
2.2 based on the sub-pix Corner Detection of Harris
Harris operator is that a kind of half-tone information of image that utilizes of people's propositions such as Harris is to extract the operator of angle point.To any piece image, angle point is relevant with the autocorrelation function curvature characteristic of gradation of image.To any point in image, if its horizontal curvature and vertical curvature value are all higher than other points in local neighborhood, then think that this point is angle point.Harris Corner Detection Algorithm is simple, stability is high, robustness is good, anti-noise ability is strong, but accuracy of detection is Pixel-level.In order to improve camera calibration precision, the present invention adopts Yao Yao sub-pix angular-point detection method (see YaoYao, ZhongxiangHu, XiaojunShi, YiboWang, " Harriscornerextractiontorealizesub-pixellevelcameraself-calibration [J]; " ElectronicDesignEngineering.Papers05,61-65 (2009)), introduce iterative algorithm to be optimized Harris Corner Detection Algorithm, obtain the angular coordinate of sub-pixel.
Point near black and white chessboard grid pattern angle point can be divided into R point on edge and the point two of the P not on edge class, and P point place shade of gray is zero, and the shade of gray on R point
direction is vertical to the line (i.e. OR) of angle point with this point, so the shade of gray put near angle point is all perpendicular to the line of this point to angle point; If
for shade of gray vector,
for image origin points to the coordinate of angle point,
for image origin points to the coordinate of near angle point i-th, real image may be subject to the impact of noise, therefore generally
if there is error ε
i, to calculate in a certain neighborhood centered by angle point error a little and be Φ, then have:
Therefore ask the problem of angle point accurate location to be converted into ask the problem making error and the minimum point of Φ, the method Optimization Solution of this problem using iterative, draws:
By in formula (5)
substitute into
obtain the iterative of angle point:
By adopting formula (6) to carry out iteration optimization to initial Harris angle point, obtain the angular coordinate of sub-pixel.
As shown in Figure 4, wherein mark 1 ". " pointed to and represent Corner Detection position of the present invention, "+" that mark 2 points to represents the Corner Detection position of Harris to the Comparative result of two kinds of Corner Detection Algorithm, and "+" that mark 3 points to represents the physical location of angle point.As seen from Figure 4, the algorithm of the Corner Detection of sub-pix is closer to the corner location of reality.So the present invention adopts the method based on the sub-pixel Corner Detection of Harris algorithm, not only increase the stated accuracy of camera, also improve Corner Detection speed.
The linear solution of 2.3 camera intrinsic parameters
Camera internal reference comprises: the scale factor (also known as normalization focal length) that the principal point coordinate of the plane of delineation, focal length are fastened in image coordinate.
The coordinate of two end points obtained by two end point self-calibration patterns under being located at image coordinate system is: M (u
m, v
m), N (u
n, v
n), then the coordinate under camera coordinates system is: M [(u
m-u
0) dx, (v
m-v
0) dy, f], N [(u
n-u
0) dx, (v
n-v
0) dy, f], wherein, f is camera focus; According to the known photocentre of perspective geometry character of orthogonal parallel straight line and the line of two end points orthogonal, namely photocentre is positioned at on the two end point lines circle that is diameter, can obtain thus about camera intrinsic parameter u
0, v
0, f
x, f
yequation of constraint:
Wherein (u
0, v
0) be the principal point coordinate of the plane of delineation, f
x, f
ybe respectively the scale factor on image coordinate system u axle and v axle.Four unknown number f are had in formula (7)
x, f
y, u
0, v
0, only taking a width figure cannot solve.Be located at diverse location and take N width image altogether, two end point coordinates that the i-th width image is determined are M (u
im, v
im), N (u
in, v
in), subtract each other after substituting into (7) formula by adopting the end point coordinate of the i-th width and the jth width image determined based on the sub-pix angular-point detection method of Harris:
(u
jm+u
jn-u
im-u
in)a+(v
jm+v
jn-v
im-v
in)b+(v
imv
in-v
jmv
jn)c=(u
jmu
jn-u
imu
in)(8)
Wherein a=u
0,
during N>=4, solve the system of equations that formula (8) sets up and can obtain a, b, c, then camera intrinsic parameter u
0=a, v
0=b/c, by u
0, v
0, c substitutes into formula (7) can try to achieve f
x, f
y.So when not considering distortion, taking four width pictures from different perspectives just can four intrinsic parameter of linear solution camera.
3, calibration experiment and interpretation of result
The Sony slr camera that this experiment adopts Japanese Sony company to produce, model is NEX-5N.Size sensor is 23.4 × 15.6mm, and image resolution ratio has 4912 × 2760,3568 × 2000,2448 × 1376.The image resolution ratio used in this experiment is 2448 × 1376.
4, experimental procedure
Step1: the sequence image obtaining two end point self-calibration patterns.Described two end point self-calibration patterns comprise the mutually orthogonal parallel lines of at least two groups, that shooting N width comprises this pattern, that visual angle is different sequence image, i.e. the sequence perspective projection image of two end point self-calibration patterns, wherein N >=4.
The present invention tests and adopts plane black and white chessboard grid pattern as two end point self-calibration patterns, and this demarcation pattern comprises the mutually orthogonal parallel lines of at least two groups, and described two groups of orthogonal parallel straight lines form rectangular patterns.To plane black and white chessboard grid pattern from different viewing angles 20 width image sequence, every four width image sequences are one group and do a calibration experiment, obtain five groups of experimental results, then average.Fig. 5 is captured one group of gridiron pattern image sequence comprising rectangular patterns.
Step2: pre-service is carried out to the sequence image of Step1 shooting.The pre-service mainly coloured image gray processing process of image, image filtering, histogram equalization, the pre-service such as edge sharpening and image denoising process (see Zhang Yujin. computer vision study course Beijing: People's Telecon Publishing House, in March, 2011 .57 ~ 74).
For the pre-service of midplane gridiron pattern pattern of the present invention, need the mainly impulsive noise eliminated.And salt-pepper noise is the topmost type of impulsive noise, its gray-scale value showing as a certain pixel differs greatly relative to other grey scale pixel values in its neighborhood, occurs the bright dark spot of black and white in the picture.Salt-pepper noise has had a strong impact on the quality of image, adds very large difficulty to follow-up image processing process simultaneously.Consider image neighborhood pixels height correlation, the present invention adopts the algorithm of medium filtering to remove noise.
Step3: Corner Detection.In two end point self-calibration patterns, two groups of mutually orthogonal parallel lines intersect at four points, adopt and detect sequence image based on Harris sub-pix angular-point detection method, extract the image coordinate of described four intersection points in every width image.
Utilize the sub-pix angular-point detection method based on Harris algorithm of the present invention to detect angle point, carry out Corner Detection to pretreated every width image, the testing result of black and white chessboard grid pattern is as shown in Fig. 6 (a).In order to reduce the impact of lens distortion on experimental result, demarcate with regard to the rectangle chosen in the middle of self-calibration pattern, obtain the coordinate of rectangle four angle points, the coordinate of four angle points that four points that the parallel lines that namely self-calibration pattern two groups of comprising are mutually orthogonal intersects are corresponding on the image plane, the testing result of black and white chessboard grid pattern intermediate rectangular pattern is as Fig. 6 (b).
Step4: the image coordinate calculating two end points; For every width sequence image, adopt the method for determination end point of the present invention, the image coordinate of four intersection points namely obtained according to definition and the Step3 of end point lists equation, calculates the image coordinate of two end points obtained that to be projected by two groups of mutually orthogonal parallel lines in two end point self-calibration patterns;
Step5: the internal reference asking for camera.By Step4 calculate by two groups of mutually orthogonal parallel lines in described two end point self-calibration patterns project two end points obtained image coordinate after, adopt the linear solution method of camera intrinsic parameter of the present invention to utilize formula (7) and formula (8) to solve the intrinsic parameter of camera.
5, experimental result and analysis
Take 20 width image sequences from different perspectives to plane black and white chessboard grid pattern, every four width image one group does a calibration experiment, obtains five groups of experimental results, then averages, and just obtains the parameter that slr camera is demarcated.Result is as table 2.
The calibration result of table 2 camera internal reference
Intrinsic parameter (pixel) | First group | Second group | 3rd group | 4th group | 5th group | Mean value |
fx | 5742.80 | 5739.23 | 5734.72 | 5738.34 | 5741.75 | 5739.37 |
fy | 4869.28 | 4861.72 | 4864.62 | 4858.33 | 4862.14 | 4863.22 |
u 0 | 1247.52 | 1246.49 | 1249.61 | 1245.44 | 1248.98 | 1247.61 |
V 0 | 679.25 | 674.32 | 672.02 | 675.94 | 672.33 | 674.47 |
In order to verify the accuracy of calibration result of the present invention, the result experimental result of the inventive method demarcated with traditional scaling board respectively, used Halcon software scaling board to demarcate compares, and list the theoretical value (supposing that camera focus is f=55mm) of camera intrinsic parameter, as table 3.
Table 3 calibration experiment Comparative result
Intrinsic parameter | fx | fy | u 0 | v 0 |
Theoretical value | 5753.85 | 4851.23 | 1224.00 | 688.00 |
Scaling board standardization | 5750.22 | 4856.02 | 1233.93 | 686.20 |
Halcon scaling board standardization | 5754.62 | 4850.75 | 1224.13 | 690.40 |
Two end point standardizations | 5739.37 | 4863.22 | 1247.61 | 674.47 |
Scaling board standardization relative error/% | 0.063 | 0.098 | 0.810 | 0.260 |
Halcon scaling board standardization relative error/% | 0.013 | 0.001 | 0.010 | 0.349 |
Two end point standardization relative error/% | 0.250 | 0.240 | 1.930 | 1.970 |
As can be seen from Table 3, based on the stated accuracy of the Camera Self-Calibration algorithm of two end points, the application such as Large visual angle and zoom scene can be met.
Adopt the robustness of the method validation two end point standardization of superposition random noise.Picture point adds random noise that amplitude is 0-3 pixel respectively, carries out 20 independent experiments under each noise level, the mean square deviation of camera intrinsic parameter and its true value changes song as shown in Figure 7, Figure 8.
Can find out from Fig. 7, Fig. 8 when noise is less, the error of the internal reference of camera is less, when noise increases, the internal reference error of camera linearly increases thereupon, but error is still than document (Li Hua, Wu Fuchao, Hu Zhanyi. a kind of new linear camera self-calibrating method. Chinese journal of computers, 2000,23 (11): 1122 ~ 1128) error much smaller, so still can demarcate preferably, illustrate that the robustness of algorithm of the present invention is better.
Claims (5)
1., based on a Camera Self-Calibration method for two end points, it is characterized in that comprising the following steps:
Step one, obtains the sequence image of two end point self-calibration patterns; Described two end point self-calibration patterns comprise the mutually orthogonal parallel lines of at least two groups, that shooting N width comprises this pattern, that visual angle is different sequence image, i.e. the sequence perspective projection image of two end point self-calibration patterns, wherein N >=4;
Step 2, carries out pre-service to the sequence image that step one is taken; The pre-service of sequence image comprises the process of coloured image gray processing, image filtering, histogram equalization, edge sharpening and image denoising process;
Step 3, Corner Detection; In two end point self-calibration patterns, two groups of mutually orthogonal parallel lines intersect at four points, adopt and detect sequence image based on Harris sub-pix angular-point detection method, extract the image coordinate of described four intersection points in every width image;
Step 4, calculates the image coordinate of two end points; For every width sequence image, the image coordinate of four intersection points obtained according to definition and the step 3 of end point lists equation, thus calculates the image coordinate of two end points obtained that to be projected by two groups of mutually orthogonal parallel lines in described two end point self-calibration patterns;
Step 5, calculates camera internal reference, the scale factor that camera internal reference comprises the principal point coordinate of the plane of delineation, focal length is fastened in image coordinate; Under being located at image coordinate system, the coordinate of described two end points is: M (u
m, v
m), N (u
n, v
n), then the coordinate under camera coordinates system is: M [(u
m-u
0) dx, (v
m-v
0) dy, f], N [(u
n-u
0) dx, (v
n-v
0) dy, f], wherein, f is camera focus;
According to the known photocentre of perspective geometry character of orthogonal parallel straight line and the line of two end points orthogonal, namely photocentre is positioned at on the line of the two end points circle that is diameter, can obtain thus about camera intrinsic parameter u
0, v
0, f
x, f
yequation of constraint:
Wherein (u
0, v
0) be the principal point coordinate of the plane of delineation, f
x, f
ybe respectively the scale factor on image coordinate system u axle and v axle; If two end point coordinates that the i-th width image is determined are M (u
im, v
im), N (u
in, v
in), subtract each other after the end point coordinate of the i-th width determined through step 3 and jth width image is substituted into formula (a):
(u
jm+u
jn-u
im-u
in)a+(v
jm+v
jn-v
im-v
in)b+(v
imv
in-v
jmv
jn)c=(u
jmu
jn-u
imu
in)(b)
Wherein a=u
0,
when N>=4, solve the system of equations that formula (b) sets up and can obtain a, b, c, then camera intrinsic parameter u
0=a, v
0=b/c; By u
0, v
0, c substitutes into formula (a) and tries to achieve f
x, f
y.
2. the Camera Self-Calibration method based on two end points according to claim 1, it is characterized in that: described in step 1, two end point self-calibration patterns adopt gridiron pattern pattern, comprise at least two group orthogonal parallel straight lines in gridiron pattern pattern, described two groups of orthogonal parallel straight lines form rectangular patterns.
3. the Camera Self-Calibration method based on two end points according to claim 1, it is characterized in that: the angular-point detection method of the described sub-pixel based on Harris is: described two end point self-calibration patterns adopt gridiron pattern pattern, point near X-comers is divided into R point on edge and the point two of the P not on edge class, P point place shade of gray is zero, and the shade of gray on R point
direction is vertical to the line of angle point with this point, so the shade of gray put near angle point is all perpendicular to the line of this point to angle point; If
for shade of gray vector,
for image origin points to the coordinate of angle point,
for image origin points to the coordinate of near focal point i-th, real image may be subject to the impact of noise, therefore generally
if there is error ε
i, to calculate in a certain neighborhood centered by angle point error a little and be Φ, then have:
Therefore ask the problem of angle point accurate location to be converted into ask the problem making error and the minimum point of Φ, the method Optimization Solution of this problem using iterative, draws:
By in formula (d)
substitute into
obtain the iterative of angle point:
By adopting formula (e) to carry out iteration optimization to initial Harris angle point, obtain the angular coordinate of sub-pixel.
4. the Camera Self-Calibration method based on two end points according to claim 1, it is characterized in that: described two end point self-calibration patterns adopt rectangular patterns, its four edges is two groups of parallel lines, and it is mutually orthogonal, then camera photocentre is positioned at on the two end point lines circle that is diameter, the line of photocentre and two end points is orthogonal, lists according to the definition of end point the image coordinate that equation calculates two end points.
5. the Camera Self-Calibration method based on two end points according to claim 1, is characterized in that: image denoising process adopts median filtering algorithm to remove noise.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510708196.4A CN105389808A (en) | 2015-10-27 | 2015-10-27 | Camera self-calibration method based on two vanishing points |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510708196.4A CN105389808A (en) | 2015-10-27 | 2015-10-27 | Camera self-calibration method based on two vanishing points |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105389808A true CN105389808A (en) | 2016-03-09 |
Family
ID=55422057
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510708196.4A Pending CN105389808A (en) | 2015-10-27 | 2015-10-27 | Camera self-calibration method based on two vanishing points |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105389808A (en) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106327532A (en) * | 2016-08-31 | 2017-01-11 | 北京天睿空间科技股份有限公司 | Three-dimensional registering method for single image |
CN106600641A (en) * | 2016-12-07 | 2017-04-26 | 常州工学院 | Embedded type visual servo control method based on multi-feature fusion |
CN107492123A (en) * | 2017-07-07 | 2017-12-19 | 长安大学 | A kind of road monitoring camera self-calibrating method using information of road surface |
CN107622514A (en) * | 2017-09-30 | 2018-01-23 | 常州工学院 | Video camera convex lens model Auto-calibration method |
CN108010086A (en) * | 2017-12-04 | 2018-05-08 | 深圳市赢世体育科技有限公司 | Camera marking method, device and medium based on tennis court markings intersection point |
WO2018113846A1 (en) * | 2016-12-23 | 2018-06-28 | Wipotec Gmbh | Testing and/or calibrating of a camera, in particular a digital camera, by means of an optical test standard |
CN108346164A (en) * | 2018-01-09 | 2018-07-31 | 云南大学 | Utilize the method for the property calibration axicon lens face catadioptric video camera of essential matrix |
CN108470361A (en) * | 2017-02-23 | 2018-08-31 | 南宁市富久信息技术有限公司 | A kind of angle point automatic identification camera calibration method |
CN108645426A (en) * | 2018-04-09 | 2018-10-12 | 北京空间飞行器总体设计部 | A kind of in-orbit self-calibrating method of extraterrestrial target Relative Navigation vision measurement system |
CN109685851A (en) * | 2018-10-08 | 2019-04-26 | 上海肇观电子科技有限公司 | Hand and eye calibrating method, system, equipment and the storage medium of walking robot |
CN109903341A (en) * | 2019-01-25 | 2019-06-18 | 东南大学 | Join dynamic self-calibration method outside a kind of vehicle-mounted vidicon |
CN110349219A (en) * | 2018-04-04 | 2019-10-18 | 杭州海康威视数字技术股份有限公司 | A kind of Camera extrinsic scaling method and device |
CN110807815A (en) * | 2019-10-30 | 2020-02-18 | 扬州大学 | Rapid underwater calibration method based on two groups of mutually orthogonal parallel lines corresponding vanishing points |
CN111223150A (en) * | 2020-01-15 | 2020-06-02 | 电子科技大学 | Vehicle-mounted camera external parameter calibration method based on double vanishing points |
CN111260736A (en) * | 2020-01-16 | 2020-06-09 | 中国科学院西安光学精密机械研究所 | In-orbit real-time calibration method for internal parameters of space camera |
CN112135125A (en) * | 2020-10-28 | 2020-12-25 | 歌尔光学科技有限公司 | Camera internal reference testing method, device, equipment and computer readable storage medium |
CN112785570A (en) * | 2021-01-19 | 2021-05-11 | 浙江商汤科技开发有限公司 | Method, device, equipment and storage medium for detecting image registration correctness |
CN113012226A (en) * | 2021-03-22 | 2021-06-22 | 浙江商汤科技开发有限公司 | Camera pose estimation method and device, electronic equipment and computer storage medium |
CN113313768A (en) * | 2021-05-27 | 2021-08-27 | 华南理工大学 | Global space scale measurement method based on two vanishing points and local object scale |
CN113362398A (en) * | 2021-06-30 | 2021-09-07 | 广州文远知行科技有限公司 | Method, system, device and storage medium for determining camera reference error |
CN116342712A (en) * | 2023-03-30 | 2023-06-27 | 中国科学院西安光学精密机械研究所 | Method, medium and equipment for calibrating on-orbit distortion coefficient of space camera based on vanishing point consistency |
CN118037863A (en) * | 2024-04-11 | 2024-05-14 | 四川大学 | Neural network optimization automatic zooming camera internal parameter calibration method based on visual field constraint |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103492899A (en) * | 2011-04-20 | 2014-01-01 | 高通股份有限公司 | Online reference patch generation and pose estimation for augmented reality |
-
2015
- 2015-10-27 CN CN201510708196.4A patent/CN105389808A/en active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103492899A (en) * | 2011-04-20 | 2014-01-01 | 高通股份有限公司 | Online reference patch generation and pose estimation for augmented reality |
Non-Patent Citations (5)
Title |
---|
姚耀 等: "Harris角点提取实现亚像素级摄像机自标定", 《电子设计工程》 * |
朱水娟: "双目系统摄像机自标定技术研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ集》 * |
章毓晋: "《计算机视觉教程》", 31 March 2011 * |
胡钊政 等: "一种基于二消失点的摄像机自标定新算法", 《空间电子技术》 * |
霍炬 等: "基于消隐点几何特性的摄像机自标定方法", 《光学学报》 * |
Cited By (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106327532A (en) * | 2016-08-31 | 2017-01-11 | 北京天睿空间科技股份有限公司 | Three-dimensional registering method for single image |
CN106327532B (en) * | 2016-08-31 | 2019-06-11 | 北京天睿空间科技股份有限公司 | A kind of three-dimensional registration method of single image |
CN106600641A (en) * | 2016-12-07 | 2017-04-26 | 常州工学院 | Embedded type visual servo control method based on multi-feature fusion |
CN106600641B (en) * | 2016-12-07 | 2020-02-11 | 常州工学院 | Embedded visual servo control method based on multi-feature fusion |
WO2018113846A1 (en) * | 2016-12-23 | 2018-06-28 | Wipotec Gmbh | Testing and/or calibrating of a camera, in particular a digital camera, by means of an optical test standard |
US11218692B2 (en) | 2016-12-23 | 2022-01-04 | Wipotec Gmbh | Testing and/or calibrating of a camera, in particular a digital camera, by means of an optical test standard |
CN108470361A (en) * | 2017-02-23 | 2018-08-31 | 南宁市富久信息技术有限公司 | A kind of angle point automatic identification camera calibration method |
CN107492123A (en) * | 2017-07-07 | 2017-12-19 | 长安大学 | A kind of road monitoring camera self-calibrating method using information of road surface |
CN107492123B (en) * | 2017-07-07 | 2020-01-14 | 长安大学 | Road monitoring camera self-calibration method using road surface information |
CN107622514A (en) * | 2017-09-30 | 2018-01-23 | 常州工学院 | Video camera convex lens model Auto-calibration method |
CN108010086A (en) * | 2017-12-04 | 2018-05-08 | 深圳市赢世体育科技有限公司 | Camera marking method, device and medium based on tennis court markings intersection point |
CN108346164A (en) * | 2018-01-09 | 2018-07-31 | 云南大学 | Utilize the method for the property calibration axicon lens face catadioptric video camera of essential matrix |
CN110349219A (en) * | 2018-04-04 | 2019-10-18 | 杭州海康威视数字技术股份有限公司 | A kind of Camera extrinsic scaling method and device |
CN108645426B (en) * | 2018-04-09 | 2020-04-10 | 北京空间飞行器总体设计部 | On-orbit self-calibration method for space target relative navigation vision measurement system |
CN108645426A (en) * | 2018-04-09 | 2018-10-12 | 北京空间飞行器总体设计部 | A kind of in-orbit self-calibrating method of extraterrestrial target Relative Navigation vision measurement system |
CN109685851A (en) * | 2018-10-08 | 2019-04-26 | 上海肇观电子科技有限公司 | Hand and eye calibrating method, system, equipment and the storage medium of walking robot |
CN109685851B (en) * | 2018-10-08 | 2021-02-12 | 上海肇观电子科技有限公司 | Hand-eye calibration method, system, equipment and storage medium of walking robot |
CN109903341B (en) * | 2019-01-25 | 2023-09-08 | 东南大学 | Vehicle-mounted camera external parameter dynamic self-calibration method |
CN109903341A (en) * | 2019-01-25 | 2019-06-18 | 东南大学 | Join dynamic self-calibration method outside a kind of vehicle-mounted vidicon |
CN110807815A (en) * | 2019-10-30 | 2020-02-18 | 扬州大学 | Rapid underwater calibration method based on two groups of mutually orthogonal parallel lines corresponding vanishing points |
CN111223150A (en) * | 2020-01-15 | 2020-06-02 | 电子科技大学 | Vehicle-mounted camera external parameter calibration method based on double vanishing points |
CN111260736B (en) * | 2020-01-16 | 2023-04-11 | 中国科学院西安光学精密机械研究所 | In-orbit real-time calibration method for internal parameters of space camera |
CN111260736A (en) * | 2020-01-16 | 2020-06-09 | 中国科学院西安光学精密机械研究所 | In-orbit real-time calibration method for internal parameters of space camera |
CN112135125A (en) * | 2020-10-28 | 2020-12-25 | 歌尔光学科技有限公司 | Camera internal reference testing method, device, equipment and computer readable storage medium |
CN112135125B (en) * | 2020-10-28 | 2024-07-30 | 歌尔光学科技有限公司 | Camera internal reference testing method, device, equipment and computer readable storage medium |
CN112785570A (en) * | 2021-01-19 | 2021-05-11 | 浙江商汤科技开发有限公司 | Method, device, equipment and storage medium for detecting image registration correctness |
CN113012226A (en) * | 2021-03-22 | 2021-06-22 | 浙江商汤科技开发有限公司 | Camera pose estimation method and device, electronic equipment and computer storage medium |
CN113313768A (en) * | 2021-05-27 | 2021-08-27 | 华南理工大学 | Global space scale measurement method based on two vanishing points and local object scale |
CN113362398A (en) * | 2021-06-30 | 2021-09-07 | 广州文远知行科技有限公司 | Method, system, device and storage medium for determining camera reference error |
CN116342712A (en) * | 2023-03-30 | 2023-06-27 | 中国科学院西安光学精密机械研究所 | Method, medium and equipment for calibrating on-orbit distortion coefficient of space camera based on vanishing point consistency |
CN116342712B (en) * | 2023-03-30 | 2024-04-05 | 中国科学院西安光学精密机械研究所 | Method, medium and equipment for calibrating on-orbit distortion coefficient of space camera based on vanishing point consistency |
CN118037863A (en) * | 2024-04-11 | 2024-05-14 | 四川大学 | Neural network optimization automatic zooming camera internal parameter calibration method based on visual field constraint |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105389808A (en) | Camera self-calibration method based on two vanishing points | |
US11521311B1 (en) | Collaborative disparity decomposition | |
Wang et al. | Automatic registration of mobile LiDAR and spherical panoramas | |
CN109544628B (en) | Accurate reading identification system and method for pointer instrument | |
Wan et al. | Stereo vision using two PTZ cameras | |
Taneja et al. | Geometric change detection in urban environments using images | |
CN105096317A (en) | Fully automatic calibration method for high performance camera under complicated background | |
KR101759798B1 (en) | Method, device and system for generating an indoor two dimensional plan view image | |
CN104574393A (en) | Three-dimensional pavement crack image generation system and method | |
CN104748750A (en) | Model constraint-based on-orbit 3D space target attitude estimation method and system | |
CN103473771A (en) | Method for calibrating camera | |
CN104463899A (en) | Target object detecting and monitoring method and device | |
CN104361603B (en) | Gun camera image target designating method and system | |
CN107421473A (en) | The two beam laser coaxial degree detection methods based on image procossing | |
CN106971408A (en) | A kind of camera marking method based on space-time conversion thought | |
CN114998448B (en) | Multi-constraint binocular fisheye camera calibration and space point positioning method | |
Chalom et al. | Measuring image similarity: an overview of some useful applications | |
Horanyi et al. | Generalized pose estimation from line correspondences with known vertical direction | |
CN114372992A (en) | Edge corner point detection four-eye vision algorithm based on moving platform | |
Liu et al. | Infrared-visible image registration for augmented reality-based thermographic building diagnostics | |
Song et al. | Modeling deviations of rgb-d cameras for accurate depth map and color image registration | |
Ye et al. | Extrinsic calibration of a monocular camera and a single line scanning Lidar | |
Christie et al. | Single view geocentric pose in the wild | |
Lin et al. | Effective and efficient line segment detection for visual measurement guided by level lines | |
Kim et al. | Automatic registration of LiDAR and optical imagery using depth map stereo |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20160309 |
|
RJ01 | Rejection of invention patent application after publication |