CN105046691B - A kind of Camera Self-Calibration method based on orthogonal end point - Google Patents

A kind of Camera Self-Calibration method based on orthogonal end point Download PDF

Info

Publication number
CN105046691B
CN105046691B CN201510372406.7A CN201510372406A CN105046691B CN 105046691 B CN105046691 B CN 105046691B CN 201510372406 A CN201510372406 A CN 201510372406A CN 105046691 B CN105046691 B CN 105046691B
Authority
CN
China
Prior art keywords
end point
orthogonal end
camera
calibration method
method based
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510372406.7A
Other languages
Chinese (zh)
Other versions
CN105046691A (en
Inventor
吕松
刁常宇
邢卫
鲁东明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201510372406.7A priority Critical patent/CN105046691B/en
Publication of CN105046691A publication Critical patent/CN105046691A/en
Application granted granted Critical
Publication of CN105046691B publication Critical patent/CN105046691B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses a kind of Camera Self-Calibration method based on orthogonal end point, comprise the following steps:Straight-line detection is carried out to demarcation scene image and obtains some parallel line segments;For any n bars line segment, n >=4, proceed as follows:Calculate the orthogonal end point and camera focus of the n bar line segments;Corresponding Intrinsic Matrix is calculated according to described camera focus and orthogonal end point;Final Intrinsic Matrix is obtained according to method of all Intrinsic Matrixes based on random sampling uniformity.The line segment that the Camera Self-Calibration method of the present invention obtains to all straight-line detections is grouped, the influence of straight-line detection mistiming testing result can be effectively excluded by multigroup detection, the precision of self-calibration result is improved, and parallel computation is easy to implement by packet, is advantageous to improve self-calibration speed.

Description

A kind of Camera Self-Calibration method based on orthogonal end point
Technical field
The present invention relates to high-resolution three-dimension reconstruction technique field, and in particular to a kind of camera based on orthogonal end point is certainly Scaling method, applied to the camera calibration process in being rebuild based on photometric stereo high-resolution three-dimension.
Background technology
In the three-dimensional reconstruction based on image, the coefficient of camera itself has directly influenced the accurate of three-dimensional reconstruction result Property, therefore, result of the accurate camera calibration result for three-dimensional reconstruction algorithm is obtained, there is conclusive influence.
Currently used camera calibration technology is broadly divided into three classes:Traditional camera calibration technique, Camera Self-Calibration technology, base In the camera calibration technology of active vision.
Traditional camera calibration technique uses the mathematical transformation in image processing process, is counted under certain camera model Calculate, obtain camera calibration result.This method can borrow the known information such as body form and size, so as to obtain as far as possible More parameters carries out camera calibration, resulting result precision highest, but is constrained by demarcation object information, if demarcation During object error be present or artificial interaction has error, by the influence calibration result of maximization.In the work of reality Due to the constraint of condition in journey, it is difficult to accurately control environment, man's activity, calibration result is easily influenceed.
Camera calibration technology based on active vision, by accurately controlling the movement of camera come shooting image, by camera Precise displacement information simplifies difficulty in computation.In this way, it is greatly reduced in calibration process complexity so that As a result it is readily available.However, accurately controlling the special equipment of the mobile needs of camera to realize, equipment precision requirement is very high, Cost is very big, and such equipment is not readily available, and therefore, this method is not suitable for ordinary circumstance.And it can not realize that field is made Industry, it can typically be only used to the indoor scanning of precision.
Camera Self-Calibration technology is not influenceed by captured object, only considers the change that environmental variance changes with camera, The influence of camera subject inner parameter.Camera Self-Calibration is actually the Intrinsic Matrix for obtaining camera, and General Result includes Spin matrix, translation vector and distortion parameter (parameter for weighing deformation size).
The Camera Self-Calibration technology used at present has convenient, flexible, is generally applicable, the advantages of not being affected by the external environment, But simultaneously there is also many defects, maximum of which defect is that robustness is very low, calculating process is complicated, unstable, it is difficult to ensure As a result accuracy.
The content of the invention
In view of the shortcomings of the prior art, the invention provides a kind of Camera Self-Calibration method based on orthogonal end point, energy The 3-D scanning process under common situation is enough tackled, overcomes current Camera Self-Calibration technology to be difficult to accurately obtain camera coefficient Shortcoming, and be advantageous to improve the robustness of Camera Self-Calibration technology, there is the advantages that generally applicable, solution is easy, and robustness is high.
A kind of Camera Self-Calibration method based on orthogonal end point, comprises the following steps:
(1) straight-line detection is carried out to demarcation scene image and obtains some parallel line segments,
(2) any n bars line segment is directed to, n >=4, is proceeded as follows:
(2-1) calculates the orthogonal end point and camera focus of the n bar line segments;
Corresponding Intrinsic Matrix is calculated according to described camera focus and orthogonal end point in (2-2);
(3) final Intrinsic Matrix is obtained according to method of all Intrinsic Matrixes based on random sampling uniformity.
Straight-line detection is carried out to demarcation scene image using Feature Points Extraction in the present invention.Demarcate scene image (i.e. Demarcate the image of thing) there should be enough straightways (can at least obtain four parallel line segments), demarcation thing should be with artificial Based on object, or the natural scenery with obvious linear edge.
It is more to be generally designated the bar number for the line segment that object image includes, it is as a result more accurate, but it is computationally intensive.
Actually first the line segment that detection obtains in step (1) is carried out by n according to permutation and combination principle in the step (2) Individual one group of carry out permutation and combination, corresponding internal reference matrix is calculated respectively for each combined result.
To improve calculating speed, parallel processing can be carried out to each combination (including n bars line segment) in practical application.
Each group of parallel processing, speed is relative will necessarily be fast, and when can effectively exclude straight-line detection by multigroup detection The influence of error detection result, the precision of self-calibration result is improved, and final intrinsic parameter square is obtained based on random sampling uniformity Battle array (selects one as final Intrinsic Matrix) in all Intrinsic Matrixes actually obtained from step (2), significantly Improve the robustness of Camera Self-Calibration method.
In addition, the Camera Self-Calibration method based on orthogonal end point of the present invention can greatly reduce interactive process, The error that manual operation is brought is reduced, the uncertainty in calibration process is avoided, facilitates proving operation, and camera mark can be reduced The fixed requirement for device context, time, the funds cost of self-calibration process are reduced, can quickly obtain calibration result;It is not required to Calibration facility is carried out by a kind of standard device of version is such as demarcated, flexibly and easily, suitable for various environment, avoided simultaneously Influence of the equipment error for calibration result.
Preferably, the value of the n is 4~10.N is bigger, as a result more accurate, but computationally intensive.Further preferably, The n=4.Four line segments can further improve calculating speed, now caused precise decreasing can to minimize solution Final Intrinsic Matrix is obtained by the method based on random sampling uniformity to be balanced.
Error be present due to carrying out straight-line detection, the line segment typically detected includes actual value and detection error (is made an uproar Sound).To eliminate influence of noise, the step (2-1) comprises the following steps:
(2-11) establishes the mixed model of effective robust of each line segment, and the mixed model includes obeying Cauchy's distribution Estimate noise;
The mixed model of (2-12) based on each line segment, is proceeded as follows:
(a) iteration carries out disappearance point estimation to corresponding mixed model, obtains orthogonal end point, until meeting end condition After stop, and the orthogonal end point obtained using last time iteration is used as final result;
(b) corresponding camera focus is calculated according to final result.
Noise (estimation noise) mixed model is included by structure, the iteration progress end point for being then based on mixed model is estimated Meter, influence of noise is gradually reduced by iterating, the obtained accuracy of orthogonal end point is ensure that, afterwards again to using Shandong Rod is refined based on the Maximum-likelihood estimation of mixed model, so as to greatly improving the precision of calibration result.
For ease of calculating, the estimation noise of all line segments is set in the present invention to obey independent identically distributed zero-mean gaussian Distribution.
It is conventional that disappearance point estimation is carried out based on maximum-likelihood method, can additionally it take greatest hope to estimate, but greatly The possibility predication scope of application is wider.Therefore, preferably, carrying out end point based on Maximum Likelihood Estimation Method in the step (a) Estimation.
End condition in the step (a) is as follows:
The mean square deviation of the Euclidean distance for the orthogonal end point that adjacent iteration twice obtains is less than default threshold value;Or reach most Big iterations.Described threshold value is 0.1~1 pixel, typically smaller than 0.5 pixel;The maximum iteration be 20~ 50, (it can generally reach convergence, i.e., the mean square deviation of the Euclidean distance for the orthogonal end point that adjacent iteration twice obtains before 30 times Less than default threshold value).
Compared with prior art, the invention has the advantages that:
The line segment obtained to all straight-line detections is grouped, and straight-line detection mistiming can be effectively excluded by multigroup detection The influence of testing result, the precision of self-calibration result is improved, and parallel computation is easy to implement by packet, be advantageous to improve from mark Constant speed rate;
Camera Self-Calibration method based on orthogonal end point can greatly reduce interactive process, reduce manual operation band The error come, avoids the uncertainty in calibration process, facilitates proving operation;And camera calibration can be reduced for device context Requirement, reduce time of calibration process, funds are spent, can quickly obtain calibration result;
Calibration facility need not be carried out by a kind of standard device of version is such as demarcated, flexibly and easily, suitable for various rings Border, while avoid influence of the equipment error for calibration result.
Embodiment
Below in conjunction with specific embodiment, the present invention is described in detail.
A kind of Camera Self-Calibration method based on orthogonal end point, comprises the following steps:
(1) straight-line detection is carried out to demarcation scene image and obtains some parallel line segments,
Straight-line detection is carried out to demarcation scene image using Feature Points Extraction in the present embodiment.Demarcate scene image (image for demarcating thing) should have enough straightways (can at least obtain four parallel line segments), and demarcation thing should be with people Based on divine force that created the universe body, or the natural scenery with obvious linear edge.
(2) any four line segments are directed to, are proceeded as follows:
(2-1) calculates the orthogonal end point and camera focus of four line segments;
Error be present due to carrying out straight-line detection, the line segment typically detected includes actual value and detection error (is made an uproar Sound).To eliminate influence of noise, step (2-1) is achieved by the steps of in the present embodiment:
(2-11) establishes the mixed model of effective robust of each line segment, and mixed model includes obeying the estimation of Cauchy's distribution Noise;
The mixed model for establishing straight line is actually to represent straight-line detection by the way of a kind of actual value and noise sum Obtained each line segment, the step can be it is determined that carry out before orthogonal end point, can also be after straight-line detection is carried out directly Carry out.For ease of calculating, the estimation noise of all line segments is set in the present embodiment to obey independent identically distributed zero-mean gaussian Distribution.
The mixed model of (2-12) based on each line segment, is proceeded as follows:
(a) disappearance point estimation is carried out to corresponding mixed model based on Maximum Likelihood Estimation Method iteration, obtains orthogonal disappearance Point, until the mean square deviation of the Euclidean distance for the orthogonal end point that adjacent iteration twice obtains is less than default threshold value (the present embodiment In be 0.5 pixel);Or stop when reaching maximum iteration (50 times), and the orthogonal disappearance obtained with last time iteration Point is used as final result;
(b) corresponding camera focus is calculated according to final result.
Corresponding Intrinsic Matrix is calculated according to camera focus and orthogonal end point in (2-2);
The equation group containing unknown quantity is actually first calculated, unknown quantity is camera focus, then solves the equation End point and camera focus are obtained,
With four line segment l1、l2、l3And l4Exemplified by illustrate corresponding orthogonal end point and the camera focus i.e. camera of determination The method of Intrinsic Matrix.First in the case where meeting following constraints:
Three orthogonal end point v are tried to achieve according to equation belowi, i=1,2,3,:
v1=l1×l2
v2=l3×l4
v3=K ((K-1v1)×(K-1v2))
Wherein, K is camera internal reference matrix, ω=K-TK-1The absolute conic of image is represented, K represents the internal reference of camera Matrix, K- TRepresenting matrix K transposition it is inverse.
The coordinate of the central point of acquiescence is 0 in internal reference matrix (i.e. Intrinsic Matrix) that the present embodiment obtains, root during shooting Intrinsic Matrix is just obtained according to the coordinate of the central point of application demand setting coordinate system.
(3) final Intrinsic Matrix is obtained according to method of all Intrinsic Matrixes based on random sampling uniformity.
To improve calculating speed, parallel processing can be carried out to each combination (including four line segments) in practical application.
Each group of parallel processing speeds will necessarily be fast, and packet calculate and can be implemented, and then passes through multigroup detection energy Enough influences for effectively excluding straight-line detection mistiming testing result, improve the precision of self-calibration result.
Technical scheme and beneficial effect are described in detail above-described embodiment, Ying Li Solution is to the foregoing is only presently most preferred embodiment of the invention, is not intended to limit the invention, all principle models in the present invention Interior done any modification, supplement and equivalent substitution etc. are enclosed, should be included in the scope of the protection.

Claims (8)

  1. A kind of 1. Camera Self-Calibration method based on orthogonal end point, it is characterised in that comprise the following steps:
    (1) straight-line detection is carried out to demarcation scene image and obtains some parallel line segments;
    (2) any n bars line segment is directed to, n >=4, is proceeded as follows:
    (2-1) calculates the orthogonal end point and camera focus of the n bar line segments;
    Corresponding Intrinsic Matrix is calculated according to described camera focus and orthogonal end point in (2-2);
    Actually first the line segment that detection obtains in step (1) is carried out by n one according to permutation and combination principle in the step (2) Group carries out permutation and combination, and corresponding internal reference matrix is calculated respectively for each combined result;
    (3) final Intrinsic Matrix is obtained according to method of all Intrinsic Matrixes based on random sampling uniformity.
  2. 2. the Camera Self-Calibration method based on orthogonal end point as claimed in claim 1, it is characterised in that the value of the n For 4~10.
  3. 3. the Camera Self-Calibration method based on orthogonal end point as claimed in claim 1, it is characterised in that the step (2- 1) comprise the following steps:
    (2-11) establishes the mixed model of effective robust of each line segment, and the mixed model includes obeying the estimation of Cauchy's distribution Noise;
    The mixed model of (2-12) based on each line segment, is proceeded as follows:
    (a) iteration carries out disappearance point estimation to corresponding mixed model, obtains orthogonal end point, until stopping after meeting end condition Only, and using last time iteration the orthogonal end point obtained is used as final result;
    (b) corresponding camera focus is calculated according to final result.
  4. 4. the Camera Self-Calibration method based on orthogonal end point as claimed in claim 3, it is characterised in that all line segments are estimated Noise is counted to obey independent identically distributed zero-mean gaussian distribution.
  5. 5. the Camera Self-Calibration method based on orthogonal end point as claimed in claim 3, it is characterised in that the step (a) In disappearance point estimation carried out based on Maximum Likelihood Estimation Method.
  6. 6. the Camera Self-Calibration method based on orthogonal end point as claimed in claim 3, it is characterised in that the step (a) In end condition it is as follows:
    The mean square deviation of the Euclidean distance of the adjacent orthogonal end point obtained twice is less than default threshold value;
    Or reach maximum iteration.
  7. 7. the Camera Self-Calibration method based on orthogonal end point as claimed in claim 6, it is characterised in that described threshold value is 0.1~1 pixel.
  8. 8. the Camera Self-Calibration method based on orthogonal end point as claimed in claim 6, it is characterised in that the greatest iteration Number is 20~50.
CN201510372406.7A 2015-06-26 2015-06-26 A kind of Camera Self-Calibration method based on orthogonal end point Active CN105046691B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510372406.7A CN105046691B (en) 2015-06-26 2015-06-26 A kind of Camera Self-Calibration method based on orthogonal end point

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510372406.7A CN105046691B (en) 2015-06-26 2015-06-26 A kind of Camera Self-Calibration method based on orthogonal end point

Publications (2)

Publication Number Publication Date
CN105046691A CN105046691A (en) 2015-11-11
CN105046691B true CN105046691B (en) 2018-04-10

Family

ID=54453209

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510372406.7A Active CN105046691B (en) 2015-06-26 2015-06-26 A kind of Camera Self-Calibration method based on orthogonal end point

Country Status (1)

Country Link
CN (1) CN105046691B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106327532B (en) * 2016-08-31 2019-06-11 北京天睿空间科技股份有限公司 A kind of three-dimensional registration method of single image
CN110033492B (en) * 2019-04-17 2021-05-11 深圳金三立视频科技股份有限公司 Camera calibration method and terminal
CN110033493B (en) * 2019-04-17 2021-05-11 深圳金三立视频科技股份有限公司 Camera 3D calibration method and terminal
CN113016007B (en) * 2020-08-12 2023-11-10 香港应用科技研究院有限公司 Apparatus and method for estimating the orientation of a camera relative to the ground

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101488222A (en) * 2008-01-16 2009-07-22 中国科学院自动化研究所 Camera self-calibration method based on movement target image and movement information
CN101814184A (en) * 2010-01-15 2010-08-25 北京智安邦科技有限公司 Calibration method and device based on line segments
CN101980292A (en) * 2010-01-25 2011-02-23 北京工业大学 Regular octagonal template-based board camera intrinsic parameter calibration method
EP2927870A1 (en) * 2014-04-02 2015-10-07 Panasonic Intellectual Property Management Co., Ltd. Calibration apparatus, calibration method, and calibration program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101488222A (en) * 2008-01-16 2009-07-22 中国科学院自动化研究所 Camera self-calibration method based on movement target image and movement information
CN101814184A (en) * 2010-01-15 2010-08-25 北京智安邦科技有限公司 Calibration method and device based on line segments
CN101980292A (en) * 2010-01-25 2011-02-23 北京工业大学 Regular octagonal template-based board camera intrinsic parameter calibration method
EP2927870A1 (en) * 2014-04-02 2015-10-07 Panasonic Intellectual Property Management Co., Ltd. Calibration apparatus, calibration method, and calibration program

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Robust Camera Calibration with Vanishing Points;Xiaopuan Xu et al;《2012 5th International Congress on Image and Signal Processing》;20121231;全文 *
一种基于消失点的摄像机标定方法;刘鹏、贾云得;《中国图象图形学报》;20030930;第8卷;摘要、第134-135页第0节、第135-136页第1.1节、第2.1 -2.3节 *
基于非定标图像序列的三维重建关键技术研究;陈付幸;《中国博士学位论文全文数据库 信息科技辑》;20061115(第11期);第75页第5.4.2节、第80页第5.6节、第81页第5.6.1节、第82页第5.6.3节、5.6.4节 *

Also Published As

Publication number Publication date
CN105046691A (en) 2015-11-11

Similar Documents

Publication Publication Date Title
CN105046691B (en) A kind of Camera Self-Calibration method based on orthogonal end point
CN107705333B (en) Space positioning method and device based on binocular camera
CN104154875B (en) Three-dimensional data acquisition system and acquisition method based on two-axis rotation platform
CN104333675B (en) A kind of panorama electronic image stabilization method based on spherical projection
CN105654476B (en) Binocular calibration method based on Chaos particle swarm optimization algorithm
CN107633536A (en) A kind of camera calibration method and system based on two-dimensional planar template
CN103903260B (en) Target method for quickly calibrating intrinsic parameters of vidicon
CN107358633A (en) Join scaling method inside and outside a kind of polyphaser based on 3 points of demarcation things
CN102750697A (en) Parameter calibration method and device
CN102609941A (en) Three-dimensional registering method based on ToF (Time-of-Flight) depth camera
CN108053373A (en) One kind is based on deep learning model fisheye image correcting method
CN110443879B (en) Perspective error compensation method based on neural network
CN104981846A (en) Apparatus and method for sensing ball in motion
CN110246124A (en) Target size measurement method and system based on deep learning
CN106570907A (en) Camera calibrating method and device
CN103903263B (en) A kind of 360 degrees omnidirection distance-finding method based on Ladybug panorama camera image
CN111062131A (en) Power transmission line sag calculation method and related device
CN107341844A (en) A kind of real-time three-dimensional people's object plotting method based on more Kinect
CN106971408A (en) A kind of camera marking method based on space-time conversion thought
US9830712B2 (en) Method and apparatus for sensing moving ball
CN103247048A (en) Camera mixing calibration method based on quadratic curve and straight lines
CN107339938A (en) A kind of special-shaped calibrating block and scaling method for single eye stereo vision self-calibration
CN104123725B (en) A kind of computational methods of single line array camera homography matrix H
CN103994779A (en) Panorama camera calibrating method based on three-dimensional laser-point cloud
CN105405135B (en) Two-step method photography object point, picture point automatic matching method based on basic configuration point

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant