CN114663527A - Camera self-calibration method under general motion - Google Patents

Camera self-calibration method under general motion Download PDF

Info

Publication number
CN114663527A
CN114663527A CN202210284800.5A CN202210284800A CN114663527A CN 114663527 A CN114663527 A CN 114663527A CN 202210284800 A CN202210284800 A CN 202210284800A CN 114663527 A CN114663527 A CN 114663527A
Authority
CN
China
Prior art keywords
camera
calibration
curve
obtaining
absolute
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210284800.5A
Other languages
Chinese (zh)
Inventor
刘玉
张慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202210284800.5A priority Critical patent/CN114663527A/en
Priority to PCT/CN2022/083071 priority patent/WO2023178658A1/en
Publication of CN114663527A publication Critical patent/CN114663527A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a camera self-calibration method under motion, which comprises seven steps. Compared with the prior art, the invention has the advantages that: the geometric relation between the Steiner curve and the absolute quadratic curve of the symmetrical part of the basic matrix under general motion is utilized, the complete calibration of five internal parameters of the camera and the calibration of six external parameters of the camera can be automatically realized, the operation of the calibration process is simple, only at least three pictures of general motion need to be shot in one calibration, the time of the calibration operation of the camera is saved, and the scheme has the advantages of simplicity, convenience in calculation and accurate result.

Description

Camera self-calibration method under general motion
Technical Field
The invention relates to the technical field of three-dimensional vision, in particular to a camera self-calibration method under general motion.
Background
In the field of three-dimensional vision, it is often necessary to accurately calibrate internal and external parameters of a camera so as to accurately perform operations such as subsequent photogrammetry, autonomous navigation based on vision, motion estimation, three-dimensional reconstruction and the like. However, in the method for completing camera self-calibration under general motion in actual calculation, the original technology neglects the inherent constraint of the basic matrix, and only part of camera parameters can be calculated by using the Kruppa equation; for five unknown camera parameters, there are still five possible solutions of the quintic power of two in the five quadratic equations. And the calculation accuracy is not accurate enough. Other methods attempt to simplify the Kruppa equation by eliminating the scale factors through some specific operations, however, there is still some ambiguity in the obtained camera auto-calibration constraints and the results are not accurate enough.
Disclosure of Invention
In order to solve the technical problems, the technical scheme provided by the invention is as follows: a camera self-calibration method under general motion comprises the following steps:
step one, a basic matrix is obtained through feature point correspondence;
step two, decomposing the basic matrix to obtain a conical quadratic curve Steiner curve Fs and a fixed point xa;
step three, obtaining general characteristic vector constraints of the absolute quadratic curve image and the Fs;
step four, obtaining initial estimation of three internal references and absolute secondary curve images of the camera by general feature vector constraint;
acquiring a circular ring point from the intersection point of the absolute quadratic curve image and Fs, and then acquiring initial solutions of two vanishing lines and two circle centers;
step six, establishing an objective function optimization double circle center;
and seventhly, obtaining the internal parameters of the camera by the optimal solution of the double circle centers.
Compared with the prior art, the invention has the advantages that: the geometric relation between the Steiner curve and the absolute quadratic curve of the symmetrical part of the basic matrix under general motion is utilized, the complete calibration of five internal parameters of the camera and the calibration of six external parameters of the camera can be automatically realized, the operation of the calibration process is simple, only at least three pictures of general motion need to be shot in one calibration, the time of the calibration operation of the camera is saved, and the scheme has the advantages of simplicity, convenience in calculation and accurate result.
Drawings
Fig. 1 is a schematic diagram of a second step in a camera self-calibration method under general motion.
Fig. 2 is a schematic diagram of step three in a camera self-calibration method under general motion.
Fig. 3 is a schematic diagram of step five of the camera self-calibration method under general motion.
Fig. 4 is a schematic diagram of step six of the camera self-calibration method under general motion.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings.
In the embodiment shown in fig. 1 to 4, the present invention provides a method for calibrating a camera under general motion, which includes the following steps:
step one, a basic matrix is obtained through feature point correspondence;
step two, decomposing the basic matrix to obtain a conical secondary curve Steiner curve Fs and a fixed point xa;
step three, obtaining general characteristic vector constraints of the absolute quadratic curve images and Fs;
step four, obtaining initial estimation of three internal references and absolute secondary curve images of the camera by general feature vector constraint;
acquiring a circular ring point from the intersection point of the absolute quadratic curve image and Fs, and then acquiring initial solutions of two vanishing lines and two circle centers;
step six, establishing an objective function optimization double circle center;
and seventhly, obtaining the internal parameters of the camera by the optimal solution with double circle centers.
Wherein:
1. at least three images with overlapping regions are taken of the scene to be measured. And selecting two adjacent image pairs, and acquiring high-precision feature points and matching by using a feature extraction and matching method such as SIFT. Based on the matching feature points in the image pair, the exact basis matrix F is estimated using the eight-point method or other methods, as well as the RANSAC method.
2. In the embodiment shown in FIG. 1, the fundamental matrix F can be decomposed into symmetrical parts Fs=(F+FT) /2 and asymmetric part Fa=(F-FT)/2. Wherein the symmetrical part FsIs a circleConic Steiner curve, asymmetric part FaIs an antisymmetric matrix, which can be written as Fa=[xa]×Thus point xaIs FaThe zero vector of (2). Meanwhile, the zero vector of the basic matrix F can obtain the polar point pair { e, e' }, which falls on the quadratic curve FsAbove, their connecting line laSatisfy about FsPole line quadrature constraint ofa=Fsxa. To summarize, in the geometric representation of the fundamental matrix F, the symmetrical part FsIs a conic section with F as asymmetrical partaZero vector of (2), i.e. point xa. The pair of poles { e, e' } falls on the quadratic curve FsAbove, they are the zero vectors of the fundamental matrix F, their connecting lines laSatisfy la ═ la=Fsxa
3. In the embodiment shown in FIG. 2, let the absolute conic image be ω, then v=ω*laIn the middle line laAnd point vIs a pair of epipolar lines and poles with respect to the absolute conic section image omega. Connection point vAnd point xaA fixed shaft l can be obtaineds. ω and the symmetrical part F of the basic matrixsForm omega*FsThe general eigenvector corresponding to the maximum eigenvalue is the point v1It is located on the line laThe other two general feature vectors are points v2And v3They are located atsOn the image of (a), the absolute conic image ω and the symmetric part F of the fundamental matrixsThe common eigenvector of (A) can form a common extreme triangle Deltav1v2v3. Wherein, with ω and FsGeneral eigenvector v corresponding to the largest eigenvalue1On the line laThe other two general feature vectors v2And v3On the axis lsAbove, and lsAnd laAbout ω quadrature, i.e. ls Tω*la=0。
4. Due to omega*FsGeneral eigenvector v corresponding to maximum eigenvalue1On the line laIn the above, the following constraints can be obtained,
Figure BDA0003557779300000021
three independent constraints are included. When the main shaft point (u) is known0,v0) Then, three unknown parameters, i.e., f, (v) can be recovered from the equation1x,v1y). Where f is the focal length of the natural camera, (v)1x,v1y) Is a generic feature vector v1The coordinates of (a). Using at least three images to obtain three image pairs, using and not limited to the above constraints, for camera parameters with known principal axis points, three camera parameters can be recovered, namely focal length fx,fyAnd a skew parameter s, and an initial estimate of the absolute conic image can be obtained
Figure BDA0003557779300000031
5. In the embodiment shown in FIG. 3, the image is due to an absolute quadratic curve
Figure BDA0003557779300000032
And a symmetrical part F of a pair of image elementary matrices FsThere are two pairs of imaginary intersections, which are the circle points ii,jiAnd i is 1 and 2. Suppose FsIs the projection of a circle on a plane in space, and two vanishing lines l are corresponding to the ambiguity of the normal direction of the planehiAnd i is 1 and 2. Their intersection v1On the line laUpper, and two pairs of circle points ii,jiRespectively located on the two vanishing lines. Due to the vanishing line lhiAbout F by projection from the centre of a circlesWith polar line pole constraint, two circular projection can be recovered
Figure BDA0003557779300000033
And the center of the circle
Figure BDA0003557779300000034
On the axis lsUpper, ω and a pair of image momentsSymmetrical part F of matrix FsThere are two pairs of virtual intersections, which are circular points, located on the two vanishing lines brought by the plane normal ambiguity, respectively.
6. In the embodiment shown in fig. 4, the double center of the Fs is optimized: initial estimation at two centers of a circle respectively
Figure BDA0003557779300000035
Uniformly selecting a plurality of sampling points in the nearby area
Figure BDA0003557779300000036
Fixed shaft image can be obtained by connecting a pair of sampling points
Figure BDA0003557779300000037
From the center of a circle
Figure BDA0003557779300000038
Epipolar pole constraint computation on Fs corresponds to two horizontal lines
Figure BDA0003557779300000039
Further in a horizontal line
Figure BDA00035577793000000310
Two pairs of circular points are obtained from two pairs of imaginary intersections with Fs
Figure BDA00035577793000000311
They satisfy
Figure BDA00035577793000000312
This constraint can be the calculation of an absolute quadratic curve image
Figure BDA00035577793000000313
4 independent constraints are provided. Thus obtaining more circle points using at least three images and absolute conic image constraints
Figure BDA00035577793000000314
Calculated by the method in step 4
Figure BDA00035577793000000315
Are points, respectively
Figure BDA00035577793000000316
And calculating a relative absolute conic section
Figure BDA00035577793000000317
Pole of (2)
Figure BDA00035577793000000318
Constructing an objective function, which may include, but is not limited to, the following constraints: 1) x is the number ofa
Figure BDA00035577793000000319
On the shaft
Figure BDA00035577793000000320
The above step (1); 2) dot
Figure BDA00035577793000000321
On-line laC, removing; 3) lsAnd laQuadrature with respect to ω; llAnd lrRegarding ω orthogonality, the objective function brought about by the above constraints is optimized to solve an optimal solution for the center oi (i ═ 1, 2).
7. Calculating a corresponding horizontal line l with respect to the epipolar pole constraint of Fs from the optimal solution of the center oi (i ═ 1,2)hi(ii) a Further on a horizontal line lhiTwo pairs of imaginary intersections with Fs obtain a circle point ii,jiI is 1, 2; they satisfy
Figure BDA00035577793000000322
It can provide 4 independent constraints for calculating the absolute quadratic curve image omega. Thus at least 3 images forming 3 image pairs will contain 6 pairs of circle points which provide 12 independent constraints to calculate the absolute quadratic curve image omega. The 5 internal parameters of the camera are obtained by Cholesky decomposition ω. The camera external parameters are thus obtainable from the decomposed essential matrix. Using more images will improve the accuracy of the camera self-calibration algorithm.
While there have been shown and described the fundamental principles and principal features of the invention and advantages thereof, it will be understood by those skilled in the art that the invention is not limited by the embodiments described above, which are given by way of illustration of the principles of the invention, but is susceptible to various changes and modifications without departing from the spirit and scope of the invention as defined by the appended claims. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (1)

1. A camera self-calibration method under general motion is characterized by comprising the following steps:
step one, a basic matrix is obtained through feature point correspondence;
step two, decomposing the basic matrix to obtain a conical quadratic curve Steiner curve Fs and a fixed point xa;
step three, obtaining general characteristic vector constraints of the absolute quadratic curve images and Fs;
step four, obtaining initial estimation of three internal references and absolute secondary curve images of the camera by general feature vector constraint;
step five, obtaining a circular point by the intersection point of the absolute secondary curve image and Fs, and then obtaining initial solutions of two vanishing lines and two circle centers;
step six, establishing an objective function optimization double circle center;
and seventhly, obtaining the internal parameters of the camera by the optimal solution of the double circle centers.
CN202210284800.5A 2022-03-22 2022-03-22 Camera self-calibration method under general motion Pending CN114663527A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210284800.5A CN114663527A (en) 2022-03-22 2022-03-22 Camera self-calibration method under general motion
PCT/CN2022/083071 WO2023178658A1 (en) 2022-03-22 2022-03-25 Camera self-calibration method under general motion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210284800.5A CN114663527A (en) 2022-03-22 2022-03-22 Camera self-calibration method under general motion

Publications (1)

Publication Number Publication Date
CN114663527A true CN114663527A (en) 2022-06-24

Family

ID=82031880

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210284800.5A Pending CN114663527A (en) 2022-03-22 2022-03-22 Camera self-calibration method under general motion

Country Status (2)

Country Link
CN (1) CN114663527A (en)
WO (1) WO2023178658A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6437823B1 (en) * 1999-04-30 2002-08-20 Microsoft Corporation Method and system for calibrating digital cameras
CN104167001B (en) * 2014-08-27 2017-02-15 大连理工大学 Large-visual-field camera calibration method based on orthogonal compensation
CN106530358A (en) * 2016-12-15 2017-03-22 北京航空航天大学 Method for calibrating PTZ camera by using only two scene images
CN109064516B (en) * 2018-06-28 2021-09-24 北京航空航天大学 Camera self-calibration method based on absolute quadratic curve image

Also Published As

Publication number Publication date
WO2023178658A1 (en) 2023-09-28

Similar Documents

Publication Publication Date Title
CN112102458A (en) Single-lens three-dimensional image reconstruction method based on laser radar point cloud data assistance
CN108648240B (en) Non-overlapping view field camera attitude calibration method based on point cloud feature map registration
CN106558080B (en) Monocular camera external parameter online calibration method
CN109165680B (en) Single-target object dictionary model improvement method in indoor scene based on visual SLAM
CN107833181B (en) Three-dimensional panoramic image generation method based on zoom stereo vision
CN110264528B (en) Rapid self-calibration method for binocular camera with fish-eye lens
CN110853075B (en) Visual tracking positioning method based on dense point cloud and synthetic view
CN109754459B (en) Method and system for constructing human body three-dimensional model
CN111612731B (en) Measuring method, device, system and medium based on binocular microscopic vision
CN107818598B (en) Three-dimensional point cloud map fusion method based on visual correction
WO2024045632A1 (en) Binocular vision and imu-based underwater scene three-dimensional reconstruction method, and device
CN113361365B (en) Positioning method, positioning device, positioning equipment and storage medium
CN111127401A (en) Robot stereoscopic vision mechanical part detection method based on deep learning
CN113706381A (en) Three-dimensional point cloud data splicing method and device
CN116051766A (en) External planet surface environment reconstruction method based on nerve radiation field
CN113240597B (en) Three-dimensional software image stabilizing method based on visual inertial information fusion
Wang et al. Lrru: Long-short range recurrent updating networks for depth completion
CN110838146A (en) Homonymy point matching method, system, device and medium for coplanar cross-ratio constraint
CN113393524A (en) Target pose estimation method combining deep learning and contour point cloud reconstruction
CN110555880B (en) Focal length unknown P6P camera pose estimation method
CN112288813A (en) Pose estimation method based on multi-view vision measurement and laser point cloud map matching
CN114663527A (en) Camera self-calibration method under general motion
CN114399547B (en) Monocular SLAM robust initialization method based on multiframe
CN108595373B (en) Uncontrolled DEM registration method
CN111339342A (en) Three-dimensional model retrieval method based on angle ternary center loss

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination