CN117036506B - Binocular camera calibration method - Google Patents

Binocular camera calibration method Download PDF

Info

Publication number
CN117036506B
CN117036506B CN202311077325.5A CN202311077325A CN117036506B CN 117036506 B CN117036506 B CN 117036506B CN 202311077325 A CN202311077325 A CN 202311077325A CN 117036506 B CN117036506 B CN 117036506B
Authority
CN
China
Prior art keywords
camera
rotation
angle
image
equation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311077325.5A
Other languages
Chinese (zh)
Other versions
CN117036506A (en
Inventor
贺治国
叶一群
林颖典
王立忠
王鑫宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hainan Research Institute Of Zhejiang University
Original Assignee
Hainan Research Institute Of Zhejiang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hainan Research Institute Of Zhejiang University filed Critical Hainan Research Institute Of Zhejiang University
Priority to CN202311077325.5A priority Critical patent/CN117036506B/en
Publication of CN117036506A publication Critical patent/CN117036506A/en
Application granted granted Critical
Publication of CN117036506B publication Critical patent/CN117036506B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a binocular camera calibration method. The method comprises the following steps: rotating the observation areas of the left and right cameras from areas such as water surfaces and the like where sufficient ground control points cannot be obtained to areas where a large number of accurate ground control points are conveniently obtained, and respectively recording rotation calibration angles of the areas; correcting optical distortion of the two cameras; laying ground control points, and respectively solving external azimuth elements of the two cameras based on a collineation equation; subtracting the value of the rotation calibration angle from the calculated azimuth angle to respectively obtain rotation matrixes after the calibration of the two cameras; linearizing the calibrated collineation equation into a direct linear transformation equation to obtain direct linear transformation coefficients of the two cameras; and finally, based on an image matching technology and an internal interpolation algorithm, obtaining the three-dimensional space coordinate at any point in the image. The invention well solves the problem that the control points on the water surface are difficult to be laid, and is beneficial to realizing higher-precision measurement on sea wave parameters, water depth topography and the like of the offshore area.

Description

Binocular camera calibration method
Technical Field
The invention relates to a camera calibration method, in particular to a binocular camera calibration method based on rotation, which belongs to the technical field of computer vision and can be applied to other fields such as ocean engineering.
Background
In the coastal zone, the inversion of sea wave parameters and offshore deep water topography is always the key point of research of domestic and foreign scholars, and is also the key point and the difficulty of the coastal zone observation field; with the continuous development of binocular vision technology, this problem is gradually solved. The binocular vision technology comprises image acquisition, camera calibration, image matching, three-dimensional reconstruction, image post-processing and the like, wherein the process of camera calibration, namely the process of establishing a conversion relation between two-dimensional image coordinate points and three-dimensional space coordinate points, is extremely critical, the precision of the process has very important influence on the subsequent image processing result, valuable prior data, namely accurate ground control points are usually required to be acquired in the process, and the position selection, elevation distribution, quantity control and measurement precision of the ground control points can directly influence the final result.
It is well known that in coastal zone areas, the surface control points at the beach are easier to measure, but it is harder to obtain accurate surface control points at the sea surface. Therefore, there is a need to find a method for calibrating a binocular camera without acquiring a sea surface control point.
In recent years, research on camera calibration methods has become a research hotspot in recent years. For example, the application day is 2022, 8 and 5, and the Chinese patent with patent number CN202210935362.4 discloses a double-target calibration method and a system, wherein the method updates the internal and external parameters of the binocular camera based on the Zhang Zhengyou calibration method through the difference of the longitudinal coordinates of corresponding points and the difference of actual ranging data and calculated distances, so as to achieve the purposes of calibration and ranging. The method can realize distance measurement without shooting a calibration plate under the use scene, but the method still needs to acquire coordinate values of a plurality of calibration points in the scene, and is not easy to realize in coastal zone areas, especially sea surfaces.
In general, the acquisition of ground control points at the sea surface is still a difficult point in the current field of coastal zone video observation, and related technical solutions are still lacking at present. How to accurately and simply calibrate a camera in a coastal zone observation system is a problem to be solved.
Disclosure of Invention
In order to solve the defects in the prior art, the invention provides a binocular camera calibration method.
The method comprises the following specific steps:
(1) Video camera rotation
The left camera and the right camera rotate from an area where sufficient ground control points cannot be acquired to an area where a large number of accurate ground control points are convenient to acquire, and meanwhile rotation calibration angles of the left camera and the right camera are recorded respectively;
(2) Lens optical distortion correction
(3) External orientation element solution
Setting a plurality of ground control points distributed on an image picture, respectively solving respective straight line elements and angle elements of the left and right cameras based on a collineation equation, and acquiring a rotation matrix according to the angle elements;
(4) Calculating a rotation matrix after camera calibration
Subtracting the value of the rotation calibration angle from the azimuth angle in the angle element to respectively obtain rotation matrixes after the calibration of the left camera and the right camera;
(5) Collineation equation linearization
Bringing the calibrated rotation matrix into a collineation equation, linearizing the collineation equation into a direct linear transformation equation, and respectively obtaining direct linear transformation coefficients of the left camera and the right camera;
(6) Three-dimensional information reconstruction
Based on an image matching technology and an internal interpolation algorithm, three-dimensional space coordinates of any point position in the image are obtained.
The invention also provides application of the binocular camera calibration method in sea wave parameter and water depth topography measurement of the offshore area.
According to the technical scheme, compared with the existing camera calibration method, the method has the beneficial effects that:
(1) The difficulty of establishing a ground control point on the sea surface is effectively avoided;
(2) The method is carried out by rotating the camera, so that a complex calculation process is effectively avoided;
(3) The method is beneficial to popularization and application to scenes such as lakes, rivers and the like.
Drawings
FIG. 1 is a schematic diagram of a binocular vision system;
FIG. 2 is a schematic diagram of the basic unit composition of the system;
FIG. 3 is a schematic diagram of binocular vision determination of three-dimensional information of an image;
FIG. 4 is a technical roadmap for the implementation of the method;
Fig. 5 is a schematic diagram showing the camera imaging model of the collinear relationship and the positions of the external azimuth elements.
Detailed Description
A method for calibrating a binocular camera according to the present invention will be described in further detail and fully with reference to the accompanying drawings in the embodiments of the present invention, but is not intended to limit the invention.
As shown in fig. 1, the embodiment discloses a binocular video camera video observation system, which comprises a camera 1, a bracket 2, a distribution box 3 and a base 4; two cameras 1 are provided, and a TP-LINK high-definition waterproof camera is adopted to be fixed on a bracket 2 to keep the level; the bracket 2 can be selected from the existing facilities such as a watchtower in the coastal zone area, and can also be manufactured by self-selecting materials, and is arranged at a designated position by using a tool fixing frame such as an expansion screw; the distribution box 3 is fixedly arranged on the bracket 2, and the distribution box 3 is used for placing a power supply, a wireless WiFi transmitter and a wiring board and has a good waterproof effect; if the bracket is manufactured by oneself, the base 4 can be connected with the bracket 2 by using a connecting piece such as a bolt, and also can be connected by welding, the base 4 is required to ensure enough stability, and the base 4 is fixed at a proper position by an expansion screw.
Fig. 2 is a schematic diagram of basic modules of the observation system, and mainly includes a control module, a power supply module, a wireless transmission module, a data storage module and an image acquisition module.
The high-definition waterproof camera 1 in the embodiment is installed on the support 2, is powered by a power supply of the distribution box 3, is wireless WiFi (wireless fidelity) power supply network, and is remotely controlled to a proper area through the TP-LINK security system APP on a computer or a mobile phone to observe sea wave movement and the like in real time, long time and continuously.
Based on the above system, as shown in fig. 1 and fig. 4, this embodiment also discloses a binocular camera calibration method, which includes the steps of:
(1) Firstly, the observation areas of the left and right cameras are rotated from the area where sufficient ground control points cannot be acquired, such as the water surface, to the area where a large number of accurate ground control points are convenient to acquire, and the rotation calibration angles delta alpha 1、Δα2 of the observation areas are recorded respectively.
In an embodiment, the rotation calibration angle is obtained by rotating a cradle head, and the rotation angle value is measured by an angle measuring instrument.
(2) Optical distortion correction is performed for both cameras.
In the embodiment, specifically, the chess checkerboard is printed, and the left and right cameras are used for shooting about 15-20 images of the checkerboard with known physical side lengths from different angles and different directions, so that the occurrence of the checkerboard at any position in the drawing is ensured. And then, all the images are guided into a Camera Calibration tool box with Matlab for correction, the Matlab automatically identifies the black-white intersection points of the chessboard as the standard points, and the images which are considered to be satisfactory are screened out for correction, so that the optical distortion of the images is eliminated. The output result comprises the external parameters such as the position of the main point of the camera, the distortion coefficient, the internal parameter matrix of the camera, the rotation matrix and the translation matrix corresponding to each image.
(3) 15-25 Scattered ground control points are distributed in the range of the image shooting, three-dimensional space coordinates (X, Y, Z) of the ground control points are measured by using measuring instruments such as a total station, and as shown in figure 5, the three-dimensional space coordinates are a camera model of a collinear relation between a target point and the image point and a schematic diagram of positions of external azimuth elements. The 6 external azimuth elements X S,YS,ZS, alpha, tau and theta of the left camera and the right camera are respectively solved based on the following collinear equation:
Wherein X S,YS,ZS is the coordinate of the camera in the three-dimensional space coordinate system established according to the ground control point; a i,bi,ci (i=1, 2, 3) is a constituent element of the rotation matrix R, and is obtained from the magnitudes of the elements such as the azimuth angle α, the inclination angle τ, the rotation angle θ, and the like, specifically as follows:
a1=cosαcosθ-sinαsinτsinθ
b1=-cosαsinθ+sinαsinτcosθ
c1=-sinαsinτ
a2=cosτsinθ
b2=cosτcosθ
c2=-sinτ
a3=sinαcosθ+cosαsinτsinθ
b3=-sinαsinθ+cosαsinτcosθ
c3=cosαcosθ
(4) Subtracting the value of the rotation calibration angle delta alpha 1、Δα2 from the calculated azimuth angle to respectively obtain left and right rotation matrixes after shooting calibration, wherein the rotation matrixes are formed by a' i,b′i,c′i;
a′1=cos(α-Δα)cosθ-sin(α-Δα)sinτsinθ
b′1=-cos(α-Δα)sinθ+sin(α-Δα)sinτcosθ
c′1=-sin(α-Δα)sinτ
a′2=cosτsinθ
b′2=cosτcosθ
c′2=-sinτ
a′3=sin(α-Δα)cosθ+cos(α-Δα)sinτsinθ
b′3=-sin(α-Δα)sinθ+cos(α-Δα)sinτcosθ
c′3=cos(α-Δα)cosθ
(5) And (3) bringing the calibrated rotation matrix into a collineation equation, and linearizing the collineation equation into a Direct Linear Transformation (DLT) equation to obtain direct linear transformation coefficients L L、LR of the left camera and the right camera respectively. The DLT equation is shown below:
where L 1-L11 is a Direct Linear Transform (DLT) coefficient, two sets of coefficients for the left and right cameras are available together.
(6) And finally, based on an image matching technology and an internal interpolation algorithm, obtaining the three-dimensional space coordinates of any point position in the image. Specifically:
Firstly, matching left and right images to obtain a series of matching point pairs;
Secondly, calculating three-dimensional coordinates of the space points according to DLT coefficients acquired by the matching point pairs and the left and right images, and acquiring coordinate point cloud data;
And finally, based on an internal interpolation algorithm, obtaining the three-dimensional space coordinates of any point position in the image.
The reconstruction principle in this embodiment is as follows:
Wherein (X L,yL) and (X R,yR) are coordinates of left and right image pixel points, and (X, Y, Z) are coordinates of three-dimensional space points; the coordinates (X L,yL) and (X R,yR) of the pixel points of the left and right images accurately correspond to the coordinates (X, Y, Z) of the same three-dimensional space point, and the method is the most complex and important step in the binocular vision technology.
A schematic diagram of binocular vision determination of three-dimensional information of an image is shown in fig. 3.
Specifically, the optical distortion correction of the lens is mainly radial distortion correction.
Specifically, the theoretical basis of the binocular camera calibration method is a collineation equation.
Specifically, the binocular camera calibration method is mainly accomplished by rotating the camera view to an area where a large number of precise ground control points are easily acquired.
Specifically, both cameras need to establish a conversion relationship between two-dimensional image points and three-dimensional space points based on laid ground control points scattered in the image.
Specifically, the image matching algorithm can select a traditional matching algorithm based on image feature extraction, and through detection, description and matching of feature points, namely known points, the corresponding points are solved; the image matching algorithm based on deep learning can also be applied, feature detection is avoided, the Transformer is used for obtaining feature descriptors of two images by using a self-attention layer and a mutual-attention layer, dense matching can be generated in a region with less texture, the image matching algorithm is required to be suitable for sea waves in a motion process, and the LoFTR matching algorithm based on deep learning is particularly used.
Specifically, the binocular vision technique is to calculate a positional deviation (parallax) between image pixels based on the principle of triangulation to acquire three-dimensional information of a target point.
The embodiment in the specification is described in a hardware-software mode, the technical aspect of the hardware is mainly a binocular vision observation system consisting of two cameras, and the technical aspect of the software is mainly a rotation calibration technology of the cameras, so that the embodiment of the invention is a preferred implementation mode.
In summary, in this embodiment, the camera is rotated to the beach position where a large number of accurate control points can be obtained, the rotation angles of the left and right cameras are recorded respectively, after the optical distortion of the lens is eliminated, the calibration parameters before the rotation of the left and right cameras are solved based on the collineation equation, and the three-dimensional information of the sea wave can be reconstructed through image matching and internal interpolation. The calibration method well solves the problem that the ground control point at the sea surface is difficult to set, and is beneficial to improving the inversion accuracy of sea wave parameters.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to the embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (6)

1. The binocular camera calibration method is characterized by comprising the following steps of:
(1) Video camera rotation
The left camera and the right camera rotate from an area where sufficient ground control points cannot be acquired to an area where a large number of accurate ground control points are convenient to acquire, and meanwhile rotation calibration angles of the left camera and the right camera are recorded respectively;
(2) Lens optical distortion correction
(3) External orientation element solution
Setting a plurality of ground control points distributed on an image picture, respectively solving respective straight line elements and angle elements of the left and right cameras based on a collineation equation, and acquiring a rotation matrix according to the angle elements;
(4) Calculating a rotation matrix after camera calibration
Subtracting the value of the rotation calibration angle from the azimuth angle in the angle element to respectively obtain rotation matrixes after the calibration of the left camera and the right camera;
(5) Collineation equation linearization
Bringing the calibrated rotation matrix into a collineation equation, linearizing the collineation equation into a direct linear transformation equation, and respectively obtaining direct linear transformation coefficients of the left camera and the right camera;
(6) Three-dimensional information reconstruction
Based on an image matching technology and an internal interpolation algorithm, three-dimensional space coordinates of any point position in the image are obtained.
2. The binocular camera calibration method of claim 1, wherein: the rotation calibration angle is obtained by rotating a cradle head, and the rotation angle value is measured by an angle measuring instrument.
3. The binocular camera calibration method of claim 1, wherein: the optical distortion correction of the lens is mainly radial distortion correction.
4. The binocular camera calibration method of claim 1, wherein: the step (3) is specifically as follows:
Firstly, arranging 15-25 scattered ground control points in the range shot by an image, and measuring three-dimensional space coordinates (X, Y, Z) of the ground control points by using a total station;
and then, respectively solving a linear element X S,YS,ZS and angle elements tau, theta and alpha of the left camera and the right camera based on a collineation equation.
5. The binocular camera calibration method of claim 4, wherein: the rotation matrix described in step (3) consists of a i,bi,ci, where i=1, 2,3, calculated as follows:
a1=cosαcosθ-sinαsinτsinθ
b1=-cosαsinθ+sinαsinτcosθ
c1=-sinαsinτ
a2=cosτsinθ
b2=cosτcosθ
c2=-sinτ
a3=sinαcosθ+cosαsinτsinθ
b3=-sinαsinθ+cosαsinτcosθ
c3=cosαcosθ
where α is azimuth, τ is inclination angle, θ is rotation angle.
6. A binocular camera calibration method according to any one of claims 1 to 5, characterized in that: the step (6) is specifically as follows:
Firstly, matching left and right images to obtain a series of matching point pairs;
secondly, calculating three-dimensional coordinates of the space points according to the matching point pairs and the direct linear transformation coefficients acquired by the left image and the right image to obtain coordinate point cloud data;
And finally, based on an internal interpolation algorithm, obtaining the three-dimensional space coordinates of any point position in the image.
CN202311077325.5A 2023-08-25 2023-08-25 Binocular camera calibration method Active CN117036506B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311077325.5A CN117036506B (en) 2023-08-25 2023-08-25 Binocular camera calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311077325.5A CN117036506B (en) 2023-08-25 2023-08-25 Binocular camera calibration method

Publications (2)

Publication Number Publication Date
CN117036506A CN117036506A (en) 2023-11-10
CN117036506B true CN117036506B (en) 2024-05-10

Family

ID=88628070

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311077325.5A Active CN117036506B (en) 2023-08-25 2023-08-25 Binocular camera calibration method

Country Status (1)

Country Link
CN (1) CN117036506B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015087315A1 (en) * 2013-12-10 2015-06-18 L.M.Y. Research & Development Ltd. Methods and systems for remotely guiding a camera for self-taken photographs
CN110969668A (en) * 2019-11-22 2020-04-07 大连理工大学 Stereoscopic calibration algorithm of long-focus binocular camera
CN112541951A (en) * 2020-11-13 2021-03-23 国网浙江省电力有限公司舟山供电公司 Monitoring system and monitoring method for preventing ship from hooking off cross-sea overhead power line
CN113129385A (en) * 2020-12-23 2021-07-16 合肥工业大学 Binocular camera internal and external parameter calibration method based on multi-coding plane target in space
CN115797461A (en) * 2022-11-11 2023-03-14 中国消防救援学院 Flame space positioning system calibration and correction method based on binocular vision

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015087315A1 (en) * 2013-12-10 2015-06-18 L.M.Y. Research & Development Ltd. Methods and systems for remotely guiding a camera for self-taken photographs
CN110969668A (en) * 2019-11-22 2020-04-07 大连理工大学 Stereoscopic calibration algorithm of long-focus binocular camera
CN112541951A (en) * 2020-11-13 2021-03-23 国网浙江省电力有限公司舟山供电公司 Monitoring system and monitoring method for preventing ship from hooking off cross-sea overhead power line
CN113129385A (en) * 2020-12-23 2021-07-16 合肥工业大学 Binocular camera internal and external parameter calibration method based on multi-coding plane target in space
CN115797461A (en) * 2022-11-11 2023-03-14 中国消防救援学院 Flame space positioning system calibration and correction method based on binocular vision

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
GPS双目摄像机标定及空间坐标重建;孔筱芳 等;光学 精密工程;20170228;第25卷(第2期);第486-492页 *

Also Published As

Publication number Publication date
CN117036506A (en) 2023-11-10

Similar Documents

Publication Publication Date Title
CN102364299B (en) Calibration technology for multiple structured light projected three-dimensional profile measuring heads
CN110021046A (en) The external parameters calibration method and system of camera and laser radar combination sensor
CN107358631B (en) Binocular vision reconstruction method considering three-dimensional distortion
CN105157566B (en) The method of 3 D stereo colour point clouds scanning
CN102032878B (en) Accurate on-line measurement method based on binocular stereo vision measurement system
CN108646259A (en) A kind of three-dimensional laser scanner, which is set, stands firm to device and method
CN107886547B (en) Fisheye camera calibration method and system
CN106204731A (en) A kind of multi-view angle three-dimensional method for reconstructing based on Binocular Stereo Vision System
CN110645917B (en) Array camera-based high-spatial-resolution three-dimensional digital image measuring method
CN106123798B (en) A kind of digital photography laser scanning device
CN109297436B (en) Binocular line laser stereo measurement reference calibration method
CN109099883A (en) The big visual field machine vision metrology of high-precision and caliberating device and method
CN105066962B (en) A kind of high-precision photogrammetric apparatus of the big angle of visual field of multiresolution
CN105157602A (en) Remote three-dimensional scanning system and method based on machine vision
CN110146030A (en) Side slope surface DEFORMATION MONITORING SYSTEM and method based on gridiron pattern notation
CN109141226A (en) The spatial point coordinate measuring method of one camera multi-angle
CN110378969A (en) A kind of convergence type binocular camera scaling method based on 3D geometrical constraint
CN116051659B (en) Linear array camera and 2D laser scanner combined calibration method
CN113724337B (en) Camera dynamic external parameter calibration method and device without depending on tripod head angle
CN112365545B (en) Calibration method of laser radar and visible light camera based on large-plane composite target
CN112212788A (en) Visual space point three-dimensional coordinate measuring method based on multiple mobile phones
CN106500625A (en) A kind of telecentricity stereo vision measuring apparatus and its method for being applied to the measurement of object dimensional pattern micron accuracies
CN106780632A (en) A kind of demarcation target group demarcated target and demarcated for four road fish eye lenses
CN103561257A (en) Interference-free light-encoded depth extraction method based on depth reference planes
CN113450416B (en) TCSC method applied to three-dimensional calibration of three-dimensional camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant