CN111415391B - External azimuth parameter calibration method for multi-camera by adopting mutual shooting method - Google Patents

External azimuth parameter calibration method for multi-camera by adopting mutual shooting method Download PDF

Info

Publication number
CN111415391B
CN111415391B CN202010131077.8A CN202010131077A CN111415391B CN 111415391 B CN111415391 B CN 111415391B CN 202010131077 A CN202010131077 A CN 202010131077A CN 111415391 B CN111415391 B CN 111415391B
Authority
CN
China
Prior art keywords
camera
measuring
target
auxiliary
external
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010131077.8A
Other languages
Chinese (zh)
Other versions
CN111415391A (en
Inventor
吴军
李泽川
郭润夏
徐鋆
李雁玲
李鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Civil Aviation University of China
Original Assignee
Civil Aviation University of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Civil Aviation University of China filed Critical Civil Aviation University of China
Priority to CN202010131077.8A priority Critical patent/CN111415391B/en
Publication of CN111415391A publication Critical patent/CN111415391A/en
Application granted granted Critical
Publication of CN111415391B publication Critical patent/CN111415391B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A method for calibrating external azimuth parameters of a multi-camera by adopting a mutual shooting method. The method comprises the steps of establishing a calibration system of the multi-camera space measurement system; calibrating parameters in a monocular camera; the auxiliary camera is combined with the measurement camera to calibrate external parameters of the binocular vision system; calculating the pose of the target under an auxiliary camera coordinate system; calculating the pose of the target under the self-measuring camera coordinate system; calibrating external parameters of mutual shooting; and rotating the measuring camera to the working posture and measuring. The external azimuth parameter calibration method of the multi-camera by adopting the mutual shooting method can quickly and accurately calibrate the external parameters between each measuring camera in the multi-camera measuring system, and can flexibly adapt to various working environments, so that the measuring work can be efficiently carried out. In addition, the external parameters between cameras can be calibrated rapidly, accurate parameters are provided for subsequent measurement work, the working efficiency and the precision can be improved, and the measurement cost is reduced.

Description

External azimuth parameter calibration method for multi-camera by adopting mutual shooting method
Technical Field
The invention belongs to the technical field of machine vision three-dimensional space coordinate measurement, and particularly relates to a method for calibrating external azimuth parameters of a multi-camera by adopting a mutual shooting method.
Background
With the continuous development of fields such as machine manufacturing, aerospace, robots and the like in China, particularly the development of the production technology level of domestic large aircrafts, the requirement on the space measurement precision in the manufacturing and assembling process is also continuously improved. The accuracy of the measurement technique directly determines the accuracy of industrial manufacturing and assembly. The traditional three-coordinate measuring machine has overlarge volume and lower calculation efficiency, and the traditional measuring method often needs to contact with the measured object, so that the measured object is easily damaged, and the precision and efficiency requirements of the modern industry on a measuring system can not be met.
Along with the improvement of hardware level, the calculation speed of the computer is also improved rapidly, so that the computer vision and photogrammetry technology can be widely applied to various industries, the working efficiency and precision of the measurement system are greatly improved, and the labor cost and the time cost are reduced. In the field of three-coordinate measurement of computer vision space, the multi-camera vision measurement system is widely applied by virtue of the advantages of high measurement precision, strong adaptability, high measurement efficiency, low cost and the like. Compared with the traditional three-coordinate measuring method, the method has the advantages of high precision, zero loss on the measured object and the like. Therefore, it is of great practical importance to study multi-camera vision measurement systems.
In the multi-camera measurement system, the mainly adopted measurement method is a triangulation algorithm, in which the calibration accuracy of the internal and external parameters of the camera can directly influence the measurement accuracy of the space coordinates, and the internal parameters of the camera can be accurately calibrated by using mature calibration tools such as a Zhang's calibration method. However, as the working scene changes, the parameters between cameras in the system change, and the conventional binocular vision external parameter calibration method cannot be used for calibrating by means of a checkerboard because of the limitation of the working environment. Therefore, the research on a quick, accurate and flexible external parameter calibration method of the camera is a serious problem in the development of a multi-camera space three-coordinate measurement system.
Disclosure of Invention
In order to solve the problems, the invention aims to provide a calibration method for external azimuth parameters of a multi-camera by adopting a mutual shooting method.
In order to achieve the above object, the method for calibrating external azimuth parameters of a multi-camera by using a mutual shooting method provided by the invention comprises the following steps in sequence:
step 1) establishing a calibration system of the multi-camera space measurement system: the system comprises a first measuring camera, a second measuring camera, a target, an auxiliary camera and a precise rotating platform; the lower ends of the first measuring camera, the second measuring camera and the auxiliary camera are respectively provided with a precise rotary table, and the three precise rotary tables are arranged in a triangle; a target is respectively arranged on the first measuring camera and the second measuring camera;
step 2) calibrating parameters in the monocular camera: respectively establishing mathematical models of a first measuring camera, a second measuring camera and an auxiliary camera according to the small-hole imaging model; calibrating an internal reference matrix and distortion parameters of the first measuring camera, the second measuring camera and the auxiliary camera by means of a checkerboard calibration plate according to a homography matrix mapping principle and a nonlinear optimization principle;
step 3) auxiliary cameras are combined with a measurement camera to calibrate external parameters of the binocular vision system: the first measuring camera and the second measuring camera respectively form a binocular vision system with the auxiliary camera, and then the external parameter matrixes of the two binocular vision systems are calibrated according to the binocular vision external parameter calibration principle;
step 4) calculating the pose of the target under an auxiliary camera coordinate system: rotating the auxiliary camera, and obtaining an external parameter matrix between the rotated auxiliary camera and the first or second measuring camera by using the external parameter matrix of the calibrated binocular vision system: shooting a target by using an auxiliary camera, and calculating the accurate pose of the target under an auxiliary camera coordinate system;
step 5) calculating the pose of the target under the self-measuring camera coordinate system: calculating the pose of the target under the coordinate system of the self-measuring camera through simple coordinate conversion by utilizing the external parameter matrix between the rotated auxiliary camera and the first measuring camera or the second measuring camera obtained in the step 4) and the calculated pose of the target under the coordinate system of the auxiliary camera;
and 6) calibrating external parameters of mutual shooting: shooting targets on the opposite side measuring cameras by using the two measuring cameras, calculating the pose of the opposite side targets, and calculating the current external parameters of the two measuring cameras through conversion;
and 7) rotating the measuring camera to a working posture by using the precision rotating table and measuring.
In the step 2), the parameter calibration in the monocular camera is realized by a calibration tool box in Matlab or a calibration function in OpenCV.
In step 3), the method for calibrating the external parameters of the binocular vision system by the auxiliary camera combined measurement camera comprises the following steps: firstly, a binocular vision system is formed by a first measuring camera and a second measuring camera and an auxiliary camera respectively, then, the first measuring camera and the auxiliary camera as well as the second measuring camera and the auxiliary camera are utilized to shoot the checkerboard calibration plate at the same time, space feature point matching is carried out after shooting, an essential matrix is calculated after paired space feature points are obtained, and a rotation matrix R and a translation vector T in an external parameter matrix can be decomposed through the essential matrix.
In step 4), the method for calculating the pose of the target under the auxiliary camera coordinate system is as follows: the method comprises the steps of rotating an auxiliary camera to be opposite to a target on a first measuring camera or a second measuring camera by using a precise rotating table, then imaging the target by using the auxiliary camera, obtaining a pixel coordinate of the target after obtaining an image, wherein the world coordinate of the target is known, so that the pose of the target under an auxiliary camera coordinate system can be obtained by using the association between the world coordinate of the target and the pixel coordinate.
In step 5), the method for calculating the pose of the target under the self-measurement camera coordinate system is as follows: multiplying the external parameter matrix between the rotated auxiliary camera obtained in the step 4) and the first measuring camera or the second measuring camera by the calculated pose of the target under the auxiliary camera coordinate system, namely, the coordinate of the target under the self measuring camera coordinate system can be obtained.
In step 6), the method for calibrating the external parameters of mutual shooting is as follows: the pose of the target under the coordinate system of the self-measuring camera is calculated through the step 5), the target on the second measuring camera is shot by utilizing the first measuring camera, or the target on the first measuring camera is shot by utilizing the second measuring camera, then the pose of the opposite target is calculated by utilizing a PnP algorithm, and the current external parameters of the two measuring cameras are calculated through conversion.
In step 7), the method for rotating the measuring camera to the working posture and measuring by using the precision rotating table is as follows: before the measurement work is carried out, the first measurement camera and the second measurement camera are rotated to an angle opposite to the measured object; the rotation angles of the first measuring camera and the second measuring camera from the calibration posture to the working posture are recorded by using the precise rotating table, and the external parameters of the two measuring cameras in the working posture are obtained through calculation, so that the measuring work can be carried out.
The external azimuth parameter calibration method of the multi-camera by adopting the mutual shooting method can quickly and accurately calibrate the external parameters between each measuring camera in the multi-camera measuring system, and can flexibly adapt to various working environments, so that the measuring work can be efficiently carried out. In addition, the external parameters between cameras can be calibrated rapidly, accurate parameters are provided for subsequent measurement work, the working efficiency and the precision can be improved, and the measurement cost is reduced.
Drawings
FIG. 1 is a flow chart of a method for calibrating external azimuth parameters of a multi-camera by using a mutual shooting method.
FIG. 2 is a schematic diagram of the external parameter calibration process of the binocular vision system in the present invention.
Fig. 3 is a schematic diagram of coordinate transformation in the present invention.
Detailed Description
The external azimuth parameter calibration method of the multi-camera adopting the mutual shooting method provided by the invention is described in detail below with reference to the accompanying drawings and specific embodiments. The drawings are for reference and description only and are not intended to limit the scope of the invention.
As shown in fig. 1, 2 and 3, the method for calibrating external azimuth parameters of a multi-camera by adopting a mutual shooting method provided by the invention comprises the following steps in sequence:
step 1) establishing a calibration system of the multi-camera space measurement system: the system comprises a first measuring camera 1, a second measuring camera 2, a target L, an auxiliary camera 3 and a precision rotating table 4; wherein, the lower ends of the first measuring camera 1, the second measuring camera 2 and the auxiliary camera 3 are respectively provided with a precise rotary table 4, and the three precise rotary tables 4 are arranged in a triangle; a target L is respectively arranged on the first measuring camera 1 and the second measuring camera 2; the first measuring camera 1 and the second measuring camera 2 are image acquisition devices; the target L is a mutual shooting calibration auxiliary tool for realizing rapid external parameter calibration before measurement work; the auxiliary camera 3 is used to calculate the coordinates of each target L in the measurement camera coordinate system; the precision rotary table 4 is used for rotating the camera thereon to a proper measuring angle;
step 2) calibrating parameters in the monocular camera: respectively establishing mathematical models of the first measuring camera 1, the second measuring camera 2 and the auxiliary camera 3 according to the small-hole imaging model; calibrating an internal reference matrix and distortion parameters of the first measuring camera 1, the second measuring camera 2 and the auxiliary camera 3 by means of a checkerboard calibration plate according to a homography matrix mapping principle and a nonlinear optimization principle;
and the parameter calibration in the monocular camera is realized by a calibration tool box in Matlab or a calibration function in OpenCV.
Step 3) auxiliary cameras are combined with a measurement camera to calibrate external parameters of the binocular vision system: the first measuring camera 1, the second measuring camera 2 and the auxiliary camera 3 respectively form a binocular vision system, and then the external parameter matrixes of the two binocular vision systems are calibrated according to the binocular vision external parameter calibration principle;
FIG. 2 is a schematic diagram of the external parameter calibration process of the binocular vision system in the present invention. As shown in FIG. 2, O L X L Y L Z L For measuring the camera coordinate system, the imaging plane coordinate system is o l x l y l The optical axis direction is Z L The method comprises the steps of carrying out a first treatment on the surface of the Similarly, O R X R Y R Z R For auxiliary camera coordinate system, the imaging plane coordinate system is o R x R y R The optical axis direction is Z R ;P L ,P R Respectively the pixel coordinates of the space feature points at the imaging points of the image surfaces of the measuring camera and the auxiliary camera, and the intersection point P of two rays in the figure W Namely, the space feature point is in the world coordinate system X W Y W Z W The coordinates below. The coordinate conversion relation between any two coordinate systems is as follows:
Figure BDA0002395791650000051
wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0002395791650000052
representing a coordinate system O R X R Y R To the coordinate system O L X L Y L Is a rotation matrix of (a); />
T=(t 1 t 2 t 3 ) T A translation vector corresponding to the rotation matrix R is represented.
In the measuring camera coordinate system O in FIG. 3 1 X 1 Y 1 Z 1 And an auxiliary camera coordinate system O 3 X 3 Y 3 Z 3 For example, the coordinate conversion relationship is:
Figure BDA0002395791650000061
wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0002395791650000062
representing a coordinate system O 1 X 1 Y 1 To the coordinate system O 3 X 3 Y 3 Is a rotation matrix of (a);
T=(t 1 t 2 t 3 ) T a translation vector corresponding to the rotation matrix R is represented.
The rotation matrix is calibrated. The external parameter calibration of the binocular vision system is the process of solving the rotation matrix R and the translation vector T. The external matrix of the two binocular vision systems is R in figure 3 13 T 13 And R is 23 T 23
Step 4) calculating the pose of the target under an auxiliary camera coordinate system: rotating the auxiliary camera, and obtaining an external parameter matrix between the rotated auxiliary camera and the first or second measuring camera by using the external parameter matrix of the calibrated binocular vision system: shooting a target by using an auxiliary camera, and calculating the accurate pose of the target under an auxiliary camera coordinate system;
in step 3), after the rotation matrix R and the translation vector T are obtained, the auxiliary camera 3 is rotated by a certain angle by using the precision rotation table 4 toSo that it can shoot the target L fixed on the first measuring camera 1 or the second measuring camera 2, the rotation of the auxiliary camera 3 is performed around the Z axis of the measuring camera coordinate system, the rotation angle theta and the rotation matrix R z The conversion relationship of (θ) is:
Figure BDA0002395791650000063
the rotation matrix R marked in the step 3) is rotated by the precision rotating table 4 together with the translation vector T and the rotation matrix R obtained by rotating the precision rotating table z And (theta) multiplying to obtain the external parameter matrix between the current auxiliary camera 3 and the first measuring camera 1 or the second measuring camera 2 after rotating.
Rotating the precise rotating table 4 under the auxiliary camera 3 until the auxiliary camera 3 can shoot the target L on the first measuring camera 1 or the second measuring camera 2, shooting the target L, calculating the pose of the target L under the auxiliary camera coordinate system according to a PnP algorithm, and optimizing the re-projection error of the characteristic points on the target by using a nonlinear optimization algorithm so as to obtain the precise pose of the target L under the auxiliary camera coordinate system;
because the world coordinates of the feature points on the target L are known, the pixel coordinates on the imaging plane can be obtained from the image. Shooting the target L by using the auxiliary camera 3 is to prepare for solving the pose of the target L under the coordinate system of the self-measuring camera;
the principle of the PnP algorithm is as follows:
PNP (Perspotive-N-Points) is an algorithm for solving the pose of a camera according to 3D to 2D point pairs, and describes how to solve the pose of the camera when N pairs of space and image matching Points exist.
The target L has N threeThe dimensional space feature point P is provided with N projection points P on an imaging plane, and the pose R and T of the target L need to be calculated, and the lie algebra of the pose R and T is denoted as xi. Let the coordinates of the spatial coordinate point on the target L be P i =[X i Y i Z i ] T The corresponding pixel coordinate on the imaging plane is U i =[u i v i ] T The positional relationship between the pixel position and the spatial feature point is as follows:
Figure BDA0002395791650000071
wherein K is an internal reference matrix of the camera marked and obtained in the step 1), xi is the pose of a target L represented by a lie algebra, and the pose is written into a matrix form of s i U i =Kexp(ξ^)P i Because of factors such as pose of the target L, imaging noise and the like, errors exist in the equation, all the errors are added up to form a nonlinear least square problem, and the accurate pose can be obtained by iterative solution:
Figure BDA0002395791650000081
the error term in the PnP problem is an error obtained by comparing the observed pixel coordinates with the position of the 3D point projected onto the imaging plane according to the currently estimated pose, and is called a reprojection error. There are many methods for solving the nonlinear least squares problem, such as first-order steepest descent, second-order gaussian newton, and linear optimization algorithms including levenberg marquark.
Step 5) calculating the pose of the target under the self-measuring camera coordinate system: calculating the pose of the target under the coordinate system of the self-measuring camera through simple coordinate conversion by utilizing the external parameter matrix between the rotated auxiliary camera and the first measuring camera or the second measuring camera obtained in the step 4) and the calculated pose of the target under the coordinate system of the auxiliary camera;
because the target L is fixedly arranged on the first measuring camera 1 and the second measuring camera 2, i.e. the target L is under the coordinate system of the measuring cameraThe pose of the calculation target is unchanged, and a schematic diagram of the pose of the calculation target under the self-measurement camera coordinate system is shown in fig. 3. The external parameter matrix R of two binocular vision systems can be known through the external parameter calibration of the step 3) 13 T 13 And R is 23 T 23 Is known and the external matrix and target L between the rotated auxiliary camera 3 and the first measuring camera 1 or the second measuring camera 2 have been obtained in step 4) in the auxiliary camera coordinate system O 3 X 3 Y 3 Z 3 The exact pose of the first measuring camera 1 can be determined by simple coordinate transformation 1 X 1 Y 1 Z 1 Measurement camera coordinate system and target L on second measurement camera 2 at O 2 X 2 Y 2 Z 2 And measuring the pose under the coordinate system of the camera.
And 6) calibrating external parameters of mutual shooting: shooting targets on the opposite side measuring cameras by using the two measuring cameras, calculating the pose of the opposite side targets, and calculating the current external parameters of the two measuring cameras through conversion;
the pose of the target L under the coordinate system of the self-measuring camera is calculated through the step 5), the target L on the second measuring camera 2 is shot by using the first measuring camera 1, or the target L on the first measuring camera 1 is shot by using the second measuring camera 2, then the pose of the opposite target L is calculated by using a PnP algorithm, and the current external parameters of the two measuring cameras can be calculated through conversion;
step 7) rotating the measuring camera to a working posture by using a precision rotating table and measuring:
in step 6) the first measuring camera 1 and the second measuring camera 2 are opposite, but both measuring cameras need to face the object to be measured when the measuring work is performed, and therefore both measuring cameras need to be rotated to an angle facing the object to be measured before the measuring work is performed. The rotation angles of the first measuring camera 1 and the second measuring camera 2 from the calibration posture to the working posture are recorded by using the precise rotating table 3, and the external parameters of the two measuring cameras in the working posture can be obtained through calculation, so that the measuring work can be carried out.
After rotation, the rotation matrix in the external parameters of the two measurement cameras changes again, the rotation angle of the precision rotary table 3 is required to be converted into the rotation matrix, the external parameter matrix is updated, and the relation between the rotation angle and the rotation matrix is as follows:
if the rotation angle is θ and the rotation axes are X, Y and Z axes, respectively, the rotation matrices are:
Figure BDA0002395791650000091
Figure BDA0002395791650000092
Figure BDA0002395791650000093
the above description of the embodiments of the invention has been presented in connection with the drawings but these descriptions should not be construed as limiting the scope of the invention, which is defined by the appended claims, and any changes based on the claims are intended to be covered by the invention.

Claims (7)

1. A method for calibrating external azimuth parameters of a multi-camera by adopting a mutual shooting method is characterized by comprising the following steps of: the external azimuth parameter calibration method of the multi-camera adopting the mutual shooting method comprises the following steps of:
step 1) establishing a calibration system of the multi-camera space measurement system: the system comprises a first measuring camera (1), a second measuring camera (2), a target (L), an auxiliary camera (3) and a precision rotating table (4); the lower ends of the first measuring camera (1), the second measuring camera (2) and the auxiliary camera (3) are respectively provided with a precise rotary table (4), and the three precise rotary tables (4) are arranged in a triangle; a target (L) is respectively arranged on the first measuring camera (1) and the second measuring camera (2);
step 2) calibrating parameters in the monocular camera: respectively establishing mathematical models of a first measuring camera (1), a second measuring camera (2) and an auxiliary camera (3) according to the small-hole imaging model; calibrating an internal reference matrix and distortion parameters of the first measuring camera (1), the second measuring camera (2) and the auxiliary camera (3) by means of a checkerboard calibration plate according to a homography matrix mapping principle and a nonlinear optimization principle;
step 3) auxiliary cameras are combined with a measurement camera to calibrate external parameters of the binocular vision system: a binocular vision system is formed by a first measuring camera (1), a second measuring camera (2) and an auxiliary camera (3), and then external parameter matrixes of the two binocular vision systems are calibrated according to a binocular vision external parameter calibration principle;
step 4) calculating the pose of the target under an auxiliary camera coordinate system: rotating the auxiliary camera, and obtaining an external parameter matrix between the rotated auxiliary camera and the first or second measuring camera by using the external parameter matrix of the calibrated binocular vision system: shooting a target by using an auxiliary camera, and calculating the accurate pose of the target under an auxiliary camera coordinate system;
step 5) calculating the pose of the target under the self-measuring camera coordinate system: calculating the pose of the target under the coordinate system of the self-measuring camera through simple coordinate conversion by utilizing the external parameter matrix between the rotated auxiliary camera and the first measuring camera or the second measuring camera obtained in the step 4) and the calculated pose of the target under the coordinate system of the auxiliary camera;
and 6) calibrating external parameters of mutual shooting: shooting targets on the opposite side measuring cameras by using the two measuring cameras, calculating the pose of the opposite side targets, and calculating the current external parameters of the two measuring cameras through conversion;
and 7) rotating the measuring camera to a working posture by using the precision rotating table and measuring.
2. The method for calibrating external azimuth parameters of a multi-camera by adopting a mutual shooting method according to claim 1, wherein the method is characterized in that: in the step 2), the parameter calibration in the monocular camera is realized by a calibration tool box in Matlab or a calibration function in OpenCV.
3. The method for calibrating external azimuth parameters of a multi-camera by adopting a mutual shooting method according to claim 1, wherein the method is characterized in that: in step 3), the method for calibrating the external parameters of the binocular vision system by the auxiliary camera combined measurement camera comprises the following steps: firstly, a binocular vision system is formed by a first measuring camera (1) and a second measuring camera (2) and an auxiliary camera (3), then, shooting is carried out on a checkerboard calibration plate by utilizing the first measuring camera (1) and the auxiliary camera (3) as well as the second measuring camera (2) and the auxiliary camera (3), space feature point matching is carried out after shooting, an essential matrix is calculated after paired space feature points are obtained, and a rotation matrix R and a translation vector T in an external parameter matrix can be decomposed through the essential matrix.
4. The method for calibrating external azimuth parameters of a multi-camera by adopting a mutual shooting method according to claim 1, wherein the method is characterized in that: in step 4), the method for calculating the pose of the target under the auxiliary camera coordinate system is as follows: firstly, a precision rotary table (4) is used for rotating an auxiliary camera (3) to be opposite to a target (L) on a first measuring camera (1) or a second measuring camera (2), then the auxiliary camera (3) is used for imaging the target (L), after an image is obtained, the pixel coordinates of the target (L) can be obtained, and the world coordinates of the target (L) are known, so that the pose of the target (L) under an auxiliary camera coordinate system can be obtained by using the correlation between the world coordinates of the target (L) and the pixel coordinates.
5. The method for calibrating external azimuth parameters of a multi-camera by adopting a mutual shooting method according to claim 1, wherein the method is characterized in that: in step 5), the method for calculating the pose of the target under the self-measurement camera coordinate system is as follows: and (3) multiplying the external parameter matrix between the rotated auxiliary camera (3) obtained in the step (4) and the first measuring camera (1) or the second measuring camera (2) by the calculated pose of the target (L) under the auxiliary camera coordinate system, namely, the coordinate of the target (L) under the self measuring camera coordinate system can be obtained.
6. The method for calibrating external azimuth parameters of a multi-camera by adopting a mutual shooting method according to claim 1, wherein the method is characterized in that: in step 6), the method for calibrating the external parameters of mutual shooting is as follows: the pose of the target (L) under the coordinate system of the self-measuring camera is calculated through the step 5), the target (L) on the second measuring camera (2) is shot by utilizing the first measuring camera (1), or the target (L) on the first measuring camera (1) is shot by utilizing the second measuring camera (2), then the pose of Fang Babiao (L) is calculated by utilizing the PnP algorithm, and the current external parameters of the two measuring cameras are calculated through conversion.
7. The method for calibrating external azimuth parameters of a multi-camera by adopting a mutual shooting method according to claim 1, wherein the method is characterized in that: in step 7), the method for rotating the measuring camera to the working posture and measuring by using the precision rotating table is as follows: before measuring work, the first measuring camera (1) and the second measuring camera (2) are rotated to an angle opposite to the measured object; the precise rotary table (3) is used for recording the rotation angles of the first measuring camera (1) and the second measuring camera (2) from the calibration posture to the working posture, and the external parameters of the two measuring cameras in the working posture are obtained through calculation, so that the measuring work can be carried out.
CN202010131077.8A 2020-02-28 2020-02-28 External azimuth parameter calibration method for multi-camera by adopting mutual shooting method Active CN111415391B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010131077.8A CN111415391B (en) 2020-02-28 2020-02-28 External azimuth parameter calibration method for multi-camera by adopting mutual shooting method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010131077.8A CN111415391B (en) 2020-02-28 2020-02-28 External azimuth parameter calibration method for multi-camera by adopting mutual shooting method

Publications (2)

Publication Number Publication Date
CN111415391A CN111415391A (en) 2020-07-14
CN111415391B true CN111415391B (en) 2023-04-28

Family

ID=71491099

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010131077.8A Active CN111415391B (en) 2020-02-28 2020-02-28 External azimuth parameter calibration method for multi-camera by adopting mutual shooting method

Country Status (1)

Country Link
CN (1) CN111415391B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111445537B (en) * 2020-06-18 2020-09-29 浙江中控技术股份有限公司 Calibration method and system of camera
CN111964693B (en) * 2020-07-21 2022-03-22 中国科学院长春光学精密机械与物理研究所 High-precision calibration method for internal and external orientation elements of surveying and mapping camera
CN111862238B (en) * 2020-07-23 2022-05-10 中国民航大学 Full-space monocular light pen type vision measurement method
CN112700501B (en) * 2020-12-12 2024-03-05 西北工业大学 Underwater monocular subpixel relative pose estimation method
CN112837373B (en) * 2021-03-03 2024-04-26 福州视驰科技有限公司 Multi-camera pose estimation method without feature point matching
CN113256742B (en) * 2021-07-15 2021-10-15 禾多科技(北京)有限公司 Interface display method and device, electronic equipment and computer readable medium
CN115526941B (en) * 2022-11-25 2023-03-10 海伯森技术(深圳)有限公司 Calibration device and calibration method for telecentric camera
CN116704045B (en) * 2023-06-20 2024-01-26 北京控制工程研究所 Multi-camera system calibration method for monitoring starry sky background simulation system
CN117197241A (en) * 2023-09-14 2023-12-08 上海智能制造功能平台有限公司 Robot tail end absolute pose high-precision tracking method based on multi-eye vision

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102831641A (en) * 2012-08-08 2012-12-19 浙江华震数字化工程有限公司 Method for shooting and three-dimensional reduction and reconstruction
CN104807476A (en) * 2015-04-23 2015-07-29 上海大学 Pose estimation-based quick probe calibration device and method
CN108663043A (en) * 2018-05-16 2018-10-16 北京航空航天大学 Distributed boss's POS node relative pose measurement method based on single camera auxiliary
WO2019062291A1 (en) * 2017-09-29 2019-04-04 歌尔股份有限公司 Binocular vision positioning method, device, and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102831641A (en) * 2012-08-08 2012-12-19 浙江华震数字化工程有限公司 Method for shooting and three-dimensional reduction and reconstruction
CN104807476A (en) * 2015-04-23 2015-07-29 上海大学 Pose estimation-based quick probe calibration device and method
WO2019062291A1 (en) * 2017-09-29 2019-04-04 歌尔股份有限公司 Binocular vision positioning method, device, and system
CN108663043A (en) * 2018-05-16 2018-10-16 北京航空航天大学 Distributed boss's POS node relative pose measurement method based on single camera auxiliary

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
吴军 ; 马凯 ; 徐海涛 ; 王志军 ; 于之靖 ; .采用虚拟立体视觉的航空发动机叶片点云测量方法.机械科学与技术.2018,(第11期),全文. *
李鑫 ; .基于多目视觉的布匹瑕疵检测系统硬件部分设计.机械研究与应用.2011,(第05期),全文. *

Also Published As

Publication number Publication date
CN111415391A (en) 2020-07-14

Similar Documents

Publication Publication Date Title
CN111415391B (en) External azimuth parameter calibration method for multi-camera by adopting mutual shooting method
CN108053450B (en) High-precision binocular camera calibration method based on multiple constraints
CN109859275B (en) Monocular vision hand-eye calibration method of rehabilitation mechanical arm based on S-R-S structure
CN110378969B (en) Convergent binocular camera calibration method based on 3D geometric constraint
CN108648237B (en) Space positioning method based on vision
Zhan et al. Hand–eye calibration and positioning for a robot drilling system
CN108416812B (en) Calibration method of single-camera mirror image binocular vision system
CN109877840B (en) Double-mechanical-arm calibration method based on camera optical axis constraint
CN109015110B (en) Machine tool rotary shaft position error modeling and identifying method
WO2014133646A2 (en) Apparatus and method for three dimensional surface measurement
CN111862238B (en) Full-space monocular light pen type vision measurement method
CN109794963A (en) A kind of robot method for rapidly positioning towards curved surface member
CN115861445B (en) Hand-eye calibration method based on three-dimensional point cloud of calibration plate
CN112229323B (en) Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method
CN111879354A (en) Unmanned aerial vehicle measurement system that becomes more meticulous
CN113870366B (en) Calibration method and calibration system of three-dimensional scanning system based on pose sensor
CN109773589B (en) Method, device and equipment for online measurement and machining guidance of workpiece surface
Wei et al. Flexible calibration of a portable structured light system through surface plane
CN114001651A (en) Large-scale long and thin cylinder type component pose in-situ measurement method based on binocular vision measurement and prior detection data
Zhang et al. Improved Camera Calibration Method and Accuracy Analysis for Binocular Vision
CN113409395B (en) High-precision detection and positioning method for catheter end
Langming et al. A flexible method for multi-view point clouds alignment of small-size object
Mavrinac et al. Calibration of dual laser-based range cameras for reduced occlusion in 3D imaging
CN111716340B (en) Correcting device and method for coordinate system of 3D camera and mechanical arm
Zou et al. Flexible Extrinsic Parameter Calibration for Multicameras With Nonoverlapping Field of View

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Wu Jun

Inventor after: Li Zechuan

Inventor after: Guo Runxia

Inventor after: Xu Jun

Inventor after: Li Yanling

Inventor after: Li Xin

Inventor before: Wu Jun

Inventor before: Li Zechuan

Inventor before: Xu Jun

Inventor before: Li Yanling

Inventor before: Li Xin

CB03 Change of inventor or designer information
GR01 Patent grant
GR01 Patent grant