CN111415391A - Multi-view camera external orientation parameter calibration method adopting inter-shooting method - Google Patents

Multi-view camera external orientation parameter calibration method adopting inter-shooting method Download PDF

Info

Publication number
CN111415391A
CN111415391A CN202010131077.8A CN202010131077A CN111415391A CN 111415391 A CN111415391 A CN 111415391A CN 202010131077 A CN202010131077 A CN 202010131077A CN 111415391 A CN111415391 A CN 111415391A
Authority
CN
China
Prior art keywords
camera
measuring
target
auxiliary
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010131077.8A
Other languages
Chinese (zh)
Other versions
CN111415391B (en
Inventor
吴军
李泽川
徐鋆
李雁玲
李鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Civil Aviation University of China
Original Assignee
Civil Aviation University of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Civil Aviation University of China filed Critical Civil Aviation University of China
Priority to CN202010131077.8A priority Critical patent/CN111415391B/en
Publication of CN111415391A publication Critical patent/CN111415391A/en
Application granted granted Critical
Publication of CN111415391B publication Critical patent/CN111415391B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

A multi-view camera external orientation parameter calibration method adopting an inter-shooting method. The method comprises the steps of establishing a multi-camera space measurement system calibration system; calibrating parameters in the monocular camera; the auxiliary camera is combined with the measuring camera to calibrate the external parameters of the binocular vision system; calculating the pose of the target under an auxiliary camera coordinate system; calculating the pose of the target under the coordinate system of the self-measuring camera; calibrating external parameters of mutual shooting; and rotating the measuring camera to a working posture, measuring and the like. The multi-camera external orientation parameter calibration method adopting the mutual shooting method can quickly and accurately calibrate the external parameters between each measuring camera in the multi-camera measuring system, and can flexibly adapt to various working environments, thereby efficiently carrying out measuring work. In addition, external parameters between cameras can be calibrated quickly, accurate parameters are provided for subsequent measurement work, work efficiency and precision can be improved, and measurement cost is reduced.

Description

Multi-view camera external orientation parameter calibration method adopting inter-shooting method
Technical Field
The invention belongs to the technical field of machine vision three-dimensional space coordinate measurement, and particularly relates to a multi-view camera external orientation parameter calibration method by adopting an inter-shooting method.
Background
With the continuous development of the fields of machinery manufacturing, aerospace, robots and the like in China, particularly the development of the production technical level of a domestic large airplane, the requirement on the space measurement precision in the manufacturing and assembling process is continuously improved. The accuracy of the measurement technique directly determines the accuracy of industrial manufacturing and assembly. The traditional three-coordinate measuring machine is overlarge in size and low in calculation efficiency, and the traditional measuring method always needs to be in contact with a measured object, so that the measured object is easily damaged, and the requirements of the modern industry on the precision and the efficiency of a measuring system can not be met.
With the improvement of hardware level, the calculation speed of the computer is also improved rapidly, so that the computer vision and photogrammetry technology can be widely applied to various industries, the working efficiency and the precision of the measurement system are greatly improved, and the labor cost and the time cost are reduced. In the field of three-coordinate measurement of computer vision space, the multi-camera vision measurement system is widely applied by virtue of the advantages of high measurement precision, strong adaptability, high measurement efficiency, low cost and the like. Compared with the traditional three-coordinate measuring method, the method has the advantages of high precision, zero loss on the measured object and the like. Therefore, the research on the multi-camera vision measuring system has important practical significance.
In the multi-camera measurement system, the mainly adopted measurement method is a triangulation algorithm, the measurement precision of space coordinates can be directly influenced by the calibration precision of internal and external parameters of the camera in the algorithm, and the internal parameters of the camera can be accurately calibrated by using mature calibration tools such as the Zhang calibration method and the like. However, with the change of working scenes, the external parameters between cameras in the system are changed, and the calibration cannot be performed by the traditional binocular vision external parameter calibration method through checkerboards due to the limitation of working environments. Therefore, the research on the fast, accurate and flexible camera external parameter calibration method is the central importance of the development of the multi-camera space three-coordinate measurement system.
Disclosure of Invention
In order to solve the above problems, an object of the present invention is to provide a method for calibrating external orientation parameters of a multi-view camera by using an inter-shooting method.
In order to achieve the above object, the method for calibrating external orientation parameters of a multi-view camera by using an inter-shooting method provided by the invention comprises the following steps in sequence:
step 1) establishing a calibration system of a multi-camera space measurement system: the system comprises a first measuring camera, a second measuring camera, a target, an auxiliary camera and a precision rotating platform; the lower ends of the first measuring camera, the second measuring camera and the auxiliary camera are respectively provided with a precise rotating table, and the three precise rotating tables are arranged in a triangular shape; a target is respectively arranged on the first measuring camera and the second measuring camera;
step 2) calibrating parameters in the monocular camera: respectively establishing mathematical models of a first measuring camera, a second measuring camera and an auxiliary camera according to the pinhole imaging model; calibrating an internal reference matrix and distortion parameters of the first measuring camera, the second measuring camera and the auxiliary camera by means of a checkerboard calibration plate according to a homography matrix mapping principle and a nonlinear optimization principle;
step 3), the auxiliary camera jointly measures the camera and carries out binocular vision system external parameter calibration: the method comprises the steps that a first measuring camera, a second measuring camera and an auxiliary camera form a binocular vision system respectively, and then external parameter matrixes of the two binocular vision systems are calibrated according to a binocular vision external parameter calibration principle;
step 4), calculating the pose of the target under an auxiliary camera coordinate system: rotating the auxiliary camera, and obtaining an external reference matrix between the rotated auxiliary camera and the first or second measuring camera by using the calibrated external reference matrix of the binocular vision system: then, shooting the target by using an auxiliary camera, and calculating the accurate pose of the target under an auxiliary camera coordinate system;
step 5) calculating the pose of the target under the coordinate system of the self-measuring camera: calculating the pose of the target under the coordinate system of the self measuring camera by simple coordinate transformation by utilizing the external parameter matrix between the rotated auxiliary camera and the first measuring camera or the second measuring camera obtained in the step 4) and the calculated pose of the target under the coordinate system of the auxiliary camera;
step 6), mutual shooting external parameter calibration: shooting targets on the opposite measuring camera by using the two measuring cameras, calculating the pose of the opposite target, and calculating the current external parameters of the two measuring cameras through conversion;
and 7) rotating the measuring camera to a working posture by using the precision rotating platform and measuring.
In step 2), the parameter calibration in the monocular camera is realized by using a calibration tool box in Matlab or a calibration function in OpenCV.
In step 3), the method for calibrating the external parameters of the binocular vision system by the auxiliary camera combined measurement camera comprises the following steps: firstly, a first measuring camera and a second measuring camera respectively form a binocular vision system with an auxiliary camera, then the first measuring camera and the auxiliary camera as well as the second measuring camera and the auxiliary camera are used for shooting a checkerboard calibration board at the same time, after the spatial feature points are shot, the matching of the spatial feature points is carried out, the essential matrix is calculated after the matched spatial feature points are obtained, and the rotation matrix R and the translation vector T in the external reference matrix can be decomposed through the essential matrix.
In step 4), the method for calculating the pose of the target in the auxiliary camera coordinate system includes: firstly, the auxiliary camera is rotated to be opposite to the target on the first measuring camera or the second measuring camera by using the precision rotating platform, then the auxiliary camera is used for imaging the target, the pixel coordinate of the target can be obtained after the image is obtained, and the world coordinate of the target is known, so that the pose of the target under the auxiliary camera coordinate system can be obtained by using the association between the world coordinate and the pixel coordinate of the target.
In step 5), the method for calculating the pose of the target in the self-measurement camera coordinate system comprises the following steps: multiplying the external parameter matrix between the rotated auxiliary camera and the first measuring camera or the second measuring camera obtained in the step 4) by the calculated pose of the target in the auxiliary camera coordinate system, so that the coordinate of the target in the self measuring camera coordinate system can be obtained.
In step 6), the method for calibrating the mutual shooting external parameters comprises the following steps: and 5) calculating the pose of the target under the coordinate system of the measuring camera per se, shooting the target on the second measuring camera by using the first measuring camera, or shooting the target on the first measuring camera by using the second measuring camera, then calculating the pose of the target of the other side by using a PnP algorithm, and calculating the current external parameters of the two measuring cameras by conversion.
In step 7), the method of rotating the measurement camera to the working posture using the precision rotation stage and performing measurement includes: before measurement, the first measurement camera and the second measurement camera are rotated to an angle opposite to a measured object; and recording the rotation angles of the first measuring camera and the second measuring camera from the calibration posture to the working posture by using the precision rotating platform, and obtaining the external parameters of the two measuring cameras in the working posture through calculation, so that the measuring work can be carried out.
The multi-camera external orientation parameter calibration method adopting the mutual shooting method can quickly and accurately calibrate the external parameters between each measuring camera in the multi-camera measuring system, and can flexibly adapt to various working environments, thereby efficiently carrying out measuring work. In addition, external parameters between cameras can be calibrated quickly, accurate parameters are provided for subsequent measurement work, work efficiency and precision can be improved, and measurement cost is reduced.
Drawings
Fig. 1 is a flowchart of a calibration method for external orientation parameters of a multi-view camera by using an inter-shooting method according to the present invention.
Fig. 2 is a schematic diagram of the binocular vision system external parameter calibration process in the invention.
FIG. 3 is a schematic diagram of coordinate transformation according to the present invention.
Detailed Description
The following describes the external orientation parameter calibration method of the multi-view camera using the inter-shooting method in detail with reference to the accompanying drawings and specific embodiments. The drawings are for reference and illustration purposes only and are not to be construed as limiting the scope of the present invention.
As shown in fig. 1, 2 and 3, the method for calibrating external orientation parameters of a multi-view camera by using an inter-shooting method provided by the invention comprises the following steps in sequence:
the method comprises the following steps of 1) establishing a multi-camera space measurement system calibration system, wherein the system comprises a first measurement camera 1, a second measurement camera 2, a target L, an auxiliary camera 3 and a precision rotary table 4, wherein the lower ends of the first measurement camera 1, the second measurement camera 2 and the auxiliary camera 3 are respectively provided with the precision rotary table 4, the three precision rotary tables 4 are arranged in a triangle shape, the first measurement camera 1 and the second measurement camera 2 are respectively provided with a target L, the first measurement camera 1 and the second measurement camera 2 are image acquisition devices, the target L is a mutual shooting calibration auxiliary tool for realizing rapid external parameter calibration before measurement work, the auxiliary camera 3 is used for calculating the coordinate of each target L under a measurement camera coordinate system, and the precision rotary table 4 is used for rotating the camera on the precision rotary table to a proper measurement angle;
step 2) calibrating parameters in the monocular camera: respectively establishing mathematical models of a first measuring camera 1, a second measuring camera 2 and an auxiliary camera 3 according to the pinhole imaging model; calibrating an internal reference matrix and distortion parameters of the first measuring camera 1, the second measuring camera 2 and the auxiliary camera 3 by means of a checkerboard calibration plate according to a homography matrix mapping principle and a nonlinear optimization principle;
the monocular camera internal parameter calibration is realized by adopting a calibration tool box in Matlab or a calibration function in OpenCV.
Step 3), the auxiliary camera jointly measures the camera and carries out binocular vision system external parameter calibration: the method comprises the steps that a first measuring camera 1, a second measuring camera 2 and an auxiliary camera 3 form a binocular vision system respectively, and then external reference matrixes of the two binocular vision systems are calibrated according to a binocular vision external reference calibration principle;
fig. 2 is a schematic diagram of the binocular vision system external parameter calibration process in the invention. As shown in FIG. 2, OLXLYLZLFor measuring the camera coordinate system, the imaging plane coordinate system is olxlylIn the direction of the optical axis ZL(ii) a In the same way, ORXRYRZRTo assist the camera coordinate system, the imaging plane coordinate system is oRxRyRIn the direction of the optical axis ZR;PL,PRAre respectively provided withThe pixel coordinates of the imaging points of the spatial characteristic points on the image surfaces of the measuring camera and the auxiliary camera are taken as the intersection point P of two rays in the imageWNamely the world coordinate system X of the space characteristic pointWYWZWThe coordinates of the following. The coordinate transformation relationship between any two coordinate systems is:
Figure BDA0002395791650000051
wherein the content of the first and second substances,
Figure BDA0002395791650000052
represents a coordinate system ORXRYRTo the coordinate system OLXLYLThe rotation matrix of (a);
T=(t1t2t3)Tand represents a translation vector corresponding to the rotation matrix R.
With the measuring camera coordinate system O in FIG. 31X1Y1Z1And an auxiliary camera coordinate system O3X3Y3Z3For example, the coordinate transformation relationship is as follows:
Figure BDA0002395791650000061
wherein the content of the first and second substances,
Figure BDA0002395791650000062
represents a coordinate system O1X1Y1To the coordinate system O3X3Y3The rotation matrix of (a);
T=(t1t2t3)Tand represents a translation vector corresponding to the rotation matrix R.
The rotation matrix is calibrated. The binocular vision system extrinsic parameter calibration is the process of solving the rotation matrix R and the translation vector T. The external reference matrix of the two binocular vision systems is designated as R in FIG. 313T13And R23T23
Step 4), calculating the pose of the target under an auxiliary camera coordinate system: rotating the auxiliary camera, and obtaining an external reference matrix between the rotated auxiliary camera and the first or second measuring camera by using the calibrated external reference matrix of the binocular vision system: then, shooting the target by using an auxiliary camera, and calculating the accurate pose of the target under an auxiliary camera coordinate system;
after the rotation matrix R and the translation vector T are obtained in step 3), the auxiliary camera 3 needs to be rotated by a certain angle by using the precision rotation platform 4 so that the auxiliary camera can shoot the target L fixed on the first measuring camera 1 or the second measuring camera 2, the auxiliary camera 3 rotates around the Z axis of the measuring camera coordinate system at this time, and the rotation angle θ and the rotation matrix RzThe transformation relationship of (θ) is:
Figure BDA0002395791650000063
the rotation matrix R and the translation vector T which are marked in the step 3) and the rotation matrix R obtained by rotating the precision rotating platform 4zAnd (theta) multiplying to obtain an external parameter matrix between the current auxiliary camera 3 and the first measuring camera 1 or the second measuring camera 2 after the current auxiliary camera rotates.
The precision rotating platform 4 under the auxiliary camera 3 is rotated until the auxiliary camera 3 can shoot the target L on the first measuring camera 1 or the second measuring camera 2, then the target L is shot, then the pose of the target L under the auxiliary camera coordinate system is calculated according to the PnP algorithm, then the re-projection error of the feature points on the target is optimized by using the nonlinear optimization algorithm, and therefore the precise pose of the target L under the auxiliary camera coordinate system is obtained;
shooting target L with auxiliary camera 3 is prepared for solving the pose of target L in its own measuring camera coordinate system;
the principle of the PnP algorithm is as follows:
PNP (Passive-N-Points) is an algorithm for solving the pose of a camera according to 3D to 2D point pairs, and describes how to solve the pose of the camera when N pairs of space and image matching Points exist.
L has N three-dimensional characteristic points P on the target, N projection points P on the imaging plane, and it is necessary to calculate the poses R and T of the target L, whose lie algebra is ξ. suppose the coordinates of the spatial coordinate point on the target L is Pi=[XiYiZi]TThe corresponding pixel coordinate on the imaging plane is Ui=[uivi]TThe position relationship between the pixel position and the spatial feature point is as follows:
Figure BDA0002395791650000071
wherein K is the internal reference matrix of the camera obtained by the calibration in the step 1), ξ is the pose of a target L represented by a lie algebra, and the pose is written in a matrix form of siUi=Kexp(ξ^)PiBecause the pose of the target L, imaging noise and other factors have errors in the equation, the invention adds all the errors to construct a nonlinear least square problem, and iterative solution can obtain an accurate pose:
Figure BDA0002395791650000081
the error term in the PnP problem is an error obtained by comparing the observed pixel coordinates with the position of the 3D point projected onto the imaging plane according to the currently estimated pose, and is called a reprojection error. There are many methods for solving the nonlinear least squares problem, such as nonlinear optimization algorithms including the steepest descent method of the first order, the gauss-newton method of the second order, and the levenberg marquarl method.
Step 5) calculating the pose of the target under the coordinate system of the self-measuring camera: calculating the pose of the target under the coordinate system of the self measuring camera by simple coordinate transformation by utilizing the external parameter matrix between the rotated auxiliary camera and the first measuring camera or the second measuring camera obtained in the step 4) and the calculated pose of the target under the coordinate system of the auxiliary camera;
because the target L is fixedly installed on the first measuring camera 1 and the second measuring camera 2, that is, the pose of the target L in the coordinate system of the self-measuring camera is not changed, the pose of the calculated target in the coordinate system of the self-measuring camera is schematically shown in fig. 3. the appearance parameter calibration in step 3) can obtain the appearance parameter matrix R of the two binocular vision systems13T13And R23T23Is known and the external reference matrix between the rotated auxiliary camera 3 and the first measuring camera 1 or the second measuring camera 2 and the target L in the auxiliary camera coordinate system O have been obtained in step 4)3X3Y3Z3Accurate pose of the target L on the first measuring camera 1 can be found by simple coordinate transformation at O1X1Y1Z1Measurement camera coordinate system and target L on second measurement camera 2 at O2X2Y2Z2And measuring the pose of the camera in a coordinate system.
Step 6), mutual shooting external parameter calibration: shooting targets on the opposite measuring camera by using the two measuring cameras, calculating the pose of the opposite target, and calculating the current external parameters of the two measuring cameras through conversion;
the pose of the target L under the coordinate system of the measuring camera per se is calculated through the step 5), the first measuring camera 1 is used for shooting the target L on the second measuring camera 2, or the second measuring camera 2 is used for shooting the target L on the first measuring camera 1, then the pose of the opposite target L is calculated through a PnP algorithm, and the current external parameters of the two measuring cameras can be calculated through conversion;
step 7), rotating the measuring camera to a working posture by using the precision rotating platform and measuring:
in step 6), the first measuring camera 1 and the second measuring camera 2 are opposite to each other, but both measuring cameras need to face the object to be measured when the measuring work is performed, and therefore, the measuring cameras need to be rotated to an angle facing the object to be measured before the measuring work is performed. And recording the rotation angles of the first measuring camera 1 and the second measuring camera 2 from the calibration posture to the working posture by using the precision rotating platform 3, and obtaining the external parameters of the two measuring cameras in the working posture through calculation, so that the measuring work can be carried out.
After the rotation, the rotation matrix in the external parameters of the two measurement cameras changes again, the rotation angle of the precision rotation platform 3 needs to be converted into a rotation matrix, and the external parameter matrix needs to be updated, wherein the relationship between the rotation angle and the rotation matrix is as follows:
if the rotation angle is θ and the rotation axes are X, Y and Z axes, the rotation matrix is:
Figure BDA0002395791650000091
Figure BDA0002395791650000092
Figure BDA0002395791650000093
while the invention has been described in connection with specific embodiments thereof, it will be understood that these should not be construed as limiting the scope of the invention, which is defined in the following claims, and any variations which fall within the scope of the claims are intended to be embraced thereby.

Claims (7)

1. A multi-view camera external orientation parameter calibration method adopting an inter-shooting method is characterized in that: the method for calibrating the external orientation parameters of the multi-view camera by adopting the mutual shooting method comprises the following steps in sequence:
step 1) establishing a multi-camera space measurement system calibration system, wherein the system comprises a first measurement camera (1), a second measurement camera (2), a target (L), an auxiliary camera (3) and a precision rotary table (4), wherein the precision rotary table (4) is respectively arranged at the lower ends of the first measurement camera (1), the second measurement camera (2) and the auxiliary camera (3), the three precision rotary tables (4) are arranged in a triangular manner, and the target (L) is respectively arranged on the first measurement camera (1) and the second measurement camera (2);
step 2) calibrating parameters in the monocular camera: respectively establishing mathematical models of a first measuring camera (1), a second measuring camera (2) and an auxiliary camera (3) according to the pinhole imaging model; calibrating an internal reference matrix and distortion parameters of the first measuring camera (1), the second measuring camera (2) and the auxiliary camera (3) by means of a checkerboard calibration plate according to a homography matrix mapping principle and a nonlinear optimization principle;
step 3), the auxiliary camera jointly measures the camera and carries out binocular vision system external parameter calibration: a first measuring camera (1), a second measuring camera (2) and an auxiliary camera (3) form a binocular vision system respectively, and then external parameter matrixes of the two binocular vision systems are calibrated according to a binocular vision external parameter calibration principle;
step 4), calculating the pose of the target under an auxiliary camera coordinate system: rotating the auxiliary camera, and obtaining an external reference matrix between the rotated auxiliary camera and the first or second measuring camera by using the calibrated external reference matrix of the binocular vision system: then, shooting the target by using an auxiliary camera, and calculating the accurate pose of the target under an auxiliary camera coordinate system;
step 5) calculating the pose of the target under the coordinate system of the self-measuring camera: calculating the pose of the target under the coordinate system of the self measuring camera by simple coordinate transformation by utilizing the external parameter matrix between the rotated auxiliary camera and the first measuring camera or the second measuring camera obtained in the step 4) and the calculated pose of the target under the coordinate system of the auxiliary camera;
step 6), mutual shooting external parameter calibration: shooting targets on the opposite measuring camera by using the two measuring cameras, calculating the pose of the opposite target, and calculating the current external parameters of the two measuring cameras through conversion;
and 7) rotating the measuring camera to a working posture by using the precision rotating platform and measuring.
2. The method for calibrating the external orientation parameters of the multi-view camera by the mutual shooting method according to claim 1, wherein: in step 2), the parameter calibration in the monocular camera is realized by using a calibration tool box in Matlab or a calibration function in OpenCV.
3. The method for calibrating the external orientation parameters of the multi-view camera by the mutual shooting method according to claim 1, wherein: in step 3), the method for calibrating the external parameters of the binocular vision system by the auxiliary camera combined measurement camera comprises the following steps: firstly, a first measuring camera (1), a second measuring camera (2) and an auxiliary camera (3) form a binocular vision system respectively, then the first measuring camera (1), the auxiliary camera (3), the second measuring camera (2) and the auxiliary camera (3) are used for shooting a checkerboard calibration board simultaneously, spatial feature points are matched after shooting, an essential matrix is calculated after matched spatial feature points are obtained, and a rotation matrix R and a translation vector T in the external reference matrix can be decomposed through the essential matrix.
4. The method for calibrating the external orientation parameters of the multi-purpose camera using the cross-shooting method as claimed in claim 1, wherein in step 4), the pose of the target in the auxiliary camera coordinate system is calculated by first rotating the auxiliary camera (3) to face the target (L) on the first measuring camera (1) or the second measuring camera (2) by using the precision rotation table (4), then imaging the target (L) by using the auxiliary camera (3), obtaining the pixel coordinates of the target (L), and the world coordinates of the target (L) are known, so that the pose of the target (L) in the auxiliary camera coordinate system can be determined by using the correlation between the world coordinates and the pixel coordinates of the target (L).
5. The method for calibrating the extrinsic orientation parameters of a multi-purpose camera using an inter-shooting method as claimed in claim 1, wherein in step 5), the pose of the target in the coordinate system of the self-measuring camera is calculated by multiplying the pose of the target (L) in the coordinate system of the self-measuring camera by the pose matrix of the rotated auxiliary camera (3) and the first measuring camera (1) or the second measuring camera (2) obtained in step 4) in the coordinate system of the auxiliary camera and the calculated pose of the target (L).
6. The method for calibrating the extrinsic orientation parameters of multi-purpose camera using the cross-shooting method as claimed in claim 1, wherein in step 6), the pose of the target (L) in the coordinate system of the self-surveying camera is calculated by step 5), the target (L) on the second surveying camera (2) is shot by the first surveying camera (1), or the target (L) on the first surveying camera (1) is shot by the second surveying camera (2), then the pose of the opposite target (L) is calculated by the PnP algorithm, and the current extrinsic parameters of the two surveying cameras are obtained by scaling.
7. The method for calibrating the external orientation parameters of the multi-view camera by the mutual shooting method according to claim 1, wherein: in step 7), the method of rotating the measurement camera to the working posture using the precision rotation stage and performing measurement includes: before measurement, the first measurement camera (1) and the second measurement camera (2) are rotated to an angle opposite to a measured object; and recording the rotation angles of the first measuring camera (1) and the second measuring camera (2) from the calibration posture to the working posture by using a precision rotating platform (3), and obtaining external parameters of the two measuring cameras in the working posture through calculation, so that the measuring work can be carried out.
CN202010131077.8A 2020-02-28 2020-02-28 External azimuth parameter calibration method for multi-camera by adopting mutual shooting method Active CN111415391B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010131077.8A CN111415391B (en) 2020-02-28 2020-02-28 External azimuth parameter calibration method for multi-camera by adopting mutual shooting method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010131077.8A CN111415391B (en) 2020-02-28 2020-02-28 External azimuth parameter calibration method for multi-camera by adopting mutual shooting method

Publications (2)

Publication Number Publication Date
CN111415391A true CN111415391A (en) 2020-07-14
CN111415391B CN111415391B (en) 2023-04-28

Family

ID=71491099

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010131077.8A Active CN111415391B (en) 2020-02-28 2020-02-28 External azimuth parameter calibration method for multi-camera by adopting mutual shooting method

Country Status (1)

Country Link
CN (1) CN111415391B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111445537A (en) * 2020-06-18 2020-07-24 浙江中控技术股份有限公司 Calibration method and system of camera
CN111862238A (en) * 2020-07-23 2020-10-30 中国民航大学 Full-space monocular light pen type vision measurement method
CN111964693A (en) * 2020-07-21 2020-11-20 中国科学院长春光学精密机械与物理研究所 High-precision calibration method for internal and external orientation elements of surveying and mapping camera
CN112700501A (en) * 2020-12-12 2021-04-23 西北工业大学 Underwater monocular sub-pixel relative pose estimation method
CN112837373A (en) * 2021-03-03 2021-05-25 福州视驰科技有限公司 Multi-camera pose estimation method without feature point matching
CN113256742A (en) * 2021-07-15 2021-08-13 禾多科技(北京)有限公司 Interface display method and device, electronic equipment and computer readable medium
CN115526941A (en) * 2022-11-25 2022-12-27 海伯森技术(深圳)有限公司 Calibration device and calibration method for telecentric camera
CN116704045A (en) * 2023-06-20 2023-09-05 北京控制工程研究所 Multi-camera system calibration method for monitoring starry sky background simulation system
CN117197241A (en) * 2023-09-14 2023-12-08 上海智能制造功能平台有限公司 Robot tail end absolute pose high-precision tracking method based on multi-eye vision

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102831641A (en) * 2012-08-08 2012-12-19 浙江华震数字化工程有限公司 Method for shooting and three-dimensional reduction and reconstruction
CN104807476A (en) * 2015-04-23 2015-07-29 上海大学 Pose estimation-based quick probe calibration device and method
CN108663043A (en) * 2018-05-16 2018-10-16 北京航空航天大学 Distributed boss's POS node relative pose measurement method based on single camera auxiliary
WO2019062291A1 (en) * 2017-09-29 2019-04-04 歌尔股份有限公司 Binocular vision positioning method, device, and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102831641A (en) * 2012-08-08 2012-12-19 浙江华震数字化工程有限公司 Method for shooting and three-dimensional reduction and reconstruction
CN104807476A (en) * 2015-04-23 2015-07-29 上海大学 Pose estimation-based quick probe calibration device and method
WO2019062291A1 (en) * 2017-09-29 2019-04-04 歌尔股份有限公司 Binocular vision positioning method, device, and system
CN108663043A (en) * 2018-05-16 2018-10-16 北京航空航天大学 Distributed boss's POS node relative pose measurement method based on single camera auxiliary

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
吴军;马凯;徐海涛;王志军;于之靖;: "采用虚拟立体视觉的航空发动机叶片点云测量方法" *
李鑫;: "基于多目视觉的布匹瑕疵检测系统硬件部分设计" *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111445537A (en) * 2020-06-18 2020-07-24 浙江中控技术股份有限公司 Calibration method and system of camera
CN111964693A (en) * 2020-07-21 2020-11-20 中国科学院长春光学精密机械与物理研究所 High-precision calibration method for internal and external orientation elements of surveying and mapping camera
CN111862238B (en) * 2020-07-23 2022-05-10 中国民航大学 Full-space monocular light pen type vision measurement method
CN111862238A (en) * 2020-07-23 2020-10-30 中国民航大学 Full-space monocular light pen type vision measurement method
CN112700501A (en) * 2020-12-12 2021-04-23 西北工业大学 Underwater monocular sub-pixel relative pose estimation method
CN112700501B (en) * 2020-12-12 2024-03-05 西北工业大学 Underwater monocular subpixel relative pose estimation method
CN112837373A (en) * 2021-03-03 2021-05-25 福州视驰科技有限公司 Multi-camera pose estimation method without feature point matching
CN112837373B (en) * 2021-03-03 2024-04-26 福州视驰科技有限公司 Multi-camera pose estimation method without feature point matching
CN113256742A (en) * 2021-07-15 2021-08-13 禾多科技(北京)有限公司 Interface display method and device, electronic equipment and computer readable medium
CN115526941A (en) * 2022-11-25 2022-12-27 海伯森技术(深圳)有限公司 Calibration device and calibration method for telecentric camera
CN116704045A (en) * 2023-06-20 2023-09-05 北京控制工程研究所 Multi-camera system calibration method for monitoring starry sky background simulation system
CN116704045B (en) * 2023-06-20 2024-01-26 北京控制工程研究所 Multi-camera system calibration method for monitoring starry sky background simulation system
CN117197241A (en) * 2023-09-14 2023-12-08 上海智能制造功能平台有限公司 Robot tail end absolute pose high-precision tracking method based on multi-eye vision

Also Published As

Publication number Publication date
CN111415391B (en) 2023-04-28

Similar Documents

Publication Publication Date Title
CN111415391B (en) External azimuth parameter calibration method for multi-camera by adopting mutual shooting method
CN110728715B (en) Intelligent inspection robot camera angle self-adaptive adjustment method
CN109859275B (en) Monocular vision hand-eye calibration method of rehabilitation mechanical arm based on S-R-S structure
CN110378969B (en) Convergent binocular camera calibration method based on 3D geometric constraint
CN108053450B (en) High-precision binocular camera calibration method based on multiple constraints
CN108648237B (en) Space positioning method based on vision
CN109794963B (en) Robot rapid positioning method facing curved surface component
WO2020199439A1 (en) Single- and dual-camera hybrid measurement-based three-dimensional point cloud computing method
CN112229323B (en) Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method
CN115861445B (en) Hand-eye calibration method based on three-dimensional point cloud of calibration plate
CN113870366B (en) Calibration method and calibration system of three-dimensional scanning system based on pose sensor
CN109059755B (en) High-precision hand-eye calibration method for robot
CN111879354A (en) Unmanned aerial vehicle measurement system that becomes more meticulous
CN110363801B (en) Method for matching corresponding points of workpiece real object and three-dimensional CAD (computer-aided design) model of workpiece
Wei et al. Flexible calibration of a portable structured light system through surface plane
CN112857328B (en) Calibration-free photogrammetry method
CN113409395B (en) High-precision detection and positioning method for catheter end
CN109342008B (en) Wind tunnel test model attack angle single-camera video measuring method based on homography matrix
CN111145267A (en) IMU (inertial measurement unit) assistance-based 360-degree panoramic view multi-camera calibration method
Zou et al. Flexible Extrinsic Parameter Calibration for Multicameras With Nonoverlapping Field of View
Mavrinac et al. Calibration of dual laser-based range cameras for reduced occlusion in 3D imaging
CN100370220C (en) Single-image self-calibration for relative parameter of light structural three-dimensional system
Liu et al. Binocular camera calibration based on three-dimensional reconstruction error
Zhang et al. Camera Calibration Algorithm for Long Distance Binocular Measurement
CN114565714B (en) Monocular vision sensor hybrid high-precision three-dimensional structure recovery method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Wu Jun

Inventor after: Li Zechuan

Inventor after: Guo Runxia

Inventor after: Xu Jun

Inventor after: Li Yanling

Inventor after: Li Xin

Inventor before: Wu Jun

Inventor before: Li Zechuan

Inventor before: Xu Jun

Inventor before: Li Yanling

Inventor before: Li Xin

CB03 Change of inventor or designer information
GR01 Patent grant
GR01 Patent grant