CN113160331A - External parameter calibration method based on vision system imaging - Google Patents

External parameter calibration method based on vision system imaging Download PDF

Info

Publication number
CN113160331A
CN113160331A CN202110426793.3A CN202110426793A CN113160331A CN 113160331 A CN113160331 A CN 113160331A CN 202110426793 A CN202110426793 A CN 202110426793A CN 113160331 A CN113160331 A CN 113160331A
Authority
CN
China
Prior art keywords
coordinate system
calibration frame
camera
calibration
rotation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110426793.3A
Other languages
Chinese (zh)
Other versions
CN113160331B (en
Inventor
张烁
彭松
张建利
温博
刘少创
贾阳
马友青
亓晨
鄢咏折
吴运佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Spacecraft System Engineering
Aerospace Information Research Institute of CAS
Original Assignee
Beijing Institute of Spacecraft System Engineering
Aerospace Information Research Institute of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Spacecraft System Engineering, Aerospace Information Research Institute of CAS filed Critical Beijing Institute of Spacecraft System Engineering
Priority to CN202110426793.3A priority Critical patent/CN113160331B/en
Publication of CN113160331A publication Critical patent/CN113160331A/en
Application granted granted Critical
Publication of CN113160331B publication Critical patent/CN113160331B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The invention discloses an external parameter calibration method based on vision system imaging, which comprises the steps of firstly, measuring three-dimensional coordinates of each target in a calibration frame under a calibration frame coordinate system by adopting two high-precision theodolites; arranging a plurality of conversion targets on a camera tool for fixing a camera to be detected, and selecting a plurality of corresponding conversion targets in a calibration frame; performing multi-angle sequence imaging on the calibration frame; extracting image coordinates of a target point in the calibration frame; obtaining the accurate values of the rotation and translation parameters from the projection center coordinate system of the camera to be measured to the coordinate system of the calibration frame according to the image coordinate and the world three-dimensional coordinate; and then, solving by adopting a least square coordinate conversion algorithm to obtain rotation and translation parameters from a projection center coordinate system of the camera to be detected to a reference mirror coordinate system. The method can solve the problem that the theodolite can not aim at the projection center of the camera, and can also solve the problem that external parameters estimated by the Zhang Zhengyou calibration plate method are not parameters required in the final assembly and task implementation of the spacecraft.

Description

External parameter calibration method based on vision system imaging
Technical Field
The invention relates to the technical field of aerospace camera calibration, in particular to an external parameter calibration method based on vision system imaging.
Background
In the aerospace field, with the continuous promotion of moon exploration engineering and planet exploration engineering in China in recent years, a Chang ' e three-number detector, a Chang ' e four-number detector, a Chang ' e five-number detector and a mars detector which are complex in structure are successively developed in China. In order to better detect extraterrestrial stars and perform tasks, the detectors are usually equipped with a vision system. For example, the engineering camera installed on the rabbit No. 2 lunar vehicle includes: a pair of binocular navigation cameras, a pair of binocular obstacle avoidance cameras; the installed scientific camera includes: a pair of binocular panoramic cameras. Depending on the task, the cameras are designed differently, may be monocular cameras or may be binocular cameras, and may be installed at different positions on the entire detector. After the camera is integrated on the whole device (the whole detector), in order to complete the subsequent on-orbit task, the rotation of the projection center of the camera to the reference mirror thereof needs to be obtained
Figure BDA0003029886570000011
And translation (X, Y, Z) parameters, as shown in fig. 1, which is a schematic diagram of a single unit of a camera installed on a deep space probe in the prior art, and a reference mirror is used for facilitating the aiming of a theodolite when the whole unit is precise, so as to measure the accurate installation position of the single unit on the whole unit.
At present, in the process of spacecraft final assembly integration, a theodolite system is generally adopted to accurately measure the installation position of a camera or other equipment on a whole spacecraft, but a visual system is a complex optical system, and the theodolite measurement method is that a surveyor finds a target to be aimed in an eyepiece of the theodolite, usually a cross wire of a reference mirror. However, for the vision system, the aiming target cannot be found, because the projection center of the camera is a virtual point located inside the lens, there is no entity in the space, and the theodolite measurement must find a solid target which can be aimed, so the method which is compromised in the prior art is to replace the position of the cross hair of the reference mirror as the installation position of the vision system on the whole device, but the processing is inaccurate.
Another method for calibrating a camera in the prior art is a Zhangyingyou chessboard pattern calibration method, which is widely applied to the field of computer vision and the field of optics, and specifically, a calibration scene is constructed by using a calibration plate, and a camera images and shoots a calibration image on the calibration plate by changing the position and the posture of the calibration plate (generally changing for 10-20 times). After the test is finished, the strict grid size of the calibration plate is known, and the external parameters of the camera are estimated by using a post-processing algorithm. The external parameters calibrated by the method are rotation and translation parameters from the camera projection center to the coordinate system of the calibration plate, because the method defines the world coordinate system on the first corner point at the lower left of the calibration plate, and the estimated three-dimensional external parameters are rotation and translation parameters from the coordinate system of the right camera to the coordinate system of the left camera, but in the spacecraft development task, the external parameters are not the desired external parameters, so the method cannot obtain the external parameters from the camera to the coordinate system of the whole spacecraft.
Disclosure of Invention
The invention aims to provide an external parameter calibration method based on vision system imaging, which can solve the problem that a theodolite cannot aim at a camera projection center and also can solve the problem that external parameters estimated by a Zhang Yongyou calibration plate method are not parameters required in spacecraft final assembly and task implementation.
The purpose of the invention is realized by the following technical scheme:
a method for external parameter calibration based on vision system imaging, the method comprising:
step 1, firstly, measuring three-dimensional coordinates of each target in a calibration frame under a calibration frame coordinate system by adopting two high-precision theodolites and adopting a multi-measuring-back forward intersection measuring method in cooperation with a standard ruler; the calibration frame is a control field made of tungsten steel and small in deformation, the interior of the calibration frame is composed of 92 targets, and the accurate three-dimensional coordinate of each target is relatively fixed;
step 2, arranging a plurality of conversion targets on a camera tool for fixing a camera to be detected, selecting a corresponding number of conversion targets in the calibration frame, and measuring three-dimensional coordinates of the conversion targets arranged on the camera tool in a reference mirror coordinate system through the high-precision theodolite;
step 3, performing multi-angle sequence imaging on the calibration frame by using the camera to be tested;
step 4, extracting image coordinates of the target points in the calibration frame from the image obtained in the step 3;
step 5, obtaining initial values and accurate values of rotation and translation parameters from a projection center coordinate system of the camera to be measured to a calibration frame coordinate system according to the image coordinate and the world three-dimensional coordinate of the target point in the calibration frame;
step 6, after the camera to be detected performs each imaging, measuring three-dimensional coordinates of a plurality of conversion targets arranged on the camera tooling and a plurality of conversion targets corresponding to the calibration frame in a theodolite coordinate system by using a high-precision theodolite, and solving by adopting a least square coordinate conversion method to obtain rotation and translation parameters from the calibration frame coordinate system to a reference mirror coordinate system;
and 7, obtaining the accurate values of the rotation and translation parameters from the projection center coordinate system of the camera to be detected to the calibration frame coordinate system and the rotation and translation parameters from the calibration frame coordinate system to the reference mirror coordinate system obtained in the step 6 by utilizing the step 5, and solving by adopting a least square coordinate conversion algorithm to obtain the rotation and translation parameters from the projection center coordinate system of the camera to be detected to the reference mirror coordinate system so as to finish the solving of the external parameters of the camera to be detected.
The technical scheme provided by the invention can solve the problem that the theodolite cannot aim at the projection center of the camera, and also can overcome the problem that the external parameters estimated by the Zhang Yongyou calibration plate method are not parameters required in the final assembly and task implementation of the spacecraft, thereby providing an effective and high-precision geometric parameter estimation and measurement solution for the final assembly and task implementation of the spacecraft.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
FIG. 1 is a schematic diagram of a single camera mounted on a deep space probe in the prior art;
FIG. 2 is a schematic flow chart of an external parameter calibration method based on vision system imaging according to an embodiment of the present invention;
fig. 3 is a schematic diagram of sequential imaging according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention are clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the present invention will be further described in detail with reference to the accompanying drawings, and fig. 2 is a schematic flow chart of an external parameter calibration method based on vision system imaging provided by the embodiment of the present invention, where the method includes:
step 1, firstly, measuring three-dimensional coordinates of each target in a calibration frame under a calibration frame coordinate system by adopting two high-precision theodolites and adopting a multi-measuring-back forward intersection measuring method in cooperation with a standard ruler;
the calibration frame is a control field made of tungsten steel and small in deformation, the interior of the calibration frame is composed of 92 targets, and the accurate three-dimensional coordinate of each target is relatively fixed;
in the specific implementation, the high-precision theodolite is an auto-collimation electronic theodolite, can measure the horizontal angle and the vertical angle of a target point of an object, and calculates the three-dimensional coordinate of the target point of the object according to the triangulation principle, and the auto-collimation electronic theodolite has an auto-collimation function and can be used for establishing a three-axis orthogonal coordinate system;
the standard ruler is used as a scale reference in the control point coordinate calculation process, if the nominal length of the standard ruler is inaccurate, the coordinates of the control points measured by the theodolite are also inaccurate, and the length error of the calibrated first-level invar leveling ruler is 0.02 mm.
Step 2, arranging a plurality of conversion targets on a camera tool for fixing a camera to be detected, selecting a corresponding number of conversion targets in the calibration frame, and measuring three-dimensional coordinates of the conversion targets arranged on the camera tool in a reference mirror coordinate system through the high-precision theodolite;
in this step, the plurality of conversion targets is at least 3.
Step 3, performing multi-angle sequence imaging on the calibration frame by using the camera to be tested;
in this step, embodiments of the present invention are designed to take 5 shots, based on experience derived from domestic and foreign literature.
In a specific implementation, the multi-angle imaging of the calibration frame is not randomly shot in any posture, and fig. 3 is a schematic diagram of the sequence imaging according to the embodiment of the present invention, where: c represents a camera coordinate system, J represents a reference mirror coordinate system, B represents a calibration frame coordinate system, and T represents a theodolite coordinate system; p1 to P5 represent 5 imaging positions of the camera; t1 and T2 indicate the mounting positions of the two theodolites. The camera to be measured needs to align the center of the calibration frame for imaging at the outermost boundaries of the two sides of the calibration frame, the included angle between the single-side light beams obtained by aligning the center of the calibration frame right in front of the calibration frame for imaging is not more than 30 degrees, and the included angle between the two sides is not more than 60 degrees.
Step 4, extracting image coordinates of the target points in the calibration frame from the image obtained in the step 3;
in the step, the target point image to be extracted is amplified by 10 times, then smooth noise reduction processing is carried out, and finally the image coordinate of the target point is collected;
wherein, the image coordinate extraction precision of the target point is better than 0.25 pixel.
Step 5, obtaining initial values and accurate values of rotation and translation parameters from a projection center coordinate system of the camera to be measured to a calibration frame coordinate system according to the image coordinate and the world three-dimensional coordinate of the target point in the calibration frame;
in the step, specifically, according to the image coordinate and the world three-dimensional coordinate of the target point in the calibration frame, a direct linear transformation algorithm DLT (the algorithm does not need an initial value of a parameter to be solved) is used for obtaining an initial value of a rotation parameter and a translation parameter from a projection center coordinate system of the camera to be measured to a calibration frame coordinate system;
and then, solving the accurate value of the external parameter through least square iteration by using a multi-image back intersection algorithm.
In the specific implementation, the calculation formula of the DLT is as follows:
Figure BDA0003029886570000041
in the formula (1), (X, Y, Z) represents the three-dimensional coordinates of the target point in the calibration frame under the coordinate system of the calibration frame; (x, y) represents two-dimensional coordinates of a target point extracted from the image; l1-l11Representing imaging parameters isThe unknowns to be solved for, in particular, by the following formula (2)1-l11
Figure BDA0003029886570000051
In the above formula, n represents the nth target point; solving the translation parameter (X) from the projection center coordinate system of the camera to be measured to the coordinate system of the calibration frame by adopting the formulas (3) to (5)S,YS,ZS):
l1XS+l2YS+l3ZS=-l4 (3)
l5XS+l6YS+l7ZS=-l8 (4)
l9XS+l10YS+l11ZS=-1 (5)
Figure BDA0003029886570000052
Wherein, a1,a2,…,c3Representing the rotation parameters from the projection center coordinate system of the camera to be measured to the calibration frame coordinate system (
Figure BDA0003029886570000053
ω, κ) of 9 elements of the rotation matrix R;
solving the rotation parameters from the projection center coordinate system of the camera to be measured to the calibration frame coordinate system by adopting the formulas (6) to (8)
Figure BDA0003029886570000054
Figure BDA0003029886570000055
sinω=-b3 (7)
Figure BDA0003029886570000056
Then, translation and rotation parameters from the projection center coordinate system of the camera to be measured to the coordinate system of the calibration frame can be calculated through formulas (3) to (8);
and substituting the calculated translation and rotation parameters from the projection center coordinate system of the camera to be measured to the coordinate system of the calibration frame into a multi-image back intersection algorithm by taking the translation and rotation parameters as initial values, wherein a specific calculation model is shown as a formula (9):
Figure BDA0003029886570000061
in the formula (9), f represents the camera principal distance;
performing first-order Taylor series expansion on the formula (9), and solving the unknown number (X) by adopting an iterative methodS,YS,ZS,
Figure BDA0003029886570000063
Omega and kappa) to obtain the accurate values of the translation and rotation parameters from the projection center coordinate system of the camera to be measured to the calibration frame coordinate system.
Step 6, after the camera to be detected performs each imaging, measuring three-dimensional coordinates of a plurality of conversion targets arranged on the camera tooling and a plurality of conversion targets corresponding to the calibration frame in a theodolite coordinate system by using a high-precision theodolite, and solving by adopting a least square coordinate conversion method to obtain rotation and translation parameters from the calibration frame coordinate system to a reference mirror coordinate system;
in the step, on the premise that the three-dimensional coordinates of the plurality of conversion targets arranged on the camera tool under the coordinate system of the reference mirror are known, and the three-dimensional coordinates of the plurality of conversion targets corresponding to the calibration frame under the coordinate system of the calibration frame are known, the rotation and translation parameters from the coordinate system of the calibration frame to the coordinate system of the reference mirror are obtained by solving through a least square coordinate conversion method.
In the concrete implementation, the following three-dimensional space similarity transformation error equation is adopted to solve to obtain the rotation and translation parameters from the coordinate system of the calibration frame to the coordinate system of the reference mirror:
Figure BDA0003029886570000062
in the formula (10), RB-JA rotation matrix representing the coordinate system of the calibration frame to the coordinate system of the reference mirror; t isB-JRepresenting a translation vector from the coordinate system of the calibration frame to the coordinate system of the reference mirror; (X)B,YB,ZB) Representing three-dimensional coordinates of the plurality of conversion targets under a calibration frame coordinate system; (X)J,YJ,ZJ) Representing the three-dimensional coordinates of the plurality of translation targets in the reference mirror coordinate system.
And 7, obtaining the accurate values of the rotation and translation parameters from the projection center coordinate system of the camera to be detected to the calibration frame coordinate system and the rotation and translation parameters from the calibration frame coordinate system to the reference mirror coordinate system obtained in the step 6 by utilizing the step 5, and solving by adopting a least square coordinate conversion algorithm to obtain the rotation and translation parameters from the projection center coordinate system of the camera to be detected to the reference mirror coordinate system so as to finish the solving of the external parameters of the camera to be detected.
In the step, a rotation matrix R from a projection center coordinate system of the camera to be measured to a reference mirror coordinate system is calculated through the following formulaC-J
RC-J=RB-J·RC-B (11)
In formula (11), RB-JA rotation matrix representing the coordinate system of the calibration frame to the coordinate system of the reference mirror; rC-BRepresenting a rotation matrix from a projection center coordinate system of the camera to be measured to a coordinate system of the calibration frame;
and calculating the translation parameter T from the projection center coordinate system of the camera to be measured to the reference mirror coordinate system by the following formulaC-J
TC-J=RB-J·TC-B+TB-J (12)
In the formula (12), TC-BRepresenting a translation vector from a projection center coordinate system of the camera to be measured to a coordinate system of the calibration frame; t isB-JRepresenting the coordinate system of the calibration frame to the coordinate system of the reference mirrorThe vector is translated.
It is noted that those skilled in the art will recognize that embodiments of the present invention are not described in detail herein.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (9)

1. An external parameter calibration method based on vision system imaging is characterized by comprising the following steps:
step 1, firstly, measuring three-dimensional coordinates of each target in a calibration frame under a calibration frame coordinate system by adopting two high-precision theodolites and adopting a multi-measuring-back forward intersection measuring method in cooperation with a standard ruler; the calibration frame is a control field made of tungsten steel and small in deformation, the interior of the calibration frame is composed of 92 targets, and the accurate three-dimensional coordinate of each target is relatively fixed;
step 2, arranging a plurality of conversion targets on a camera tool for fixing a camera to be detected, selecting a corresponding number of conversion targets in the calibration frame, and measuring three-dimensional coordinates of the conversion targets arranged on the camera tool in a reference mirror coordinate system through the high-precision theodolite;
step 3, performing multi-angle sequence imaging on the calibration frame by using the camera to be tested;
step 4, extracting image coordinates of the target points in the calibration frame from the image obtained in the step 3;
step 5, obtaining initial values and accurate values of rotation and translation parameters from a projection center coordinate system of the camera to be measured to a calibration frame coordinate system according to the image coordinate and the world three-dimensional coordinate of the target point in the calibration frame;
step 6, after the camera to be detected performs each imaging, measuring three-dimensional coordinates of a plurality of conversion targets arranged on the camera tooling and a plurality of conversion targets corresponding to the calibration frame in a theodolite coordinate system by using a high-precision theodolite, and solving by adopting a least square coordinate conversion method to obtain rotation and translation parameters from the calibration frame coordinate system to a reference mirror coordinate system;
and 7, obtaining the accurate values of the rotation and translation parameters from the projection center coordinate system of the camera to be detected to the calibration frame coordinate system and the rotation and translation parameters from the calibration frame coordinate system to the reference mirror coordinate system obtained in the step 6 by utilizing the step 5, and solving by adopting a least square coordinate conversion algorithm to obtain the rotation and translation parameters from the projection center coordinate system of the camera to be detected to the reference mirror coordinate system so as to finish the solving of the external parameters of the camera to be detected.
2. The vision system imaging-based extrinsic parameter calibration method according to claim 1, wherein, in step 1,
the high-precision theodolite is an auto-collimation electronic theodolite, can measure the horizontal angle and the vertical angle of a target point of an object, and calculates the three-dimensional coordinate of the target point of the object by a triangulation principle, and the auto-collimation electronic theodolite has an auto-collimation function and can be used for establishing a three-axis orthogonal coordinate system;
the standard ruler is used as a scale reference in the control point coordinate calculation process, a first-level invar leveling ruler which is detected is adopted, and the length error of the standard ruler is 0.02 mm.
3. The vision system imaging-based extrinsic parameter calibration method according to claim 1, wherein in step 3, the camera to be tested is imaged by aligning with the center of the calibration frame at the outermost boundaries of both sides of the calibration frame, and the included angle between the light beams obtained by aligning with the center of the calibration frame right in front of the calibration frame is not more than 30 degrees on one side and not more than 60 degrees on both sides.
4. The vision system imaging-based extrinsic parameter calibration method according to claim 1, wherein in step 4, the target point image to be extracted is amplified by 10 times, then smoothed and denoised, and finally the image coordinates of the target point are collected;
wherein, the image coordinate extraction precision of the target point is better than 0.25 pixel.
5. The vision system imaging-based extrinsic parameter calibration method according to claim 1, wherein said step 5 process specifically comprises:
obtaining initial values of rotation and translation parameters from a projection center coordinate system of the camera to be measured to a calibration frame coordinate system by using a direct linear transformation algorithm DLT according to the image coordinate and the world three-dimensional coordinate of the target point in the calibration frame;
and then, solving the accurate value of the external parameter through least square iteration by using a multi-image back intersection algorithm.
6. The vision system imaging-based extrinsic parameter calibration method according to claim 5, wherein said DLT is calculated as follows:
Figure FDA0003029886560000021
in the formula (1), (X, Y, Z) represents the three-dimensional coordinates of the target point in the calibration frame under the coordinate system of the calibration frame; (x, y) represents two-dimensional coordinates of a target point extracted from the image; l1-l11Representing imaging parameters, is an unknown number to be solved, and specifically adopts the following formula (2) to solve l1-l11
Figure FDA0003029886560000022
In the above formula, n represents the nth target point; solving the translation parameter (X) from the projection center coordinate system of the camera to be measured to the coordinate system of the calibration frame by adopting the formulas (3) to (5)S,YS,ZS):
l1XS+l2YS+l3ZS=-l4 (3)
l5XS+l6YS+l7ZS=-l8 (4)
l9XS+l10YS+l11ZS=-1 (5)
Figure FDA0003029886560000031
Wherein, a1,a2,…,c3Representing rotation parameters from the projection center coordinate system of the camera to be measured to the calibration frame coordinate system
Figure FDA0003029886560000032
Figure FDA0003029886560000033
9 elements of the formed rotation matrix R;
solving the rotation parameters from the projection center coordinate system of the camera to be measured to the calibration frame coordinate system by adopting the formulas (6) to (8)
Figure FDA0003029886560000034
Figure FDA0003029886560000035
sinω=-b3 (7)
Figure FDA0003029886560000036
Then, calculating translation and rotation parameters from the projection center coordinate system of the camera to be measured to the coordinate system of the calibration frame through formulas (3) to (8);
and substituting the calculated translation and rotation parameters from the projection center coordinate system of the camera to be measured to the coordinate system of the calibration frame into a multi-image back intersection algorithm by taking the translation and rotation parameters as initial values, wherein a specific calculation model is shown as a formula (9):
Figure FDA0003029886560000037
in the formula (9), f represents the camera principal distance;
performing first-order Taylor series expansion on the formula (9), and solving the unknown number by adopting an iterative method
Figure FDA0003029886560000038
Figure FDA0003029886560000039
Thereby obtaining the accurate values of the translation and rotation parameters from the projection center coordinate system of the camera to be measured to the coordinate system of the calibration frame.
7. The vision system imaging-based extrinsic parameter calibration method according to claim 1, wherein in step 6, on the premise that the three-dimensional coordinates of the 7 conversion targets installed on the camera rig in the reference mirror coordinate system are known, and the three-dimensional coordinates of the 7 conversion targets in the calibration frame coordinate system are known, the least square coordinate conversion method is used to solve the rotation and translation parameters from the calibration frame coordinate system to the reference mirror coordinate system.
8. The vision system imaging-based extrinsic parameter calibration method according to claim 7, wherein in step 6, the rotation and translation parameters from the calibration frame coordinate system to the reference mirror coordinate system are obtained by solving using the following three-dimensional space similarity transformation error equations:
Figure FDA0003029886560000041
in the formula (10), RB-JA rotation matrix representing the coordinate system of the calibration frame to the coordinate system of the reference mirror;TB-Jrepresenting a translation vector from the coordinate system of the calibration frame to the coordinate system of the reference mirror; (X)B,YB,ZB) Representing three-dimensional coordinates of the plurality of conversion targets under a calibration frame coordinate system; (X)J,YJ,ZJ) Representing the three-dimensional coordinates of the plurality of translation targets in the reference mirror coordinate system.
9. The vision system imaging-based extrinsic parameter calibration method according to claim 1, wherein in step 7, the rotation matrix R from the projection center coordinate system of the camera under test to the reference mirror coordinate system is calculated by the following formulaC-J
RC-J=RB-J·RC-B (11)
In formula (11), RB-JA rotation matrix representing the coordinate system of the calibration frame to the coordinate system of the reference mirror; rC-BRepresenting a rotation matrix from a projection center coordinate system of the camera to be measured to a coordinate system of the calibration frame;
and calculating the translation parameter T from the projection center coordinate system of the camera to be measured to the reference mirror coordinate system by the following formulaC-J
TC-J=RB-J·TC-B+TB-J (12)
In the formula (12), TC-BRepresenting a translation vector from a projection center coordinate system of the camera to be measured to a coordinate system of the calibration frame; t isB-JRepresenting the translation vector of the calibration frame coordinate system to the reference mirror coordinate system.
CN202110426793.3A 2021-04-20 2021-04-20 External parameter calibration method based on visual system imaging Active CN113160331B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110426793.3A CN113160331B (en) 2021-04-20 2021-04-20 External parameter calibration method based on visual system imaging

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110426793.3A CN113160331B (en) 2021-04-20 2021-04-20 External parameter calibration method based on visual system imaging

Publications (2)

Publication Number Publication Date
CN113160331A true CN113160331A (en) 2021-07-23
CN113160331B CN113160331B (en) 2023-02-24

Family

ID=76867609

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110426793.3A Active CN113160331B (en) 2021-04-20 2021-04-20 External parameter calibration method based on visual system imaging

Country Status (1)

Country Link
CN (1) CN113160331B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114463422A (en) * 2022-04-14 2022-05-10 北京深度奇点科技有限公司 Method and system for image measurement correction
WO2024032663A1 (en) * 2022-08-11 2024-02-15 深圳大学 Underwater photogrammetry-based method for measurement during docking of immersed tube segments

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102012236A (en) * 2010-09-26 2011-04-13 郑州辰维科技股份有限公司 Method for calibrating moon rover binocular vision obstacle avoidance system
US20150029345A1 (en) * 2012-01-23 2015-01-29 Nec Corporation Camera calibration device, camera calibration method, and camera calibration program
CN105469418A (en) * 2016-01-04 2016-04-06 中车青岛四方机车车辆股份有限公司 Photogrammetry-based wide-field binocular vision calibration device and calibration method
CN107610178A (en) * 2017-07-27 2018-01-19 北京航天计量测试技术研究所 A kind of industrial photogrammetry system camera parameter movable type scaling method
CN108645426A (en) * 2018-04-09 2018-10-12 北京空间飞行器总体设计部 A kind of in-orbit self-calibrating method of extraterrestrial target Relative Navigation vision measurement system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102012236A (en) * 2010-09-26 2011-04-13 郑州辰维科技股份有限公司 Method for calibrating moon rover binocular vision obstacle avoidance system
US20150029345A1 (en) * 2012-01-23 2015-01-29 Nec Corporation Camera calibration device, camera calibration method, and camera calibration program
CN105469418A (en) * 2016-01-04 2016-04-06 中车青岛四方机车车辆股份有限公司 Photogrammetry-based wide-field binocular vision calibration device and calibration method
CN107610178A (en) * 2017-07-27 2018-01-19 北京航天计量测试技术研究所 A kind of industrial photogrammetry system camera parameter movable type scaling method
CN108645426A (en) * 2018-04-09 2018-10-12 北京空间飞行器总体设计部 A kind of in-orbit self-calibrating method of extraterrestrial target Relative Navigation vision measurement system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YOUQING MA 等: "A precise visual localisation method for the Chinese Chang"e-4 Yutu-2 rover", 《THE PHOTOGRAMMETRIC RECORD》 *
马友青 等: "基于嫦娥四号月球车图像的地形遮挡高精度预报试验研究", 《中国科学》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114463422A (en) * 2022-04-14 2022-05-10 北京深度奇点科技有限公司 Method and system for image measurement correction
CN114463422B (en) * 2022-04-14 2022-08-16 北京深度奇点科技有限公司 Method and system for image measurement correction
WO2024032663A1 (en) * 2022-08-11 2024-02-15 深圳大学 Underwater photogrammetry-based method for measurement during docking of immersed tube segments

Also Published As

Publication number Publication date
CN113160331B (en) 2023-02-24

Similar Documents

Publication Publication Date Title
CN103759716B (en) The dynamic target position of mechanically-based arm end monocular vision and attitude measurement method
CN108986070B (en) Rock crack propagation experiment monitoring method based on high-speed video measurement
JP5992184B2 (en) Image data processing apparatus, image data processing method, and image data processing program
CN107014399B (en) Combined calibration method for satellite-borne optical camera-laser range finder combined system
CN108444449B (en) It is a kind of to the object space attitude measurement method with parallel lines feature
CN113160331B (en) External parameter calibration method based on visual system imaging
US20120257792A1 (en) Method for Geo-Referencing An Imaged Area
US11212511B1 (en) Residual error mitigation in multiview calibration
CN113048980B (en) Pose optimization method and device, electronic equipment and storage medium
CN108801218B (en) High-precision orientation and orientation precision evaluation method of large-size dynamic photogrammetry system
CN110363758B (en) Optical remote sensing satellite imaging quality determination method and system
CN113947638A (en) Image orthorectification method for fisheye camera
CN113870366A (en) Calibration method and calibration system of three-dimensional scanning system based on pose sensor
WO2017131547A1 (en) A method and apparatus for single camera optical measurements
García-Moreno et al. Error propagation and uncertainty analysis between 3D laser scanner and camera
CN111815712B (en) High-precision camera-single laser instrument combined calibration method
JP4935769B2 (en) Plane region estimation apparatus and program
JP4804371B2 (en) Sensor bias error estimation device
CN111220118A (en) Laser range finder based on visual inertial navigation system and range finding method
CN216116064U (en) Pose calibration system of heading machine
US11137247B2 (en) Method and system for measuring the orientation of one rigid object relative to another
CN114705223A (en) Inertial navigation error compensation method and system for multiple mobile intelligent bodies in target tracking
CN112665613A (en) Pose calibration method and system of heading machine
Shen et al. Accurate direct georeferencing of aerial imagery in national coordinates
JPH1137736A (en) Method and device for measuring 3-dimentional shape

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant