CN113160331B - External parameter calibration method based on visual system imaging - Google Patents

External parameter calibration method based on visual system imaging Download PDF

Info

Publication number
CN113160331B
CN113160331B CN202110426793.3A CN202110426793A CN113160331B CN 113160331 B CN113160331 B CN 113160331B CN 202110426793 A CN202110426793 A CN 202110426793A CN 113160331 B CN113160331 B CN 113160331B
Authority
CN
China
Prior art keywords
coordinate system
calibration frame
camera
calibration
rotation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110426793.3A
Other languages
Chinese (zh)
Other versions
CN113160331A (en
Inventor
张烁
彭松
张建利
温博
刘少创
贾阳
马友青
亓晨
鄢咏折
吴运佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Spacecraft System Engineering
Aerospace Information Research Institute of CAS
Original Assignee
Beijing Institute of Spacecraft System Engineering
Aerospace Information Research Institute of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Spacecraft System Engineering, Aerospace Information Research Institute of CAS filed Critical Beijing Institute of Spacecraft System Engineering
Priority to CN202110426793.3A priority Critical patent/CN113160331B/en
Publication of CN113160331A publication Critical patent/CN113160331A/en
Application granted granted Critical
Publication of CN113160331B publication Critical patent/CN113160331B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The invention discloses an external parameter calibration method based on vision system imaging, which comprises the steps of firstly, measuring three-dimensional coordinates of each target in a calibration frame under a calibration frame coordinate system by adopting two high-precision theodolites; arranging a plurality of conversion targets on a camera tool for fixing a camera to be detected, and selecting a plurality of corresponding conversion targets in a calibration frame; performing multi-angle sequence imaging on the calibration frame; extracting image coordinates of a target point in the calibration frame; obtaining the accurate values of the rotation and translation parameters from the projection center coordinate system of the camera to be measured to the coordinate system of the calibration frame according to the image coordinate and the world three-dimensional coordinate; and then, solving by adopting a least square coordinate conversion algorithm to obtain rotation and translation parameters from a projection center coordinate system of the camera to be detected to a reference mirror coordinate system. The method can solve the problem that the theodolite can not aim at the projection center of the camera, and can also solve the problem that external parameters estimated by the Zhang Zhengyou calibration plate method are not parameters required in the final assembly and task implementation of the spacecraft.

Description

External parameter calibration method based on visual system imaging
Technical Field
The invention relates to the technical field of aerospace camera calibration, in particular to an external parameter calibration method based on vision system imaging.
Background
In the aerospace field, along with the continuous promotion of moon exploration engineering and planet exploration engineering in China in recent years, chang ' e three-size detector, chang ' e four-size detector, chang ' e five-size detector and mars detector with complex structures are developed successively in China. In order to better detect extraterrestrial stars and perform tasks, the detectors are usually equipped with a vision system. For example, the engineering camera installed on the rabbit No. 2 lunar vehicle includes: a pair of binocular navigation cameras, a pair of binocular obstacle avoidance cameras; the installed scientific camera includes: a pair of binocular panoramic cameras. Depending on the task, the cameras are designed differently, may be monocular cameras or may be binocular cameras, and the installation positions of the entire detector are different. After the camera is integrated on the whole device (the whole detector), in order to complete the subsequent on-orbit task, the rotation of the projection center of the camera to the reference mirror thereof needs to be obtained
Figure BDA0003029886570000011
And translation (X, Y, Z) parameters, as shown in fig. 1, which is a schematic diagram of a single unit of a camera installed on a deep space probe in the prior art, and a reference mirror is used for facilitating the aiming of a theodolite when the whole unit is precise, so as to measure the accurate installation position of the single unit on the whole unit.
At present, in the process of spacecraft final assembly integration, a theodolite system is generally adopted to accurately measure the installation position of a camera or other equipment on a whole spacecraft, but a visual system is a complex optical system, and the theodolite measurement method is that a surveyor finds a target to be aimed in an eyepiece of the theodolite, usually a cross wire of a reference mirror. However, for the vision system, the aiming target cannot be found, because the projection center of the camera is a virtual point located inside the lens, there is no entity in the space, and the theodolite measurement must find a solid target which can be aimed, so the method which is compromised in the prior art is to replace the position of the cross hair of the reference mirror as the installation position of the vision system on the whole device, but the processing is inaccurate.
Another method for calibrating a camera in the prior art is a Zhang Zhengyou chessboard pattern calibration method, which is widely applied to the field of computer vision and the field of optics, and specifically, a calibration scene is built by using a calibration plate, and a calibration image is shot by imaging the calibration plate through changing the position and the posture of the calibration plate (generally changing for 10-20 times) at the same time. After the test is finished, the strict grid size of the calibration plate is known, and the external parameters of the camera are estimated by using a post-processing algorithm. The external parameters calibrated by the method are rotation and translation parameters from the camera projection center to the coordinate system of the calibration plate, because the method defines the world coordinate system on the first corner point at the lower left of the calibration plate, and the estimated three-dimensional external parameters are rotation and translation parameters from the coordinate system of the right camera to the coordinate system of the left camera, but in the spacecraft development task, the external parameters are not the desired external parameters, so the method cannot obtain the external parameters from the camera to the coordinate system of the whole spacecraft.
Disclosure of Invention
The invention aims to provide an external parameter calibration method based on vision system imaging, which can solve the problem that a theodolite cannot aim at a camera projection center and also can solve the problem that external parameters estimated by a Zhang Yongyou calibration plate method are not parameters required in spacecraft final assembly and task implementation.
The purpose of the invention is realized by the following technical scheme:
a method for external parameter calibration based on vision system imaging, the method comprising:
step 1, firstly, measuring three-dimensional coordinates of each target in a calibration frame in a coordinate system of the calibration frame by adopting two high-precision theodolites and adopting a multi-echo forward intersection measuring method by matching a standard ruler; the calibration frame is a control field made of tungsten steel and small in deformation, the interior of the calibration frame is composed of 92 targets, and the accurate three-dimensional coordinate of each target is relatively fixed;
step 2, arranging a plurality of conversion targets on a camera tool for fixing a camera to be detected, selecting a corresponding number of conversion targets in the calibration frame, and measuring three-dimensional coordinates of the plurality of conversion targets arranged on the camera tool in a reference mirror coordinate system through the high-precision theodolite;
step 3, performing multi-angle sequence imaging on the calibration frame by using the camera to be tested;
step 4, extracting image coordinates of the target points in the calibration frame from the image obtained in the step 3;
step 5, obtaining initial values and accurate values of rotation and translation parameters from a projection center coordinate system of the camera to be measured to a calibration frame coordinate system according to the image coordinate and the world three-dimensional coordinate of the target point in the calibration frame;
step 6, after the camera to be detected performs each imaging, measuring three-dimensional coordinates of a plurality of conversion targets arranged on the camera tooling and a plurality of conversion targets corresponding to the calibration frame in a theodolite coordinate system by using a high-precision theodolite, and solving by adopting a least square coordinate conversion method to obtain rotation and translation parameters from the calibration frame coordinate system to a reference mirror coordinate system;
and 7, obtaining the accurate values of the rotation and translation parameters from the projection center coordinate system of the camera to be detected to the calibration frame coordinate system and the rotation and translation parameters from the calibration frame coordinate system to the reference mirror coordinate system obtained in the step 6 by utilizing the step 5, and solving by adopting a least square coordinate conversion algorithm to obtain the rotation and translation parameters from the projection center coordinate system of the camera to be detected to the reference mirror coordinate system so as to finish the solving of the external parameters of the camera to be detected.
The technical scheme provided by the invention can solve the problem that the theodolite cannot aim at the projection center of the camera, and also can overcome the problem that the external parameters estimated by the Zhang Yongyou calibration plate method are not parameters required in the final assembly and task implementation of the spacecraft, thereby providing an effective and high-precision geometric parameter estimation and measurement solution for the final assembly and task implementation of the spacecraft.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
FIG. 1 is a schematic diagram of a single camera mounted on a deep space probe in the prior art;
FIG. 2 is a schematic flow chart of an external parameter calibration method based on vision system imaging according to an embodiment of the present invention;
fig. 3 is a schematic diagram of sequential imaging according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention are clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the present invention will be further described in detail with reference to the accompanying drawings, and fig. 2 is a schematic flow chart of an external parameter calibration method based on vision system imaging provided by the embodiment of the present invention, where the method includes:
step 1, firstly, measuring three-dimensional coordinates of each target in a calibration frame in a coordinate system of the calibration frame by adopting two high-precision theodolites and adopting a multi-echo forward intersection measuring method by matching a standard ruler;
the calibration frame is a control field made of tungsten steel and small in deformation, the interior of the calibration frame is composed of 92 targets, and the accurate three-dimensional coordinate of each target is relatively fixed;
in the specific implementation, the high-precision theodolite is an auto-collimation electronic theodolite, can measure the horizontal angle and the vertical angle of a target point of an object, and calculates the three-dimensional coordinate of the target point of the object according to the triangulation principle, and the auto-collimation electronic theodolite has an auto-collimation function and can be used for establishing a three-axis orthogonal coordinate system;
the standard ruler is used as a scale reference in the control point coordinate calculation process, if the nominal length of the standard ruler is inaccurate, the coordinates of the control point measured by the theodolite are also inaccurate, and the length error of the calibrated first-level invar leveling ruler is 0.02 mm.
Step 2, arranging a plurality of conversion targets on a camera tool for fixing a camera to be detected, selecting a corresponding number of conversion targets in the calibration frame, and measuring three-dimensional coordinates of the conversion targets arranged on the camera tool in a reference mirror coordinate system through the high-precision theodolite;
in this step, the plurality of conversion targets is at least 3.
Step 3, performing multi-angle sequence imaging on the calibration frame by using the camera to be tested;
in this step, embodiments of the present invention are designed to take 5 shots, based on experience derived from domestic and foreign literature.
In a specific implementation, the multi-angle imaging of the calibration frame is not randomly shot in any posture, and fig. 3 is a schematic diagram of the sequence imaging according to the embodiment of the present invention, where: c represents a camera coordinate system, J represents a reference mirror coordinate system, B represents a calibration frame coordinate system, and T represents a theodolite coordinate system; p1 to P5 represent 5 imaging positions of the camera; t1 and T2 indicate the installation positions of the two theodolites. The camera to be measured needs to align the center of the calibration frame for imaging at the outermost boundaries of the two sides of the calibration frame, the included angle between the single-side light beams obtained by aligning the center of the calibration frame right in front of the calibration frame for imaging is not more than 30 degrees, and the included angle between the two sides is not more than 60 degrees.
Step 4, extracting image coordinates of the target points in the calibration frame from the image obtained in the step 3;
in the step, the target point image to be extracted is amplified by 10 times, then smooth noise reduction processing is carried out, and finally the image coordinate of the target point is collected;
wherein, the image coordinate extraction precision of the target point is better than 0.25 pixel.
Step 5, obtaining initial values and accurate values of rotation and translation parameters from a projection center coordinate system of the camera to be measured to a calibration frame coordinate system according to the image coordinate and the world three-dimensional coordinate of the target point in the calibration frame;
in the step, specifically, according to the image coordinate and the world three-dimensional coordinate of the target point in the calibration frame, a direct linear transformation algorithm DLT (the algorithm does not need to solve the initial value of the parameter) is used to obtain the initial values of the rotation and translation parameters from the projection center coordinate system of the camera to be measured to the calibration frame coordinate system;
and then, solving the accurate value of the external parameter through least square iteration by using a multi-image back intersection algorithm.
In specific implementation, a calculation formula of the DLT is as follows:
Figure BDA0003029886570000041
in the formula (1), (X, Y, Z) represents the three-dimensional coordinates of the target point in the calibration frame under the coordinate system of the calibration frame; (x, y) represents two-dimensional coordinates of a target point extracted from an image; l 1 -l 11 Representing the imaging parameters, which are unknowns to be solved, specifically using the following formula (2) to solve for l 1 -l 11
Figure BDA0003029886570000051
In the above formula, n represents the nth target point; solving the translation parameter (X) from the projection center coordinate system of the camera to be measured to the coordinate system of the calibration frame by adopting the formulas (3) to (5) S ,Y S ,Z S ):
l 1 X S +l 2 Y S +l 3 Z S =-l 4 (3)
l 5 X S +l 6 Y S +l 7 Z S =-l 8 (4)
l 9 X S +l 10 Y S +l 11 Z S =-1 (5)
Figure BDA0003029886570000052
Wherein, a 1 ,a 2 ,…,c 3 Representing the rotation parameters from the projection center coordinate system of the camera to be measured to the calibration frame coordinate system: (
Figure BDA0003029886570000053
ω, κ) of 9 elements of the rotation matrix R;
solving the rotation parameters from the projection center coordinate system of the camera to be measured to the calibration frame coordinate system by adopting the formulas (6) to (8)
Figure BDA0003029886570000054
Figure BDA0003029886570000055
sinω=-b 3 (7)
Figure BDA0003029886570000056
Then, translation and rotation parameters from the projection center coordinate system of the camera to be measured to the coordinate system of the calibration frame can be calculated through formulas (3) to (8);
taking the solved translation and rotation parameters of the projection center coordinate system of the camera to be measured to the calibration frame coordinate system as initial values, substituting the initial values into a multi-image back intersection algorithm, and specifically calculating a model as shown in a formula (9):
Figure BDA0003029886570000061
in the formula (9), f represents the camera standoff;
performing first-order Taylor series expansion on the formula (9), and solving the unknown number (X) by adopting an iterative method S ,Y S ,Z S ,
Figure BDA0003029886570000063
Omega, kappa) to obtain the accurate values of the translation and rotation parameters from the projection center coordinate system of the camera to be measured to the calibration frame coordinate system.
Step 6, after the camera to be detected performs each imaging, measuring three-dimensional coordinates of a plurality of conversion targets arranged on the camera tooling and a plurality of conversion targets corresponding to the calibration frame in a theodolite coordinate system by using a high-precision theodolite, and solving by adopting a least square coordinate conversion method to obtain rotation and translation parameters from the calibration frame coordinate system to a reference mirror coordinate system;
in the step, on the premise that the three-dimensional coordinates of the plurality of conversion targets arranged on the camera tool under the coordinate system of the reference mirror are known, and the three-dimensional coordinates of the plurality of conversion targets corresponding to the calibration frame under the coordinate system of the calibration frame are known, the rotation and translation parameters from the coordinate system of the calibration frame to the coordinate system of the reference mirror are obtained by solving through a least square coordinate conversion method.
In specific implementation, the following three-dimensional space similarity transformation error equation is adopted to solve to obtain rotation and translation parameters from a calibration frame coordinate system to a reference mirror coordinate system:
Figure BDA0003029886570000062
in the formula (10), R B-J A rotation matrix representing the coordinate system of the calibration frame to the coordinate system of the reference mirror; t is B-J Representing a translation vector from the coordinate system of the calibration frame to the coordinate system of the reference mirror; (X) B ,Y B ,Z B ) Representing three-dimensional coordinates of the plurality of conversion targets under a calibration frame coordinate system; (X) J ,Y J ,Z J ) Representing the three-dimensional coordinates of the plurality of translation targets in the reference mirror coordinate system.
And 7, obtaining the accurate values of the rotation and translation parameters from the projection center coordinate system of the camera to be detected to the calibration frame coordinate system and the rotation and translation parameters from the calibration frame coordinate system to the reference mirror coordinate system obtained in the step 6 by utilizing the step 5, and solving by adopting a least square coordinate conversion algorithm to obtain the rotation and translation parameters from the projection center coordinate system of the camera to be detected to the reference mirror coordinate system so as to finish the solving of the external parameters of the camera to be detected.
In the step, a rotation matrix R from a projection center coordinate system of the camera to be measured to a reference mirror coordinate system is calculated through the following formula C-J
R C-J =R B-J ·R C-B (11)
In the formula (11), R B-J A rotation matrix representing the coordinate system of the calibration frame to the coordinate system of the reference mirror; r is C-B Representing a rotation matrix from a projection center coordinate system of the camera to be measured to a coordinate system of the calibration frame;
and calculating the translation parameter T from the projection center coordinate system of the camera to be measured to the reference mirror coordinate system by the following formula C-J
T C-J =R B-J ·T C-B +T B-J (12)
In the formula (12), T C-B Representing a translation vector from a projection center coordinate system of the camera to be measured to a coordinate system of the calibration frame; t is B-J Representing the translation vector of the coordinate system of the calibration frame to the coordinate system of the reference mirror.
It is noted that those skilled in the art will recognize that embodiments of the present invention are not described in detail herein.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (5)

1. An external parameter calibration method based on vision system imaging is characterized by comprising the following steps:
step 1, firstly, measuring three-dimensional coordinates of each target in a calibration frame under a calibration frame coordinate system by adopting two high-precision theodolites and adopting a multi-measuring-back forward intersection measuring method in cooperation with a standard ruler; the calibration frame is a control field made of tungsten steel and small in deformation, the interior of the calibration frame is composed of 92 targets, and the accurate three-dimensional coordinate of each target is relatively fixed;
step 2, arranging a plurality of conversion targets on a camera tool for fixing a camera to be detected, selecting a corresponding number of conversion targets in the calibration frame, and measuring three-dimensional coordinates of the conversion targets arranged on the camera tool in a reference mirror coordinate system through the high-precision theodolite;
step 3, performing multi-angle sequence imaging on the calibration frame by using the camera to be tested;
step 4, extracting image coordinates of the target points in the calibration frame from the image obtained in the step 3;
step 5, obtaining initial values and accurate values of rotation and translation parameters from a projection center coordinate system of the camera to be measured to a calibration frame coordinate system according to the image coordinate and the world three-dimensional coordinate of the target point in the calibration frame;
the process of the step 5 specifically comprises the following steps:
obtaining initial values of rotation and translation parameters from a projection center coordinate system of the camera to be measured to a calibration frame coordinate system by using a direct linear transformation algorithm DLT according to the image coordinate and the world three-dimensional coordinate of the target point in the calibration frame;
then, solving the accurate value of the external parameter through least square iteration by using a multi-image back intersection algorithm;
the calculation formula of the direct linear transformation algorithm DLT is as follows:
Figure FDA0003975536550000011
in the formula (1), (X, Y, Z) represents the three-dimensional coordinate of the target point in the calibration frame under the coordinate system of the calibration frame; (x, y) represents two-dimensional coordinates of a target point extracted from the image; l. the 1 -l 11 Representing the imaging parameters, which are unknowns to be solved, specifically using the following formula (2) to solve for l 1 -l 11
Figure FDA0003975536550000012
In the above formula, n represents the nth target point; solving the translation parameter (X) from the projection center coordinate system of the camera to be measured to the coordinate system of the calibration frame by adopting the formulas (3) to (5) S ,Y S ,Z S ):
l 1 X S +l 2 Y S +l 3 Z S =-l 4 (3)
l 5 X S +l 6 Y S +l 7 Z S =-l 8 (4)
l 9 X S +l 10 Y S +l 11 Z S =-1 (5)
Figure FDA0003975536550000021
Wherein, a 1 ,a 2 ,…,c 3 Representing rotation parameters from the projection center coordinate system of the camera to be measured to the calibration frame coordinate system
Figure FDA0003975536550000025
Figure FDA0003975536550000026
9 elements of the formed rotation matrix R;
solving the rotation parameters from the projection center coordinate system of the camera to be measured to the calibration frame coordinate system by adopting the formulas (6) to (8)
Figure FDA0003975536550000027
Figure FDA0003975536550000022
sinω=-b 3 (7)
Figure FDA0003975536550000023
Then, calculating translation and rotation parameters from the projection center coordinate system of the camera to be measured to the coordinate system of the calibration frame through formulas (3) to (8);
taking the calculated translation and rotation parameters from the projection center coordinate system of the camera to be measured to the coordinate system of the calibration frame as initial values, substituting the initial values into a multi-image back intersection algorithm, and particularly calculating a model as shown in a formula (9):
Figure FDA0003975536550000024
in the formula (9), f represents the camera standoff;
performing first-order Taylor series expansion on the formula (9), and solving the unknown number by adopting an iterative method
Figure FDA0003975536550000028
Figure FDA0003975536550000029
Thereby obtaining the accurate values of the translation and rotation parameters from the projection center coordinate system of the camera to be measured to the coordinate system of the calibration frame;
step 6, after the camera to be detected performs each imaging, measuring three-dimensional coordinates of a plurality of conversion targets arranged on the camera tooling and a plurality of conversion targets corresponding to the calibration frame in a theodolite coordinate system by using a high-precision theodolite, and solving by adopting a least square coordinate conversion method to obtain rotation and translation parameters from the calibration frame coordinate system to a reference mirror coordinate system;
in step 6, on the premise that the three-dimensional coordinates of 7 conversion targets arranged on the camera tool are known under a reference mirror coordinate system and the three-dimensional coordinates of 7 corresponding conversion targets in the calibration frame are known under the calibration frame coordinate system, solving by using a least square coordinate conversion method to obtain rotation and translation parameters from the calibration frame coordinate system to the reference mirror coordinate system;
specifically, the following three-dimensional space similarity transformation error equation is adopted to solve to obtain rotation and translation parameters from a calibration frame coordinate system to a reference mirror coordinate system:
Figure FDA0003975536550000031
in the formula (10), R B-J A rotation matrix representing the calibration frame coordinate system to the reference mirror coordinate system; t is a unit of B-J Representing a translation vector from the coordinate system of the calibration frame to the coordinate system of the reference mirror; (X) B ,Y B ,Z B ) Representing three-dimensional coordinates of the plurality of conversion targets under a calibration frame coordinate system; (X) J ,Y J ,Z J ) Representing three-dimensional coordinates of the plurality of conversion targets in a reference mirror coordinate system;
and 7, obtaining the accurate values of the rotation and translation parameters from the projection center coordinate system of the camera to be detected to the calibration frame coordinate system and the rotation and translation parameters from the calibration frame coordinate system to the reference mirror coordinate system obtained in the step 6 by using the step 5, and solving the rotation and translation parameters from the projection center coordinate system of the camera to be detected to the reference mirror coordinate system by using a least square coordinate conversion algorithm to complete the calculation of the external parameters of the camera to be detected.
2. The vision system imaging-based extrinsic parameter calibration method according to claim 1, wherein, in step 1,
the high-precision theodolite is an auto-collimation electronic theodolite, can measure the horizontal angle and the vertical angle of an object space target point, and calculates the three-dimensional coordinate of the object space target point according to the triangulation principle, and the auto-collimation electronic theodolite has an auto-collimation function and can be used for establishing a three-axis orthogonal coordinate system;
the standard ruler is used as a scale reference in the control point coordinate calculation process, a first-level invar leveling ruler which is detected is adopted, and the length error of the standard ruler is 0.02 mm.
3. The method for calibrating extrinsic parameter based on imaging of vision system according to claim 1, wherein in step 3, the outermost boundaries of said camera to be measured at both sides of said calibration frame are imaged with respect to the center of said calibration frame, and the included angle between the two side of the light beams obtained by imaging with respect to the center of said calibration frame right in front of said calibration frame is not more than 30 degrees, and the included angle between the two side of the light beams is not more than 60 degrees.
4. The vision system imaging-based extrinsic parameter calibration method according to claim 1, wherein in step 4, the target point image to be extracted is amplified by 10 times, then smoothed and denoised, and finally the image coordinates of the target point are collected;
wherein, the image coordinate extraction precision of the target point is better than 0.25 pixel.
5. The vision system imaging-based extrinsic parameter calibration method according to claim 1, wherein in step 7, the rotation matrix R from the projection center coordinate system of the camera under test to the reference mirror coordinate system is calculated by the following formula C-J
R C-J =R B-J ·R C-B (11)
In the formula (11), R B-J A rotation matrix representing the coordinate system of the calibration frame to the coordinate system of the reference mirror; r C-B Representing a rotation matrix from a projection center coordinate system of the camera to be measured to a calibration frame coordinate system;
and calculating the translation parameter T from the projection center coordinate system of the camera to be measured to the reference mirror coordinate system by the following formula C-J
T C-J =R B-J ·T C-B +T B-J (12)
In the formula (12), T C-B Representing a translation vector from a projection center coordinate system of the camera to be measured to a coordinate system of the calibration frame; t is B-J Representing the translation vector of the calibration frame coordinate system to the reference mirror coordinate system.
CN202110426793.3A 2021-04-20 2021-04-20 External parameter calibration method based on visual system imaging Active CN113160331B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110426793.3A CN113160331B (en) 2021-04-20 2021-04-20 External parameter calibration method based on visual system imaging

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110426793.3A CN113160331B (en) 2021-04-20 2021-04-20 External parameter calibration method based on visual system imaging

Publications (2)

Publication Number Publication Date
CN113160331A CN113160331A (en) 2021-07-23
CN113160331B true CN113160331B (en) 2023-02-24

Family

ID=76867609

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110426793.3A Active CN113160331B (en) 2021-04-20 2021-04-20 External parameter calibration method based on visual system imaging

Country Status (1)

Country Link
CN (1) CN113160331B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114463422B (en) * 2022-04-14 2022-08-16 北京深度奇点科技有限公司 Method and system for image measurement correction
CN115371639B (en) * 2022-08-11 2023-04-18 深圳大学 Underwater photogrammetry immersed tube joint butt joint measurement method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102012236A (en) * 2010-09-26 2011-04-13 郑州辰维科技股份有限公司 Method for calibrating moon rover binocular vision obstacle avoidance system
CN105469418A (en) * 2016-01-04 2016-04-06 中车青岛四方机车车辆股份有限公司 Photogrammetry-based wide-field binocular vision calibration device and calibration method
CN107610178A (en) * 2017-07-27 2018-01-19 北京航天计量测试技术研究所 A kind of industrial photogrammetry system camera parameter movable type scaling method
CN108645426A (en) * 2018-04-09 2018-10-12 北京空间飞行器总体设计部 A kind of in-orbit self-calibrating method of extraterrestrial target Relative Navigation vision measurement system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2808645B1 (en) * 2012-01-23 2019-02-20 Nec Corporation Camera calibration device, camera calibration method, and camera calibration program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102012236A (en) * 2010-09-26 2011-04-13 郑州辰维科技股份有限公司 Method for calibrating moon rover binocular vision obstacle avoidance system
CN105469418A (en) * 2016-01-04 2016-04-06 中车青岛四方机车车辆股份有限公司 Photogrammetry-based wide-field binocular vision calibration device and calibration method
CN107610178A (en) * 2017-07-27 2018-01-19 北京航天计量测试技术研究所 A kind of industrial photogrammetry system camera parameter movable type scaling method
CN108645426A (en) * 2018-04-09 2018-10-12 北京空间飞行器总体设计部 A kind of in-orbit self-calibrating method of extraterrestrial target Relative Navigation vision measurement system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A precise visual localisation method for the Chinese Chang"e-4 Yutu-2 rover;Youqing Ma 等;《THE PHOTOGRAMMETRIC RECORD》;20200331;全文 *
基于嫦娥四号月球车图像的地形遮挡高精度预报试验研究;马友青 等;《中国科学》;20190924;全文 *

Also Published As

Publication number Publication date
CN113160331A (en) 2021-07-23

Similar Documents

Publication Publication Date Title
CN108986070B (en) Rock crack propagation experiment monitoring method based on high-speed video measurement
JP5992184B2 (en) Image data processing apparatus, image data processing method, and image data processing program
KR102345886B1 (en) Method for the three-dimensional measuring of moving objects during known movement
CN113160331B (en) External parameter calibration method based on visual system imaging
CN108444449B (en) It is a kind of to the object space attitude measurement method with parallel lines feature
CN107014399B (en) Combined calibration method for satellite-borne optical camera-laser range finder combined system
KR101282718B1 (en) Absolute misalignment calibration method between attitude sensors and linear array image sensor
US20120257792A1 (en) Method for Geo-Referencing An Imaged Area
US11212511B1 (en) Residual error mitigation in multiview calibration
CN105378794A (en) 3d recording device, method for producing 3d image, and method for setting up 3d recording device
CN109520476B (en) System and method for measuring dynamic pose of rear intersection based on inertial measurement unit
CN107144278B (en) Lander visual navigation method based on multi-source characteristics
CN108801218B (en) High-precision orientation and orientation precision evaluation method of large-size dynamic photogrammetry system
CN110363758B (en) Optical remote sensing satellite imaging quality determination method and system
CN114332191A (en) Three-dimensional point cloud error compensation method and device
CN113870366A (en) Calibration method and calibration system of three-dimensional scanning system based on pose sensor
CN113947638A (en) Image orthorectification method for fisheye camera
García-Moreno et al. Error propagation and uncertainty analysis between 3D laser scanner and camera
CN111815712B (en) High-precision camera-single laser instrument combined calibration method
JP4804371B2 (en) Sensor bias error estimation device
Ye et al. A calibration trilogy of monocular-vision-based aircraft boresight system
JP2010009236A (en) Plane area estimation device and program
KR20180038761A (en) Method for target data acquisition
US11137247B2 (en) Method and system for measuring the orientation of one rigid object relative to another
CN114299477A (en) Vehicle vision positioning method, system, equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant