CN113012234B - High-precision camera calibration method based on plane transformation - Google Patents

High-precision camera calibration method based on plane transformation Download PDF

Info

Publication number
CN113012234B
CN113012234B CN202110282708.0A CN202110282708A CN113012234B CN 113012234 B CN113012234 B CN 113012234B CN 202110282708 A CN202110282708 A CN 202110282708A CN 113012234 B CN113012234 B CN 113012234B
Authority
CN
China
Prior art keywords
coordinates
image
calibration
center
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110282708.0A
Other languages
Chinese (zh)
Other versions
CN113012234A (en
Inventor
郭君斌
彭妍
于传强
李游
王旭平
孙晓艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rocket Force University of Engineering of PLA
Original Assignee
Rocket Force University of Engineering of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rocket Force University of Engineering of PLA filed Critical Rocket Force University of Engineering of PLA
Priority to CN202110282708.0A priority Critical patent/CN113012234B/en
Publication of CN113012234A publication Critical patent/CN113012234A/en
Application granted granted Critical
Publication of CN113012234B publication Critical patent/CN113012234B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a high-precision camera calibration method based on plane transformation, which comprises the following steps of S1: on a plane calibration plate taking a circle as a marking point, extracting angular points on inner and outer frames of the calibration plate, accurately adjusting coordinates of the angular points to a sub-pixel level, and projecting an ellipse into an approximate standard circle through perspective transformation; s2: the center of mass is calculated by using the image moment to complete the extraction of the center coordinates of the standard circle; s3: projecting the extracted circle center coordinates back to the original calibration plate plane through inverse perspective transformation, and acquiring actual pixel coordinates of the circle center of the calibration point; s4: and finishing camera calibration by combining a Zhangyingyou calibration method according to the pixel coordinate and the space coordinate corresponding to the circle center of the circular mark point. The camera calibration method can effectively reduce the camera calibration error and improve the camera calibration precision.

Description

High-precision camera calibration method based on plane transformation
Technical Field
The invention relates to the technical field of camera calibration, in particular to a high-precision camera calibration method based on plane transformation.
Background
In the field of computer vision, camera calibration plays an irreplaceable role. At present, the commonly used camera calibration methods mainly include three types: traditional calibration methods, self-calibration methods, and active visual calibration methods. The self-calibration method does not need a calibration object, has strong flexibility, but has poor robustness and poor precision; although the active visual calibration method is simple and can be used for linear solution, the active visual calibration method cannot be applied to application occasions with unknown camera motion parameters; the traditional calibration method has high precision and is widely applied to the fields of high-precision measurement and three-dimensional reconstruction, but calibration objects are needed, and typical methods include a direct linear transformation calibration method, a Tsai two-step calibration method, a Zhang-Yongyou calibration method and the like.
The Zhangyou calibration method only needs one checkerboard as a plane calibration plate, and has the advantages of simple operation, high calibration precision and the like in actual use, so the Zhangyou calibration method becomes one of the most widely used calibration algorithms in the field of current high-precision industrial measurement.
When a plane calibration plate with circles as the calibration points is used for calibrating the camera, the accuracy of camera calibration is determined by the extraction accuracy of the center coordinates of the circles of the circular calibration points. When the calibration plate is shot, the plane of the calibration plate and the imaging plane of the camera are not always parallel, a certain angle of inclination exists usually, the circular mark point can be projected into an ellipse, and the center of the ellipse extracted from the calibration plate is not the projection point of the real physical center of the circle due to the existence of perspective deviation. The traditional method usually extracts the center of an ellipse to replace the projection point of the real physical center of a circle, so that the traditional method inevitably reduces the precision of camera calibration.
Disclosure of Invention
Aiming at the existing problems, the invention aims to provide a high-precision camera calibration method based on plane transformation, which can effectively reduce the camera calibration error and improve the camera calibration precision.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
the high-precision camera calibration method based on the plane transformation is characterized by comprising the following steps: comprises the following steps of (a) carrying out,
s1: on a plane calibration plate taking a circle as a marking point, extracting angular points on inner and outer frames of the calibration plate, accurately adjusting coordinates of the angular points to a sub-pixel level, and projecting an ellipse into an approximate standard circle through perspective transformation;
s2: the center of mass is calculated by using the image moment to complete the extraction of the center coordinates of the standard circle;
s3: projecting the extracted circle center coordinates back to the original calibration plate plane through inverse perspective transformation, and acquiring actual pixel coordinates of the circle center of the calibration point;
s4: and finishing camera calibration by combining a Zhangyingyou calibration method according to the pixel coordinate and the space coordinate corresponding to the circle center of the circular mark point.
Further, the specific operation of step S1 includes,
s101: preparing a dot matrix calibration plate;
s102: fixing a camera, changing the postures and positions of the calibration plates, and shooting images of the calibration plates at different viewing angles;
s103: preprocessing the acquired image;
s104: and carrying out plane perspective transformation on the preprocessed image.
Further, the specific operation step of preprocessing the acquired image in step S103 includes,
s1031: converting the collected image into a gray image;
s1032: denoising the gray level image by Gaussian filtering;
s1033: and (5) carrying out binarization processing on the de-manic gray level image by using a maximum inter-class variance method.
Further, the specific operation step of performing the planar perspective transformation on the preprocessed image in step S104 includes,
s1041: detecting the edge of the image, screening the edge by synthesizing the area and length constraints of the edge, and positioning the edge on the outer frame of the calibration plate;
s1042: detecting the corner points of the outer frame by using a Shi-Tomasi algorithm, and searching corner point coordinates at a sub-pixel level;
s1043: repeating the step S1041 and the step S1042, carrying out the same operation on the inner frame of the image, and detecting to obtain five corner points;
s1044: extending two sides of the pentagonal inner frame to form a quadrangle, and obtaining a new corner point at the intersection point;
s1045: estimating an optimal perspective transformation matrix T from the coordinates of the angular points of eight sub-pixel levels contained in the inner frame and the outer frame by utilizing a random sampling consistency algorithm and an iterative idea, and enabling the plane of the calibration plate to be parallel to the imaging plane through perspective transformation, wherein the marking point is approximate to a standard circle;
the eight sub-pixel-level corner coordinates contained in the inner frame and the outer frame comprise four sub-pixel-level corner coordinates of the outer frame and four quadrilateral sub-pixel-level corner coordinates formed after two sides of the inner frame are extended.
Further, the perspective transformation described in step S1045 is essentially to transform the image from one view plane to a new view plane by using a formula
Figure BDA0002979219180000031
Wherein (u, v, w) and (x, y, z) are coordinates before and after perspective transformation of the image, T is a perspective transformation matrix and is a homogeneous matrix with a degree of freedom of 8, and a 33 =1;
Assuming that the pixel coordinates of the image before and after perspective transformation are (u ', v') and (x ', y'), respectively, solving the formula of the perspective transformation, and obtaining the pixel coordinates after the perspective transformation as:
Figure BDA0002979219180000041
further, the specific operation of extracting the coordinates of the center of the standard circle in step S2 includes,
s201: regarding each mark point as a h multiplied by w digital image, and expressing the p + q order moment of the image as the order moment according to the essence of the image moment
Figure BDA0002979219180000042
Wherein f (u, v) is the gray value of the image at pixel coordinates (u, v);
S202:using the zeroth order moment m of each marker point 00 And first moment m 10 、m 01 Calculating the coordinates of the center of mass of each mark point
Figure BDA0002979219180000043
Figure BDA0002979219180000044
S203: because each mark point is approximate to a standard circle, the barycenter coordinate of each mark point can be regarded as the pixel coordinate of the center of the standard circle.
Further, the specific operation of step S3 includes the following steps,
s301: performing inverse operation on the perspective transformation matrix T in the step S1045 to obtain an inverse perspective transformation matrix T inv ,T inv =T -1
S302: the pixel coordinates of the center of each standard circle obtained in step S202 are subjected to inverse perspective transformation, and the operation principle is that
Figure BDA0002979219180000045
(x ', y', z ') and (u', v ', w') are coordinates before and after the image is subjected to inverse perspective transformation, the pixel coordinates of the center of each standard circle obtained in step S202 are regarded as the coordinates (x ', y', z ') of the image before the image is subjected to inverse perspective transformation in the step, and the output (u', v ', w') is the coordinates of the actual center of the mark point;
s303: and transforming and projecting the circular mark points from the new viewing plane to the original calibration plate plane through the inverse perspective transformation in the step S302 to obtain the coordinates of the pixel coordinates of the center of each standard circle before perspective, namely the actual pixel coordinates of the center of each circular mark point.
Further, the specific operation of step S4 includes the following steps,
s401: calculating internal and external parameters of the camera under an ideal distortion-free condition according to the pixel coordinate and the space coordinate corresponding to the circle center of the circular mark point;
s402: improving the internal and external parameter precision of the camera obtained in the step S401 by utilizing maximum likelihood estimation;
s403: under the condition of nonlinear distortion, calculating a geometric distortion coefficient by using a least square method;
s404: and integrating the internal and external parameters and the distortion coefficients, and improving the overall estimation precision by utilizing the maximum likelihood estimation to obtain the final internal and external parameters and the distortion coefficients of the camera.
Further, in step S401, in the ideal case of no distortion, the camera imaging model is a pinhole model, and the spatial coordinate of the circle center of the circular mark point is defined as P ═ X W ,Y W ,Z W ] T The projection point on the calibration plate plane, i.e. the actual pixel coordinate of the center of the circle of the circular calibration point obtained in step S303 is p ═ u, v] T Corresponding homogeneous coordinates are respectively
Figure BDA0002979219180000051
And
Figure BDA0002979219180000052
the projection imaging model is represented as
Figure BDA0002979219180000053
Wherein s is any scale factor, K is an internal reference matrix, R and t are respectively a rotation matrix and a translation matrix from a world coordinate system to a camera coordinate system to jointly form an external reference matrix, (u) 0 ,v 0 ) As principal point coordinates of the image, f x And f y Is the effective focal length on the horizontal and vertical axes of the image, respectively, and gamma is the tilt factor.
Further, in step S403, in the case of nonlinear distortion, a nonlinear distortion model is expressed as
Figure BDA0002979219180000061
In the formula (x) d ,y d ) The coordinates of the imaging point in the ideal case are (x) u ,y u ) For the actual distorted coordinates of the imaging point, delta x (x d ,y d ) And delta y (x d ,y d ) Respectively represent the coordinates of the imaging point as (x) d ,y d ) When inThe amount of distortion occurring in the x and y directions;
taking into account the radial and tangential distortion of the lens, the amount of distortion delta x (x d ,y d ) And delta y (x d ,y d ) Are respectively as
Figure BDA0002979219180000062
In the formula, k 1 、k 2 、k 3 As radial distortion coefficient, p 1 、p 2 In order to be the tangential distortion coefficient,
Figure BDA0002979219180000063
the invention has the beneficial effects that:
the invention provides a high-precision camera calibration method based on plane transformation, which comprises the steps of carrying out first transformation on a plane of a calibration plate, projecting an elliptical mark point into an approximate standard circle, and extracting coordinates of the center of the standard circle; and performing secondary transformation on the plane of the calibration plate, projecting the circle center coordinates extracted by the primary transformation onto the original elliptic mark points, and acquiring the actual pixel coordinates of the circle centers of the mark points, thereby avoiding the interference of perspective deviation on the extraction of the circle center coordinates. And then, the camera is calibrated by combining a Zhang Zhengyou calibration method, and compared with the traditional method, the total average reprojection error of the calibration method disclosed by the invention is reduced by 66.169%, so that the camera calibration precision is greatly improved.
Drawings
FIG. 1 is a flow chart of a camera calibration method according to the present invention;
FIG. 2 is a dot matrix calibration plate used for image acquisition in the present invention;
FIG. 3 is a result of pre-processing of an acquired image in accordance with the present invention;
FIG. 4 shows the result of extracting sub-pixel-level corner points from the outer square frame;
FIG. 5 shows the result of extracting sub-pixel-level corner points for the pentagonal inner frames in the present invention;
FIG. 6 is a perspective transformed calibration plate of the present invention;
FIG. 7 is a result of extracting the center of a standard circle according to the present invention;
FIG. 8 is a result of extracting the center of a circle of a circular mark point according to the present invention;
FIG. 9 is a process of extracting the circle centers of the circular mark points of the first and second figures in the first embodiment of the present invention;
FIG. 10 is a schematic view of a world coordinate system axis distribution of a midpoint array calibration plate according to an embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solution of the present invention, the following further describes the technical solution of the present invention with reference to the drawings and the embodiments.
As shown in fig. 1, the high-precision camera calibration method based on plane transformation includes the following steps,
s1: on a plane calibration plate taking a circle as a marking point, extracting angular points on inner and outer frames of the calibration plate, accurately adjusting coordinates of the angular points to a sub-pixel level, and projecting an ellipse into an approximate standard circle through perspective transformation;
specifically, S101: preparing a dot matrix calibration plate; in the invention, a dot matrix calibration plate specified by Halcon is adopted and consists of 49 black round mark points with equal diameters in 7 rows and 7 columns, as shown in figure 2.
S102: fixing a camera, changing the posture and the position of the calibration plate, and shooting images of 16-20 calibration plates at different visual angles;
s103: preprocessing the acquired image;
specifically, the specific operation steps of preprocessing the acquired image include,
s1031: converting the collected image into a gray image;
s1032: denoising the gray level image by Gaussian filtering;
s1033: and (5) carrying out binarization processing on the de-manic gray level image by using a maximum inter-class variance method.
As shown in fig. 3, (a) is an original image before preprocessing, (b) is a grayscale image, (c) is an image after denoising, and (d) is an image after binarization. The maximum inter-class variance method is an algorithm for determining a segmentation threshold value and carrying out binarization on an image, is insensitive to the brightness and contrast of the image, is widely applied to image processing, but is sensitive to noise, so that the image needs to be denoised before the maximum inter-class variance method is applied.
S104: and carrying out plane perspective transformation on the preprocessed image.
The perspective transformation is to transform the image from one view plane to a new view plane and adopts formula
Figure BDA0002979219180000081
Wherein (u, v, w) and (x, y, z) are coordinates before and after perspective transformation of the image, T is a perspective transformation matrix and is a homogeneous matrix with a degree of freedom of 8, and a 33 =1;
Assuming that the pixel coordinates of the image before and after perspective transformation are (u ', v') and (x ', y'), respectively, solving the formula of the perspective transformation, and obtaining the pixel coordinates after the perspective transformation as:
Figure BDA0002979219180000082
according to the solving result, the perspective transformation matrix can be obtained through four pairs of pixel coordinates, and because the image has the problems of noise, corner point error extraction and the like, the perspective transformation matrix is generally obtained by solving more than four pairs of pixel coordinates, and then the perspective transformation matrix is utilized to carry out plane transformation on the image.
The specific operation of performing the planar perspective transformation on the preprocessed image includes the following steps,
s1041: detecting the edge of the image, screening the edge by synthesizing the area and length constraints of the edge, and positioning the edge on the outer frame of the calibration plate;
s1042: the corner points of the outer frame are detected by using the Shi-Tomasi algorithm, and the corner point coordinates at the sub-pixel level are searched, as shown in fig. 4.
S1043: repeating the step S1041 and the step S1042, carrying out the same operation on the inner frame of the image, and detecting to obtain five corner points;
s1044: extending two sides of the pentagonal inner frame to form a quadrangle, and obtaining a new corner point at the intersection point; as shown in fig. 5.
S1045: because the image may have the problems of noise, corner point error extraction and the like, and the obtained sub-pixel level corner point coordinates may not be all accurate, therefore, by using a random sampling consistency algorithm and through an iterative idea, an optimal perspective transformation matrix T is estimated from the eight sub-pixel level corner point coordinates contained in the inner and outer frames, the calibration plate plane and the imaging plane are parallel through perspective transformation, and the result is shown in figure 5, and the mark point is approximate to a standard circle;
the eight sub-pixel-level corner coordinates contained in the inner frame and the outer frame comprise four sub-pixel-level corner coordinates of the outer frame and four quadrilateral sub-pixel-level corner coordinates formed after two sides of the inner frame are extended.
Further, step S2: the center of mass is calculated by using the image moment to complete the extraction of the center coordinates of the standard circle;
image moments are operators describing image features, and essentially perform a special weighting on the gray-scale values of an image. The specific operation steps of utilizing the image moment to calculate the mass center to complete the extraction of the center coordinates of the standard circle comprise,
s201: regarding each mark point as a h multiplied by w digital image, and expressing the p + q order moment of the image as the order moment according to the essence of the image moment
Figure BDA0002979219180000091
Wherein f (u, v) is the gray value of the image at pixel coordinates (u, v);
s202: using the zeroth order moment m of each marker point 00 And first order moment m 10 、m 01 Calculating the coordinates of the center of mass of each marking point
Figure BDA0002979219180000092
Figure BDA0002979219180000093
S203: since each mark point is approximate to a standard circle, the coordinates of the centroid of each mark point can be regarded as the pixel coordinates of the center of the standard circle, as shown in fig. 7.
Further, step S3: projecting the extracted circle center coordinates back to the original calibration plate plane through inverse perspective transformation, and acquiring actual pixel coordinates of the circle center of the calibration point;
the inverse perspective transformation is the inverse process of the perspective transformation, and essentially transforms the image from the new view plane to the original view plane, specifically,
s301: performing inverse operation on the perspective transformation matrix T in the step S1045 to obtain an inverse perspective transformation matrix T inv ,T inv =T -1
S302: the pixel coordinates of the center of each standard circle obtained in step S202 are subjected to inverse perspective transformation, and the operation principle is that
Figure BDA0002979219180000101
(x ', y', z ') and (u', v ', w') are coordinates before and after the image is subjected to inverse perspective transformation, respectively, the pixel coordinates of the center of each standard circle obtained in step S202 are input as the coordinates (x ', y', z ') of the image before the image is subjected to inverse perspective transformation, and the output (u', v ', w') is the coordinates of the actual center of the mark point;
s303: after the inverse perspective transformation in step S302, the circular mark points are transformed from the new viewing plane and projected back to the original calibration plate plane, and the coordinates of the pixel coordinates of the center of each standard circle before perspective, that is, the actual pixel coordinates of the center of the circular mark points, are obtained, as shown in fig. 8.
Further, step S4: and finishing camera calibration by combining a Zhangyingyou calibration method according to the pixel coordinate and the space coordinate corresponding to the circle center of the circular mark point.
Camera calibration refers to establishing a projection imaging model between world coordinates of a three-dimensional space and pixel coordinates of a two-dimensional image. In particular, the method comprises the following steps of,
s401: calculating internal and external parameters of the camera under an ideal distortion-free condition according to the pixel coordinate and the space coordinate corresponding to the circle center of the circular mark point;
specifically, the pixel coordinate corresponding to the circle center of the circular mark point is the actual pixel coordinate of the circle center of the circular mark point obtained in step S3, and the method for determining the spatial coordinate corresponding to the circle center of the circular mark point includes: the axis distribution of a world coordinate system of a dot matrix calibration plate is automatically determined through the position of the shortest side length AB in the inner frame of the pentagon, the calibration plate is supposed to be on a plane with the world coordinate system z being 0, a dot closest to the AB is selected as the origin of the world coordinate system, x and y coordinate systems are respectively established horizontally, rightwards and vertically downwards, and the physical distance between adjacent circular mark points is known, so that the three-dimensional coordinate of the center of each circular mark point can be determined.
Under the ideal distortion-free condition, the camera imaging model is a pinhole model, and the space coordinate of the circle center of the circular mark point is set as P ═ X W ,Y W ,Z W ] T The actual pixel coordinate of the center of the projected point on the plane of the calibration plate, i.e. the circle of the circular calibration point obtained in step S303, is p ═ u, v ═ v] T Corresponding homogeneous coordinates are respectively
Figure BDA0002979219180000114
And
Figure BDA0002979219180000111
the projection imaging model is represented as
Figure BDA0002979219180000112
Wherein s is any scale factor, K is an internal reference matrix, R and t are respectively a rotation matrix and a translation matrix from a world coordinate system to a camera coordinate system to jointly form an external reference matrix, (u) 0 ,v 0 ) As principal point coordinates of the image, f x And f y Is the effective focal length on the horizontal and vertical axes of the image, respectively, and gamma is the tilt factor.
S402: improving the internal and external parameter precision of the camera obtained in the step S401 by utilizing maximum likelihood estimation;
specifically, in order to obtain the optimal camera calibration parameters, the Zhang Zhengyou calibration method utilizes maximum likelihood estimation by constructing an objective function of a reprojection errorAnd improving the precision of all parameters, wherein the objective function is as follows:
Figure BDA0002979219180000113
in the formula: n is the number of calibration images, m is the number of characteristic points on each calibration image, K is the camera internal reference matrix, D is the distortion coefficient of the camera, R i And t i For each corresponding external parameter, p ij And
Figure BDA0002979219180000124
for the j-th feature point P on the i images j Actual proxels and proxels obtained via a camera imaging model.
S403: under the condition of nonlinear distortion, calculating a geometric distortion coefficient by using a least square method;
in practical situations, the camera imaging model may not be able to reach a perfectly ideal linear model due to geometric distortions introduced by the manufacturing accuracy of the lens and the variations in the assembly process. Wherein in the case of nonlinear distortion, the nonlinear distortion model is expressed as
Figure BDA0002979219180000121
In the formula (x) d ,y d ) As coordinates of the imaging point in the ideal case, (x) u ,y u ) For the actual distorted coordinates of the imaging points, delta x (x d ,y d ) And delta y (x d ,y d ) Respectively represent the coordinates of the imaging point as (x) d ,y d ) The amount of distortion occurring in the x and y directions;
the distortion of the lens is mainly divided into radial distortion, tangential distortion, thin lens distortion and the like, wherein the radial distortion and the tangential distortion are most significantly influenced, so that the invention considers the radial distortion and the tangential distortion of the lens, and the distortion quantity delta x (x d ,y d ) And delta y (x d ,y d ) Are respectively as
Figure BDA0002979219180000122
In the formula, k 1 、k 2 、k 3 As radial distortion coefficient, p 1 、p 2 In order to be the tangential distortion coefficient,
Figure BDA0002979219180000123
s404: and integrating the internal and external parameters and the distortion coefficients, and improving the overall estimation precision by utilizing the maximum likelihood estimation to obtain the final internal and external parameters and the distortion coefficients of the camera.
The first embodiment is as follows:
the camera resolution used in this example was pixel, and a dot matrix calibration plate was printed on a4 paper, the specification of which is shown in table 1 below. The experimental environment is matched with an image processing open source library OpenCV 3.2.0 under Visual Studio 2019 on a PC with a CPU of Intel core i51.80 GHz and an operating system of Windows 10.
TABLE 1 lattice calibration plate Specification
Figure BDA0002979219180000131
The camera is fixed, the calibration plates under different postures and positions are shot, 16 different images are shot in total, and the pixel coordinates of the circle center are extracted from 2 images (named as a figure (r) and a figure (r)) in the steps S1-S3 by using the method for extracting the circle center of the circular marking point, wherein the process is shown in the attached figure 9. In fig. 9, the original drawing, the inner and outer frame corners, the plane transformation, the center of the standard circle, and the center of the circular mark point are sequentially extracted from top to bottom.
Further, according to the pixel coordinates and the space coordinates corresponding to the circle center of the circular mark point, the camera calibration is completed by combining a Zhang-Yongyou calibration method, wherein the determination method of the space coordinates comprises the following steps: and automatically determining the axis distribution of the world coordinate system of the dot matrix calibration board according to the position of the shortest side length AB in the inner frame of the pentagon. Assuming that the calibration board is on a plane with a world coordinate system z being 0, selecting a dot closest to AB as an origin of the world coordinate system, establishing an x coordinate system and a y coordinate system horizontally and rightwards and vertically and downwards respectively, and the physical distance between adjacent circular mark points is known, so that the three-dimensional coordinates of the center of each circular mark point can be determined, as shown in fig. 10.
Further, the results of calibrating the camera by using the method of the present invention and the conventional method are compared.
Specifically, for 16 shot images, the center of a circle of a circular mark point is extracted by the method of the present invention and the conventional method, and then the camera is calibrated by combining the Zhangyingyou calibration method, and the obtained calibration results are shown in table 2 below.
TABLE 2 comparison of camera calibration results
Figure BDA0002979219180000141
In the Zhang friend calibration method, the reprojection error is generally used to determine the camera calibration accuracy. The re-projection error refers to that the three-dimensional point of the space is re-projected by using the internal and external parameters and the distortion coefficient of the camera obtained by calibration, so as to obtain the deviation between the new projection point coordinate and the original imaging point coordinate of the three-dimensional point of the space on the image. Generally, the smaller the reprojection error, the higher the accuracy of the camera calibration. The reprojection error for each image and the average reprojection error for all images for the inventive and conventional methods are shown in tables 3 and 4 below, respectively. As can be seen from Table 3, the reprojection error of the method of the present invention was reduced by 54.921% -75.410% compared to the conventional method, and was lower than that of the conventional method. As can be seen from Table 4, the overall average error of the method of the present invention is 0.0340, which is 66.169% lower than that of the conventional method. By combining tables 3 and 4, it is demonstrated that the method herein greatly improves the accuracy of camera calibration, while also demonstrating the feasibility and effectiveness of the method of the present invention.
Table 3 re-projection error contrast units calibrated by camera: pixel
Figure BDA0002979219180000151
FIG. 4 Total average reprojection error versus Unit for camera calibration: pixel
Figure BDA0002979219180000152
The foregoing shows and describes the general principles, principal features, and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are given by way of illustration of the principles of the present invention, but that various changes and modifications may be made without departing from the spirit and scope of the invention, and such changes and modifications are within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (8)

1. The high-precision camera calibration method based on the plane transformation is characterized by comprising the following steps: comprises the following steps of (a) carrying out,
s1: on a plane calibration plate taking a circle as a marking point, extracting angular points on inner and outer frames of the calibration plate, accurately adjusting coordinates of the angular points to a sub-pixel level, and projecting an ellipse into an approximate standard circle through perspective transformation;
s2: calculating the center of mass by using the image moment to complete the extraction of the center coordinates of the standard circle;
s3: projecting the extracted circle center coordinates back to the original calibration plate plane through inverse perspective transformation, and acquiring actual pixel coordinates of the circle center of the calibration point;
s4: according to the pixel coordinate and the space coordinate corresponding to the circle center of the circular mark point, completing camera calibration by combining a Zhangyingyou calibration method;
wherein, the specific operation steps of step S1 include,
s101: preparing a dot matrix calibration plate;
s102: fixing a camera, changing the postures and positions of the calibration plates, and shooting images of the calibration plates at different viewing angles;
s103: preprocessing the acquired image;
s104: carrying out plane perspective transformation on the preprocessed image;
the specific operation of performing the planar perspective transformation on the preprocessed image in step S104 includes,
s1041: detecting the edge of the image, screening the edge by synthesizing the area and length constraints of the edge, and positioning the edge on the outer frame of the calibration plate;
s1042: detecting the corner points of the outer frame by using a Shi-Tomasi algorithm, and searching corner point coordinates at a sub-pixel level;
s1043: repeating the step S1041 and the step S1042, carrying out the same operation on the inner frame of the image, and detecting to obtain five corner points;
s1044: extending two sides of the pentagonal inner frame to form a quadrangle, and obtaining a new corner point at the intersection point;
s1045: estimating an optimal perspective transformation matrix T from the angular point coordinates of eight subpixel levels contained in the inner frame and the outer frame by utilizing a random sampling consistency algorithm and an iterative idea, and enabling the plane of the calibration plate to be parallel to the imaging plane through perspective transformation, wherein the marking points are approximate to standard circles;
the eight sub-pixel-level corner coordinates of the inner frame and the outer frame comprise four sub-pixel-level corner coordinates of the outer frame and four sub-pixel-level corner coordinates of a quadrangle formed after two sides of the inner frame are extended.
2. The method for calibrating a high precision camera based on planar transformation according to claim 1, wherein the specific operation step of preprocessing the acquired image in step S103 includes,
s1031: converting the collected image into a gray image;
s1032: denoising the gray level image by Gaussian filtering;
s1033: and carrying out binarization processing on the denoised gray level image by using a maximum inter-class variance method.
3. The method for calibrating a high-precision camera based on planar transformation as claimed in claim 1, wherein the perspective transformation in step S1045 is performed by transforming an image from a viewing plane to a new viewing planeOn a plane, using a formula
Figure FDA0003700980460000021
Wherein (u, v, w) and (x, y, z) are coordinates before and after perspective transformation of the image, T is a perspective transformation matrix and is a homogeneous matrix with a degree of freedom of 8, and a 33 =1;
Assuming that the pixel coordinates of the image before and after perspective transformation are (u ', v') and (x ', y'), respectively, solving the formula of the perspective transformation, and obtaining the pixel coordinates after the perspective transformation as:
Figure FDA0003700980460000022
4. the method for calibrating a high-precision camera based on planar transformation as claimed in claim 2, wherein the specific operation step of extracting the coordinates of the center of the standard circle in step S2 includes,
s201: regarding each mark point as a h multiplied by w digital image, and expressing the p + q order moment of the image as the order moment according to the essence of the image moment
Figure FDA0003700980460000023
Wherein f (u, v) is the gray value of the image at pixel coordinates (u, v);
s202: using the zeroth order moment m of each marker point 00 And first moment m 10 、m 01 Calculating the coordinates of the center of mass of each mark point
Figure FDA0003700980460000031
S203: because each mark point is approximate to a standard circle, the centroid coordinate of each mark point can be regarded as the pixel coordinate of the center of the standard circle.
5. The method for calibrating a high-precision camera based on planar transformation as claimed in claim 4, wherein the specific operation of step S3 includes the following steps,
s301: performing inverse operation on the perspective transformation matrix T in the step S1045 to obtain an inverse perspective transformation matrix T inv ,T inv =T -1
S302: the pixel coordinates of the center of each standard circle obtained in step S202 are subjected to inverse perspective transformation, and the operation principle is that
Figure FDA0003700980460000032
(x ', y', z ') and (u', v ', w') are coordinates before and after the image is subjected to inverse perspective transformation, respectively, the pixel coordinates of the center of each standard circle obtained in step S202 are input as the coordinates (x ', y', z ') of the image before the image is subjected to inverse perspective transformation, and the output (u', v ', w') is the coordinates of the actual center of the mark point;
s303: and transforming and projecting the circular mark points from the new viewing plane to the plane of the original calibration plate through the inverse perspective transformation in the step S302 to obtain the coordinates of the pixel coordinates of the center of each standard circle before perspective, namely the actual pixel coordinates of the center of the circle of the circular mark points.
6. The method for calibrating a high-precision camera based on planar transformation as claimed in claim 5, wherein the specific operation of step S4 includes the following steps,
s401: calculating internal and external parameters of the camera under an ideal distortion-free condition according to the pixel coordinate and the space coordinate corresponding to the circle center of the circular mark point;
s402: improving the internal and external parameter precision of the camera obtained in the step S401 by utilizing maximum likelihood estimation;
s403: under the condition of nonlinear distortion, calculating a geometric distortion coefficient by using a least square method;
s404: and integrating the internal and external parameters and the distortion coefficients, and improving the overall estimation precision by utilizing the maximum likelihood estimation to obtain the final internal and external parameters and the distortion coefficients of the camera.
7. High-precision camera calibration method based on plane transformation as claimed in claim 6The method is characterized in that in step S401, under the ideal and distortion-free condition, the camera imaging model is a pinhole model, and the spatial coordinate of the circle center of the circular mark point is set as P ═ X W ,Y W ,Z W ] T The projection point on the calibration plate plane, i.e. the actual pixel coordinate of the center of the circle of the circular calibration point obtained in step S303 is p ═ u, v] T Corresponding homogeneous coordinates are respectively
Figure FDA0003700980460000033
And
Figure FDA0003700980460000041
the projection imaging model is represented as
Figure FDA0003700980460000042
Wherein s is any scale factor, K is an internal reference matrix, R and t are respectively a rotation matrix and a translation matrix from a world coordinate system to a camera coordinate system to jointly form an external reference matrix, (u) 0 ,v 0 ) As principal point coordinates of the image, f x And f y Is the effective focal length on the horizontal and vertical axes of the image, respectively, and gamma is the tilt factor.
8. The method for calibrating a high-precision camera based on planar transformation as claimed in claim 7, wherein in step S403, in case of non-linear distortion, the non-linear distortion model is expressed as
Figure FDA0003700980460000043
In the formula (x) d ,y d ) As coordinates of the imaging point in the ideal case, (x) u ,y u ) For the actual distorted coordinates of the imaging points, delta x (x d ,y d ) And delta y (x d ,y d ) Respectively represent the coordinates of the imaging point as (x) d ,y d ) The amount of distortion occurring in the x and y directions;
taking into account the radial and tangential distortion of the lens, the amount of distortion delta x (x d ,y d ) And delta y (x d ,y d ) Are respectively as
Figure FDA0003700980460000044
In the formula, k 1 、k 2 、k 3 As radial distortion coefficient, p 1 、p 2 In order to be the tangential distortion coefficient,
Figure FDA0003700980460000045
CN202110282708.0A 2021-03-16 2021-03-16 High-precision camera calibration method based on plane transformation Active CN113012234B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110282708.0A CN113012234B (en) 2021-03-16 2021-03-16 High-precision camera calibration method based on plane transformation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110282708.0A CN113012234B (en) 2021-03-16 2021-03-16 High-precision camera calibration method based on plane transformation

Publications (2)

Publication Number Publication Date
CN113012234A CN113012234A (en) 2021-06-22
CN113012234B true CN113012234B (en) 2022-09-02

Family

ID=76408584

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110282708.0A Active CN113012234B (en) 2021-03-16 2021-03-16 High-precision camera calibration method based on plane transformation

Country Status (1)

Country Link
CN (1) CN113012234B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113838138B (en) * 2021-08-06 2024-06-21 杭州灵西机器人智能科技有限公司 System calibration method, system, device and medium for optimizing feature extraction
CN114782553B (en) * 2022-05-11 2023-07-28 江南大学 Iterative camera calibration method and device based on elliptic dual conic
CN115147499A (en) * 2022-07-11 2022-10-04 深圳思谋信息科技有限公司 Calibration parameter determination method, hybrid calibration plate, device, equipment and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101303768A (en) * 2008-06-17 2008-11-12 东南大学 Method for correcting circle center error of circular index point when translating camera perspective projection
CN102915535A (en) * 2012-08-23 2013-02-06 深圳大学 Method and system for correcting circle center deviation of round mark points during camera projection transformation
CN109829948A (en) * 2018-12-13 2019-05-31 昂纳自动化技术(深圳)有限公司 Camera calibration plate, calibration method and camera
CN109859277A (en) * 2019-01-21 2019-06-07 陕西科技大学 A kind of robotic vision system scaling method based on Halcon

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105981074B (en) * 2014-11-04 2018-02-02 深圳市大疆创新科技有限公司 For demarcating system, the method and apparatus of imaging device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101303768A (en) * 2008-06-17 2008-11-12 东南大学 Method for correcting circle center error of circular index point when translating camera perspective projection
CN102915535A (en) * 2012-08-23 2013-02-06 深圳大学 Method and system for correcting circle center deviation of round mark points during camera projection transformation
CN109829948A (en) * 2018-12-13 2019-05-31 昂纳自动化技术(深圳)有限公司 Camera calibration plate, calibration method and camera
CN109859277A (en) * 2019-01-21 2019-06-07 陕西科技大学 A kind of robotic vision system scaling method based on Halcon

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
基于圆形阵列标定板的张氏相机标定法;汪首坤 等;《北京理工大学学报》;20190831;第1,2.2,3节 *
基于圆心真实图像坐标计算的高精度相机标定方法;卢晓冬 等;《中国激光》;20191107;全文 *
基于结构光的微小特征三维测量系统;张观锦 等;《组合机床与自动化加工技术》;20180930;第2.1节 *

Also Published As

Publication number Publication date
CN113012234A (en) 2021-06-22

Similar Documents

Publication Publication Date Title
CN110276808B (en) Method for measuring unevenness of glass plate by combining single camera with two-dimensional code
CN109035320B (en) Monocular vision-based depth extraction method
CN113012234B (en) High-precision camera calibration method based on plane transformation
CN109146980B (en) Monocular vision based optimized depth extraction and passive distance measurement method
CN108053450B (en) High-precision binocular camera calibration method based on multiple constraints
CN110717942B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN110969668A (en) Stereoscopic calibration algorithm of long-focus binocular camera
WO2021138990A1 (en) Adaptive detection method for checkerboard sub-pixel corner points
WO2017092631A1 (en) Image distortion correction method for fisheye image, and calibration method for fisheye camera
CN111429533B (en) Camera lens distortion parameter estimation device and method
CN111862224B (en) Method and device for determining external parameters between camera and laser radar
CN107886547B (en) Fisheye camera calibration method and system
CN109118544B (en) Synthetic aperture imaging method based on perspective transformation
CN109064516B (en) Camera self-calibration method based on absolute quadratic curve image
CN108734657B (en) Image splicing method with parallax processing capability
CN107084680B (en) Target depth measuring method based on machine monocular vision
CN109859137B (en) Wide-angle camera irregular distortion global correction method
CN110956661A (en) Method for calculating dynamic pose of visible light and infrared camera based on bidirectional homography matrix
CN109255818B (en) Novel target and extraction method of sub-pixel level angular points thereof
CN110060304B (en) Method for acquiring three-dimensional information of organism
CN111739031A (en) Crop canopy segmentation method based on depth information
CN114463442A (en) Calibration method of non-coaxial camera
CN116129037B (en) Visual touch sensor, three-dimensional reconstruction method, system, equipment and storage medium thereof
CN113920206A (en) Calibration method of perspective tilt-shift camera
CN115201883A (en) Moving target video positioning and speed measuring system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant