CN113012234A - High-precision camera calibration method based on plane transformation - Google Patents
High-precision camera calibration method based on plane transformation Download PDFInfo
- Publication number
- CN113012234A CN113012234A CN202110282708.0A CN202110282708A CN113012234A CN 113012234 A CN113012234 A CN 113012234A CN 202110282708 A CN202110282708 A CN 202110282708A CN 113012234 A CN113012234 A CN 113012234A
- Authority
- CN
- China
- Prior art keywords
- coordinates
- image
- center
- calibration
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000009466 transformation Effects 0.000 title claims abstract description 85
- 238000000034 method Methods 0.000 title claims abstract description 69
- 238000000605 extraction Methods 0.000 claims abstract description 9
- 239000011159 matrix material Substances 0.000 claims description 37
- 238000003384 imaging method Methods 0.000 claims description 23
- 238000007781 pre-processing Methods 0.000 claims description 8
- 238000007476 Maximum Likelihood Methods 0.000 claims description 7
- 238000012545 processing Methods 0.000 claims description 5
- 230000036544 posture Effects 0.000 claims description 4
- 206010026749 Mania Diseases 0.000 claims description 3
- 238000001914 filtration Methods 0.000 claims description 3
- 239000003550 marker Substances 0.000 claims description 3
- 238000005070 sampling Methods 0.000 claims description 3
- 238000012216 screening Methods 0.000 claims description 3
- 230000002194 synthesizing effect Effects 0.000 claims description 3
- 238000013519 translation Methods 0.000 claims description 3
- 230000001131 transforming effect Effects 0.000 claims description 2
- 238000007796 conventional method Methods 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 5
- 238000005259 measurement Methods 0.000 description 2
- 241001292396 Cirrhitidae Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20164—Salient point detection; Corner detection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geometry (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a high-precision camera calibration method based on plane transformation, which comprises the following steps of S1: on a plane calibration plate taking a circle as a marking point, extracting angular points on inner and outer frames of the calibration plate, accurately adjusting coordinates of the angular points to a sub-pixel level, and projecting an ellipse into an approximate standard circle through perspective transformation; s2: the center of mass is calculated by using the image moment to complete the extraction of the center coordinates of the standard circle; s3: projecting the extracted circle center coordinates back to the original calibration plate plane through inverse perspective transformation, and acquiring actual pixel coordinates of the circle center of the calibration point; s4: and finishing camera calibration by combining a Zhangyingyou calibration method according to the pixel coordinate and the space coordinate corresponding to the circle center of the circular mark point. The camera calibration method can effectively reduce the camera calibration error and improve the camera calibration precision.
Description
Technical Field
The invention relates to the technical field of camera calibration, in particular to a high-precision camera calibration method based on plane transformation.
Background
In the field of computer vision, camera calibration plays an irreplaceable role. At present, the commonly used camera calibration methods mainly include three types: traditional calibration methods, self-calibration methods, and active visual calibration methods. The self-calibration method does not need a calibration object, has strong flexibility, but has poor robustness and poor precision; although the active visual calibration method is simple and can be used for linear solution, the active visual calibration method cannot be applied to application occasions with unknown camera motion parameters; the traditional calibration method has high precision and is widely applied to the fields of high-precision measurement and three-dimensional reconstruction, but calibration objects are needed, and typical methods include a direct linear transformation calibration method, a Tsai two-step calibration method, a Zhang-Yongyou calibration method and the like.
The Zhangyingyou calibration method only needs one checkerboard as a plane calibration plate, and has many advantages of simple operation, high calibration precision and the like in practical use, so the method becomes one of the most widely used calibration algorithms in the field of current high-precision industrial measurement.
When a plane calibration plate with circles as the calibration points is used for calibrating the camera, the accuracy of camera calibration is determined by the extraction accuracy of the center coordinates of the circles of the circular calibration points. When the calibration plate is shot, the plane of the calibration plate and the imaging plane of the camera are not always parallel, a certain angle of inclination exists usually, the circular mark point can be projected into an ellipse, and the center of the ellipse extracted from the calibration plate is not the projection point of the real physical center of the circle due to the existence of perspective deviation. The traditional method usually extracts the center of an ellipse to replace the projection point of the real physical center of a circle, so that the traditional method inevitably reduces the precision of camera calibration.
Disclosure of Invention
Aiming at the existing problems, the invention aims to provide a high-precision camera calibration method based on plane transformation, which can effectively reduce the camera calibration error and improve the camera calibration precision.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
the high-precision camera calibration method based on the plane transformation is characterized by comprising the following steps: comprises the following steps of (a) carrying out,
s1: on a plane calibration plate taking a circle as a marking point, extracting angular points on inner and outer frames of the calibration plate, accurately adjusting coordinates of the angular points to a sub-pixel level, and projecting an ellipse into an approximate standard circle through perspective transformation;
s2: the center of mass is calculated by using the image moment to complete the extraction of the center coordinates of the standard circle;
s3: projecting the extracted circle center coordinates back to the original calibration plate plane through inverse perspective transformation, and acquiring actual pixel coordinates of the circle center of the calibration point;
s4: and finishing camera calibration by combining a Zhangyingyou calibration method according to the pixel coordinate and the space coordinate corresponding to the circle center of the circular mark point.
Further, the specific operation of step S1 includes,
s101: preparing a dot matrix calibration plate;
s102: fixing a camera, changing the postures and positions of the calibration plates, and shooting images of the calibration plates at different viewing angles;
s103: preprocessing the acquired image;
s104: and carrying out plane perspective transformation on the preprocessed image.
Further, the specific operation step of preprocessing the acquired image in step S103 includes,
s1031: converting the collected image into a gray image;
s1032: denoising the gray level image by Gaussian filtering;
s1033: and (5) carrying out binarization processing on the de-manic gray level image by using a maximum inter-class variance method.
Further, the specific operation step of performing the planar perspective transformation on the preprocessed image in step S104 includes,
s1041: detecting the edge of the image, screening the edge by synthesizing the area and length constraints of the edge, and positioning the edge on the outer frame of the calibration plate;
s1042: detecting the corner points of the outer frame by using a Shi-Tomasi algorithm, and searching corner point coordinates at a sub-pixel level;
s1043: repeating the step S1041 and the step S1042, carrying out the same operation on the inner frame of the image, and detecting to obtain five corner points;
s1044: extending two sides of the pentagonal inner frame to form a quadrangle, and obtaining a new corner point at the intersection point;
s1045: estimating an optimal perspective transformation matrix T from the angular point coordinates of eight subpixel levels contained in the inner frame and the outer frame by utilizing a random sampling consistency algorithm and an iterative idea, and enabling the plane of the calibration plate to be parallel to the imaging plane through perspective transformation, wherein the marking points are approximate to standard circles;
the eight sub-pixel-level corner coordinates contained in the inner frame and the outer frame comprise four sub-pixel-level corner coordinates of the outer frame and four quadrilateral sub-pixel-level corner coordinates formed after two sides of the inner frame are extended.
Further, the perspective transformation described in step S1045 is essentially to transform the image from one view plane to a new view plane by using a formulaWherein (u, v, w) and (x, y, z) are coordinates before and after perspective transformation of the image, T is a perspective transformation matrix and is a homogeneous matrix with a degree of freedom of 8, and a33=1;
Assuming that the pixel coordinates of the image before and after perspective transformation are (u ', v') and (x ', y'), respectively, solving the formula of the perspective transformation, and obtaining the pixel coordinates after the perspective transformation as:
further, the specific operation of extracting the coordinates of the center of the standard circle in step S2 includes,
s201: regarding each mark point as a h multiplied by w digital image, and expressing the p + q order moment of the image as the order moment according to the essence of the image momentWherein f (u, v) is the gray value of the image at pixel coordinates (u, v);
s202: using the zeroth order moment m of each marker point00And first moment m10、m01Calculating the coordinates of the center of mass of each mark point
S203: because each mark point is approximate to a standard circle, the centroid coordinate of each mark point can be regarded as the pixel coordinate of the center of the standard circle.
Further, the specific operation of step S3 includes the following steps,
s301: performing inverse operation on the perspective transformation matrix T in the step S1045 to obtain an inverse perspective transformation matrix Tinv,Tinv=T-1;
S302: the pixel coordinates of the center of each standard circle obtained in step S202 are subjected to inverse perspective transformation, and the operation principle is that(x ', y', z ') and (u', v ', w') are coordinates before and after the image is subjected to inverse perspective transformation, respectively, the pixel coordinates of the center of each standard circle obtained in step S202 are input as the coordinates (x ', y', z ') of the image before the image is subjected to inverse perspective transformation, and the output (u', v ', w') is the coordinates of the actual center of the mark point;
s303: and transforming and projecting the circular mark points from the new viewing plane to the original calibration plate plane through the inverse perspective transformation in the step S302 to obtain the coordinates of the pixel coordinates of the center of each standard circle before perspective, namely the actual pixel coordinates of the center of each circular mark point.
Further, the specific operation of step S4 includes the following steps,
s401: calculating internal and external parameters of the camera under an ideal distortion-free condition according to the pixel coordinate and the space coordinate corresponding to the circle center of the circular mark point;
s402: improving the internal and external parameter precision of the camera obtained in the step S401 by utilizing maximum likelihood estimation;
s403: under the condition of nonlinear distortion, calculating a geometric distortion coefficient by using a least square method;
s404: and integrating the internal and external parameters and the distortion coefficients, and improving the overall estimation precision by utilizing the maximum likelihood estimation to obtain the final internal and external parameters and the distortion coefficients of the camera.
Further, in step S401, in the ideal distortion-free case, the camera imaging model is a pinhole model, and the spatial coordinate of the center of the circular mark point is set as P ═ XW,YW,ZW]TThe projection point on the calibration plate plane, i.e. the actual pixel coordinate of the center of the circle of the circular calibration point obtained in step S303 is p ═ u, v]TCorresponding homogeneous coordinates are respectivelyAndthe projection imaging model is represented asWherein s is any scale factor, K is an internal reference matrix, R and t are respectively a rotation matrix and a translation matrix from a world coordinate system to a camera coordinate system to jointly form an external reference matrix, (u)0,v0) As principal point coordinates of the image, fxAnd fyAre the effective focal lengths on the horizontal and vertical axes of the image, respectively, and gamma isThe tilt factor.
Further, in step S403, in the case of nonlinear distortion, a nonlinear distortion model is expressed asIn the formula (x)d,yd) The coordinates of the imaging point in the ideal case are (x)u,yu) For the actual distorted coordinates of the imaging points, deltax(xd,yd) And deltay(xd,yd) Respectively represent the coordinates of the imaging point as (x)d,yd) The amount of distortion occurring in the x and y directions;
taking into account the radial and tangential distortion of the lens, the amount of distortion deltax(xd,yd) And deltay(xd,yd) Are respectively as
In the formula, k1、k2、k3As radial distortion coefficient, p1、p2In order to be the tangential distortion coefficient,the invention has the beneficial effects that:
the invention provides a high-precision camera calibration method based on plane transformation, which comprises the steps of carrying out first transformation on a plane of a calibration plate, projecting an elliptical mark point into an approximate standard circle, and extracting coordinates of the center of the standard circle; and performing secondary transformation on the plane of the calibration plate, projecting the circle center coordinates extracted by the primary transformation onto the original elliptic mark points, and acquiring the actual pixel coordinates of the circle centers of the mark points, thereby avoiding the interference of perspective deviation on the extraction of the circle center coordinates. And then, the camera is calibrated by combining a Zhang Zhengyou calibration method, and compared with the traditional method, the total average reprojection error of the calibration method disclosed by the invention is reduced by 66.169%, so that the camera calibration precision is greatly improved.
Drawings
FIG. 1 is a flow chart of a camera calibration method according to the present invention;
FIG. 2 is a dot matrix calibration plate used for image acquisition in the present invention;
FIG. 3 is a result of pre-processing of an acquired image in accordance with the present invention;
FIG. 4 shows the result of extracting sub-pixel-level corner points from the outer quadrilateral frame;
FIG. 5 shows the result of extracting sub-pixel-level corner points for the pentagonal inner frames in the present invention;
FIG. 6 is a perspective transformed calibration plate of the present invention;
FIG. 7 shows the result of extracting the center of a standard circle according to the present invention;
FIG. 8 shows the result of extracting the center of a circle of a circular mark point according to the present invention;
FIG. 9 is a process of extracting the circle centers of the circular mark points of the first and second figures in the first embodiment of the present invention;
FIG. 10 is a schematic view of a world coordinate system axis distribution of a midpoint array calibration plate according to an embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solution of the present invention, the following further describes the technical solution of the present invention with reference to the drawings and the embodiments.
As shown in fig. 1, the method for calibrating a high-precision camera based on planar transformation comprises the following steps,
s1: on a plane calibration plate taking a circle as a marking point, extracting angular points on inner and outer frames of the calibration plate, accurately adjusting coordinates of the angular points to a sub-pixel level, and projecting an ellipse into an approximate standard circle through perspective transformation;
specifically, S101: preparing a dot matrix calibration plate; in the invention, a dot matrix calibration plate specified by Halcon is adopted and consists of 49 black round mark points with equal diameters in 7 rows and 7 columns, as shown in figure 2.
S102: fixing a camera, changing the posture and the position of the calibration plate, and shooting images of 16-20 calibration plates at different visual angles;
s103: preprocessing the acquired image;
specifically, the specific operation steps of preprocessing the acquired image include,
s1031: converting the collected image into a gray image;
s1032: denoising the gray level image by Gaussian filtering;
s1033: and (5) carrying out binarization processing on the de-manic gray level image by using a maximum inter-class variance method.
As shown in fig. 3, (a) is an original image before preprocessing, (b) is a grayscale image, (c) is an image after denoising, and (d) is an image after binarization. The maximum inter-class variance method is an algorithm for determining a segmentation threshold value and carrying out binarization on an image, is insensitive to the brightness and contrast of the image, is widely applied to image processing, but is sensitive to noise, so that the image needs to be denoised before the maximum inter-class variance method is applied.
S104: and carrying out plane perspective transformation on the preprocessed image.
The perspective transformation is essentially to transform an image from a viewing plane to a new viewing plane, using a formulaWherein (u, v, w) and (x, y, z) are coordinates before and after perspective transformation of the image, T is a perspective transformation matrix and is a homogeneous matrix with a degree of freedom of 8, and a33=1;
Assuming that the pixel coordinates of the image before and after perspective transformation are (u ', v') and (x ', y'), respectively, solving the formula of the perspective transformation, and obtaining the pixel coordinates after the perspective transformation as:
according to the solving result, the perspective transformation matrix can be obtained through four pairs of pixel coordinates, and because the image has the problems of noise, corner point error extraction and the like, the perspective transformation matrix is generally obtained by solving more than four pairs of pixel coordinates, and then the perspective transformation matrix is utilized to carry out plane transformation on the image.
The specific operation of performing the planar perspective transformation on the preprocessed image includes the following steps,
s1041: detecting the edge of the image, screening the edge by synthesizing the area and length constraints of the edge, and positioning the edge on the outer frame of the calibration plate;
s1042: the corner points of the outer frame are detected by using the Shi-Tomasi algorithm, and corner point coordinates at the sub-pixel level are searched, as shown in fig. 4.
S1043: repeating the step S1041 and the step S1042, carrying out the same operation on the inner frame of the image, and detecting to obtain five corner points;
s1044: extending two sides of the pentagonal inner frame to form a quadrangle, and obtaining a new corner point at the intersection point; as shown in fig. 5.
S1045: because the image may have the problems of noise, corner point error extraction and the like, and the obtained sub-pixel level corner point coordinates may not be all accurate, therefore, by using a random sampling consistency algorithm and through an iterative idea, an optimal perspective transformation matrix T is estimated from the eight sub-pixel level corner point coordinates contained in the inner and outer frames, the calibration plate plane and the imaging plane are parallel through perspective transformation, and the result is shown in figure 5, and the mark point is approximate to a standard circle;
the eight sub-pixel-level corner coordinates contained in the inner frame and the outer frame comprise four sub-pixel-level corner coordinates of the outer frame and four quadrilateral sub-pixel-level corner coordinates formed after two sides of the inner frame are extended.
Further, step S2: the center of mass is calculated by using the image moment to complete the extraction of the center coordinates of the standard circle;
image moments are operators describing image features, and essentially perform a special weighting on the gray-scale values of an image. The specific operation steps of utilizing the image moment to calculate the mass center to complete the extraction of the center coordinates of the standard circle comprise,
s201: regarding each mark point as a h multiplied by w digital image, and expressing the p + q order moment of the image as the order moment according to the essence of the image momentWherein f (u, v) is the gray value of the image at pixel coordinates (u, v);
s202: using the zeroth order moment m of each marker point00And first moment m10、m01Calculating the coordinates of the center of mass of each mark point
S203: since each mark point is approximate to a standard circle, the coordinates of the centroid of each mark point can be regarded as the pixel coordinates of the center of the standard circle, as shown in fig. 7.
Further, step S3: projecting the extracted circle center coordinates back to the original calibration plate plane through inverse perspective transformation, and acquiring actual pixel coordinates of the circle center of the calibration point;
the inverse perspective transformation is the inverse process of the perspective transformation, and essentially transforms the image from the new viewing plane to the original viewing plane, and in particular,
s301: performing inverse operation on the perspective transformation matrix T in the step S1045 to obtain an inverse perspective transformation matrix Tinv,Tinv=T-1;
S302: the pixel coordinates of the center of each standard circle obtained in step S202 are subjected to inverse perspective transformation, and the operation principle is that(x ', y', z ') and (u', v ', w') are coordinates before and after the image is subjected to inverse perspective transformation, respectively, the pixel coordinates of the center of each standard circle obtained in step S202 are input as the coordinates (x ', y', z ') of the image before the image is subjected to inverse perspective transformation, and the output (u', v ', w') is the coordinates of the actual center of the mark point;
s303: through the inverse perspective transformation in step S302, the circular mark points are transformed from the new viewing plane and projected back to the original calibration plate plane, and coordinates of the pixel coordinates of the center of each standard circle before perspective are obtained, that is, the actual pixel coordinates of the center of the circular mark points, as shown in fig. 8.
Further, step S4: and finishing camera calibration by combining a Zhangyingyou calibration method according to the pixel coordinate and the space coordinate corresponding to the circle center of the circular mark point.
Camera calibration refers to establishing a projection imaging model between world coordinates of a three-dimensional space and pixel coordinates of a two-dimensional image. In particular, the method comprises the following steps of,
s401: calculating internal and external parameters of the camera under an ideal distortion-free condition according to the pixel coordinate and the space coordinate corresponding to the circle center of the circular mark point;
specifically, the pixel coordinate corresponding to the circle center of the circular mark point is the actual pixel coordinate of the circle center of the circular mark point obtained in step S3, and the method for determining the spatial coordinate corresponding to the circle center of the circular mark point includes: the axis distribution of a world coordinate system of a dot matrix calibration plate is automatically determined through the position of the shortest side length AB in the inner frame of the pentagon, the calibration plate is supposed to be on a plane with the world coordinate system z being 0, a dot closest to the AB is selected as the origin of the world coordinate system, x and y coordinate systems are respectively established horizontally, rightwards and vertically downwards, and the physical distance between adjacent circular mark points is known, so that the three-dimensional coordinate of the center of each circular mark point can be determined.
Under the ideal distortion-free condition, the camera imaging model is a pinhole model, and the space coordinate of the circle center of the circular mark point is set as P ═ XW,YW,ZW]TThe projection point on the calibration plate plane, i.e. the actual pixel coordinate of the center of the circle of the circular calibration point obtained in step S303 is p ═ u, v]TCorresponding homogeneous coordinates are respectivelyAndthe projection imaging model is represented asWherein s is an arbitrary scale factorK is an internal reference matrix, R and t are rotation and translation matrixes from a world coordinate system to a camera coordinate system respectively, and form an external reference matrix together, (u)0,v0) As principal point coordinates of the image, fxAnd fyIs the effective focal length on the horizontal and vertical axes of the image, respectively, and gamma is the tilt factor.
S402: improving the internal and external parameter precision of the camera obtained in the step S401 by utilizing maximum likelihood estimation;
specifically, in order to obtain the optimal camera calibration parameters, the Zhang Zhengyou calibration method improves the precision of all parameters by constructing an objective function of a reprojection error and utilizing maximum likelihood estimation, wherein the objective function is as follows:in the formula: n is the number of calibration images, m is the number of characteristic points on each calibration image, K is the camera internal reference matrix, D is the distortion coefficient of the camera, RiAnd tiFor each corresponding external parameter, pijAndfor the j-th feature point P on the i imagesjActual proxels and proxels obtained via a camera imaging model.
S403: under the condition of nonlinear distortion, calculating a geometric distortion coefficient by using a least square method;
in practical situations, the camera imaging model may not be able to achieve a perfectly ideal linear model due to geometric distortions introduced by the manufacturing accuracy of the lens and variations in the assembly process. Wherein in the case of nonlinear distortion, the nonlinear distortion model is expressed asIn the formula (x)d,yd) The coordinates of the imaging point in the ideal case are (x)u,yu) For the actual distorted coordinates of the imaging points, deltax(xd,yd) And deltay(xd,yd) Respectively represent the coordinates of the imaging point as (x)d,yd) The amount of distortion occurring in the x and y directions;
the distortion of the lens is mainly divided into radial distortion, tangential distortion, thin lens distortion and the like, wherein the radial distortion and the tangential distortion have the most significant influence, so that the invention considers the radial distortion and the tangential distortion of the lens, and the distortion quantity deltax(xd,yd) And deltay(xd,yd) Are respectively as
In the formula, k1、k2、k3As radial distortion coefficient, p1、p2In order to be the tangential distortion coefficient,
s404: and integrating the internal and external parameters and the distortion coefficients, and improving the overall estimation precision by utilizing the maximum likelihood estimation to obtain the final internal and external parameters and the distortion coefficients of the camera.
The first embodiment is as follows:
the camera resolution used in this example was pixel, and a dot matrix calibration plate was printed on a4 paper, the specification of which is shown in table 1 below. The experimental environment is matched with an image processing open source library OpenCV 3.2.0 under Visual Studio 2019 on a PC with a CPU of Intel core i51.80 GHz and an operating system of Windows 10.
TABLE 1 lattice calibration plate Specification
The camera is fixed, the calibration plates under different postures and positions are shot, 16 different images are shot in total, and the pixel coordinates of the circle center are extracted from 2 images (named as a figure (r) and a figure (r)) in the steps S1-S3 by using the method for extracting the circle center of the circular marking point, wherein the process is shown in the attached figure 9. In fig. 9, the original drawing, the inner and outer corner points, the plane transformation, the center of the standard circle, and the center of the circular mark point are sequentially extracted from top to bottom.
Further, according to the pixel coordinates and the space coordinates corresponding to the circle center of the circular mark point, the camera calibration is completed by combining a Zhang-Yongyou calibration method, wherein the determination method of the space coordinates comprises the following steps: and automatically determining the axis distribution of the world coordinate system of the dot matrix calibration board according to the position of the shortest side length AB in the inner frame of the pentagon. Assuming that the calibration board is on a plane with a world coordinate system z equal to 0, a dot closest to AB is selected as the origin of the world coordinate system, x and y coordinate systems are respectively established horizontally and rightwards and vertically downwards, and the physical distance between adjacent circular mark points is known, so that the three-dimensional coordinates of the center of each circular mark point can be determined, as shown in fig. 10.
Further, the results of calibrating the camera by using the method of the present invention and the conventional method are compared.
Specifically, for 16 shot images, the center of a circle of a circular mark point is extracted by the method of the present invention and the conventional method, and then the camera is calibrated by combining the Zhangyingyou calibration method, and the obtained calibration results are shown in table 2 below.
TABLE 2 comparison of camera calibration results
In the Zhang friend calibration method, the reprojection error is generally used to determine the camera calibration accuracy. The reprojection error refers to that the three-dimensional point of the space is reprojected by using the calibrated internal and external parameters and distortion coefficients of the camera to obtain the deviation between the new projection point coordinate and the original imaging point coordinate of the three-dimensional point of the space on the image. Generally, the smaller the reprojection error, the higher the accuracy of the camera calibration. The reprojection error for each image and the average reprojection error for all images for the inventive and conventional methods are shown in tables 3 and 4 below, respectively. As can be seen from Table 3, the reprojection error of the method of the present invention was reduced by 54.921% -75.410% compared to the conventional method, and was lower than that of the conventional method. As can be seen from Table 4, the overall average error of the method of the present invention is 0.0340, which is 66.169% lower than that of the conventional method. By combining tables 3 and 4, it is demonstrated that the method herein greatly improves the accuracy of camera calibration, while also demonstrating the feasibility and effectiveness of the method of the present invention.
Table 3 re-projection error contrast units calibrated by camera: pixel
FIG. 4 Total average reprojection error versus Unit for camera calibration: pixel
The foregoing shows and describes the general principles, essential features, and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are described in the specification and illustrated only to illustrate the principle of the present invention, but that various changes and modifications may be made therein without departing from the spirit and scope of the present invention, which fall within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (10)
1. The high-precision camera calibration method based on the plane transformation is characterized by comprising the following steps: comprises the following steps of (a) carrying out,
s1: on a plane calibration plate taking a circle as a marking point, extracting angular points on inner and outer frames of the calibration plate, accurately adjusting coordinates of the angular points to a sub-pixel level, and projecting an ellipse into an approximate standard circle through perspective transformation;
s2: the center of mass is calculated by using the image moment to complete the extraction of the center coordinates of the standard circle;
s3: projecting the extracted circle center coordinates back to the original calibration plate plane through inverse perspective transformation, and acquiring actual pixel coordinates of the circle center of the calibration point;
s4: and finishing camera calibration by combining a Zhangyingyou calibration method according to the pixel coordinate and the space coordinate corresponding to the circle center of the circular mark point.
2. The method for calibrating a high-precision camera based on planar transformation as claimed in claim 1, wherein the specific operation of step S1 includes,
s101: preparing a dot matrix calibration plate;
s102: fixing a camera, changing the postures and positions of the calibration plates, and shooting images of the calibration plates at different viewing angles;
s103: preprocessing the acquired image;
s104: and carrying out plane perspective transformation on the preprocessed image.
3. The method for calibrating a high-precision camera based on planar transformation as claimed in claim 2, wherein the specific operation step of preprocessing the acquired image in step S103 includes,
s1031: converting the collected image into a gray image;
s1032: denoising the gray level image by Gaussian filtering;
s1033: and (5) carrying out binarization processing on the de-manic gray level image by using a maximum inter-class variance method.
4. The method for calibrating a high-precision camera based on planar transformation as claimed in claim 2, wherein the specific operation step of performing planar perspective transformation on the preprocessed image in step S104 includes,
s1041: detecting the edge of the image, screening the edge by synthesizing the area and length constraints of the edge, and positioning the edge on the outer frame of the calibration plate;
s1042: detecting the corner points of the outer frame by using a Shi-Tomasi algorithm, and searching corner point coordinates at a sub-pixel level;
s1043: repeating the step S1041 and the step S1042, carrying out the same operation on the inner frame of the image, and detecting to obtain five corner points;
s1044: extending two sides of the pentagonal inner frame to form a quadrangle, and obtaining a new corner point at the intersection point;
s1045: estimating an optimal perspective transformation matrix T from the angular point coordinates of eight subpixel levels contained in the inner frame and the outer frame by utilizing a random sampling consistency algorithm and an iterative idea, and enabling the plane of the calibration plate to be parallel to the imaging plane through perspective transformation, wherein the marking points are approximate to standard circles;
the eight sub-pixel-level corner coordinates contained in the inner frame and the outer frame comprise four sub-pixel-level corner coordinates of the outer frame and four quadrilateral sub-pixel-level corner coordinates formed after two sides of the inner frame are extended.
5. The method for calibrating a high precision camera based on plane transformation as claimed in claim 4, wherein the perspective transformation in step S1045 is to transform the image from a view plane to a new view plane by using a formulaWherein (u, v, w) and (x, y, z) are coordinates before and after perspective transformation of the image, T is a perspective transformation matrix and is a homogeneous matrix with a degree of freedom of 8, and a33=1;
6. the method for calibrating a high-precision camera based on planar transformation as claimed in claim 5, wherein the specific operation step of extracting the coordinates of the center of the standard circle in step S2 includes,
s201: regarding each mark point as a h multiplied by w digital image, and expressing the p + q order moment of the image as the order moment according to the essence of the image momentWherein f (u, v) is the gray value of the image at pixel coordinates (u, v);
s202: using the zeroth order moment m of each marker point00And first moment m10、m01Calculating the coordinates of the center of mass of each mark point
S203: because each mark point is approximate to a standard circle, the centroid coordinate of each mark point can be regarded as the pixel coordinate of the center of the standard circle.
7. The method for calibrating a high-precision camera based on planar transformation as claimed in claim 6, wherein the specific operation of step S3 includes the following steps,
s301: performing inverse operation on the perspective transformation matrix T in the step S1045 to obtain an inverse perspective transformation matrix Tinv,Tinv=T-1;
S302: the pixel coordinates of the center of each standard circle obtained in step S202 are subjected to inverse perspective transformation, and the operation principle is that(x ', y', z ') and (u', v ', w') are coordinates before and after the image is subjected to inverse perspective transformation, respectively, the pixel coordinates of the center of each standard circle obtained in step S202 are input as the coordinates (x ', y', z ') of the image before the image is subjected to inverse perspective transformation, and the output (u', v ', w') is the coordinates of the actual center of the mark point;
s303: and transforming and projecting the circular mark points from the new viewing plane to the original calibration plate plane through the inverse perspective transformation in the step S302 to obtain the coordinates of the pixel coordinates of the center of each standard circle before perspective, namely the actual pixel coordinates of the center of each circular mark point.
8. The method for calibrating a high-precision camera based on planar transformation as claimed in claim 7, wherein the specific operation of step S4 includes the following steps,
s401: calculating internal and external parameters of the camera under an ideal distortion-free condition according to the pixel coordinate and the space coordinate corresponding to the circle center of the circular mark point;
s402: improving the internal and external parameter precision of the camera obtained in the step S401 by utilizing maximum likelihood estimation;
s403: under the condition of nonlinear distortion, calculating a geometric distortion coefficient by using a least square method;
s404: and integrating the internal and external parameters and the distortion coefficients, and improving the overall estimation precision by utilizing the maximum likelihood estimation to obtain the final internal and external parameters and the distortion coefficients of the camera.
9. The method for calibrating a high-precision camera based on planar transformation as claimed in claim 8, wherein in step S401, under an ideal distortion-free condition, the camera imaging model is a pinhole model, and the spatial coordinate of the center of the circular mark point is set as P ═ XW,YW,ZW]TThe projection point on the calibration plate plane, i.e. the actual pixel coordinate of the center of the circle of the circular calibration point obtained in step S303 is p ═ u, v]TCorresponding homogeneous coordinates are respectivelyAndthe projection imaging model is represented asWherein s is any scale factor, K is an internal reference matrix, R and t are respectively a rotation matrix and a translation matrix from a world coordinate system to a camera coordinate system to jointly form an external reference matrix, (u)0,v0) As principal point coordinates of the image, fxAnd fyIs the effective focal length on the horizontal and vertical axes of the image, respectively, and gamma is the tilt factor.
10. The method for calibrating a high-precision camera based on planar transformation as claimed in claim 8, wherein in step S403, in case of non-linear distortion, the non-linear distortion model is expressed asIn the formula (x)d,yd) The coordinates of the imaging point in the ideal case are (x)u,yu) For the actual distorted coordinates of the imaging points, deltax(xd,yd) And deltay(xd,yd) Respectively represent the coordinates of the imaging point as (x)d,yd) The amount of distortion occurring in the x and y directions;
taking into account the radial and tangential distortion of the lens, the amount of distortion deltax(xd,yd) And deltay(xd,yd) Are respectively as
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110282708.0A CN113012234B (en) | 2021-03-16 | 2021-03-16 | High-precision camera calibration method based on plane transformation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110282708.0A CN113012234B (en) | 2021-03-16 | 2021-03-16 | High-precision camera calibration method based on plane transformation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113012234A true CN113012234A (en) | 2021-06-22 |
CN113012234B CN113012234B (en) | 2022-09-02 |
Family
ID=76408584
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110282708.0A Active CN113012234B (en) | 2021-03-16 | 2021-03-16 | High-precision camera calibration method based on plane transformation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113012234B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113838138A (en) * | 2021-08-06 | 2021-12-24 | 杭州灵西机器人智能科技有限公司 | System calibration method, system, device and medium for optimizing feature extraction |
CN114529613A (en) * | 2021-12-15 | 2022-05-24 | 深圳市华汉伟业科技有限公司 | Method for extracting characteristic point high-precision coordinates of circular array calibration plate |
CN114782553A (en) * | 2022-05-11 | 2022-07-22 | 江南大学 | Iterative camera calibration method and device based on elliptic dual quadratic curve |
CN115115619A (en) * | 2022-07-27 | 2022-09-27 | 深圳见得空间科技有限公司 | Feature point extraction method, device and equipment based on circle fitting and storage medium |
WO2024011764A1 (en) * | 2022-07-11 | 2024-01-18 | 深圳思谋信息科技有限公司 | Calibration parameter determination method and apparatus, hybrid calibration board, device, and medium |
CN118570312A (en) * | 2024-08-01 | 2024-08-30 | 国科大杭州高等研究院 | Multi-camera collaborative calibration method suitable for dynamic vision sensor and application |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101303768A (en) * | 2008-06-17 | 2008-11-12 | 东南大学 | Method for correcting circle center error of circular index point when translating camera perspective projection |
CN102915535A (en) * | 2012-08-23 | 2013-02-06 | 深圳大学 | Method and system for correcting circle center deviation of round mark points during camera projection transformation |
US20170221226A1 (en) * | 2014-11-04 | 2017-08-03 | SZ DJI Technology Co., Ltd. | Camera calibration |
CN109829948A (en) * | 2018-12-13 | 2019-05-31 | 昂纳自动化技术(深圳)有限公司 | Camera calibration plate, calibration method and camera |
CN109859277A (en) * | 2019-01-21 | 2019-06-07 | 陕西科技大学 | A kind of robotic vision system scaling method based on Halcon |
-
2021
- 2021-03-16 CN CN202110282708.0A patent/CN113012234B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101303768A (en) * | 2008-06-17 | 2008-11-12 | 东南大学 | Method for correcting circle center error of circular index point when translating camera perspective projection |
CN102915535A (en) * | 2012-08-23 | 2013-02-06 | 深圳大学 | Method and system for correcting circle center deviation of round mark points during camera projection transformation |
US20170221226A1 (en) * | 2014-11-04 | 2017-08-03 | SZ DJI Technology Co., Ltd. | Camera calibration |
CN109829948A (en) * | 2018-12-13 | 2019-05-31 | 昂纳自动化技术(深圳)有限公司 | Camera calibration plate, calibration method and camera |
CN109859277A (en) * | 2019-01-21 | 2019-06-07 | 陕西科技大学 | A kind of robotic vision system scaling method based on Halcon |
Non-Patent Citations (3)
Title |
---|
卢晓冬 等: "基于圆心真实图像坐标计算的高精度相机标定方法", 《中国激光》 * |
张观锦 等: "基于结构光的微小特征三维测量系统", 《组合机床与自动化加工技术》 * |
汪首坤 等: "基于圆形阵列标定板的张氏相机标定法", 《北京理工大学学报》 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113838138A (en) * | 2021-08-06 | 2021-12-24 | 杭州灵西机器人智能科技有限公司 | System calibration method, system, device and medium for optimizing feature extraction |
CN114529613A (en) * | 2021-12-15 | 2022-05-24 | 深圳市华汉伟业科技有限公司 | Method for extracting characteristic point high-precision coordinates of circular array calibration plate |
CN114782553A (en) * | 2022-05-11 | 2022-07-22 | 江南大学 | Iterative camera calibration method and device based on elliptic dual quadratic curve |
WO2024011764A1 (en) * | 2022-07-11 | 2024-01-18 | 深圳思谋信息科技有限公司 | Calibration parameter determination method and apparatus, hybrid calibration board, device, and medium |
CN115115619A (en) * | 2022-07-27 | 2022-09-27 | 深圳见得空间科技有限公司 | Feature point extraction method, device and equipment based on circle fitting and storage medium |
CN118570312A (en) * | 2024-08-01 | 2024-08-30 | 国科大杭州高等研究院 | Multi-camera collaborative calibration method suitable for dynamic vision sensor and application |
Also Published As
Publication number | Publication date |
---|---|
CN113012234B (en) | 2022-09-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113012234B (en) | High-precision camera calibration method based on plane transformation | |
CN110276808B (en) | Method for measuring unevenness of glass plate by combining single camera with two-dimensional code | |
CN109035320B (en) | Monocular vision-based depth extraction method | |
CN109146980B (en) | Monocular vision based optimized depth extraction and passive distance measurement method | |
CN110969668B (en) | Stereo calibration algorithm of long-focus binocular camera | |
CN110717942B (en) | Image processing method and device, electronic equipment and computer readable storage medium | |
CN111862224B (en) | Method and device for determining external parameters between camera and laser radar | |
CN111260731A (en) | Checkerboard sub-pixel level corner point self-adaptive detection method | |
WO2017092631A1 (en) | Image distortion correction method for fisheye image, and calibration method for fisheye camera | |
CN107886547B (en) | Fisheye camera calibration method and system | |
CN108734657B (en) | Image splicing method with parallax processing capability | |
CN109118544B (en) | Synthetic aperture imaging method based on perspective transformation | |
CN111429533A (en) | Camera lens distortion parameter estimation device and method | |
CN109961485A (en) | A method of target positioning is carried out based on monocular vision | |
CN109255818B (en) | Novel target and extraction method of sub-pixel level angular points thereof | |
CN111340888B (en) | Light field camera calibration method and system without white image | |
CN109859137B (en) | Wide-angle camera irregular distortion global correction method | |
CN110956661A (en) | Method for calculating dynamic pose of visible light and infrared camera based on bidirectional homography matrix | |
CN107084680A (en) | Target depth measuring method based on machine monocular vision | |
CN111739031A (en) | Crop canopy segmentation method based on depth information | |
CN116129037B (en) | Visual touch sensor, three-dimensional reconstruction method, system, equipment and storage medium thereof | |
CN114998448B (en) | Multi-constraint binocular fisheye camera calibration and space point positioning method | |
CN112801870A (en) | Image splicing method based on grid optimization, splicing system and readable storage medium | |
CN114998571B (en) | Image processing and color detection method based on fixed-size markers | |
CN112446926B (en) | Relative position calibration method and device for laser radar and multi-eye fish-eye camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |