CN114998448A - Method for calibrating multi-constraint binocular fisheye camera and positioning space point - Google Patents

Method for calibrating multi-constraint binocular fisheye camera and positioning space point Download PDF

Info

Publication number
CN114998448A
CN114998448A CN202210655001.4A CN202210655001A CN114998448A CN 114998448 A CN114998448 A CN 114998448A CN 202210655001 A CN202210655001 A CN 202210655001A CN 114998448 A CN114998448 A CN 114998448A
Authority
CN
China
Prior art keywords
image
fisheye camera
constraint
calibration
binocular fisheye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210655001.4A
Other languages
Chinese (zh)
Inventor
张文明
杨新宇
李海滨
李雅倩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yanshan University
Original Assignee
Yanshan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yanshan University filed Critical Yanshan University
Priority to CN202210655001.4A priority Critical patent/CN114998448A/en
Publication of CN114998448A publication Critical patent/CN114998448A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T3/047
    • G06T3/06
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration by the use of histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Abstract

The invention discloses a method for calibrating a multi-constraint binocular fisheye camera and positioning a space point. The method comprises the steps of firstly shooting a circular calibration object by using a left binocular fisheye camera and a right binocular fisheye camera to obtain an original distorted image, carrying out threshold segmentation on the distorted image by adopting a global threshold algorithm of Gaussian fitting to obtain the boundary of a calibration circle, calculating the mass center of the calibration circle in the distorted image as a characteristic point, then calculating initial binocular camera parameters by adopting a Kannala model, obtaining a space point according to back projection of the initial parameters, calculating a space geometric error, a polar line constraint error, a distance constraint error, a vertical constraint error and a collinear constraint error, establishing an optimized objective function, and solving the optimized objective function by adopting a Levenberg-Marquardt method to obtain high-precision optimal binocular camera parameters.

Description

Method for calibrating multi-constraint binocular fisheye camera and positioning space point
Technical Field
The invention relates to the technical field of camera calibration, in particular to a method for calibrating a multi-constraint binocular fisheye camera and positioning a space point.
Background
Binocular stereo vision is one of important research directions of computer vision, and the technology is widely applied to the fields of aerial surveying and mapping, building measurement, robot positioning, three-dimensional reconstruction and the like. The camera is an important way for acquiring external information in a stereoscopic vision manner, however, the traditional camera has a limited field angle, and is difficult to meet the requirements of large-scale navigation, positioning and three-dimensional reconstruction without auxiliary facilities such as a rotary platform. The fish-eye camera solves the problem that the field range of a common camera is small, and the fish-eye lens is an ultra-wide-angle lens, and the field range of the fish-eye lens can reach 180 degrees, and some of the fish-eye lenses even can reach 270 degrees. However, although the range of the angle of view of the fisheye lens is larger than that of the normal lens, some visual information is lost. The fisheye image has great deformation, and under some environments with higher precision requirements, the error inevitably causes influence on the application field of the fisheye lens. Therefore, a mathematical model for optimization needs to be established, and an optimization algorithm is used to find accurate internal and external parameters of the camera, so as to complete high-precision calibration of the camera.
The existing calibration methods can be generally divided into two categories: the first type is self-calibration of a camera, and the self-calibration method can complete calibration only by image point correspondence, but the method can flexibly and quickly complete calibration at the expense of calibration precision, so the method is not suitable for application scenes needing high precision. The second type is a calibration method based on a target, wherein the most common method is based on checkerboard detection, but redundant checkerboard corners exist in the calibration process based on the checkerboard detection method, which may cause the conditions of corner false detection, missing detection and the like, and can only reach the pixel-level precision.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a multi-constraint binocular fisheye camera calibration and space point positioning method, after the characteristic points are found through threshold segmentation, a multi-constraint objective function comprising reprojection constraint, polar line constraint, distance constraint, vertical constraint and collinear constraint is established, parameters to be calibrated are iteratively optimized through an optimization algorithm, and finally the parameters of the binocular fisheye camera are obtained.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
a method for calibrating a multi-constraint binocular fisheye camera and positioning a space point comprises the following steps:
step S1, obtaining the original image of the calibration plate: selecting a left binocular fisheye camera and a right binocular fisheye camera, shooting the left binocular fisheye camera and the right binocular fisheye camera at a plurality of angles relative to a calibration object, keeping the relative pose between the left binocular fisheye camera and the right binocular fisheye camera unchanged during each shooting, changing the relative positions between the left binocular fisheye camera and the right binocular fisheye camera and the calibration object, and acquiring a calibration plate image group, wherein the calibration plate image group is a plurality of pairs of distorted images;
step S2, initializing internal parameters: extracting the effective region of the distorted image in the step S1, fitting the fisheye image with the ellipse as the imaging boundary by adopting a least square method, and solving the general equation fitting result of the ellipse as follows: au coating 2 +2Buv+Cv 2 +2Du +2Ev + F is 0, and the effective focal length (F) for the left and right binocular fisheye cameras according to the fitting result x ,f y ) Performing initial estimation:
Figure BDA0003687212440000021
step S3, extracting image coordinates of the control points in the original image of the calibration plate: performing threshold segmentation on the effective region of the distorted image by adopting a global threshold algorithm of Gaussian fitting, acquiring the boundary of a calibration circle in the distorted image, calculating the mass center of the calibration circle in the distorted image, and using the mass center as a characteristic point pair of a left fisheye camera and a right fisheye camera;
step S4, performing monocular calibration on the left binocular fisheye camera and the right binocular fisheye camera: performing monocular calibration on the left binocular fisheye camera and the right binocular fisheye camera respectively by adopting a spherical projection model according to a Kannala method to obtain an internal parameter matrix p of the left binocular fisheye camera l And a left binocular fisheye camera extrinsic parameter rotation matrix R l Translation matrix t l Right binocular fisheye camera internal parameter matrix p r And a right binocular fisheye camera extrinsic parameter rotation matrix R r Translation matrix t r And the initial parameters are used as the initial parameters of the left binocular fisheye camera and the right binocular fisheye camera;
step S5, calculating space point coordinates by the characteristic points of the back projection distortion image: back-projecting the characteristic point pairs on the distorted image obtained in step S3 according to the internal parameter p l 、p r And radix Ginseng R l 、R r 、t l 、t r Obtaining the parameter theta of the incident light by back projection l 、θ r
Figure BDA0003687212440000031
Wherein theta is l Representing the angle theta between the incident ray of the left image and the z-axis in the three-dimensional coordinate system of the camera r Representing the angle between the incident ray of the right image and the z-axis in the three-dimensional coordinate system of the camera,
Figure BDA0003687212440000032
represents the included angle between the projection of the incident ray of the left image to the xy plane of the three-dimensional coordinate system of the camera and the x axis,
Figure BDA0003687212440000033
representing the included angle between the projection of the incident ray of the right image to the xy plane of the three-dimensional coordinate system of the camera and the x axis; carrying out back projection on the characteristic point pairs to the spherical surface to obtain normalized coordinates
Figure BDA0003687212440000034
Figure BDA0003687212440000035
Wherein x is l Normalized coordinates, x, representing the back projection of the feature points of the left image onto a sphere r The normalized coordinates representing the feature points of the right image are back projected onto the sphere; according to transformation formula of left and right binocular fisheye cameras
Figure BDA0003687212440000036
Determining the coefficient lambda l 、λ r Finally, the space point coordinate is obtained as X ═ lambda l x l
Step S6, matching with world coordinates: aiming at the characteristic point M with known coordinates in the circular calibration object pattern under each world coordinate system w The feature point M w Converting a world coordinate system taking a calibration object as a center into a feature point M under a camera coordinate system taking a left fisheye camera as a center, comparing the calculated space points X and M, constructing an objective function with the minimum difference between two groups of feature points under a three-dimensional measurement coordinate system, and carrying out nonlinear optimization:
Figure BDA0003687212440000037
wherein i represents the serial number of the image, and j represents the serial number of the feature point in each image;
step S7, constructing an optimization objective function: in order to accelerate the convergence of nonlinear optimization, an epipolar constraint J is added according to three-dimensional reconstruction characteristic point processing e Distance constraint J dis Collinear constraint J co Vertical restraint J + Constraint of polar line J e Is an inherent property of binocular vision system, distance constraint J dis Is an index for the reconstruction of the inspection distance, collinearity constraint J co And vertical constraint J + Is an index for inspecting the back projection result, which is several J 3D 、J e 、J dis 、J co And J + Taken together, the complete objective function is obtained: j is J 3D +J e +J dis +J co +J + And solving according to the established optimization objective function to obtain the optimal camera parameters, then solving the optimization objective function by adopting a Levenberg-Marquardt method to obtain the optimal camera parameters, and completing the high-precision calibration of the left and right binocular fisheye cameras.
The technical scheme of the invention is further improved as follows: when the left and right binocular fisheye cameras shoot in step S1, the calibration object is a black-and-white circular pattern on the liquid crystal display, the radius of each calibration circle in the circular image is obtained according to the display resolution and the physical size of the screen, and the black-and-white circular pattern is ensured to be completely imaged in the left and right images during shooting.
The technical scheme of the invention is further improved as follows: the specific process of step S3 is as follows: firstly, converting a distorted image into a gray image and counting a gray histogram of the image, and then segmenting the image by adopting an Otsu global threshold algorithm to obtain an initial threshold th 0 Using an initial threshold th i Dividing the gray histogram into C 0 And C 1 Two parts, where i is the number of iterations, calculating C separately 0 And C 1 Mean value of respective distribution 0 、μ 1 And variance
Figure BDA0003687212440000041
Fit two Gaussian distributions
Figure BDA0003687212440000042
And
Figure BDA0003687212440000043
and respectively calculating the probability density function values of all the distributions, and recording mu begin Is a Gaussian distribution with small variance
Figure BDA0003687212440000044
Mean value of (d) (. mu.) end Is a Gaussian distribution with large variance
Figure BDA0003687212440000045
Mean value of from mu begin At position to mu end Traversing direction when p (x) i |Begin)≤p(x i | End), the optimal threshold value searched in this iteration is th i+1 =x i When | th i+1 -th i If | < ═ 1, then th i And (4) stopping iteration for the global optimal threshold, and then calculating the circle center image coordinates of the circular pattern of the calibration object by using a least square ellipse fitting method.
The technical scheme of the invention is further improved as follows: in step S4, the left binocular fisheye camera coordinate system is a three-dimensional cartesian coordinate system based on the left binocular fisheye camera, and the right binocular fisheye camera coordinate system is a three-dimensional cartesian coordinate system based on the right binocular fisheye camera.
The technical scheme of the invention is further improved as follows: polar line constraint J in the step S7 e Is obtained by calculating normalized coordinates obtained by back projecting the characteristic point pairs onto a unit sphere and a basic matrix E, wherein E is [ t [ ]] × R is represented by
Figure BDA0003687212440000046
Selecting { R i And { t } i The median, where i denotes the sequence number of image pairs from 1, calculating the epipolar constraint J of the feature points of the left and right image pairs e
Figure BDA0003687212440000047
Wherein x is lij Sphere normalized coordinates, x, representing the j-th feature point under the left image of the i-th image pair rij Sphere normalized coordinates, d (,) representing the j-th feature point under the right image of the i-th image pair 2 Represents the square of the geometric distance of the two coordinates, and E represents the fundamental matrix.
The technical scheme of the invention is further improved as follows: distance constraint J in the step S7 dis The calculation method comprises the following steps: calculating the distance between the back projection space points corresponding to two adjacent characteristic points in the black and white circular pattern in the row direction and the distance between the back projection space points corresponding to every two adjacent characteristic points in the circular pattern in the column direction, and calculating the difference between the distance between the space points and the actual distance between the two adjacent characteristic points in the circular pattern of the display as an adjacent distance error as a distance constraint J dis
J dis =∑ hv ||L dis -d(M h ,M v )|| 2
Wherein L is dis Is a constant, representing the minimum spacing of the circular calibration object pattern, M h And M v Are the neighboring feature points calculated by back projection in the horizontal and vertical directions on the checkerboard object.
The technical scheme of the invention is further improved as follows: vertically constraining J in the step S7 + The calculating method comprises the following steps: in the three-dimensional reconstruction characteristic points obtained by the characteristic point pair back projection, any one three-dimensional reconstruction characteristic point positioned on the non-edge is vertical to the connecting line of the three-dimensional reconstruction characteristic point on the left side and the three-dimensional reconstruction characteristic point on the right side, so that the vertical constraint J of the three-dimensional reconstruction characteristic points in the non-edge interior of all images is calculated +
Figure BDA0003687212440000051
Wherein x is ij The left three-dimensionally reconstructed feature point representing the jth feature point of the ith image points to the vector of the right three-dimensionally reconstructed feature point, y ij The upper three-dimensional reconstruction feature point of the jth feature point representing the ith image points to a vector of the lower three-dimensional reconstruction feature point.
The technical scheme of the invention is further improved as follows: collinear constraint J in the step S7 co The calculation method comprises the following steps: calculating the sum of the vertical distances of the three-dimensional reconstruction characteristic points in the edge under all the view images as collinear constraint J by using any three-dimensional reconstruction characteristic point positioned on the edge in the three-dimensional reconstruction characteristic points obtained by the back projection of the characteristic point pair on the connecting line of the three-dimensional reconstruction characteristic points in the column and the connecting line of the three-dimensional reconstruction characteristic points in the row where the three-dimensional reconstruction characteristic points are positioned co
Figure BDA0003687212440000061
Wherein p is ij Representing the jth feature point, x, at the edge in the ith image 1ij 、x 2ij Representing the sum of p in the ith picture ij Internal p in the same row or column ij And (4) feature points.
Due to the adoption of the technical scheme, the invention has the technical progress that:
the invention combines the ideas of three-dimensional histogram reconstruction and dimension reduction, provides a robust global threshold algorithm threshold optimization framework adopting Gaussian distribution fitting, fully utilizes the pattern characteristics of a calibration board, effectively improves the extraction precision of control point coordinates on the calibration board, improves the noise resistance of the algorithm and obtains a high-precision calibration result;
the invention constructs a novel three-dimensional geometric constraint condition and establishes a mathematical model comprising reprojection constraint, epipolar constraint, adjacent distance constraint, vertical constraint and collinear constraint among characteristic points reconstructed according to calibration parameters. Comprehensively utilizes the constraint information of the two-dimensional image plane and the constraint information between the three-dimensional reconstructed feature points, and introduces a more comprehensive geometric constraint relation between the three-dimensional reconstructed points including local and whole points.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a flow chart of threshold segmentation;
FIG. 3 is a graph of the results of a Gaussian fit global threshold algorithm;
fig. 4 is a binocular camera imaging system.
Detailed Description
The present invention will be described in further detail with reference to the following examples:
as shown in fig. 1, a method for calibrating a multi-constraint binocular fisheye camera and positioning a spatial point includes the following steps:
step S1, obtaining the original image of the calibration plate: selecting a left binocular fisheye camera and a right binocular fisheye camera, wherein the binocular fisheye cameras adopt NM33-F type fisheye cameras and follow equal solid angle projection, and circular calibration object images on a liquid crystal display are used for calibrating the fisheye cameras. The background of the circular calibration object is black, the white circular patterns are arranged in the calibration board in 11 rows and 20 columns at equal intervals, the distance between the centers of the circles is 29.45257mm, and the radius of the circular patterns is 11.0448 mm. The left binocular fisheye camera and the right binocular fisheye camera shoot at a plurality of angles relative to the calibration object, the relative pose between the left binocular fisheye camera and the right binocular fisheye camera is kept unchanged during each shooting, the relative positions between the left binocular fisheye camera and the right binocular fisheye camera and the calibration object are changed, the distance between the binocular cameras and the chessboard is about 1m, and the circular calibration object image is enabled to be completely presented in the left camera image and the right camera image. The binocular camera shoots a pair of images from 13 visual angle directions to obtain 26 images in total, and the calibration plate images are a plurality of pairs of distorted images;
step S2, initializing internal parameters: extracting the effective region of the distorted image in the step S1, fitting the fisheye image with the ellipse as the imaging boundary by adopting a least square method, and solving the general equation fitting result of the ellipse as follows: au coating 2 +2Buv+Cv 2 +2Du +2Ev + F is 0, and the effective focal length (F) for the left and right binocular fisheye cameras according to the fitting result x ,f y ) Performing initial estimation:
Figure BDA0003687212440000071
step S3, extracting the image coordinates of the control points in the original image of the calibration plate: performing threshold segmentation on the effective region of the distorted image by adopting a global threshold algorithm of Gaussian fitting, acquiring the boundary of a calibration circle in the distorted image, calculating the mass center of the calibration circle in the distorted image, and using the mass center as a characteristic point pair of a left fisheye camera and a right fisheye camera; the specific process of step S3 is shown in fig. 2: the control point is the central point of the circular pattern in the calibration plate, firstly the distorted image is converted into a gray image and the gray histogram of the gray image is counted, then the image is segmented by adopting an Otsu global threshold algorithm to obtain an initial threshold th 0 Using an initial threshold th i Dividing the gray histogram into C 0 And C 1 Two parts, where i is the number of iterations, calculating C separately 0 And C 1 Mean value of respective distributions mu 0 、μ 1 And variance
Figure BDA0003687212440000072
Fit two Gaussian distributions
Figure BDA0003687212440000073
And
Figure BDA0003687212440000074
and respectively calculating the probability density function values of all the distributions, and recording mu begin Is a Gaussian distribution with small variance
Figure BDA0003687212440000075
Mean value of (d) (. mu.) end Is a Gaussian distribution with large variance
Figure BDA0003687212440000081
Mean value of from mu begin At position to mu end Traversing direction when p (x) i |Begin)≤p(x i | End), the optimal threshold value searched in this iteration is th i+1 =x i When | th i+1 -th i When | is less than 1, then th i And (3) stopping iteration for the global optimal threshold, and then calculating the circle center image coordinates of the circular pattern of the calibration object by using a least square ellipse fitting method, wherein the fitting result is shown in figure 3.
Step S4, performing monocular calibration on the left binocular fisheye camera and the right binocular fisheye camera: adopting a spherical projection Model and performing monocular Calibration on a left binocular fisheye Camera and a right binocular fisheye Camera respectively according to a Method proposed by Kannala in A general Camera Model and Calibration Method for general, Wide-Angle, and Fish-Eye lens to obtain an internal parameter matrix p of the left binocular fisheye Camera l And a left binocular fisheye camera extrinsic parameter rotation matrix R l Translation matrix t l Right binocular fisheye camera internal parameter matrix p r And a right binocular fisheye camera extrinsic parameter rotation matrix R r Translation matrix t r And the initial parameters are used as the initial parameters of the left binocular fisheye camera and the right binocular fisheye camera; the left binocular fisheye camera coordinate system is a three-dimensional Cartesian coordinate system based on the left binocular fisheye camera, and the right binocular fisheye camera coordinate system is a three-dimensional Cartesian coordinate system based on the right binocular fisheye camera. The world coordinate system is a three-dimensional cartesian coordinate system based on a calibration object.
Step S5, calculating space point coordinates by the characteristic points of the back projection distortion image:
as shown in fig. 4, the result obtained for step S3The feature point pairs on the distorted image are back-projected according to the internal parameter p l 、p r And radix Ginseng R l 、R r 、t l 、t r Obtaining the parameter theta of the incident light by back projection l 、θ r
Figure BDA0003687212440000082
Wherein theta is l Representing the angle theta between the incident ray of the left image and the z-axis in the three-dimensional coordinate system of the camera r Representing the angle between the incident ray of the right image and the z-axis in the three-dimensional coordinate system of the camera,
Figure BDA0003687212440000083
represents the included angle between the projection of the incident ray of the left image to the xy plane of the three-dimensional coordinate system of the camera and the x axis,
Figure BDA0003687212440000084
representing the included angle between the projection of the incident ray of the right image to the xy plane of the three-dimensional coordinate system of the camera and the x axis; carrying out back projection on the characteristic point pairs to the spherical surface to obtain normalized coordinates
Figure BDA0003687212440000085
Wherein x is l Normalized coordinates, x, representing the back projection of the feature points of the left image onto a sphere r The normalized coordinates representing the feature points of the right image are back-projected onto the sphere; according to transformation formula of left and right binocular fisheye cameras
Figure BDA0003687212440000086
Determining the coefficient lambda l 、λ r Finally, the space point coordinate is obtained as X ═ lambda l x l
Step S6, matching with world coordinates: aiming at the characteristic point M with known coordinates in the circular calibration object pattern under each world coordinate system w The feature point M w Converting the world coordinate system with the calibration object as the center into a characteristic point M under a camera coordinate system with the left fisheye camera as the center, comparing the calculated space points X and M, and constructing two groups of characteristic points under a three-dimensional measurement coordinate systemAnd performing nonlinear optimization on the objective function with the minimum difference:
Figure BDA0003687212440000091
wherein i represents the serial number of the image, and j represents the serial number of the feature point in each image;
step S7, constructing an optimization objective function: in order to accelerate the convergence of nonlinear optimization, an epipolar constraint J is added according to the processing of three-dimensional reconstruction characteristic points e Distance constraint J dis Collinear constraint J co Vertical restraint J + Constraint of polar line J e Is an inherent property of binocular vision system, distance constraint J dis Is an index for the reconstruction of the inspection distance, collinearity constraint J co And vertical constraint J + Is an index for checking the result of the back projection,
polar constraint J e Is obtained by calculating normalized coordinates obtained by back projecting the characteristic point pairs onto a unit sphere and a basic matrix E, wherein E is [ t [ ]] × R is represented by
Figure BDA0003687212440000092
In step S3, since the estimation of the external parameters may not be accurate enough, the initial calculation of the rotation matrix R and the translational vector t will be different one by one, and for better initial estimation, { R is selected i And t i Median of (1), where i represents the number of sequences of image pairs from 1, calculating epipolar constraints and J for feature points for the left and right image pairs epi
Figure BDA0003687212440000093
Wherein x is lij Sphere normalized coordinates, x, representing the j-th feature point under the left image of the i-th image pair rij Sphere normalized coordinates, d (,) representing the j-th feature point under the right image of the i-th image pair 2 Represents the square of the geometric distance of the two coordinates, and E represents the fundamental matrix.
Distance constraint J dis The calculation method comprises the following steps: calculating the distance between the back projection space points corresponding to two adjacent characteristic points in the black and white circular pattern in the row direction and the distance between the back projection space points corresponding to every two adjacent characteristic points in the circular pattern in the column direction, and calculating the difference between the distance between the space points and the actual distance between the two adjacent characteristic points in the circular pattern of the display as an adjacent distance error as a distance constraint J dis
J dis =∑ hv ||L dis -d(M h ,M v )|| 2
Wherein L is dis Is a constant, representing the minimum spacing of the circular calibration object pattern, M h And M v Are the neighboring feature points calculated by back projection in the horizontal and vertical directions on the checkerboard object.
Vertical constraint J + The specific method comprises the following steps: in the three-dimensional reconstruction characteristic points obtained by the characteristic point pair back projection, any one three-dimensional reconstruction characteristic point positioned on the non-edge is vertical to the connecting line of the three-dimensional reconstruction characteristic point on the left side and the three-dimensional reconstruction characteristic point on the right side, so that the vertical constraint J of the three-dimensional reconstruction characteristic points in the non-edge interior of all images is calculated +
Figure BDA0003687212440000101
Wherein x is ij The left three-dimensionally reconstructed feature point representing the jth feature point of the ith image points to the vector of the right three-dimensionally reconstructed feature point, y ij The upper three-dimensional reconstruction feature point of the jth feature point representing the ith image points to a vector of the lower three-dimensional reconstruction feature point.
Any three-dimensional reconstruction characteristic point positioned at the edge in the three-dimensional reconstruction characteristic points obtained by back projection of the characteristic point pair is positioned on the connecting line of the internal three-dimensional reconstruction characteristic points of the column where the characteristic point pair is positioned and the connecting line of the internal three-dimensional reconstruction characteristic points of the row where the characteristic point pair is positioned, and the internal three-dimensional reconstruction of the edge under all the visual angle images is calculatedBuilding the cumulative sum of the vertical distances of the feature points as a collinearity constraint J co
Figure BDA0003687212440000102
Wherein p is ij Representing the jth feature point, x, at the edge in the ith image 1ij 、x 2ij Representing the sum of p in the ith picture ij Internal p in the same row or column ij And (4) feature points.
Constrain geometry J 3D 、J e 、J dis 、J co And J + Taken together, the complete objective function is obtained: j is J 3D +J e +J dis +J co +J + And solving the optimized objective function by adopting a Levenberg-Marquardt method to obtain the optimal camera parameters, and completing the high-precision calibration of the binocular camera.

Claims (8)

1. A method for calibrating a multi-constraint binocular fisheye camera and positioning a space point is characterized by comprising the following steps: the method comprises the following steps:
step S1, acquiring the calibration plate original image: selecting a left binocular fisheye camera and a right binocular fisheye camera, shooting the left binocular fisheye camera and the right binocular fisheye camera at a plurality of angles relative to a calibration object, keeping the relative pose between the left binocular fisheye camera and the right binocular fisheye camera unchanged during each shooting, changing the relative positions between the left binocular fisheye camera and the right binocular fisheye camera and the calibration object, and acquiring a calibration plate image group, wherein the calibration plate image group is a plurality of pairs of distorted images;
step S2, initializing internal parameters: extracting the effective region of the distorted image in the step S1, fitting the fisheye image with the ellipse as the imaging boundary by adopting a least square method, and solving the general equation fitting result of the ellipse as follows: au coating 2 +2Buv+Cv 2 +2Du +2Ev + F ═ 0, the effective focal length (F) for the left and right binocular fisheye cameras based on the fitting results x ,f y ) Performing initial estimation:
Figure FDA0003687212430000011
step S3, extracting image coordinates of the control points in the original image of the calibration plate: performing threshold segmentation on the effective region of the distorted image by adopting a global threshold algorithm of Gaussian fitting, acquiring the boundary of a calibration circle in the distorted image, calculating the mass center of the calibration circle in the distorted image, and using the mass center as a characteristic point pair of a left fisheye camera and a right fisheye camera;
step S4, performing monocular calibration on the left binocular fisheye camera and the right binocular fisheye camera: performing monocular calibration on the left binocular fisheye camera and the right binocular fisheye camera respectively by adopting a spherical projection model according to a Kannala method to obtain an internal parameter matrix p of the left binocular fisheye camera l And a left binocular fisheye camera external parameter rotation matrix R l Translation matrix t l The internal parameter matrix pr of the right binocular fisheye camera and the external parameter rotation matrix R of the right binocular fisheye camera r Translation matrix t r And the initial parameters are used as the initial parameters of the left binocular fisheye camera and the right binocular fisheye camera;
step S5, calculating space point coordinates by the characteristic points of the back projection distortion image: back-projecting the characteristic point pairs on the distorted image obtained in step S3 according to the internal parameter p l 、p r And radix Ginseng R l 、R r 、t l 、t r Obtaining the parameter theta of the incident light by back projection l 、θ r
Figure FDA0003687212430000012
Wherein theta is l Representing the angle theta between the incident ray of the left image and the z-axis in the three-dimensional coordinate system of the camera r Representing the angle between the incident ray of the right image and the z-axis in the three-dimensional coordinate system of the camera,
Figure FDA0003687212430000013
represents the included angle between the projection of the incident ray of the left image to the xy plane of the three-dimensional coordinate system of the camera and the x axis,
Figure FDA0003687212430000021
representing right image incident ray projectionAn included angle between the X axis and the xy plane of the three-dimensional coordinate system of the camera; carrying out back projection on the characteristic point pairs to the spherical surface to obtain normalized coordinates
Figure FDA0003687212430000022
Figure FDA0003687212430000023
Wherein x is l Normalized coordinates, x, representing the back projection of the feature points of the left image onto a sphere r The normalized coordinates representing the feature points of the right image are back projected onto the sphere; conversion formula according to left and right binocular fisheye cameras
Figure FDA0003687212430000024
Determining the coefficient lambda l 、λ r Finally, the space point coordinate is obtained as X ═ lambda l x l
Step S6, matching with world coordinates: aiming at the characteristic point M with known coordinates in the circular calibration object pattern under each world coordinate system w A feature point M w Converting a world coordinate system taking a calibration object as a center into a feature point M under a camera coordinate system taking a left fisheye camera as a center, comparing the calculated space points X and M, constructing an objective function with the minimum difference between two groups of feature points under a three-dimensional measurement coordinate system, and carrying out nonlinear optimization:
Figure FDA0003687212430000025
wherein i represents the serial number of the image, and j represents the serial number of the feature point in each image;
step S7, constructing an optimization objective function: in order to accelerate the convergence of nonlinear optimization, an epipolar constraint J is added according to three-dimensional reconstruction characteristic point processing e Distance constraint J dis Collinear constraint J co Vertical restraint J + Constraint of polar line J e Is an inherent property of binocular vision system, distance constraint J dis Is to check the distanceReconstructed index, collinearity constraint J co And vertical constraint J + Is an index for inspecting the back projection result, which is several J 3D 、J e 、J dis 、J co And J + Taken together, the complete objective function is obtained: j is J 3D +J e +J dis +J co +J + And solving according to the established optimization objective function to obtain optimal camera parameters, then solving the optimization objective function by adopting a Levenberg-Marquardt method to obtain the optimal camera parameters, and completing high-precision calibration of the left and right binocular fisheye cameras.
2. The method for multi-constraint binocular fisheye camera calibration and spatial point positioning of claim 1, wherein: when the left and right binocular fisheye cameras shoot in step S1, the calibration objects are black and white circular patterns on the liquid crystal display, the radius of each calibration circle in the circular images is obtained according to the display resolution and the physical size of the screen, and the black and white circular patterns are completely imaged in the left and right images in the shooting process.
3. The method for multi-constraint binocular fisheye camera calibration and spatial point positioning of claim 1, wherein: the specific process of step S3 is as follows: firstly, converting a distorted image into a gray image, counting a gray histogram of the distorted image, and then segmenting the image by adopting an Otsu global threshold algorithm to obtain an initial threshold th 0 Using an initial threshold th i Dividing the gray histogram into C 0 And C 1 Two parts, where i is the number of iterations, calculating C separately 0 And C 1 Mean value of respective distributions mu 0 、μ 1 And variance
Figure FDA0003687212430000031
Fit two Gaussian distributions
Figure FDA0003687212430000032
And
Figure FDA0003687212430000033
and respectively calculating the probability density function value of each distribution, and recording mu begin Is a Gaussian distribution with small variance
Figure FDA0003687212430000034
Mean value of (d) (. mu.) end Is a Gaussian distribution with large variance
Figure FDA0003687212430000035
Mean value of from μ begin At position to mu end Traversing direction when p (x) i |Begin)≤p(x i | End), the optimal threshold value searched in this iteration is th i+1 =x i When | th i+1 -th i When | is less than 1, then th i And (4) stopping iteration for the global optimal threshold, and then calculating the circle center image coordinates of the circular pattern of the calibration object by using a least square ellipse fitting method.
4. The method for multi-constraint binocular fisheye camera calibration and spatial point positioning of claim 1, wherein: in step S4, the left binocular fisheye camera coordinate system is a three-dimensional cartesian coordinate system based on the left binocular fisheye camera, and the right binocular fisheye camera coordinate system is a three-dimensional cartesian coordinate system based on the right binocular fisheye camera.
5. The method for multi-constraint binocular fisheye camera calibration and spatial point location of claim 4, wherein: polar line constraint J in the step S7 e Is obtained by calculating normalized coordinates obtained by back projecting the characteristic point pairs onto a unit sphere and a basic matrix E, wherein E is [ t [ ]] × R is represented by
Figure FDA0003687212430000036
Figure FDA0003687212430000037
Selecting { R i And t i Median of (1), where i denotes the number of sequences of image pairs from 1, calculating epipolar constraints J for feature points of left and right image pairs e
Figure FDA0003687212430000038
Wherein x is lij Sphere normalized coordinates, x, representing the j-th feature point under the left image of the i-th image pair rij Sphere normalized coordinates, d (,) representing the j-th feature point under the right image of the i-th image pair 2 Represents the square of the geometric distance of the two coordinates, and E represents the fundamental matrix.
6. The method for multi-constraint binocular fisheye camera calibration and spatial point positioning of claim 5, wherein: distance constraint J in the step S7 dis The calculation method comprises the following steps: calculating the distance between the back projection space points corresponding to two adjacent characteristic points in the black and white circular pattern in the row direction and the distance between the back projection space points corresponding to every two adjacent characteristic points in the circular pattern in the column direction, and calculating the difference between the distance between the space points and the actual distance between the two adjacent characteristic points in the circular pattern of the display as the adjacent distance error as the distance constraint J dis
J dis =∑ hv ||L dis -d(M h ,M v )|| 2
Wherein L is dis Is a constant, representing the minimum spacing of the circular calibration object pattern, M h And M v Are the neighboring feature points calculated by back projection in the horizontal and vertical directions on the checkerboard object.
7. The method for multi-constraint binocular fisheye camera calibration and spatial point location of claim 6, wherein: vertically constraining J in the step S7 + The calculation method comprises the following steps: in the three-dimensional reconstruction characteristic points obtained by back projection of the characteristic points,and (3) any three-dimensional reconstruction characteristic point positioned on the non-edge, wherein the connecting line of the left three-dimensional reconstruction characteristic point and the right three-dimensional reconstruction characteristic point is vertical to the connecting line of the upper three-dimensional reconstruction characteristic point and the lower three-dimensional reconstruction characteristic point, so that the vertical constraint J of the non-edge internal three-dimensional reconstruction characteristic points of all images is calculated +
Figure FDA0003687212430000041
Wherein x is ij The left three-dimensional reconstruction characteristic point of the jth characteristic point of the ith image points to the vector of the right three-dimensional reconstruction characteristic point, and yij represents the vector of the upper three-dimensional reconstruction characteristic point of the jth characteristic point of the ith image points to the lower three-dimensional reconstruction characteristic point.
8. The method for multi-constraint binocular fisheye camera calibration and spatial point location of claim 7, wherein: collinear constraint J in the step S7 co The calculation method comprises the following steps: any three-dimensional reconstruction characteristic point positioned at the edge in the three-dimensional reconstruction characteristic points obtained by back projection of the characteristic point pair is positioned on the connecting line of the internal three-dimensional reconstruction characteristic points of the column where the characteristic point pair is positioned and the connecting line of the internal three-dimensional reconstruction characteristic points of the row where the characteristic point pair is positioned, and the cumulative sum of the vertical distances of the internal three-dimensional reconstruction characteristic points at the edge under all view angle images is calculated and used as the collinear constraint J co
Figure FDA0003687212430000051
Wherein p is ij Denotes the jth feature point, x, at the edge in the ith image 1ij 、x 2ij Representing the sum of p in the ith picture ij Internal p in the same row or column ij And (4) feature points.
CN202210655001.4A 2022-05-19 2022-05-19 Method for calibrating multi-constraint binocular fisheye camera and positioning space point Pending CN114998448A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210655001.4A CN114998448A (en) 2022-05-19 2022-05-19 Method for calibrating multi-constraint binocular fisheye camera and positioning space point

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210655001.4A CN114998448A (en) 2022-05-19 2022-05-19 Method for calibrating multi-constraint binocular fisheye camera and positioning space point

Publications (1)

Publication Number Publication Date
CN114998448A true CN114998448A (en) 2022-09-02

Family

ID=83032803

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210655001.4A Pending CN114998448A (en) 2022-05-19 2022-05-19 Method for calibrating multi-constraint binocular fisheye camera and positioning space point

Country Status (1)

Country Link
CN (1) CN114998448A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116543057A (en) * 2023-06-27 2023-08-04 华南理工大学 Underwater multi-camera and IMU integrated calibration method
CN117541661A (en) * 2024-01-04 2024-02-09 北京友友天宇系统技术有限公司 Binocular camera external parameter automatic correction method, system, device and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116543057A (en) * 2023-06-27 2023-08-04 华南理工大学 Underwater multi-camera and IMU integrated calibration method
CN116543057B (en) * 2023-06-27 2023-10-10 华南理工大学 Underwater multi-camera and IMU integrated calibration method
CN117541661A (en) * 2024-01-04 2024-02-09 北京友友天宇系统技术有限公司 Binocular camera external parameter automatic correction method, system, device and storage medium
CN117541661B (en) * 2024-01-04 2024-04-05 北京友友天宇系统技术有限公司 Binocular camera external parameter automatic correction method, system, device and storage medium

Similar Documents

Publication Publication Date Title
CN109269430B (en) Multi-standing-tree breast height diameter passive measurement method based on deep extraction model
Furukawa et al. Accurate camera calibration from multi-view stereo and bundle adjustment
CN109903227B (en) Panoramic image splicing method based on camera geometric position relation
CN107507235B (en) Registration method of color image and depth image acquired based on RGB-D equipment
CN104537707B (en) Image space type stereoscopic vision moves real-time measurement system online
CN111369630A (en) Method for calibrating multi-line laser radar and camera
CN113205593B (en) High-light-reflection surface structure light field three-dimensional reconstruction method based on point cloud self-adaptive restoration
CN109523595B (en) Visual measurement method for linear angular spacing of building engineering
CN109118544B (en) Synthetic aperture imaging method based on perspective transformation
CN114998448A (en) Method for calibrating multi-constraint binocular fisheye camera and positioning space point
CN109272574B (en) Construction method and calibration method of linear array rotary scanning camera imaging model based on projection transformation
CN110874854B (en) Camera binocular photogrammetry method based on small baseline condition
CN109470149B (en) Method and device for measuring position and posture of pipeline
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
CN114283203B (en) Calibration method and system of multi-camera system
CN112815843B (en) On-line monitoring method for printing deviation of workpiece surface in 3D printing process
CN111709985A (en) Underwater target ranging method based on binocular vision
CN113642463B (en) Heaven and earth multi-view alignment method for video monitoring and remote sensing images
CN114372992A (en) Edge corner point detection four-eye vision algorithm based on moving platform
CN112489193A (en) Three-dimensional reconstruction method based on structured light
CN115201883A (en) Moving target video positioning and speed measuring system and method
CN116625258A (en) Chain spacing measuring system and chain spacing measuring method
CN113963067B (en) Calibration method for calibrating large-view-field visual sensor by using small target
CN112712566B (en) Binocular stereo vision sensor measuring method based on structure parameter online correction
CN112164119B (en) Calibration method for multi-camera system placed in surrounding mode and suitable for narrow space

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination