CN116563382A - Robust high-precision camera calibration system - Google Patents

Robust high-precision camera calibration system Download PDF

Info

Publication number
CN116563382A
CN116563382A CN202210111310.5A CN202210111310A CN116563382A CN 116563382 A CN116563382 A CN 116563382A CN 202210111310 A CN202210111310 A CN 202210111310A CN 116563382 A CN116563382 A CN 116563382A
Authority
CN
China
Prior art keywords
camera
calibration
parameters
module
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210111310.5A
Other languages
Chinese (zh)
Inventor
杨煦
黄龙祥
朱力
吕方璐
汪博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Guangjian Technology Co Ltd
Original Assignee
Shenzhen Guangjian Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Guangjian Technology Co Ltd filed Critical Shenzhen Guangjian Technology Co Ltd
Priority to CN202210111310.5A priority Critical patent/CN116563382A/en
Publication of CN116563382A publication Critical patent/CN116563382A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/60Rotation of a whole image or part thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention provides a robust high-precision camera calibration system, which comprises: the internal reference module is used for acquiring a first camera internal reference; the measuring module is used for emitting structured light to a calibration object by the camera and changing the distance between the camera and the calibration object so as to obtain a plurality of calibration images with different depth distances; the first characteristic module is used for processing the plurality of calibration images and acquiring position information of a plurality of characteristic points; the external parameter module is used for calculating external parameters of the camera internal parameters and the calibration plate relative to the camera by utilizing the position information of the characteristic points; the projection module is used for selecting the calibration image with a certain distance as a reference image, matching other calibration images with the reference image, and then obtaining a projection center according to 3D points generated correspondingly to corresponding matching points on different calibration images; and the optimization module is used for carrying out phased iterative optimization on the data acquired by the projection module to acquire all parameters of the camera.

Description

Robust high-precision camera calibration system
Technical Field
The invention relates to the field of camera calibration, in particular to a robust high-precision camera calibration system.
Background
In image measurement processes and machine vision applications, in order to determine the correlation between the three-dimensional geometric position of a point on the surface of a spatial object and its corresponding point in the image, a geometric model of camera imaging must be established, and these geometric model parameters are camera parameters. The camera manufacturer needs to calibrate the camera when the camera leaves the factory, so that the camera can be better applied.
The calibration of the camera can be divided into two types, the first is the self calibration of the camera; the second is a calibration method that relies on a calibration reference. The former is that the camera shoots surrounding objects, camera parameters are obtained through a digital image processing method and related geometric calculation, but because the method does not consider the influence of the camera, the overall parameters of the camera are fitted in limited times of tests, so that the result error is larger, and the method is not suitable for high-precision application occasions. The latter is to image by the camera through a calibrated reference, and calculate the internal and external parameters of the camera through a digital image processing method and a later spatial arithmetic operation. The method generally adopts a specific object with obvious characteristics, and takes the transformation of different coordinate systems into consideration, so that the calibration precision is high, and the method is suitable for application occasions with high precision requirements. The scheme of the patent application is suitable for optimizing the second calibration method.
The calibration of the camera requires the conversion of a world coordinate system, a camera coordinate system, an image coordinate system and a pixel coordinate system, the different conversion relations are proxied by different parameters, and final parameters are finally obtained by matrix operation. The calibration process is the process of solving the external reference and the internal reference. The external parameters are the rotation and translation transformation relation of the camera relative to the world coordinate system. The internal reference is the inherent properties of the camera, in fact the focal length, pel size. At the same time the distance of the object from the optical center is also an important factor affecting the imaging. That is, if the object is at a different location from the camera during calibration, the camera needs to be calibrated at a different location. Thus, de-calibration is required for each location. In order to obtain a good calibration result, camera manufacturers perform calibration in various directions by spending a large amount of manpower and material resources, but because the parameters to be calibrated are numerous, the calibration process is also influenced by the quantity of the collected calibration objects and the identification precision of the calibration objects, the optimization process is easy to generate larger errors, and a plurality of different calibration results are easy to appear in the same camera equipment after multiple calibration, so that the consistency of camera calibration is poor.
To improve the consistency problem, the prior art mainly uses means of improving calibration objects, improving ambient light and increasing the number of photos. For example, studies have shown that when the number of pictures is 20-30, the calibration results are relatively stable, but this clearly increases the length of time for calibration, resulting in an increase in cost in industrial applications.
Disclosure of Invention
Therefore, the method reduces the number of parameters to be calibrated by using parameters which can be obtained in advance, so that an equation to be solved can be converged rapidly, and then, all the parameters to be calibrated are subjected to integral optimization, so that each parameter is more accurate, the problem of local extremum in the parameter optimization process is solved, and meanwhile, multiple calibration is enabled to have better consistency, so that the result of the same camera batch calibration has stronger robustness, has important index effects on improvement of production efficiency and product yield, and has great significance on industrial production.
In a first aspect, the present invention provides a robust high-precision camera calibration system, suitable for monocular camera calibration, comprising:
the internal reference module is used for acquiring a first camera internal reference; the first camera internal parameters refer to parameters of core components of the camera, including lens focal length, sensor size and the like;
The measuring module is used for emitting structured light to a calibration object by the camera and changing the distance between the camera and the calibration object so as to obtain a plurality of calibration images with different depth distances;
the first characteristic module is used for processing the plurality of calibration images and acquiring position information of a plurality of characteristic points;
the external parameter module is used for calculating external parameters of the camera internal parameters and the calibration plate relative to the camera by utilizing the position information of the characteristic points; wherein each calibration image corresponds to a plane equation;
the projection module is used for selecting the calibration image with a certain distance as a reference image, matching other calibration images with the reference image, and then obtaining a projection center according to 3D points generated correspondingly to corresponding matching points on different calibration images;
and the optimization module is used for optimizing the data acquired by the projection module, and acquiring all parameters of the camera by adopting a staged iterative optimization method during optimization.
Optionally, the robust high-precision camera calibration system is characterized in that the measurement module includes:
a first depth unit, configured to place the calibration object in a field of view of the camera, where a distance from the camera is a first depth; wherein the calibration object covers at least half of the field of view of the camera;
A second depth unit for changing a distance between the calibration object and the camera to a second depth along an optical axis direction of the camera; wherein the calibration object covers at least half of the field of view of the camera;
and the first transmitting unit is used for controlling the camera to transmit the structured light to the calibration object and acquiring the calibration image.
Optionally, the robust high-precision camera calibration system is characterized in that at least two groups of calibration images are provided.
In a second aspect, the present invention provides a robust high-precision camera calibration system, suitable for binocular or multi-view camera calibration, comprising:
the original parameter module is used for acquiring first original parameters of the camera; the camera comprises a first camera and a second camera, wherein the first original parameters refer to parameters of core components of the binocular camera, including lens focal length, sensor size, baseline distance and the like;
the shooting module is used for controlling the camera to respectively obtain a first image and a second image of the calibration object; the first image is shot by the first camera, and the second image is shot by the second camera;
the second feature module is used for processing the first image and the second image and acquiring position information of a plurality of feature points;
And the optimization module is used for optimizing the data acquired by the second characteristic module, and acquiring all parameters of the camera by adopting a staged iterative optimization method during optimization.
Optionally, the robust high-precision camera calibration system is characterized in that the shooting module includes:
the third depth unit is used for placing the calibration object in the common field of view of the first camera and the second camera, and the distance between the third depth unit and the camera is a third depth;
a fourth depth unit for moving the calibration object to a fourth depth and located in a common field of view of the first camera and the second camera;
and the second transmitting unit is used for controlling the camera to transmit the structured light to the calibration object and acquiring the first image and the second image.
Optionally, the robust high-precision camera calibration system is characterized in that the calibration object covers more than half of the area of the common field of view of the first camera and the second camera.
Optionally, the robust high-precision camera calibration system is characterized in that the optimization method is a nonlinear optimization algorithm.
Optionally, the robust high-precision camera calibration system is applicable to the first aspect and the second aspect, and the optimization module includes:
the first optimizing unit is used for adopting initial estimation to translation parameters in the parameters, keeping unchanged in the optimizing process and optimizing the rotation quantity;
and the second optimizing unit is used for optimizing all the external parameters, including the rotation parameters and the translation parameters, by taking the rotation quantity and the translation parameters optimized in the first stage as initial values.
Optionally, the robust high-precision camera calibration system is further characterized by comprising:
and the third optimizing unit is used for taking the values of the rotation parameter and the translation parameter obtained by the second optimizing unit as initial values when the first camera internal parameter or the first camera internal parameter is unreliable, and simultaneously optimizing the camera internal parameter and the camera external parameter so as to improve the calibration precision.
In a third aspect, the present invention provides a robust high-precision camera calibration apparatus, comprising:
the first track is used for moving the camera and fixing the camera at a first preset position in a calibration time;
the second rail is used for moving the calibration object and fixing the calibration object at a plurality of second preset positions;
The second track is positioned at one side of the first track and is perpendicular to the first track.
Compared with the prior art, the invention has the following beneficial effects:
the invention utilizes the available parameters, so that the parameter calculation can be quickly converged to the vicinity of the target value, and the problem of local extremum is effectively solved. According to the invention, each parameter is calculated by adopting a step-by-step method, and the parameters before unlocking in the subsequent process are calculated, so that each parameter can obtain an optimal solution. According to the invention, the parameter determination process is more stable through step-by-step optimization of the parameters, and the optimization consistency is greatly improved. According to the invention, through step-by-step optimization of parameters, the requirement on calibration operation is greatly reduced, the calibration workload is reduced, and the industrial calibration efficiency is improved. The invention ensures that the result of the same camera batch calibration has stronger robustness, has important index function for improving the production efficiency and the product yield, and has great significance for industrial production
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present invention, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art. Other features, objects and advantages of the present invention will become more apparent upon reading of the detailed description of non-limiting embodiments, given with reference to the accompanying drawings in which:
FIG. 1 is a schematic diagram of a robust high-precision camera calibration system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an optimized extremum according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a measurement module according to an embodiment of the present invention;
FIG. 4 illustrates a schematic diagram of another robust high precision camera calibration system in accordance with an embodiment of the present invention;
FIG. 5 is a flow chart of another calibration image acquisition in an embodiment of the present invention;
FIG. 6 is a flow chart of a staged optimization process in accordance with an embodiment of the present invention;
FIG. 7 is a light spot used in an embodiment of the present invention;
FIG. 8 is a set of results of tests in an embodiment of the invention;
fig. 9 is a schematic diagram of a robust high-precision camera calibration apparatus according to an embodiment of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the present invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications could be made by those skilled in the art without departing from the inventive concept. These are all within the scope of the present invention.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented, for example, in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical scheme of the invention is described in detail below by specific examples. The following embodiments may be combined with each other, and some embodiments may not be repeated for the same or similar concepts or processes.
The following describes the technical scheme of the present invention and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments.
The invention provides a robust high-precision camera calibration system, which can be used for monocular calibration of only one camera, or can be applied to monocular structured light camera or TOF camera calibration consisting of one camera and one laser projector, or can be used for binocular calibration or multi-objective calibration consisting of two or more cameras. That is, the system provided by the invention can be applied to various camera calibration. Embodiments of the present invention will be described below with reference to the accompanying drawings.
FIG. 1 shows a schematic diagram of a robust high-precision camera calibration system in accordance with an embodiment of the present invention. A robust high-precision camera calibration system is suitable for monocular camera calibration. Taking a monocular structured light camera as an example, the high-precision camera calibration system provided by the invention comprises:
An internal reference module 11, configured to obtain a first camera internal reference;
the first camera internal parameters refer to parameters of core components of the camera, including lens focal length, sensor size, and the like. The monocular structured light camera includes a structured light projector and an IR receiver, and among other monocular structured light cameras, an RGB camera is further included, and this embodiment is described taking the monocular structured light camera having only the structured light projector and the IR receiver as an example. Calibration of the camera is to obtain an internal reference matrix Kc and an internal reference matrix Kp of the projector, and an external reference matrix between the camera and the projector: and (3) rotating the matrix R and translating the matrix T. In addition, because of the unavoidable distortion of the lenses of the camera and projector, we also need to calibrate the distortion coefficients of the camera and projector. Among the parameters to be calibrated, some parameters are closely related to the parameters of the core components of the camera, or are the parameters of the core components of the camera themselves. Typically, the manufacturer of the camera accessory will provide its parameters and thus can be used directly. If the manufacturer of the camera accessory does not provide relevant parameters, these parameters can be calibrated directly with other parameters.
A measurement module 12, configured to emit structured light from the camera to a calibration object, and change a distance between the camera and the calibration object, so as to obtain a plurality of calibration images with different depth distances;
The calibration of the camera is divided into a self-calibration type and a feature calibration type. The self-calibration is performed in a mode of searching for characteristic points in the image. The calibration object is calibrated in a mode of adopting the calibration object, and the method has the characteristics of easy calculation of characteristic points and good stability, and is also the most widely adopted calibration mode. The calibration object adopted in the embodiment is a calibration plate. There are various kinds of calibration plates, such as CharuCo, checkerboard, asymmetric circles and checkerboard. The following description will be given by taking a checkerboard as an example, but it is obvious that other calibration objects can be applied to achieve the object of the present embodiment.
The measurement module 12 needs to place the calibration object in the field of view of the camera, and acquire a plurality of calibration images by changing the distance between the calibration object and the camera. The calibration image is an image shot by a camera to be calibrated and containing a calibration object. When the distance between the calibration object and the camera is changed, only the camera can be moved, only the calibration object can be moved, and the camera and the calibration object can be simultaneously moved. The present embodiment is not limited to a specific implementation as long as the relative distance between the two can be changed. Compared with arranging the calibration objects at different distances of the camera, the method for changing the distance between the camera and the calibration objects by moving is favorable for keeping the consistency of the calibration objects, so that the problem that the different calibration objects have fine differences in detail is avoided, and the consistency of the calibration is better. In the primary calibration process, only one calibration object is adopted to maintain the calibration accuracy.
In the process of moving the camera or the calibration object, the camera or the calibration object needs to move along the optical axis direction, so that the obtained calibration images with different depths are ensured to have the same visual angle, and the consistency of contrast is improved. Preferably, the calibration object is placed perpendicular to the optical axis of the camera, and the calibration object is located on the optical axis.
The first feature module 13 is configured to process a plurality of calibration images to obtain position information of a plurality of feature points;
different markers have different feature points. If the calibration plate is a checkerboard calibration plate, the characteristic points are connection points of grids with black and white phases and grids in the diagonal direction, and the connection points are very easy to identify because the connection points are the intersection points of two straight lines and the separation points of two different colors. The distribution of a plurality of characteristic points presents certain regularity, so that the calibration of the camera in a larger range is easier. Or, if the image of the calibration object is a circular calibration plate, the circle center of the circular lattice is extracted as the characteristic, or if the calibration object is a cross wire, the center of the cross wire is extracted as the characteristic. After extracting the characteristics, calculating the internal parameters and distortion coefficients of the camera according to the characteristic positions. For example, the camera internal parameters and distortion coefficients may be calculated using a Zhang Zhengyou calibration method.
An external parameter module 14, configured to calculate external parameters of the camera internal parameter and the calibration board relative to the camera using the feature point position information;
each calibration image corresponds to one plane equation, and a plurality of calibration images correspond to a plurality of different plane equations.
When the center of the calibration object is positioned on the optical axis of the camera, the coordinate mapping relation between the pixel coordinate system and the world coordinate system can be known according to the Zhang's calibration method:
where u, v denote coordinates in the pixel coordinate system,u 0 ,x 0 gamma (two coordinate axis deflection parameters due to manufacturing errors, usually small, if the value obtained by the matrix operation is 0) represents 5 camera references, R, t represents camera references, x w ,y w ,z w Representing coordinates in the world coordinate system.
f x ,f y The relationship between the physical focal length f is: f (f) x =fs x And f y =fs y . Wherein the method comprises the steps ofThe pixel value represented by a length of 1 mm in the x-direction, i.e. pixels per unit mm, is indicated.
(x w ,y w ) The spatial coordinates of the calibration object can be manually controlled by a designer and are known. (u, v) is the pixel coordinates, which we can obtain directly from the camera. For a set of correspondences (x w ,y w ) - (u, v) we can obtain two sets of equations.
There are now 8 unknowns to solve, so we need at least eight equations. Four corresponding points are required. The homography matrix H from the image plane to the world plane can be calculated by four points. In order to improve the accuracy of the data, errors are reduced, so that the data is more robust, and many photos are often taken.
Because the size of the calibration plate and the position of the characteristic points are known, the plane equation of the calibration object under the camera coordinate system can be calculated through the position information of the characteristic points.
The speckle image is positioned by a speckle extraction method, for example, a Blob analysis method, an image segmentation method and the like are adopted to obtain a speckle center. The speckle between different positions is matched by a matching algorithm, such as Block Match, SIFT or optical flow method.
The projection module 15 is configured to select the calibration image with a certain distance as a reference image, match other calibration images with the reference image, and then calculate a projection center according to 3D points generated corresponding to corresponding matching points on different calibration images;
the first characteristic module 13 is utilized to position the calibration object, so that homography conversion of the plane of the calibration object and the plane of the image is obtained, the camera internal parameters and distortion parameters obtained by calculation of the first characteristic module 13 are utilized to calculate a plane equation of the physical plane of the calibration object under a camera coordinate system, and meanwhile, the homography conversion is utilized to convert the speckle position extracted from the external reference module 14 to the physical plane of the calibration object, so that the three-dimensional coordinate of the calibration object under the camera coordinate system is obtained. The same points are connected in a straight line using the matching relationship obtained by the external reference module 14. According to the optical property of the projector, the light beams with different speckles corresponding to straight lines are converged to the center of the projector. The center serves as the initial value for the subsequent optimization.
Different characteristic point positions can be obtained by identifying calibration images with different depths, and coordinates of characteristic points on the same optical path can be obtained, so that a projection center can be obtained. Theoretically, two calibration images with different depths can be used for solving the projection center. However, because of the large number of parameters to be obtained, in the prior art, more than 4 groups of photos are generally used to obtain all the parameters to be obtained. In this embodiment, the parameters are fixed by the first camera internal parameter, so that all the parameters to be solved can be obtained only by 2 groups of parameters, namely only 2 groups of calibration images are needed.
The optimizing module 16 is configured to optimize the data acquired by the projection module 15, and obtain all parameters of the camera by adopting a method of iterative optimization in stages during optimization.
When optimizing the calculated data, a staged optimization mode is adopted, namely, a part of parameters are fixed first, so that other parameters are converged rapidly. In the optimization process, the actual data optimization often has a plurality of local extrema, and the value is often optimized to the local extremum during the optimization process, but is not always optimized to the optimal extremum. Taking fig. 2 as an example, there are three local extrema b, c, e, with the best extremum being e. When the initial value is a, the optimization of the data is often optimized to its local extremum b, rather than the optimal extremum e. The optimization module 16 forces the parameters to approach the optimal value by fixing a part of the parameters first, so that the purpose of rapid convergence can be achieved, and the problem of optimizing to the local extremum is solved.
From the calculation time, the optimization module 16 is slightly longer than the overall optimization time in the prior art, but the time before the optimization module 16 is saved, so that the time and the calibration cost are saved overall, the data consistency is improved, and the method is more suitable for the requirements of industrial production.
In some embodiments, as shown in fig. 3, the measurement module 12 includes:
a first depth unit 121, configured to place the calibration object in a field of view of the camera, where a distance from the camera is a first depth;
the calibration object is placed at different positions of the field of view of the monocular structured light camera, and the calibration image is taken, preferably covering as much as possible the whole field of view of the camera, e.g. 9 calibration images are taken, and then the calibration object should appear in the upper left corner, upper right corner, left side, middle and right side, and lower left corner, lower right corner of the image, each taking one image.
A second depth unit 122 for changing the distance between the calibration object and the camera to a second depth along the optical axis direction of the camera.
The distance between the calibration object and the camera is changed to a second depth in a manner referred to in the previous embodiment. The calibration object covers at least half of the field of view of the camera. Through a large number of data tests of the applicant, for a depth camera applied to the field, the difference between the second depth and the first depth is at least 3 times of the focal length of the camera, so that better calibration accuracy can be obtained under the condition that two groups of calibration images can be ensured. The second depth is a different value than the first depth. It should be noted that the second depth may have a plurality of different depths to achieve a better calibration effect.
A first emitting unit 123, configured to control the camera to emit structured light to the calibration object, so as to obtain the calibration image;
only the structured light is emitted to obtain the facula image. The first transmitting unit 123 may photograph both the calibration object of the first depth and the calibration object of the second depth. The calibration image includes a calibration object image and a speckle image. The calibration object is placed at different positions of the field of view of the monocular structured light camera, the image of the calibration object and the speckle image are photographed, preferably, the speckle image and the image of the calibration object are photographed separately, i.e. one image is the image of the calibration object and the other image is the image of the speckle irradiated on the calibration object, and the monocular structured light camera and the calibration object in the two images are kept fixed without relative movement. Preferably, the calibration object moves in a direction perpendicular to the optical axis of the camera. The calibration object covers at least half of the field of view of the camera.
The embodiment defines the process of shooting a plurality of groups of calibration images, is matched with image calibration in the prior art, improves the application scene of the embodiment, can ensure that at least 2 groups of shot images meet the application requirement, and obtains better calibration accuracy under the same condition, thereby improving the robustness of data. The application of the embodiment can enable the data in the industrial application to be obtained more quickly, has better consistency and is more suitable for the industrial application.
FIG. 4 illustrates a schematic diagram of another robust high precision camera calibration system in accordance with an embodiment of the present invention. A robust high precision camera calibration system suitable for binocular or multi-view camera calibration, comprising:
an original parameter module 21, configured to obtain a first original parameter of the camera;
unlike a monocular camera, a binocular or multi-camera has two or more imaging cameras. The following description will take a binocular camera as an example. The binocular camera consists of two monocular cameras, the inter-camera distance (baseline) being known to estimate the inter-pixel position. Binocular cameras have significant advantages, such as measuring farther the greater the baseline distance, which can be used indoors and outdoors. However, the configuration and calibration are complex, the depth range and the precision are limited by the base line and the resolution, special treatment is needed, and the calculated amount is large. Compared to a monocular camera, a binocular camera includes a baseline distance in addition to parameters of a single camera. In calibration, it is necessary to place the calibration object in the common field of view of two separate cameras. The camera comprises a first camera head and a second camera head, and the first original parameters refer to parameters of core components of the binocular camera, including lens focal length, sensor size, baseline distance and the like.
A shooting module 22, configured to control the camera to obtain a first image and a second image of the calibration object respectively;
the first image is captured by the first camera, and the second image is captured by the second camera. The first image and the second image are obtained by shooting under the condition that all parameters are fixed, and the distance between the calibration object and the camera is included. The first image and the second image both have a calibration object, i.e. the common field of view of the binocular camera has a calibration object. It is noted that the calibration object may have a portion that is not in the common field of view of the binocular camera.
A second feature module 23, configured to process the first image and the second image, and acquire a plurality of feature point location information;
a plurality of feature point position information can be obtained on both the first image and the second image, and each image obtains a set of feature point position information. The camera center can be obtained through a plurality of groups of characteristic point position information data. Like a monocular camera, a binocular camera also requires at least 2 sets of calibration images to obtain camera center data.
Unlike monocular cameras, in binocular camera calibration, epipolar correction is also required for the cameras so that the optical axes of the two cameras are perfectly parallel, so that subsequent depth computation, three-dimensional reconstruction can be continued. For example, the algorithm of epipolar correction of Bouguet is used in OPencv. Specific algorithms are more disclosed, and the embodiment is not repeated.
The optimizing module 24 is configured to optimize the data acquired by the second feature module 23, and obtain all parameters of the camera by adopting a staged iterative optimization method during optimization.
The optimization method is a nonlinear optimization algorithm. The embodiment adopts Levenberg-Marquardt (LM algorithm for short) to carry out staged optimization, and of course, other nonlinear optimization algorithms can also be adopted to carry out optimization. The optimization module 24 is substantially identical to the workflow of the optimization module 16, except that parameters are different due to the difference between the binocular camera and the monocular camera. When optimizing the calculated data, a staged optimization mode is adopted, namely, a part of parameters are fixed first, so that other parameters are converged rapidly. In the optimization process, the actual data optimization often has a plurality of local extrema, and the value is often optimized to the local extremum during the optimization process, but is not always optimized to the optimal extremum. In this embodiment, a part of parameters are fixed first to force the parameters to approach the optimal value, so that the purpose of rapid convergence can be achieved, and the problem of optimizing to a local extremum is solved.
From the calculation time, the optimization module 24 is slightly longer than the method of integral optimization in the prior art, but the time before the optimization module 24 is saved, so that the time and the calibration cost are saved as a whole, the consistency of data is improved, and the method is more suitable for the requirements of industrial production.
The method and the device for calibrating the binocular camera utilize the initial parameters of the binocular camera, so that the calibration process of the binocular camera is converged rapidly, the problem of optimizing to a local extremum is avoided, the calibration result is more accurate, the method and the device have higher consistency, are more suitable for the requirements of industrial production, can greatly reduce the calibration cost of camera manufacturers, and improve the calibration efficiency.
In some embodiments, as shown in fig. 5, the photographing module 22 includes:
a third depth unit 221, configured to place the calibration object in a common field of view of the first camera and the second camera, where a distance from the camera is a third depth;
and placing the calibration object in the common field of view of the first camera and the second camera, and recording the distance between the calibration object and the camera as a third depth. The calibration object may have a portion that is not in the common field of view. The calibration object covers more than half of the area of the common field of view of the first camera and the second camera.
A fourth depth unit 222 for moving the calibration object to a fourth depth and located in a common field of view of the first camera and the second camera.
The calibration object may have a portion that is not in the common field of view. At least one of the third depth and the fourth depth is a position where the calibration object is completely in the common field of view of the camera. The calibration object covers more than half of the area of the common field of view of the first camera and the second camera. The fourth depth is a different value than the third depth. The fourth depth may include a plurality of depth values to increase the number of calibration times and improve the calibration accuracy.
A second emitting unit 223, configured to control the camera to emit structured light to the calibration object, and acquire the first image and the second image;
the calibration object in the common field of view of the first image and the second image may be a set of calibration objects or may be a single calibration object with a larger size. The more the feature points of the calibration object are, the better the identification effect is. Through a large amount of data tests of the applicant, for a depth camera applied to the consumption field, the difference between the second depth and the first depth is at least 3 times of the focal length of the camera, so that better calibration accuracy can be obtained under the condition that two groups of calibration images are ensured. The number of the characteristic points of the calibration plate is at least 72, so that the calibration requirement of the depth camera applied to the consumer field can be met.
The embodiment defines the process of shooting a plurality of groups of calibration images, is matched with image calibration in the prior art, improves the application scene of the embodiment, can ensure that at least 2 groups of shot images meet the application requirement, and obtains better calibration accuracy under the same condition, thereby improving the robustness of data. The application of the embodiment can enable the data in the industrial application to be obtained more quickly, has better consistency and is more suitable for the industrial application.
As shown in fig. 6, applicable to the foregoing embodiments, in some embodiments, the optimization module 16 or the optimization module 24 includes:
the first optimizing unit 61 is configured to use initial estimation to the translation parameter in the parameter, keep unchanged during the optimizing process, and optimize the rotation amount;
this embodiment uses a Levenberg-Marquardt (hereinafter LM algorithm) for phased optimization. Taking a monocular structured light camera as an example, optimizing the rotation amounts Rx, ry, rz and Tx; preferably, when the structural information of the monocular structured light camera is known, for example, when the camera and the projector are horizontally placed in parallel, and the base line distances Tx are known, ty and Tz are close to 0, tx, ty, tz may be fixed to design values in the first stage without participating in optimization. I.e. the first phase only optimizes Rx, ry, rz. It should be noted that, the parameters in the embodiments are all schematic, and when more parameters or fewer parameters are known, the optimization process of the embodiment is not affected, and only the parameters initially estimated in the embodiment need to be reduced or increased.
The second optimizing unit 62 is configured to optimize all parameter parameters including the rotation parameter and the translation parameter, with the rotation amount and the translation parameter optimized in the first stage as initial values.
The second optimizing unit 62 optimizes all the external parameters including the rotation parameters Rx, ry, rz and the translation parameters Tx, ty, tz with the Rx, ry, rz and Tx, ty, tz optimized by the first optimizing unit 61 as initial values.
To examine the effect of this example, the applicant used the light spot as shown in fig. 7 for testing, and the binocular camera spacing as an index for testing. As shown in fig. 8, in the test, the distances between the binocular cameras were set to 5,8, 10, 13, 15, 18, 20 for 7 values in total, and comparison was made by using this example with the standard Zhang Zhengyou calibration method (conventional scheme). The average error rate of the conventional scheme is 11%, and the error rate of the scheme of the embodiment is 0.88%, which is 1/12 of the error rate of the conventional scheme, so that the measurement accuracy is greatly improved.
The embodiment defines a subdivision structure of step optimization, can flexibly adopt known parameters, ensures that the camera calibration process is more flexible and accurate, can be suitable for the initial calibration process of various cameras, accelerates the convergence process of camera calibration, avoids the problem of local extremum, can improve the consistency of calibration results, solves the problem of different results caused by different operations, and ensures that the results have higher robustness.
In some embodiments, when the first camera is intrinsic or unreliable, further comprising:
and a third optimizing unit 63 for optimizing the camera internal parameter and the camera external parameter to improve the calibration accuracy, with the values of the rotation parameter and the translation parameter obtained by the second optimizing unit 62 as initial values.
When the camera internal parameters are unreliable, for example, the camera is not calibrated accurately when leaving factory or the camera structure is changed to cause parameter change, the camera internal parameters are inaccurate, or the optimization results of the first two parts can be used as initial values, and meanwhile, the camera internal parameters, distortion coefficients and the camera external parameters fx, fy, cx, cy, k1, k2, p1, p2, rx, ry, rz, tx, ty and Tz are optimized, so that the calibration precision is improved.
Alternatively, the optimization may also be performed by only the first optimizing unit 61 and the third optimizing unit 63 to shorten the optimization time.
The third optimizing unit 63 may be used as an alternative to the case where the internal reference is unreliable, so as to meet the requirements of the industry for camera calibration. The third optimization unit 63 may also be executed to obtain accurate calibration results when the calibration results are abnormal, which usually means that the camera is not reliable in reference, or that the parameters may change due to damage to the camera.
Fig. 9 shows a robust high precision camera calibration apparatus in an embodiment of the invention. A robust high-precision camera calibration device comprises:
The first rail 91 is used for moving the camera and fixing the camera at a first preset position in a calibration time. The first rail 91 may be connected to a production line of cameras so that the cameras produced may be calibrated quickly.
And a second rail 92 for moving the calibration object and fixing the calibration object at a plurality of second preset positions. The second rail 92 is located at one side of the first rail 91 and is perpendicular to the first rail.
On the first rail 91, a first positioning device is provided at the junction with the second rail 92 to fix the camera position, and the lens faces the second rail 92 to photograph the calibration object on the second rail 92. The first positioning means may be a shutter or means for disengaging the camera from the track. The first positioning device is movably arranged, and when the calibration is completed, the camera can be continuously transported on the track. If the first positioning device is a baffle, when the camera arrives or is about to arrive at the second track 92, the baffle is placed on the first track 91 so that the camera stays at the second track; when the calibration is completed, the baffle moves to the position to enable the camera to pass through and continue to move forward. If the first positioning device is a device for separating the camera from the track, the first positioning device is kept still when the camera reaches or is about to reach the second track 92, and the camera is fixed at a first preset position until the calibration is completed when the camera is controlled by the first positioning device and separated from the first track 91; after the calibration is completed, the first positioning device sends the camera to the first track so that the camera enters the next process flow. The first positioning device sends the camera to the first track at a different position than when it is separated from the first track.
A second positioning device 93 is provided on the second rail 92. The second positioning device 93 may be connected to the calibration plate and cause the calibration plate to rest at a plurality of second predetermined positions on the second rail 92. The second positioning device 93 may have the same structure as the first positioning device or may be different from the first positioning device.
When the first rail 91 and the second rail 92 are connected to the camera production line, the first rail 91 and the second rail 92 are grouped together, and one camera production line can be connected to multiple groups at the same time. Since the camera production line is usually continuous production, the production line is uninterrupted, and the first rail 91 and the second rail 92 need to stay the camera for a period of time, so that multiple groups of first rails 91 and second rails 92 are required to be matched with the production line, thereby meeting the requirement of the production line and improving the efficiency.
The calibration system of the present invention may be used in addition to the monocular structured light camera optimization mentioned in the examples, in other optimization of known information with a higher confidence level. For example, in monocular camera calibration, the Zhang Zhengyou calibration method commonly used in the industry does not make any assumptions about camera parameters for camera calibration, and therefore does not use useful information for some cameras. For example, when the precise value of the focal length of the lens is known in advance, the precise value of the focal length can be used as an initial value, the precise value is not involved in the optimization of the first stage, but is fixed, other unknown parameters are optimized to a more precise value, in the second stage, the more precise optimization result of the first stage is used as the initial value of the optimization of the second stage, and all parameters to be optimized are optimized, so that a more precise optimization effect is achieved.
Or in the calibration of the binocular camera, the structural information of the binocular camera can be used as an initial value, the structural information does not participate in the optimization of the first stage, the accurate value is fixed, other unknown parameters are optimized to a more accurate value, in the second stage, the more accurate optimization result of the first stage is used as an initial value of the optimization of the second stage, all parameters to be optimized are optimized, and a more accurate optimization effect is achieved.
The method can solve the problem that the calibration problem is easy to optimize to local extremum in practice due to the fact that the number of parameters is large, the relation between the calibration calculation result and the calibration implementation process is large, different implementation personnel calibrate the same equipment, and the calibration result of the same equipment is often large due to the fact that the placement positions of calibration objects are different or the calibration object positioning algorithm has errors and other factors. Through staged calibration, constraint is added at the initial stage of calibration to obtain a better initial optimization value, and then constraint is released gradually at the subsequent stage to optimize more parameters, so that higher precision and better consistency are achieved.
Preferably, the present invention can utilize the prior information of the device, such as the focal length of the camera, or the monocular structured light, or the baseline distance of the binocular structured light, and the like, to participate in the optimization by using the more accurate prior information as the constraint, so as to obtain the calibration result with higher consistency. The influence of errors of the calibration implementation process and the calibration object positioning algorithm on the calibration result precision and the calibration consistency is reduced.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing describes specific embodiments of the present invention. It is to be understood that the invention is not limited to the particular embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the claims without affecting the spirit of the invention.

Claims (10)

1. A robust high precision camera calibration system suitable for monocular camera calibration, comprising:
The internal reference module is used for acquiring a first camera internal reference; the first camera internal parameters refer to parameters of core components of the camera, including lens focal length, sensor size and the like;
the measuring module is used for emitting structured light to a calibration object by the camera and changing the distance between the camera and the calibration object so as to obtain a plurality of calibration images with different depth distances;
the first characteristic module is used for processing the plurality of calibration images and acquiring position information of a plurality of characteristic points;
the external parameter module is used for calculating external parameters of the camera internal parameters and the calibration plate relative to the camera by utilizing the position information of the characteristic points; wherein each calibration image corresponds to a plane equation;
the projection module is used for selecting the calibration image with a certain distance as a reference image, matching other calibration images with the reference image, and then obtaining a projection center according to 3D points generated correspondingly to corresponding matching points on different calibration images;
and the optimization module is used for optimizing the data acquired by the projection module, and acquiring all parameters of the camera by adopting a staged iterative optimization method during optimization.
2. A robust high precision camera calibration system according to claim 1, wherein the measurement module comprises:
A first depth unit, configured to place the calibration object in a field of view of the camera, where a distance from the camera is a first depth; wherein the calibration object covers at least half of the field of view of the camera;
a second depth unit for changing a distance between the calibration object and the camera to a second depth along an optical axis direction of the camera; wherein the calibration object covers at least half of the field of view of the camera;
and the first transmitting unit is used for controlling the camera to transmit the structured light to the calibration object and acquiring the calibration image.
3. A robust high precision camera calibration system according to claim 1, wherein there are at least two sets of calibration images.
4. A robust high precision camera calibration system suitable for binocular or multi-view camera calibration, comprising:
the original parameter module is used for acquiring first original parameters of the camera; the camera comprises a first camera and a second camera, wherein the first original parameters refer to parameters of core components of the binocular camera, including lens focal length, sensor size, baseline distance and the like;
the shooting module is used for controlling the camera to respectively obtain a first image and a second image of the calibration object; the first image is shot by the first camera, and the second image is shot by the second camera;
The second feature module is used for processing the first image and the second image and acquiring position information of a plurality of feature points;
and the optimization module is used for optimizing the data acquired by the second characteristic module, and acquiring all parameters of the camera by adopting a staged iterative optimization method during optimization.
5. The robust high precision camera calibration system according to claim 4, wherein said camera module comprises:
the third depth unit is used for placing the calibration object in the common field of view of the first camera and the second camera, and the distance between the third depth unit and the camera is a third depth;
a fourth depth unit for moving the calibration object to a fourth depth and located in a common field of view of the first camera and the second camera;
and the second transmitting unit is used for controlling the camera to transmit the structured light to the calibration object and acquiring the first image and the second image.
6. The robust high precision camera calibration system according to claim 4, wherein the calibration object covers more than half of an area of a common field of view of the first camera and the second camera.
7. The robust high precision camera calibration system according to claim 4, wherein the optimization method is a nonlinear optimization algorithm.
8. A robust high precision camera calibration system according to claim 1 or 4, wherein the optimization module comprises:
the first optimizing unit is used for adopting initial estimation to translation parameters in the parameters, keeping unchanged in the optimizing process and optimizing the rotation quantity;
and the second optimizing unit is used for optimizing all the external parameters, including the rotation parameters and the translation parameters, by taking the rotation quantity and the translation parameters optimized in the first stage as initial values.
9. The robust high precision camera calibration system according to claim 8, further comprising:
and the third optimizing unit is used for taking the values of the rotation parameter and the translation parameter obtained by the second optimizing unit as initial values when the first camera internal parameter or the first camera internal parameter is unreliable, and simultaneously optimizing the camera internal parameter and the camera external parameter so as to improve the calibration precision.
10. A robust high precision camera calibration apparatus, comprising:
the first track is used for moving the camera and fixing the camera at a first preset position in a calibration time;
The second rail is used for moving the calibration object and fixing the calibration object at a plurality of second preset positions;
the second track is positioned at one side of the first track and is perpendicular to the first track.
CN202210111310.5A 2022-01-29 2022-01-29 Robust high-precision camera calibration system Pending CN116563382A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210111310.5A CN116563382A (en) 2022-01-29 2022-01-29 Robust high-precision camera calibration system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210111310.5A CN116563382A (en) 2022-01-29 2022-01-29 Robust high-precision camera calibration system

Publications (1)

Publication Number Publication Date
CN116563382A true CN116563382A (en) 2023-08-08

Family

ID=87498808

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210111310.5A Pending CN116563382A (en) 2022-01-29 2022-01-29 Robust high-precision camera calibration system

Country Status (1)

Country Link
CN (1) CN116563382A (en)

Similar Documents

Publication Publication Date Title
CN110276808B (en) Method for measuring unevenness of glass plate by combining single camera with two-dimensional code
CN109859272B (en) Automatic focusing binocular camera calibration method and device
US10832429B2 (en) Device and method for obtaining distance information from views
US8306323B2 (en) Method and apparatus for correcting depth image
JP4095491B2 (en) Distance measuring device, distance measuring method, and distance measuring program
CN102376089B (en) Target correction method and system
JP4979928B2 (en) Three-dimensional shape calculation device and three-dimensional shape calculation method
WO2017092631A1 (en) Image distortion correction method for fisheye image, and calibration method for fisheye camera
EP2751521B1 (en) Method and system for alignment of a pattern on a spatial coded slide image
US8144974B2 (en) Image processing apparatus, method, and program
JP2017531976A (en) System and method for dynamically calibrating an array camera
JP2009284188A (en) Color imaging apparatus
CN111080705B (en) Calibration method and device for automatic focusing binocular camera
CN112815843B (en) On-line monitoring method for printing deviation of workpiece surface in 3D printing process
CN112489109B (en) Three-dimensional imaging system method and device and three-dimensional imaging system
CN111145269A (en) Calibration method for external orientation elements of fisheye camera and single-line laser radar
CN114331924A (en) Large workpiece multi-camera vision measurement method
CN106683133B (en) Method for obtaining target depth image
CN110044266B (en) Photogrammetry system based on speckle projection
CN114359406A (en) Calibration of auto-focusing binocular camera, 3D vision and depth point cloud calculation method
WO2021163406A1 (en) Methods and systems for determining calibration quality metrics for a multicamera imaging system
JP7300895B2 (en) Image processing device, image processing method, program, and storage medium
CN115601441A (en) Geometric parameter calibration method for focusing type light field camera
CN115880369A (en) Device, system and method for jointly calibrating line structured light 3D camera and line array camera
CN218159078U (en) Robust high-precision camera calibration device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination